Issues in Table with Multi-Row Insert

I have created a master detail screens using jheadstart on 2 separate pages, Master in the Form layout and detail in the Table Layout with multi-row insert, update and delete flags ON. Have set the New Rows count = 2.
Issue 1
If I try to delete any existing rows, it gives error for new rows saying value is required for the mandatory fields. It should just ignore the new rows if I have not updated any values for any attributes in the those row(As it does for non Master-Detail Table layout). I guess this might be happening because the jheadstart code is setting the foreign key for new rows the detail, but not resetting the status of the rows back to INITIALIZED.
I also noticed that the create() of underlying EO is getting called for those blank rows when I click on 'Save' button, even if I have not changed any data in those rows.
Issue 2
When I try to select the new rows also for deletion, I am getting a '500 Internal Server Error' with following stack trace... This is also happening for normal (non Master-Detail) Table layout.
java.lang.IllegalStateException: AdfFacesContext was already released or had never been attached.     at oracle.adf.view.faces.context.AdfFacesContext.release(AdfFacesContext.java:342)     at oracle.adfinternal.view.faces.webapp.AdfFacesFilterImpl.doFilter(AdfFacesFilterImpl.java:253)     at oracle.adf.view.faces.webapp.AdfFacesFilter.doFilter(AdfFacesFilter.java:87)
Issue 3
I have put some validation code in the validate() method in the MyEntityImpl.java class.
The validate method seems tobe getting called lots of times, in my case 20 times, where the new rows are just 2.
Environment:
Jdeveloper 10.1.3, JHeadStart 10.1.3 build 78, Windows XP
thanks

Thanks for the reply.
Issue 1:
What I have observed that in case of multi-row select enabled tables, the blank rows do not have any data. This is because the EO's create() method is called only when we post the data using 'Save' button. Thus the Foreign Keys are also not setup. This is a correct behavior since create() and FK setups etc should get done only if the user has inputted any value in the new rows and thus intend to insert new data into the table.
I am able to find the exact cause of this issue. It is happening because in the details table, I have a column which needs tobe shown as checkbox. Since we can only bind checkbox to an Boolean attribute in VO, I have created a transient attribute of type Boolean, which basically calls the getter/setter of actual attribute doing the String "Y"/"N" to true/false conversion. Here is code for the transient attribute getter/setter
public Boolean getDisplayOnWebBoolean() {
return "Y".equals(getDisplayOnWeb()) ? Boolean.TRUE : Boolean.FALSE;
public void setDisplayOnWebBoolean(Boolean value) {
if(Boolean.TRUE.equals(value))
setDisplayOnWeb("Y");
else
setDisplayOnWeb("N");
Now when I click on the "Save" button, the setter for the boolean field is getting called with the value = false and this is resulting into the row being maked as dirty and thus the validation for the required attributes is getting executed and failing.
Issue 2:
Confirmed that correct filter-mapping entries are present in the web.xml.
Now when I select the new blank rows for deletion and click save, following exception is thrown:
java.lang.ClassCastException: oracle.jheadstart.controller.jsf.bean.NewTableRowBean at oracle.jheadstart.controller.jsf.bean.JhsCollectionModel.getRowsToRemove(JhsCollectionModel.java:412) at oracle.jheadstart.controller.jsf.bean.JhsCollectionModel.doModelUpdate(JhsCollectionModel.java:604) at oracle.jheadstart.controller.jsf.lifecycle.JhsPageLifecycle.processModelUpdaters(JhsPageLifecycle.java:541) at oracle.jheadstart.controller.jsf.lifecycle.JhsPageLifecycle.validateModelUpdates(JhsPageLifecycle.java:571)
thanks - rutwik

Similar Messages

  • Issue in retrieving all the records from ADF Table with multiple row

    Hi,
    As per my requirement, I need to fill the table with multi selected LOV values and when user clicks on commit, I need to save them to database.
    I am using ADF 11g, Multi select table. Using the below ADD method, I am able to add the records but if user clicks on cancel, I need to remove those from view and clear the table as well.
    But the Issue I am facing is, in my cancel method, always I am getting half of the records. Lets assume table contains 100 records but in my cancel method, I am getting only 50 records.
    Please let me know what is the issue in my source code.
    ADD Method:
    public void insertRecInCMProcessParamVal(String commType, String processType, Number seqNumber){       
    try{
    Row row = this.getCmProcessParamValueView1().createRow();
    row.setAttribute("ParamValue7", commType);
    row.setAttribute("ProcessType", processType);
    row.setAttribute("CreationDate", new Date());
    row.setAttribute("CreatedBy", uid);
    row.setAttribute("ParamValueSeqNum", seqNumber);
    row.setAttribute("ProcessedFlag", "N");
    this.getCmProcessParamValueView1().insertRow(row);
    }catch(Exception e){           
    e.printStackTrace();
    Table Code:
    <af:table value="#{bindings.CmProcessParamValueView11.collectionModel}"
    var="row"
    rows="#{bindings.CmProcessParamValueView11.rangeSize}"
    emptyText="#{bindings.CmProcessParamValueView11.viewable ? 'No data to display.' : 'Access Denied.'}"
    fetchSize="#{bindings.CmProcessParamValueView11.rangeSize}"
    rowBandingInterval="1"
    selectedRowKeys="#{bindings.CmProcessParamValueView11.collectionModel.selectedRow}"
    selectionListener="#{bindings.CmProcessParamValueView11.collectionModel.makeCurrent}"
    rowSelection="multiple"
    binding="#{backingBeanScope.backing_app_RunCalcPage.t1}"
    id="t1" width="100%" inlineStyle="height:100px;" >
    <af:column sortProperty="ParamValue6"
    sortable="true"
    headerText="#{bindings.CmProcessParamValueView11.hints.ParamValue6.label}"
    id="c1" visible="false">
    <af:inputText value="#{row.bindings.ParamValue6.inputValue}"
    label="#{bindings.CmProcessParamValueView11.hints.ParamValue6.label}"
    required="#{bindings.CmProcessParamValueView11.hints.ParamValue6.mandatory}"
    columns="#{bindings.CmProcessParamValueView11.hints.ParamValue6.displayWidth}"
    maximumLength="#{bindings.CmProcessParamValueView11.hints.ParamValue6.precision}"
    shortDesc="#{bindings.CmProcessParamValueView11.hints.ParamValue6.tooltip}"
    id="it3">
    <f:validator binding="#{row.bindings.ParamValue6.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="ParamValue7"
    sortable="true"
    headerText="Comm Type"
    id="c2">
    <af:inputText value="#{row.bindings.ParamValue7.inputValue}"
    label="#{bindings.CmProcessParamValueView11.hints.ParamValue7.label}"
    required="#{bindings.CmProcessParamValueView11.hints.ParamValue7.mandatory}"
    columns="#{bindings.CmProcessParamValueView11.hints.ParamValue7.displayWidth}"
    maximumLength="#{bindings.CmProcessParamValueView11.hints.ParamValue7.precision}"
    shortDesc="#{bindings.CmProcessParamValueView11.hints.ParamValue7.tooltip}"
    id="it4">
    <f:validator binding="#{row.bindings.ParamValue7.validator}"/>
    </af:inputText>
    </af:column>
    <af:column sortProperty="ParamValue8"
    sortable="true"
    headerText="#{bindings.CmProcessParamValueView11.hints.ParamValue8.label}"
    id="c3" visible="false">
    <af:inputText value="#{row.bindings.ParamValue8.inputValue}"
    label="#{bindings.CmProcessParamValueView11.hints.ParamValue8.label}"
    required="#{bindings.CmProcessParamValueView11.hints.ParamValue8.mandatory}"
    columns="#{bindings.CmProcessParamValueView11.hints.ParamValue8.displayWidth}"
    maximumLength="#{bindings.CmProcessParamValueView11.hints.ParamValue8.precision}"
    shortDesc="#{bindings.CmProcessParamValueView11.hints.ParamValue8.tooltip}"
    id="it2">
    <f:validator binding="#{row.bindings.ParamValue8.validator}"/>
    </af:inputText>
    </af:column>
    </af:table>
    Backing Bean Code:
    DCBindingContainer dcBindings=(DCBindingContainer)getBindings();
    DCIteratorBinding dcIterator=dcBindings.findIteratorBinding("CmProcessParamValueView1Iterator");
    RowSetIterator rs = dcIterator.getRowSetIterator();
    System.out.println("In Cancel Row Count is : "+ rs.getRowCount());
    if (rs.getRowCount() > 0) {
    Row row = rs.first();
    row.refresh(Row.REFRESH_UNDO_CHANGES);
    row.remove();
    while (rs.hasNext()) {
    int count = rs.getRowCount();
    System.out.println("Count is : "+ count);
    Row row = rs.next();
    System.out.println("Row === "+ row);
    if(row != null){                   
    row.refresh(Row.REFRESH_UNDO_CHANGES);
    row.remove();
    Thanks.

    Issue resolved.
    remove selectionListener and selectedRowKeys....
    code to get all the selectedRows.
    RowSetIterator rs = dcIterator.getRowSetIterator();
    RowKeySet rks = this.t1.getSelectedRowKeys();
    Iterator rksIter = rks.iterator();
    while (rksIter.hasNext()) {
    List l = (List) rksIter.next();
    Key key = (Key)l.get(0);
    Row row = rs.getRow(key);
    Thanks.

  • Multi-row insert in master-detail tables

    Hi, I'm using jdev 10.1.3.2 with jheadstart and my problem is:
    I hava a master-detail structure, both are tables and my goal is that I want multi insert (exactly 3) in master and detail table when user makes new order(some business scenario). I cannot create rows in master or detail VO by overriding create() method because its entities have complex primary keys and some part of this key is populated by the user with lov. So I set in jhs new rows to 3 and checked multi-row insert allowed but the problem is that overall I can only create rows in master table after I submit form. I want to create row in master table and fill rows in detail table, and after that I want to have opportunity to create second (or even third) row in master table and fill rows in detail table.
    thanks for help.
    Piotr

    See JHS DevGuide: 3.2.1. Review Database Design:
    If you are in the position to create or modify the database design, make sure all
    tables have a non-updateable primary key, preferably consisting of only one
    column. If you have updateable and/or composite primary keys, introduce a
    surrogate primary key by adding an ID column that is automatically populated.
    See section 3.2.4 Generating Primary Key Values for more info. Although ADF
    Business Components can handle composite and updateable primary keys, this
    will cause problems in ADF Faces pages. For example, an ADF Faces table
    manages its rows using the key of the underlying row. If this key changes,
    unexpected behavior can occur in your ADF Faces page. In addition, if you want
    to provide a drop down list on a lookup tables to populate a foreign key, the
    foreign key can only consists of one column, which in turn means the referenced
    table must have a single primary key column.
    Groeten,
    HJH

  • How to insert a table with variable rows in smart form

    Hi all,
    How to insert a table with variable rows in smart form?
    Any help would be appreciated.
    Regards,
    Mahesh.

    Hi,
    Right click the mouse->create->table
    If you want 5 columns, you need to declare 5 cells in one line type of the table
    Click on Table -> Details, then do the following
    Line Type 1 2 3 4 5
    L1 2mm 3mm etc
    Here specify the width of the columns as many as you want..
    then in the header/main area of the table, click create Table Line, Rowtype is L1, automatically 5 cells will come,In each cell create a text element, display the variable to be printed there.

  • Multi row insert with checkboxes and textfield

    i'm trying to build a multi row insert on tabular form.
    i have checkboxed to choose the rows which i want to insert.
    in this row i have to insert a value into the textfield.
    but when i'm looping my checkec checkoxes my prcedure always takes the first textfields in my report.
    what's going wrong?
    select apex_item.checkbox(30,ar_id, 'class="checkThis"') as ar_id
    ,apex_item.display_and_save(31,ar_nr) as ar_nr
    ,ar_name
    ,apex_item.text(32,null) as Menge
    from ar
    FOR i IN 1.. APEX_APPLICATION.G_F30.COUNT LOOP
    insert blabla (st_profil_id, ar_id,menge)
    values(:P5_st_profil_id,APEX_APPLICATION.G_F30(i),APEX_APPLICATION.G_F32(i));
    end loop;

    Hi,
    Please read the following
    http://docs.oracle.com/cd/E23903_01/doc/doc.41/e21676/apex_app.htm#CHDGJBAB
    http://docs.oracle.com/cd/E23903_01/doc/doc.41/e21676/apex_item.htm#CHDDCHAF , especially the Creating an On-Submit Process.
    Cheers,

  • Import the table with 0 rows

    Hi
    I have a problem to import a dump that contains the tables with 0 rows.
    When i exported from ORACLE 11.2 64 bit on SERVER 2008 i noticed that log didn't confirm the tables with 0 rows.
    When i want to import to ORACLE 11.2 64 bit on other SERVER 2008 i have a lot of errors on this tables with 0 rows.
    In the log i get the same tables with 1 row at least, but no one with 0 rows.
    I open my dump in TEXTPAD and i see it contains "CREATE ....." these tables.
    I don't understand why it happens. I used FUll DUMP by SYS, it didn't help.
    This is not first time when i export and import dumps,no errors.
    I'm using command "EXP" and "IMP" and every time it's ok.(IF it's a releavent)
    Why it happens? any solutions for this issue?
    Thanks

    I've found (i guess so) solution to this issue
    here are two links to this new feature that is called deffered segment creation
    The reason for this behavior is 11.2 new feature ‘Deferred Segment Creation‘ – the creation of a table sent is deferred until the first row is inserted.
    As a result, empty tables are not listed in dba_segments and are not exported by exp utility
    http://www.nativeread.com/2010/04/09/11gr2-empty-tables-skipped-by-export-deferred-segment-creation/
    http://antognini.ch/2010/10/deferred-segment-creation-as-of-11-2-0-2/
    And this is i've found in official documentation from oracle
    Beginning in Oracle Database 11g Release 2, when creating a non-partitioned heap-organized table in a locally managed tablespace, table segment creation is deferred until the first row is inserted. In addition, creation of segments is deferred for any LOB columns of the table, any indexes created implicitly as part of table creation, and any indexes subsequently explicitly created on the table.The advantages of this space allocation method are the following:A significant amount of disk space can be saved for applications that create hundreds or thousands of tables upon installation, many of which might never be populated.Application installation time is reduced.There is a small performance penalty when the first row is inserted, because the new segment must be created at that time.
    To enable deferred segment creation, compatibility must be set to '11.2.0' or higher. You can disable deferred segment creation by setting the initialization parameter DEFERRED_SEGMENT_CREATION to FALSE. The new clauses SEGMENT CREATION DEFERRED and SEGMENT CREATION IMMEDIATE are available for the CREATE TABLE statement. These clauses override the setting of the DEFERRED_SEGMENT_CREATION initialization parameter.
    +Note that when you create a table with deferred segment creation (the default), the new table appears in the _TABLES views, but no entry for it appears in the SEGMENTS views until you insert the first row. There is a new SEGMENTCREATED column in _TABLES, _INDEXES, and _LOBS that can be used to verify deferred segment creation+
    Note:
    The original Export utility does not export any table that was created with deferred segment creation and has not had a segment created for it. The most common way for a segment to be created is to store a row into the table, though other operations such as ALTER TABLE ALLOCATE EXTENTS will also create a segment. If a segment does exist for the table and the table is exported, the SEGMENT CREATION DEFERRED clause will not be included in the CREATE TABLE statement that is executed by the original Import utility.

  • Select on table with 1800 rows is slow

    I have a table with 1800 rows. Each entry has a geometry position and a geometry polygon around the position. I am using the polygon to detect which (other) entries are near the current entry.
    In the following testdata and the subsequent query, i am filtering on 625 (of 1865) rows, and then using the .STContains-method to finding other rows (the testdata is fully found by this query, in the live database the values are not so regular as in the testdata.
    The query take 6500 ms. In the live database, only 800 records are (yet) in the table, and it takes 2200 ms. 
    select SlowQueryTable.id
    from SlowQueryTable
    inner join dbo.SlowQueryTable as SlowQueryTableSeen
    on SlowQueryTable.[box].STContains(SlowQueryTableSeen.position) = 1
    where SlowQueryTable.userId = 2
    (The query in the live system is even more complex, but this is main part of it and even simplified as it is just takes too long).
    This script generates test data and runs the query:
    -- The number table is just needed to generate test data
    CREATE TABLE [dbo].[numbers](
    [number] [int] NOT NULL
    go
    declare @t table (number int)
    insert into @t select 0 union all select 1 union all select 2 union all select 3 union all select 4 union all select 5 union all select 6 union all select 7 union all select 8 union all select 9
    insert into numbers
    select * from
    select
    t1.number + t2.number*10 + t3.number*100 + t4.number*1000 as x
    from
    @t as t1,
    @t as t2,
    @t as t3,
    @t as t4
    ) as t1
    order by x
    go
    -- this is the table which has the slow query. The Columns [userId], [position] and [box] are the relevant ones
    CREATE TABLE [dbo].SlowQueryTable(
    [id] [int] IDENTITY(1,1) NOT NULL,
    [userId] [int] NOT NULL,
    [position] [geometry] NOT NULL,
    [box] [geometry] NULL,
    constraint SlowQueryTable_primary primary key clustered (id)
    create nonclustered index SlowQueryTable_UserIdKey on [dbo].SlowQueryTable(userId);
    --insert testdata: three users with each 625 entries. Each entry per user has its unique position, and a rectangle (box) around it.
    -- In the database in question, the positions are a bit more random, often tens of entries have the same position. The slow query is nevertheless visible with these testdata
    declare @range int;
    set @range = 5;
    INSERT INTO [dbo].SlowQueryTable (userId,position,box)
    select
    users.number,
    geometry::STGeomFromText('POINT (' + convert(varchar(15), X) + ' ' + convert(varchar(15), Y) + ')',0),
    geometry::STPolyFromText('POLYGON ((' + convert(varchar(15), X - @range) + ' ' + convert(varchar(15), Y - @range) + ', '
    + convert(varchar(15), X + @range) + ' ' + convert(varchar(15), Y - @range) + ', '
    + convert(varchar(15), X + @range) + ' ' + convert(varchar(15), Y + @range) + ', '
    + convert(varchar(15), X - @range) + ' ' + convert(varchar(15), Y + @range) + ','
    + convert(varchar(15), X - @range) + ' ' + convert(varchar(15), Y - @range) + '))', 0)
    from (
    select
    (numberX.number * 40) + 4520 as X
    ,(numberY.number * 40) + 4520 as Y
    from numbers as numberX
    cross apply numbers as numberY
    where numberX.number < (1000 / 40)
    and numberY.number < (1000 / 40)) as positions
    cross apply numbers as users
    where users.number < 3
    CREATE SPATIAL INDEX [SlowQueryTable_position]
    ON [dbo].SlowQueryTable([position])
    USING GEOMETRY_GRID
    WITH (
    BOUNDING_BOX = ( 4500, 4500, 5500, 5500 ),
    GRIDS =(LEVEL_1 = HIGH,LEVEL_2 = HIGH,LEVEL_3 = HIGH,LEVEL_4 = HIGH),
    CELLS_PER_OBJECT = 64, PAD_INDEX = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    go
    ALTER INDEX [SlowQueryTable_position] ON [dbo].SlowQueryTable
    REBUILD;
    go
    CREATE SPATIAL INDEX [SlowQueryTable_box]
    ON [dbo].SlowQueryTable(box)
    USING GEOMETRY_GRID
    WITH ( BOUNDING_BOX = ( 4500, 4500, 5500, 5500 ) ,
    GRIDS =(LEVEL_1 = HIGH,LEVEL_2 = HIGH,LEVEL_3 = HIGH,LEVEL_4 = HIGH),
    CELLS_PER_OBJECT = 64, PAD_INDEX = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    go
    ALTER INDEX [SlowQueryTable_box] ON [dbo].SlowQueryTable
    REBUILD;
    go
    SET STATISTICS IO ON
    SET STATSTICS TIME ON
    -- this is finally the query. it takes about 6500 ms
    select SlowQueryTable.id
    into #t1
    from SlowQueryTable
    inner join dbo.SlowQueryTable as SlowQueryTableSeen
    on SlowQueryTable.[box].STContains(SlowQueryTableSeen.position) = 1
    --on SlowQueryTable.position.STDistance(SlowQueryTableSeen.position) < 5
    where SlowQueryTable.userId = 2
    drop table #t1
    drop table SlowQueryTable
    drop table numbers
    Using an explicit index hint does do the job, but then the query gets slow if i change the where clause:
    select SlowQueryTable.id
    into #t1
    from SlowQueryTable
    with (index([SlowQueryTable_box]))
    inner join dbo.SlowQueryTable as SlowQueryTableSeen
    on SlowQueryTable.[box].STContains(SlowQueryTableSeen.position) = 1
    where SlowQueryTable.userId = 2
    leads to 600ms, and changing the where clause
    where SlowQueryTable.id = 100
    slows it again down to 1200ms.  Filtering on ID get massively slowed down when using index hint on the spatial index.
    Since the table in the live system will grow to 10000+ rows, and the query is called often by users, I badly need a more efficient query.
    Do I have to create a different queries for each use-case, some with index hints and some without?

    I've run your example and can confirm your results. There's a couple of things that I noticed though.
    After looking at query plans, it's not a matter of "with spatial index" vs. "without spatial index". You have two spatial indexes, one on each column (position and box). When you don't hint the "box" spatial index, the query
    uses the "position" spatial index. Because of what they are indexing (points vs. polygons), the "box" spatial index requires a lot more IO. With some (non-spatial) predicates, the "box" spatial index gives better performance,
    with others the "position" one does. I've yet to figure out exactly why (short on time, I might get back to it in future), but you can examine query plans and use the spatial index diagnostic procs (e.g. sp_help_spatial_geometry_index_xml ) in
    addition to the diagnostics you're running to see why and if you can find a better performing plan/index.
    Bear this in mind. Given a choice of multiple spatial indexes, the SQL Server query optimizer is not able to choose (for the most part, IO etc. aside), which one is best. Also, there is usually only one choice of spatial query plan shape, in general. If
    your query is more complex than the one in your example, you might benefit by breaking it in two: one query to filter out all the rows and predicates that don't use a spatial index and one query that uses the spatial index on the subset. I've had good
    luck with this other situations with complex queries involving spatial predicates. This method may not be applicable to a spatial query as simple as the one in your example, however.
    Hope this helps, Bob 

  • How to create table with 1 row 1MB in size?

    Hello,
    I am doing some R&D and want to create a Table with 1 row, which is 1 MB in size.
    i.e. I want to create a row which is 1 MB in size.
    I am using a 11g DB.
    I do this in SQL*Plus:
    (1.) CREATE TABLE onembrow  (pk NUMBER PRIMARY KEY, onembcolumn CLOB);
    (2.) Since 1MB is 1024*1024 bytes (i.e. 1048576 bytes) and since in English 1 letter = 1 byte, I do this
    SQL> INSERT INTO onembrow VALUES (1, RPAD('A', 1048576, 'B'));
    1 row created.
    (3.) Now, after committing, I do an analyze table.
    SQL> ANALYZE TABLE onembrow COMPUTE STATISTICS;
    Table analyzed.
    (4.) Now, I check the actual size of the table using this query.
    select segment_name,segment_type,bytes/1024/1024 MB
    from user_segments where segment_type='TABLE' and segment_name='ONEMBROW';
    SEGMENT_NAME       
    SEGMENT_TYPE        
    MB
    ONEMBROW           
    TABLE            
    .0625
    Why is the size only .0625 MB, when it should be 1 MB?
    Here is the DB Block related parameters:
    SELECT * FROM v$parameter WHERE upper(name) LIKE '%BLOCK%';
      NUM NAME                                                                                   TYPE VALUE 
      478 db_block_buffers                                                                          3 0     
      482 db_block_checksum                                                                         2 TYPICAL
      484 db_block_size                                                                             3 8192  
      682 db_file_multiblock_read_count                                                             3 128   
      942 db_block_checking                                                                         2 FALSE 
    What am I doing wrong here???

    When testing it is necessary to do something that is a reasonably realistic model of a problem you might anticipate appearing in a production system - a row of 1MB doesn't seem likely to be a useful source of information for "R&D on performance tuning"
    What's wrong with creating millions of rows ?
    Here's a cut and paste from a windows system running 11.2.0.3
    SQL> set timing on
    SQL>
    SQL> drop table t1 purge;
    Table dropped.
    Elapsed: 00:00:00.04
    SQL>
    SQL> create table t1
      2  nologging
      3  as
      4  with generator as (
      5     select
      6             rownum id
      7     from dual
      8     connect by
      9             level <= 50
    10  ),
    11  ao as (
    12     select
    13             *
    14     from
    15             all_objects
    16     where   rownum <= 50000
    17  )
    18  select
    19     rownum          id,
    20     ao.*
    21  from
    22     generator       v1,
    23     ao
    24  ;
    Table created.
    Elapsed: 00:00:07.09
    7 seconds to generate 2.5M rows doesn't seem like a problem.  For a modelling example I have one script that generates 6.5M (carefully engineered) rows, with a couple of indexes and a foreign key or two, then collects stats (no histograms) in 3.5 minutes.
    Regards
    Jonathan Lewis
    http://jonathanlewis.wordpress.com
    Now on Twitter: @jloracle

  • Master-Detail Multi-Row Insert

    Im still using Oracle forms 6i. How do we create here in ADF for the transaction in master-detail operation specially in inserting multi-row in detail. In forms we can use key-next item trigger next_record for new record transaction for drugs or we can use bar code scanning to insert new records & automatically go to next records waiting for another input...
    pls help me i want to upgrade & pls sorry for my english...
    just want this adf behave like forms in terms of master-detail transaction entry...
    i appreciate if someone can help me or give me demo file to download & play with it...
    Edited by: user8983555 on Nov 10, 2010 10:30 AM

    tnx for the fast reply..
    im new with jdeveloper and no knowledge in java or html. im concentrated in pl/sql , forms 6i.We still using this until now in character base in unix environment but some module in gui mode.im working in hospital which is complete informations sytem (stock,pharm..etc...gl...) which in bulk transactions specially patients outclinic & inpatients charges.now we have also this reservations system for out clinics thats the reason im like to develop for in a web that a patient can reserve on line and i dont like running our application in different front end (forms 6i &jdeveloper).
    (now currently checking form10g & just set up AS10g which is working.can deploy and connect,LAN). but im very interested in ADF when i see the demo on Oracle website.
    now our company pplanning to change our application to power builder whiich is not good in performance regarding in hadling big databases (slow query,needs burst AS...) thru to the demonstration of the
    apllication vendors.
    im very glad if you can help. can you post a links or demo file to do this as you said....(You can replicate the code in the button in some other event on your page, for example when the value of the last field in the row is changed.
    It all comes down to the question of when you actually want to create a new row, and in that event you call the createInsert method.)
    this is my only problem now to make this master-detail multi-row insert like ora form.
    again sorry for my english...

  • Sqlite multi-row insert

    In php & mysql I am used to being able to do multi row inserts into a table using;
    INSERT INTO table (column1,column2) VALUES (val1,val2),(val3,val4)
    Is there any Sqlite equivalent? I need to populate around 100 rows on first run of a mobile application.
    Thanks,
    Pete

    If what _spoboyle recommends doesn't work in AIR and if you haven't already, look into the SQLConnection.begin() and commit().  It seems at least you can group your SQL transactions.

  • Why should avoid OLTP compression on tables with massive update/insert?

    Dear expert,
    We are planning oracle OLTP compression on a IS-U system, could you tell me
    Why should avoid OLTP compression on tables with massive update/insert?
    What kind of impact on the performance in the worst case?
    Best regards,
    Kate

    Hi
    When updating compressed data Oracle has to read it, uncompress it and update it.
    The compression is then performed again later asynchronously. This does require a lot more CPU than for a simple update.
    An other drawback is that compression on highly modified tables will generate a major increase in redo / undo log generation. I've experienced it on an DB where RFC tables where by mistake compressed, the redo increase was over 15%.
    Check the remark at the end of  Jonathan Lewis post.
    Regards
    http://allthingsoracle.com/compression-in-oracle-part-3-oltp-compression/
    Possibly this is all part of the trade-off that helps to explain why Oracle doesn't end up compressing the last few rows that get inserted into the block.
    The effect can be investigated fairly easily by inserting about 250 rows into the empty table - we see Oracle inserting 90 rows, then generating a lot of undo and redo as it compresses those 90 rows; then we insert another 40 rows, then generate a lot of undo and redo compressing the 130 rows. Ultimately, by the time the block is full we have processed the first 90 rows into the undo and redo four or five times.

  • Multi Row Inserts

    Is there any way of providing Multi Row Inserts in a single region.
    I am looking for a way to add several records at once in a table format, much like the multi row update functionalty.
    Is this possible?

    Can you use an OUTPUT clause? See examples
    MERGE INTO dbo.Customers AS TGT
    USING dbo.CustomersStage AS SRC
      ON TGT.custid = SRC.custid
    WHEN MATCHED THEN
      UPDATE SET
        TGT.companyname = SRC.companyname,
        TGT.phone = SRC.phone,
        TGT.address = SRC.address
    WHEN NOT MATCHED THEN 
      INSERT (custid, companyname, phone, address)
      VALUES (SRC.custid, SRC.companyname, SRC.phone, SRC.address)
    WHEN NOT MATCHED BY SOURCE THEN
      DELETE
    OUTPUT 
      $action AS the_action, deleted.custid AS del_custid, inserted.custid AS ins_custid;
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Master/Detail in create mode with single row insert

    Hi,
    My jheadstart version is 10.1.3.2.52
    I want to insert master and detail together on same page in create mode.
    I have these settings:
    - Master Layout: Form
    - Detail Layout: Table
    -"Single-Row Insert Allow" (Checked)
    -"Single-Row Update Allow" (Checked)
    -"Single-Row Delete Allow" (Checked)
    -"Multi-Row Insert Allow" (unchecked)
    -"Multi-Row Update Allow" (unchecked)
    -"Multi-Row Delete Allow" (unchecked)
    -"Show Add Row Button" (Checked)
    When I Generate and run Application and click New Button, my detail table don't have any New Button.
    Then I set "New Rows" to 1 for detail table and generate and run again, but problem still there!
    Can you help me?
    Thanks

    Hi,
    Thanks for your reply.
    But I don't want multi-insert row be enabled(because of some logical reasons that forced by my scenario).
    If I enable multi-insert row, even without "new row button" and just set "new row = 1", when user insert first row and click save button, one more empty row automatically add to detail table. I don't want this behavior. I want only one empty row for my detail table in create mode.
    thanks again.

  • TO DRAW A TABLE WITH MULTIPLE ROWS AND MULTIPLE COLOUMNS IN FORM

    Hi,
       How to draw a table with multiple rows and columns seperated by lines in form printing?

    check this
    http://sap-img.com/ts003.htm
    Regards
    Prabhu

  • Filtering a table with 19699 rows

    Hello
    I am new to Xcelsisu and I am trying to aply filter component to a table with 19699 rows. I is suppose to show in the filter 45 diferent clasifications. I seems to be using only the first 500 rows to aply the filter. I changed the maximum number of rows in the preferences. What should i do ?

    a component cannot handle 20.000 rows you should filter these in the database by using quries when getting the data in the dashboard. If there is no database connection available, you can use pivot tables to create list for the different dimensions and then use lookup formulas like match and index o display the data for your chart.

Maybe you are looking for

  • XI can handle a File which has hierarchical structures.

    Can any one please let me know whether XI handles the following kind of file. Here i am showing the records how these comes in the file. All records are of type fixed length. FNR............................................ THR........................

  • BE (Bill of Exchange) purchased back from bank before due date

    I have posted BE through either FBW1+ FBW2 or through F-36. Due date is 1 mth in the future. Then i have discounted it through F-33 using bank clearing liab account (credited here and Post automatically only) Settings in OBYK have also been done for

  • Multiple entries per line on MTRs

    Hello, Can anyone tell me what the multiple line items are when running an MTR?  I previously though that the "extra" line entries were the various paths that an ICMP packet may take to the destination, but I have observed that the last hop will occa

  • EEWB Enhancement in Red , error state...Cannot  'Start Wizard' after that..

    Hello , Can you please help me with this.. 1. There was an enhancement with customfield ( checkbox ). This was not selected for 'Search'...I wanted this in the 'Search' screen of CRM UI, hence ticked the 'Search' ( This I did by choosing ' Start Wiza

  • Error 1603 - Could not write value to key \Skype.C...

    Hi, I'm running Windows 7 and have admin privileges on the computer.  I just attempted to download and install the latest version of Skype.  The Install failed with the error message: "Could not write value to key \Skype.Content\shell\open\command"..