Inserting 10 million rows in to a table hangs

HI through toad iam using a simple for loop to insert 10 million rows into a table by saying
for i in 1 ......10000000
insert.................
It hangs ........ for lot of time
is there a better way to insert the rows in to the table....?
i have to test for performance.... and i have to insert 50 million rows in its child table..
practically when the code moves to production it will have these many rows...(may be more also) thats why i have to test for these many rows
plz suggest a better way for this
Regards
raj

Must be a 'hardware thing'.
My ancient desktop (pentium IV, 1.8 Ghz, 512 MB), running XE, needs:
MHO%xe> desc t
Naam                                      Null?    Type
N                                                  NUMBER
A                                                  VARCHAR2(10)
B                                                  VARCHAR2(10)
MHO%xe> insert /*+ APPEND */ into t
  2  with my_data as (
  3  select level n, 'abc' a, 'def' b from dual
  4  connect by level <= 10000000
  5  )
  6  select * from my_data;
10000000 rijen zijn aangemaakt.
Verstreken: 00:04:09.71
MHO%xe> drop table t;
Tabel is verwijderd.
Verstreken: 00:00:31.50
MHO%xe> create table t (n number, a varchar2(10), b varchar2(10));
Tabel is aangemaakt.
Verstreken: 00:00:01.04
MHO%xe> insert into t
  2  with my_data as (
  3  select level n, 'abc' a, 'def' b from dual
  4  connect by level <= 10000000
  5  )
  6  select * from my_data;
10000000 rijen zijn aangemaakt.
Verstreken: 00:02:44.12
MHO%xe>  drop table t;
Tabel is verwijderd.
Verstreken: 00:00:09.46
MHO%xe> create table t (n number, a varchar2(10), b varchar2(10));
Tabel is aangemaakt.
Verstreken: 00:00:00.15
MHO%xe> insert /*+ APPEND */ into t
  2   with my_data as (
  3   select level n, 'abc' a, 'def' b from dual
  4   connect by level <= 10000000
  5   )
  6   select * from my_data;
10000000 rijen zijn aangemaakt.
Verstreken: 00:01:03.89
MHO%xe>  drop table t;
Tabel is verwijderd.
Verstreken: 00:00:27.17
MHO%xe> create table t (n number, a varchar2(10), b varchar2(10));
Tabel is aangemaakt.
Verstreken: 00:00:01.15
MHO%xe> insert into t
  2  with my_data as (
  3  select level n, 'abc' a, 'def' b from dual
  4  connect by level <= 10000000
  5  )
  6  select * from my_data;
10000000 rijen zijn aangemaakt.
Verstreken: 00:01:56.10Yea, 'cached' it a bit (ofcourse ;) )
But the append hint seems to knibble about 50 sec off anyway (using NO indexes at all) on my 'configuration'.

Similar Messages

  • How to insert a row in the detail table of a treeTable (master/detail)?

    Is there any small, working example with JDev 11.1.2.1 on how to programmatically insert a row in the detail table represented by the treeTable component?
    Please don't send links to this example http://jobinesh.blogspot.com/2010/05/crud-operations-on-tree-table.html , which does not work.
    Thanks.

    Erp, why do you keep giving the links to that example or the zip including the project of that example? I explicitly asked to not send any links to that example, which does not work with jdev 11.1.2.1.
    This is the exception from that one when trying to create a new record on the details:
    <UIXEditableValue> <_isBeanValidationAvailable> A Bean Validation provider is not present, therefore bean validation is disabled
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    :bndVarFirstName =null
    <FormRenderer> <_warnUnpoppedContextChanges> ADF_FACES-60095:Durante l'elaborazione del renderer del form, la modifica di contesto trovata non corrisponde al componente previsto.
    <UIXCollection> <processSaveState> è possibile che la chiave di riga non venga reimpostata correttamente alla fine della richiesta. ID componente: :pc1:tt1, ViewId: /treeSample.jspx
    <UIXComponentBase> <processSaveState> Salvataggio dello stato per gli elementi figlio del componente RichPanelCollection[UIXFacesBeanImpl, id=pc1] non riuscito.
    <UIXComponentBase> <processSaveState> Salvataggio dello stato per gli elementi figlio del componente RichPanelGroupLayout[UIXFacesBeanImpl, id=pgl1] non riuscito.
    <UIXComponentBase> <processSaveState> Salvataggio dello stato per gli elementi figlio del componente RichForm[UIXFacesBeanImpl, id=f1] non riuscito.
    <UIXComponentBase> <processSaveState> Salvataggio dello stato per gli elementi figlio del componente RichDocument[UIXFacesBeanImpl, id=d1] non riuscito.
    <LifecycleImpl> <_handleException> ADF_FACES-60098:Il ciclo di vita Faces ha ricevuto eccezioni non gestite nella fase RENDER_RESPONSE 6
    java.lang.NullPointerException
         at oracle.jbo.uicli.binding.JUCtrlHierNodeBinding.findChildNode(JUCtrlHierNodeBinding.java:867)
         at oracle.jbo.uicli.binding.JUCtrlHierBinding.bringNodeToRangeKeyPath(JUCtrlHierBinding.java:788)
         at oracle.adfinternal.view.faces.model.binding.FacesCtrlHierBinding.bringNodeToRangeKeyPath(FacesCtrlHierBinding.java:111)
         at oracle.adfinternal.view.faces.model.binding.RowDataManager.setRowKey(RowDataManager.java:130)
         at oracle.adfinternal.view.faces.model.binding.FacesCtrlHierBinding$FacesModel.setRowKey(FacesCtrlHierBinding.java:830)
         at org.apache.myfaces.trinidad.component.UIXCollection.setRowKey(UIXCollection.java:513)
         at org.apache.myfaces.trinidad.component.UIXCollection.processSaveState(UIXCollection.java:270)
         at org.apache.myfaces.trinidad.component.TreeState.saveState(TreeState.java:175)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.processSaveState(UIXComponentBase.java:1043)
         at org.apache.myfaces.trinidad.component.TreeState.saveState(TreeState.java:175)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.processSaveState(UIXComponentBase.java:1043)
         at org.apache.myfaces.trinidad.component.TreeState.saveState(TreeState.java:175)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.processSaveState(UIXComponentBase.java:1043)
         at org.apache.myfaces.trinidad.component.TreeState.saveState(TreeState.java:175)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.processSaveState(UIXComponentBase.java:1043)
         at javax.faces.component.UIComponentBase.processSaveState(UIComponentBase.java:1156)
         at org.apache.myfaces.trinidadinternal.application.StateManagerImpl.saveView(StateManagerImpl.java:193)
         at org.apache.myfaces.trinidadinternal.application.StateManagerImpl.getViewState(StateManagerImpl.java:134)
         at oracle.adfinternal.view.faces.renderkit.rich.PprResponseWriter._writeViewState(PprResponseWriter.java:514)
         at oracle.adfinternal.view.faces.renderkit.rich.PprResponseWriter.endDocument(PprResponseWriter.java:83)
         at oracle.adfinternal.view.faces.renderkit.rich.DocumentRenderer.encodeAll(DocumentRenderer.java:1490)
         at oracle.adf.view.rich.render.RichRenderer.encodeAll(RichRenderer.java:1452)
         at org.apache.myfaces.trinidad.render.CoreRenderer.encodeEnd(CoreRenderer.java:511)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeEnd(UIXComponentBase.java:923)
         at javax.faces.component.UIComponent.encodeAll(UIComponent.java:1659)
         at oracle.adfinternal.view.faces.context.PartialViewContextImpl._processRender(PartialViewContextImpl.java:321)
         at oracle.adfinternal.view.faces.context.PartialViewContextImpl.processPartial(PartialViewContextImpl.java:152)
         at javax.faces.component.UIViewRoot.encodeChildren(UIViewRoot.java:974)
         at javax.faces.component.UIComponent.encodeAll(UIComponent.java:1652)
         at oracle.adfinternal.view.faces.component.AdfViewRoot.encodeAll(AdfViewRoot.java:91)
         at com.sun.faces.application.view.JspViewHandlingStrategy.doRenderView(JspViewHandlingStrategy.java:431)
         at com.sun.faces.application.view.JspViewHandlingStrategy.renderView(JspViewHandlingStrategy.java:233)
         at org.apache.myfaces.trinidadinternal.application.ViewDeclarationLanguageFactoryImpl$ChangeApplyingVDLWrapper.renderView(ViewDeclarationLanguageFactoryImpl.java:350)
         at com.sun.faces.application.view.MultiViewHandler.renderView(MultiViewHandler.java:131)
         at javax.faces.application.ViewHandlerWrapper.renderView(ViewHandlerWrapper.java:273)
         at org.apache.myfaces.trinidadinternal.application.ViewHandlerImpl.renderView(ViewHandlerImpl.java:165)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._renderResponse(LifecycleImpl.java:1027)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:334)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:232)
         at javax.faces.webapp.FacesServlet.service(FacesServlet.java:313)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
         at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:173)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:122)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
         at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:293)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:199)
         at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:111)
         at java.security.AccessController.doPrivileged(Native Method)
         at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
         at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:413)
         at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:94)
         at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:161)
         at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:136)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    <RegistrationConfigurator> <handleError> ADF_FACES-60096:Eccezione server durante PPR, n. 1
    java.lang.NullPointerException
         at oracle.jbo.uicli.binding.JUCtrlHierNodeBinding.findChildNode(JUCtrlHierNodeBinding.java:867)
         at oracle.jbo.uicli.binding.JUCtrlHierBinding.bringNodeToRangeKeyPath(JUCtrlHierBinding.java:788)
         at oracle.adfinternal.view.faces.model.binding.FacesCtrlHierBinding.bringNodeToRangeKeyPath(FacesCtrlHierBinding.java:111)
         at oracle.adfinternal.view.faces.model.binding.RowDataManager.setRowKey(RowDataManager.java:130)
         at oracle.adfinternal.view.faces.model.binding.FacesCtrlHierBinding$FacesModel.setRowKey(FacesCtrlHierBinding.java:830)
         at org.apache.myfaces.trinidad.component.UIXCollection.setRowKey(UIXCollection.java:513)
         at org.apache.myfaces.trinidad.component.UIXCollection.processSaveState(UIXCollection.java:270)
         at org.apache.myfaces.trinidad.component.TreeState.saveState(TreeState.java:175)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.processSaveState(UIXComponentBase.java:1043)
         at org.apache.myfaces.trinidad.component.TreeState.saveState(TreeState.java:175)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.processSaveState(UIXComponentBase.java:1043)
         at org.apache.myfaces.trinidad.component.TreeState.saveState(TreeState.java:175)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.processSaveState(UIXComponentBase.java:1043)
         at org.apache.myfaces.trinidad.component.TreeState.saveState(TreeState.java:175)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.processSaveState(UIXComponentBase.java:1043)
         at javax.faces.component.UIComponentBase.processSaveState(UIComponentBase.java:1156)
         at org.apache.myfaces.trinidadinternal.application.StateManagerImpl.saveView(StateManagerImpl.java:193)
         at org.apache.myfaces.trinidadinternal.application.StateManagerImpl.getViewState(StateManagerImpl.java:134)
         at oracle.adfinternal.view.faces.renderkit.rich.PprResponseWriter._writeViewState(PprResponseWriter.java:514)
         at oracle.adfinternal.view.faces.renderkit.rich.PprResponseWriter.endDocument(PprResponseWriter.java:83)
         at oracle.adfinternal.view.faces.renderkit.rich.DocumentRenderer.encodeAll(DocumentRenderer.java:1490)
         at oracle.adf.view.rich.render.RichRenderer.encodeAll(RichRenderer.java:1452)
         at org.apache.myfaces.trinidad.render.CoreRenderer.encodeEnd(CoreRenderer.java:511)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeEnd(UIXComponentBase.java:923)
         at javax.faces.component.UIComponent.encodeAll(UIComponent.java:1659)
         at oracle.adfinternal.view.faces.context.PartialViewContextImpl._processRender(PartialViewContextImpl.java:321)
         at oracle.adfinternal.view.faces.context.PartialViewContextImpl.processPartial(PartialViewContextImpl.java:152)
         at javax.faces.component.UIViewRoot.encodeChildren(UIViewRoot.java:974)
         at javax.faces.component.UIComponent.encodeAll(UIComponent.java:1652)
         at oracle.adfinternal.view.faces.component.AdfViewRoot.encodeAll(AdfViewRoot.java:91)
         at com.sun.faces.application.view.JspViewHandlingStrategy.doRenderView(JspViewHandlingStrategy.java:431)
         at com.sun.faces.application.view.JspViewHandlingStrategy.renderView(JspViewHandlingStrategy.java:233)
         at org.apache.myfaces.trinidadinternal.application.ViewDeclarationLanguageFactoryImpl$ChangeApplyingVDLWrapper.renderView(ViewDeclarationLanguageFactoryImpl.java:350)
         at com.sun.faces.application.view.MultiViewHandler.renderView(MultiViewHandler.java:131)
         at javax.faces.application.ViewHandlerWrapper.renderView(ViewHandlerWrapper.java:273)
         at org.apache.myfaces.trinidadinternal.application.ViewHandlerImpl.renderView(ViewHandlerImpl.java:165)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._renderResponse(LifecycleImpl.java:1027)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:334)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:232)
         at javax.faces.webapp.FacesServlet.service(FacesServlet.java:313)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
         at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:173)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:122)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
         at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:293)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:199)
         at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:111)
         at java.security.AccessController.doPrivileged(Native Method)
         at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
         at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:413)
         at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:94)
         at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:161)
         at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:136)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    while I get this from my own project based on the above one:
    <UIXEditableValue> <_isBeanValidationAvailable> A Bean Validation provider is not present, therefore bean validation is disabled
    <UIXRegion> <_logIllegalContextChangeMessage> ADF_FACES-10026:Durante l'elaborazione del componente area, non è stata trovata una modifica di contesto oppure la modifica trovata non corrisponde all'istanza impostata dal componente corrente. Prevista oracle.adf.view.rich.component.fragment.UIXRegion$RegionContextChange, trovata UIXCollection.CollectionComponentChange[Component class: oracle.adf.view.rich.component.rich.data.RichTreeTable, component ID: tt1].
    <UIXRegion$RegionSiteImpl> <validate> Attempt to validate an already invalid RegionSite:
    <UIXCollection> <processSaveState> è possibile che la chiave di riga non venga reimpostata correttamente alla fine della richiesta. ID componente: :pt1:r1:pt1:pc1:tt1, ViewId: /main.jspx
    Edited by: user10047839 on 20-ott-2011 7.23

  • Insert multiple rows into a same table from a single record

    Hi All,
    I need to insert multiple rows into a same table from a single record. Here is what I am trying to do and I need your expertise. I am using Oracle 11g
    DataFile
    1,"1001,2001,3001,4001"
    2,"1002,2002,3002,4002"
    The data needs to be loaded as
    Field1      Field2
    1               1001
    1               2001
    1               3001
    1               4001
    2               1002
    2               2002
    2               3002
    2               4002
    Thanks

    You could use SQL*Loader to load the data into a staging table with a varray column, then use a SQL insert statement to distribute it to the destination table, as demonstrated below.
    SCOTT@orcl> host type test.dat
    1,"1001,2001,3001,4001"
    2,"1002,2002,3002,4002"
    SCOTT@orcl> host type test.ctl
    load data
    infile test.dat
    into table staging
    fields terminated by ','
    ( field1
    , numbers varray enclosed by '"' and '"' (x))
    SCOTT@orcl> create table staging
      2    (field1  number,
      3     numbers sys.odcinumberlist)
      4  /
    Table created.
    SCOTT@orcl> host sqlldr scott/tiger control=test.ctl log=test.log
    SQL*Loader: Release 11.2.0.1.0 - Production on Wed Dec 18 21:48:09 2013
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Commit point reached - logical record count 2
    SCOTT@orcl> column numbers format a60
    SCOTT@orcl> select * from staging
      2  /
        FIELD1 NUMBERS
             1 ODCINUMBERLIST(1001, 2001, 3001, 4001)
             2 ODCINUMBERLIST(1002, 2002, 3002, 4002)
    2 rows selected.
    SCOTT@orcl> create table destination
      2    (field1  number,
      3     field2  number)
      4  /
    Table created.
    SCOTT@orcl> insert into destination
      2  select s.field1, t.column_value
      3  from   staging s, table (s.numbers) t
      4  /
    8 rows created.
    SCOTT@orcl> select * from destination
      2  /
        FIELD1     FIELD2
             1       1001
             1       2001
             1       3001
             1       4001
             2       1002
             2       2002
             2       3002
             2       4002
    8 rows selected.

  • Best way to insert millions of records into the table

    Hi,
    Performance point of view, I am looking for the suggestion to choose best way to insert millions of records into the table.
    Also guide me How to implement in easier way to make better performance.
    Thanks,
    Orahar.

    Orahar wrote:
    Its Distributed data. No. of clients and N no. of Transaction data fetching from the database based on the different conditions and insert into another transaction table which is like batch process.Sounds contradictory.
    If the source data is already in the database, it is centralised.
    In that case you ideally do not want the overhead of shipping that data to a client, the client processing it, and the client shipping the results back to the database to be stored (inserted).
    It is must faster and more scalable for the client to instruct the database (via a stored proc or package) what to do, and that code (running on the database) to process the data.
    For a stored proc, the same principle applies. It is faster for it to instruct the SQL engine what to do (via an INSERT..SELECT statement), then pulling the data from the SQL engine using a cursor fetch loop, and then pushing that data again to the SQL engine using an insert statement.
    An INSERT..SELECT can also be done as a direct path insert. This introduces some limitations, but is faster than a normal insert.
    If the data processing is too complex for an INSERT..SELECT, then pulling the data into PL/SQL, processing it there, and pushing it back into the database is the next best option. This should be done using bulk processing though in order to optimise the data transfer process between the PL/SQL and SQL engines.
    Other performance considerations are the constraints on the insert table, the triggers, the indexes and so on. Make sure that data integrity is guaranteed (e.g. via PKs and FKs), and optimal (e.g. FKs should be indexes on the referenced table). Using triggers - well, that may not be the best approach (like for exampling using a trigger to assign a sequence value when it can be faster done in the insert SQL itself). Personally, I avoid using triggers - I rather have that code residing in a PL/SQL API for manipulating data in that table.
    The type of table also plays a role. Make sure that the decision about the table structure, hashed, indexed, partitioned, etc, is the optimal one for the data structure that is to reside in that table.

  • Tuning an insert sql that inserts a million rows doing a full table scan

    Hi Experts,
    I am on Oracle 11.2.0.3 on Linux. I have a sql that inserts data in a history/archive table from a main application table based on date. The application table has 3 million rows in it. and all rows that are older then 6 months should go into a history/archive table. this was recently decided and we have 1 million rows that satisfy this criteria. This insert into archive table is taking about 3 minutes. The explain plan shows a full table scan on the main table - which is the right thing as we are pulling out 1 million rows from main table into history table.
    My question is that, is there a way I can make this sql go faster?
    Here is the query plan (I changed the table names etc.)
       INSERT INTO EMP_ARCH
       SELECT *
    FROM EMP M
    where HIRE_date < (sysdate - :v_num_days);
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        2      0.00       0.00          0          0          0           0
    Execute      2     96.22     165.59      92266     147180    8529323     1441230
    Fetch        0      0.00       0.00          0          0          0           0
    total        4     96.22     165.59      92266     147180    8529323     1441230
    Misses in library cache during parse: 1
    Misses in library cache during execute: 1
    Optimizer mode: FIRST_ROWS
    Parsing user id: 166
    Rows     Row Source Operation
    1441401   TABLE ACCESS FULL EMP (cr=52900 pr=52885 pw=0 time=21189581 us)
    I heard that there is a way to use opt_param hint to increase the multiblock read count but didn't seem to work for me....I will be thankful for suggestions on this. also can collections and changing this to pl/sql make it faster?
    Thanks,
    OrauserN

    Also I wish experts share their insight on how we make full table scan go faster (apart from parallel suggestion I mean).
    Please make up your mind about what question you actually want help with.
    First you said you want help making the INSERT query go faster but the rest of your replies, including the above statement, imply that you are obsessed with making full table scans go faster.
    You also said:
    our management would like us to come forth with the best way to do it
    But when people make suggestions you make replies about you and your abilities:
    I do not have the liberty to do the "alter session enable parallel dml".  I have to work within this constraings
    Does 'management' want the best way to do whichever question you are asking?
    Or is is just YOU that want the best way (whatever you mean by best) based on some unknown set of constraints?
    As SB already said, you clearly do NOT have an actual problem since you have already completed the task of inserting the data, several times in fact. So the time it takes to do it is irrevelant.
    There is no universal agreement on what the word 'best' means for any given use case and you haven't given us your definition either. So how would we know what might be 'best'?
    So until you provide the COMPLETE list of constraints you are just wasting our time asking for suggestions that you refute with a comment about some 'constraint' you have.
    You also haven't provided ANY information that indicates that it is the full table scan that is the root of the problem. It is far more likely to be the INSERT into the table and a simple use of NOLOGGING with an APPEND hint might be all that is needed.
    IMHO the 'best' way would be to partition both the source and target tables and just use EXCHANGE PARTITION to move the data. That operation would only take a millisecond or two.
    But, let me guess, that would not conform to one of your 'unknown' constraints would it?

  • Problem with inserting a row in the current table

    I am working on Html editor using swings.
    actually i am trying to insert a row using
    private void insertTableRow(int cols)
              StringBuffer sRow = new StringBuffer();
              sRow.append("<TR>");
              for(int i = 0; i < cols; i++)
                   sRow.append("<TD></TD>");
              sRow.append("</TR>");
              //System.out.println(sRow);
              int caretPos = jtpMain.getCaretPosition();
              System.out.println(caretPos);
              ActionEvent actionEvent = new ActionEvent(jtpMain, 0, "insertTableRow");
              //System.out.println("raju" +sRow.toString());
              new HTMLEditorKit.InsertHTMLTextAction("insertTableRow", sRow.toString(), HTML.Tag.TABLE, HTML.Tag.TR).actionPerformed(actionEvent);
              refreshOnUpdate(); // optional refresh code
         }this is performing well when the cursor is at the last cell
    of the row.But now working ,when the cursor in the remaining cells it gives in correct table.
    pls any one can help

    You probably have to first search your text for a <TR> tag to insert rows above, (place your new row text infront of the tag), or search for a </TR> tag to insert rows below, (place new row text after the tag). Otherwise you will be inserting the row in the middle of a cell.
    Jerome.

  • Insert multiple rows from cursor to table

    Hi,
    I want to insert multiple rows retrieved from cursor to a table without looping the cursor till the end of cursor.
    I have the following structure, plz guide me.
    Create procedure procedure_name as
    cursor mycursor as
    select * from table1; --returns multiple row
    t_data table1%rowtype;
    begin
    fetch mycursor to t_data;
    loop
    exit when cursor not found;
    insert to table2
    end loop;
    end procedure;
    ===========================
    Now my cursor contains multiple records. I can iterate the records in loop and insert data to another table e.g table2 with same structure.
    But I want to insert the total data at a time without looping the cursor for each data.Please suggest how to do this ?
    Thanks in advance.

    You should be able to do it in a single statement (assuming table1 and table2 have the same columns/datatypes)
    insert into table2 (select * from table1);

  • Inserting millions of records into new table based on condition

    Hi All,
    We have a range partitioned table that contains 950,000,000 records (since from 2004) which is again list sub-partitioned on status. Possible values of stauts are 0,1,2,3 and 4.
    The requirement is to get all the rows with status 1 and date less than 24-Aug 2011. (Oracle 11g R2).
    I trying below code
    CREATE TABLE RECONCILIATION_TAB PARALLEL 3 NOLOGGING  
    AS SELECT /*+ INDEX(CARDS_TAB STATUS_IDX) */ ID,STATUS,DATE_D
    FROM CARDS_TAB
    WHERE DATE_D < TO_DATE('24-AUG-2011','DD-MON-YYYY')
    AND STATUS=1; CARDS_TAB has tow global indexes one on status and another on date_d.
    Above query is running for last 28Hrs! Is this the right approach?
    With Regards,
    Farooq Abdulla

    You said the table was range partitioned but you didn't say by what. I'm guessing the table is range partitioned by DATE_D. Is that a valid assumption?
    You said that the table was subpartitioned by status. If the table is subpartitioned by status, what do you mean that the data is randomly distributed? Surely it's confined to particular subpartitions, right?
    What is the query plan without the hint?
    What is the query plan with the hint?
    Why do you believe that adding the hint will be beneficial?
    Justin

  • How to insert multiple rows in the database table with the high performance

    Hello everybody,
    I am using the struts,jsp and spring framework. In my application there are 100s of rows i have to insert into the database 1 by 1. I am using usertransaction all other things are working right but i am not getting the real time performance.
    Can anyone tell me the proper method to insert multiple records and also with fast speed

    I don't know much about Spring etc, but if the jdbc Statemenet.addBatch(), Statement.executeBatch() statements let you bundle a whole lot of sql commands into one lump to execute.
    Might help a bit...

  • Inserting Multiple Rows into Database Table using JDBC Adapter - Efficiency

    I need to insert multiple rows into a database table using the JDBC adapter (receiver).
    I understand the traditional way of repeating the statement multiple times, each having its <access> element. However, I am just wondering whether this might be performance-inefficient, as it might insert records one by one.
    Is there a way to ensure that the records are inserted into the table as a block, rather than record-by-record?

    Hi Bhavesh/Kanwaljit,
    If we have multiple ACCESS tags then what happens is that the connection to the database is made only once. But the data is inserted row by row.
    Why i am saying this?
    If we add the following in JDBC Adapter..logSQLStatement = true. Then incase of multiple inserts we can see that there are multiple
    <i>Insert into tablename(EMP_NAME,EMP_ID) VALUES('J','1000')
    Insert into tablename(EMP_NAME,EMP_ID) VALUES('J','2000')</i>
    Doesnt this mean that rows are inserted one by one?
    Correct me if i am wrong.
    This does not mean that the transaction is not guaranted. Either all the rows will be inserted or rolled back.
    Regards,
    Sumit

  • ADF Editable Table : CANNOT Insert Multi Rows , Please hellppppssss

    Hi All,
    Our customer requirement is being able to insert Multi Rows in an Editable Table and submit ALL of them with just one Click ?
    Is it Really Possible ?
    I have tried, even I can insert three Empty Rows and Fill them all with values, when I press Submit ONLY one row will be submitted, the other TWO will become BLANK again, is this my code is wrong ?
    Or this is limitation of ADF Editable Table ?
    Here is my codes, I have tried two ways, all same problem :
    (1) public String create_action() {
    BindingContainer bindings = getBindings();
    OperationBinding operationBinding =
    bindings.getOperationBinding("Create");
    Object result = operationBinding.execute();
    if (!operationBinding.getErrors().isEmpty()) {
    return null;
    return null;
    public String myCreate_action() {
    create_action();
    create_action();
    create_action();
    return null;
    (2) public String createMultiRows() {
    DCBindingContainer dcbc = (DCBindingContainer)getBindings();
    //get iterator binding
    DCIteratorBinding ib = (DCIteratorBinding)dcbc.get("DeptView1Iterator");
    // get viewobject
    ViewObject vo = ib.getViewObject();
    System.out.println(vo.getCurrentRowIndex());
    vo.clearCache();
    int currentRowIndex = vo.getRowCount() -1;
    Row row1 = vo.createRow();
    row1.setNewRowState(Row.STATUS_NEW);
    vo.insertRowAtRangeIndex(++currentRowIndex, row1);
    Row row2 = vo.createRow();
    row2.setNewRowState(Row.STATUS_NEW);
    vo.insertRow(row2);
    Row row3 = vo.createRow();
    row3.setNewRowState(Row.STATUS_NEW);
    vo.insertRow(row3);
    return null;
    Please help... I am stuck ...
    Thank you very much,
    xtanto

    Xtanto,
    Could you try changing
    bindings.getOperationBinding("Create");to
    bindings.getOperationBinding("CreateInsert");in your first example to see if it fixes your problem?
    Regards,
    John

  • Inserting multiple rows into a table (using sequences)

    Hi!
    I want to insert multiple rows into a db table, but I don't know how.
    I know how to insert 1 row using a sequence:
    I put all the fields in the jsp form's page and in the submit page I put something like this:
    <jbo:Row id="myRow" datasource="ds" action="Update" rowkeyparam="MyRowKey" >
    <jbo:SetAttribute dataitem="*" />
    </jbo:Row>
    But how can I insert multiple rows like this:
    Id          Name
    1          ana
    2          monteiro
    3          maria
    Thanks!

    Hi Chris,
    I think this should give you what you need: Working with a Multiple Select List Item
    --Jennifer                                                                                                                                                                                                                                                                                                                                                                                                   

  • Want to insert multiple rows in table using EO

    Hi,
    I have one requirement where I need to insert multiple rows at once in table lets say Previous Employers.
    What I am trying to do is I have created few textinputboxes and getting there values and putting in HashMap.
    And manually inserting the rows to EO. I am not getting any error but the data is not populating in Table.
    here is the code snap ...please suggest!!
    public void updateKoelHrPreEmpVO(HashMap map)
    OADBTransaction txn = getOADBTransaction();
    //HashMap map = new HashMap();
    KoelHrPreEmpVOImpl empVO = getKoelHrPreEmpVO1();
    KoelHrPreEmpVORowImpl fetchedRow = null;
    int fetchedRowCount = empVO.getFetchedRowCount();
    RowSetIterator deleteIter = empVO.createRowSetIterator("deleteIter");
    if (fetchedRowCount > 0)
    if(txn.isLoggingEnabled(2))
    txn.writeDiagnostics(this,"Removing rows from KoelHrPreEmpVO :: Current count :: " + fetchedRowCount ,2);
    //System.out.println("Removing rows from KoelHrPreEmpVO :: Current count :: " +fetchedRowCount);
    deleteIter.setRangeStart(0);
    deleteIter.setRangeSize(fetchedRowCount-1);
    for (int i = 0; i < fetchedRowCount; i++)
    //System.out.println("Removing Row :: "+i);
    if(txn.isLoggingEnabled(2))
    txn.writeDiagnostics(this,"Removing row :: " + i ,2);
    fetchedRow = (KoelHrPreEmpVORowImpl)deleteIter.getRowAtRangeIndex(i);
    fetchedRow.remove();
    getTransaction().commit();
    deleteIter.closeRowSetIterator();
    if(empVO.getRowCount() == 0) {
    Row row = null;
    KoelHrPreEmpVORowImpl insertRow = null;
    empVO.first();
    if(txn.isLoggingEnabled(2))
    txn.writeDiagnostics(this,"Inserting rows to KoelHrPreEmpVO :: " ,2);
    try
    for(int i=1; i<=4; i++) {
    insertRow = (KoelHrPreEmpVORowImpl)empVO.createRow();
    insertRow.setEmployeeNumber(map.get("EmployeeNumber").toString());
    insertRow.setPersonId(new Number(map.get("PersonId").toString()));
    insertRow.setEmployer(map.get("Employer"+i+"").toString());
    insertRow.setStartDate(Date.toDate(map.get("StartDate"+i+"").toString()));
    insertRow.setEndDate(Date.toDate(map.get("EndDate"+i+"").toString()));
    insertRow.setEmployer(map.get("Designation"+i+"").toString());
    empVO.insertRow(insertRow);
    insertRow.setNewRowState(Row.STATUS_INITIALIZED);
    getTransaction().commit();
    catch(SQLException ex)
    ex.printStackTrace();
    Regards,
    Mukesh

    1. Pls check if the create() methos in EOImpl is in place, setting the primary key.
    2. Even though we insert values in this fashion , it only gets inserted if user performs any action, like manipulates some column. Pls check if there is any mechansm to make the row dirty ( as if done by user).
    Srikanth

  • JTable insert a row after the table is displayed.

    I'm using the SunOneStudio and when I define the columns and rows for the table. My problem is that I need to have a ADD button and to be able to insert a Row to this predefined table.
    I know it's pretty basic, but I'm stack in here.
    THANKS A LOT FOR YOUR HELP ..

    This thead should get you started:
    http://forum.java.sun.com/thread.jsp?forum=31&thread=314155

  • Problem with BULK COLLECT with million rows - Oracle 9.0.1.4

    We have a requirement where are supposed to load 58 millions of rows into a FACT Table in our DATA WAREHOUSE. We initially planned to use Oracle Warehouse Builder but due to performance reasons, decided to write custom code. We wrote a custome procedure which opens a simple cursor and reads all the 58 million rows from the SOURCE Table and in a loop processes the rows and inserts the records into a TARGET Table. The logic works fine but it took 20hrs to complete the load.
    We then tried to leverage the BULK COLLECT and FORALL and PARALLEL options and modified our PL/SQL code completely to reflect these. Our code looks very simple.
    1. We declared PL/SQL BINARY_INDEXed Tables to store the data in memory.
    2. We used BULK COLLECT into FETCH the data.
    3. We used FORALL statement while inserting the data.
    We did not introduce any of our transformation logic yet.
    We tried with the 600,000 records first and it completed in 1 min and 29 sec with no problems. We then doubled the no. of rows to 1.2 million and the program crashed with the following error:
    ERROR at line 1:
    ORA-04030: out of process memory when trying to allocate 16408 bytes (koh-kghu
    call ,pmucalm coll)
    ORA-06512: at "VVA.BULKLOAD", line 66
    ORA-06512: at line 1
    We got the same error even with 1 million rows.
    We do have the following configuration:
    SGA - 8.2 GB
    PGA
    - Aggregate Target - 3GB
    - Current Allocated - 439444KB (439 MB)
    - Maximum allocated - 2695753 KB (2.6 GB)
    Temp Table Space - 60.9 GB (Total)
    - 20 GB (Available approximately)
    I think we do have more than enough memory to process the 1 million rows!!
    Also, some times the same program results in the following error:
    SQL> exec bulkload
    BEGIN bulkload; END;
    ERROR at line 1:
    ORA-03113: end-of-file on communication channel
    We did not even attempt the full load. Also, we are not using the PARALLEL option yet.
    Are we hitting any bug here? Or PL/SQL is not capable of mass loads? I would appreciate any thoughts on this?
    Thanks,
    Haranadh
    Following is the code:
    set echo off
    set timing on
    create or replace procedure bulkload as
    -- SOURCE --
    TYPE src_cpd_dt IS TABLE OF ima_ama_acct.cpd_dt%TYPE;
    TYPE src_acqr_ctry_cd IS TABLE OF ima_ama_acct.acqr_ctry_cd%TYPE;
    TYPE src_acqr_pcr_ctry_cd IS TABLE OF ima_ama_acct.acqr_pcr_ctry_cd%TYPE;
    TYPE src_issr_bin IS TABLE OF ima_ama_acct.issr_bin%TYPE;
    TYPE src_mrch_locn_ref_id IS TABLE OF ima_ama_acct.mrch_locn_ref_id%TYPE;
    TYPE src_ntwrk_id IS TABLE OF ima_ama_acct.ntwrk_id%TYPE;
    TYPE src_stip_advc_cd IS TABLE OF ima_ama_acct.stip_advc_cd%TYPE;
    TYPE src_authn_resp_cd IS TABLE OF ima_ama_acct.authn_resp_cd%TYPE;
    TYPE src_authn_actvy_cd IS TABLE OF ima_ama_acct.authn_actvy_cd%TYPE;
    TYPE src_resp_tm_id IS TABLE OF ima_ama_acct.resp_tm_id%TYPE;
    TYPE src_mrch_ref_id IS TABLE OF ima_ama_acct.mrch_ref_id%TYPE;
    TYPE src_issr_pcr IS TABLE OF ima_ama_acct.issr_pcr%TYPE;
    TYPE src_issr_ctry_cd IS TABLE OF ima_ama_acct.issr_ctry_cd%TYPE;
    TYPE src_acct_num IS TABLE OF ima_ama_acct.acct_num%TYPE;
    TYPE src_tran_cnt IS TABLE OF ima_ama_acct.tran_cnt%TYPE;
    TYPE src_usd_tran_amt IS TABLE OF ima_ama_acct.usd_tran_amt%TYPE;
    src_cpd_dt_array src_cpd_dt;
    src_acqr_ctry_cd_array      src_acqr_ctry_cd;
    src_acqr_pcr_ctry_cd_array     src_acqr_pcr_ctry_cd;
    src_issr_bin_array      src_issr_bin;
    src_mrch_locn_ref_id_array     src_mrch_locn_ref_id;
    src_ntwrk_id_array      src_ntwrk_id;
    src_stip_advc_cd_array      src_stip_advc_cd;
    src_authn_resp_cd_array      src_authn_resp_cd;
    src_authn_actvy_cd_array      src_authn_actvy_cd;
    src_resp_tm_id_array      src_resp_tm_id;
    src_mrch_ref_id_array      src_mrch_ref_id;
    src_issr_pcr_array      src_issr_pcr;
    src_issr_ctry_cd_array      src_issr_ctry_cd;
    src_acct_num_array      src_acct_num;
    src_tran_cnt_array      src_tran_cnt;
    src_usd_tran_amt_array      src_usd_tran_amt;
    j number := 1;
    CURSOR c1 IS
    SELECT
    cpd_dt,
    acqr_ctry_cd ,
    acqr_pcr_ctry_cd,
    issr_bin,
    mrch_locn_ref_id,
    ntwrk_id,
    stip_advc_cd,
    authn_resp_cd,
    authn_actvy_cd,
    resp_tm_id,
    mrch_ref_id,
    issr_pcr,
    issr_ctry_cd,
    acct_num,
    tran_cnt,
    usd_tran_amt
    FROM ima_ama_acct ima_ama_acct
    ORDER BY issr_bin;
    BEGIN
    OPEN c1;
    FETCH c1 bulk collect into
    src_cpd_dt_array ,
    src_acqr_ctry_cd_array ,
    src_acqr_pcr_ctry_cd_array,
    src_issr_bin_array ,
    src_mrch_locn_ref_id_array,
    src_ntwrk_id_array ,
    src_stip_advc_cd_array ,
    src_authn_resp_cd_array ,
    src_authn_actvy_cd_array ,
    src_resp_tm_id_array ,
    src_mrch_ref_id_array ,
    src_issr_pcr_array ,
    src_issr_ctry_cd_array ,
    src_acct_num_array ,
    src_tran_cnt_array ,
    src_usd_tran_amt_array ;
    CLOSE C1;
    FORALL j in 1 .. src_cpd_dt_array.count
    INSERT INTO ima_dly_acct (
         CPD_DT,
         ACQR_CTRY_CD,
         ACQR_TIER_CD,
         ACQR_PCR_CTRY_CD,
         ACQR_PCR_TIER_CD,
         ISSR_BIN,
         OWNR_BUS_ID,
         USER_BUS_ID,
         MRCH_LOCN_REF_ID,
         NTWRK_ID,
         STIP_ADVC_CD,
         AUTHN_RESP_CD,
         AUTHN_ACTVY_CD,
         RESP_TM_ID,
         PROD_REF_ID,
         MRCH_REF_ID,
         ISSR_PCR,
         ISSR_CTRY_CD,
         ACCT_NUM,
         TRAN_CNT,
         USD_TRAN_AMT)
         VALUES (
         src_cpd_dt_array(j),
         src_acqr_ctry_cd_array(j),
         null,
         src_acqr_pcr_ctry_cd_array(j),
              null,
              src_issr_bin_array(j),
              null,
              null,
              src_mrch_locn_ref_id_array(j),
              src_ntwrk_id_array(j),
              src_stip_advc_cd_array(j),
              src_authn_resp_cd_array(j),
              src_authn_actvy_cd_array(j),
              src_resp_tm_id_array(j),
              null,
              src_mrch_ref_id_array(j),
              src_issr_pcr_array(j),
              src_issr_ctry_cd_array(j),
              src_acct_num_array(j),
              src_tran_cnt_array(j),
              src_usd_tran_amt_array(j));
    COMMIT;
    END bulkload;
    SHOW ERRORS
    -----------------------------------------------------------------------------

    do you have a unique key available in the rows you are fetching?
    It seems a cursor with 20 million rows that is as wide as all the columnsyou want to work with is a lot of memory for the server to use at once. You may be able to do this with parallel processing (dop over 8) and a lot of memory for the warehouse box (and the box you are extracting data from)...but is this the most efficient (and thereby fastest) way to do it?
    What if you used a cursor to select a unique key only, and then during the cursor loop fetch each record, transform it, and insert it into the target?
    Its a different way to do a lot at once, but it cuts down on the overall memory overhead for the process.
    I know this isnt as elegant as a single insert to do it all at once, but sometimes trimming a process down so it takes less resources at any given moment is much faster than trying to do the whole thing at once.
    My solution is probably biased by transaction systems, so I would be interested in what the data warehouse community thinks of this.
    For example:
    source table my_transactions (tx_seq_id number, tx_fact1 varchar2(10), tx_fact2 varchar2(20), tx_fact3 number, ...)
    select a cursor of tx_seq_id only (even at 20 million rows this is not much)
    you could then either use a for loop or even bulk collect into a plsql collection or table
    then process individually like this:
    procedure process_a_tx(p_tx_seq_id in number)
    is
    rTX my_transactions%rowtype;
    begin
    select * into rTX from my_transactions where tx_seq_id = p_tx_seq_id;
    --modify values as needed
    insert into my_target(a, b, c) values (rtx.fact_1, rtx.fact2, rtx.fact3);
    commit;
    exception
    when others
    rollback;
    --write to a log or raise and exception
    end process_a_tx;
    procedure collect_tx
    is
    cursor tx is
    select tx_seq_id from my_transactions;
    begin
    for rTx in cTx loop
    process_a_tx(rtx.tx_seq_id);
    end loop;
    end collect_tx;

Maybe you are looking for

  • Upgrade problems with iphoto 9.2

    I just got a new Macbook Pro about two months ago. Life was already installed, as was Lion, when I received my computer in the mail. I am running I am running iPhoto 9.1.5. I wish to update to iPhoto 9.2, which includes photo streaming. Here's the st

  • Have you Gotten: "target device isn't suitable for use" ? A SOLUTION

    I got this warning about 20 times for one 50 min. video. Asked for a solution from Adobe and got none. The file was 3.06 G. I decided to lower the burning quality from 3.06 to 2.9 not wanting to loose too much quality. The file burned fine. Had tried

  • Report sq01 query

    Hi Gurus, Hi, Please could you let me know if there is a report either in SQ01 or in SAP in general that shows the following: Sales order PR00 price (Header level) Condition type (ZFRE Freight) and the value MWST Output tax and the value Net Value of

  • Japanese character storing format.

    Hi all, I have stored japanese characters in varchar field in the format 'し' for 'し' i.e. 'shi' and so on. Does any one know what this format is called and are there any other formats for storing such different language characters and also how to use

  • Kernel 2.6.27 and ipw4965 problems

    I am not 100% sure, but think it's after upgrading the kernel to newest version (see subj), i having troubles with my wireless network, using the ipw4965 module. if i'm downloading for full speed (20Mbit) over about 5 mins the wifi stalls, and cannot