Regarding Direct Load or Direct Path Load

Why Direct Load and Direct Path Load is faster?
In which situation Direct Load, Direct Path Load are faster?

>
Why Direct Load and Direct Path Load is faster?
In which situation Direct Load, Direct Path Load are faster?
>
See About Direct-Path INSERT in the DBA doc
http://docs.oracle.com/cd/E11882_01/server.112/e17120/tables004.htm#i1009100
>
About Direct-Path INSERT
Oracle Database inserts data into a table in one of two ways:
•During conventional INSERT operations, the database reuses free space in the table, interleaving newly inserted data with existing data. During such operations, the database also maintains referential integrity constraints.
•During direct-path INSERT operations, the database appends the inserted data after existing data in the table. Data is written directly into datafiles, bypassing the buffer cache. Free space in the table is not reused, and referential integrity constraints are ignored. Direct-path INSERT can perform significantly better than conventional insert.

Similar Messages

  • Sql loader not able to load more than 4.2 billion rows

    Hi,
    I am facing a problem with Sql loader utility.
    The Sql loader process stops after loading 342 gigs of data i.e it seems to load 4.294 billion rows out of 6.9 billion rows and the loader process completes without throwing any error.
    Is there any limit on the volume of data sql loader can load ?
    Also how to identify that SQL loader process has not loaded whole data from the file as it is not throwing any error ?
    Thanks in Advance.

    it may be a prob with the db config not in the sql loader...
    check all the parameters
    Maximizing SQL*Loader Performance
    SQL*Loader is flexible and offers many options that should be considered to maximize the speed of data loads. These include:
    1. Use Direct Path Loads - The conventional path loader essentially loads the data by using standard insert statements. The direct path loader (direct=true) loads directly into the Oracle data files and creates blocks in Oracle database block format. The fact that SQL is not being issued makes the entire process much less taxing on the database. There are certain cases, however, in which direct path loads cannot be used (clustered tables). To prepare the database for direct path loads, the script $ORACLE_HOME/rdbms/admin/catldr.sql.sql must be executed.
    2. Disable Indexes and Constraints. For conventional data loads only, the disabling of indexes and constraints can greatly enhance the performance of SQL*Loader.
    3. Use a Larger Bind Array. For conventional data loads only, larger bind arrays limit the number of calls to the database and increase performance. The size of the bind array is specified using the bindsize parameter. The bind array's size is equivalent to the number of rows it contains (rows=) times the maximum length of each row.
    4. Use ROWS=n to Commit Less Frequently. For conventional data loads only, the rows parameter specifies the number of rows per commit. Issuing fewer commits will enhance performance.
    5. Use Parallel Loads. Available with direct path data loads only, this option allows multiple SQL*Loader jobs to execute concurrently.
    $ sqlldr control=first.ctl parallel=true direct=true
    $ sqlldr control=second.ctl parallel=true direct=true
    6. Use Fixed Width Data. Fixed width data format saves Oracle some processing when parsing the data. The savings can be tremendous, depending on the type of data and number of rows.
    7. Disable Archiving During Load. While this may not be feasible in certain environments, disabling database archiving can increase performance considerably.
    8. Use unrecoverable. The unrecoverable option (unrecoverable load data) disables the writing of the data to the redo logs. This option is available for direct path loads only.
    Edited by: 879090 on 18-Aug-2011 00:23

  • SQLLoader, direct path load issue

    Hi all,
    I have a sqlldr control file which has couple of columns like,
    my_column_1 ,
    my_column_2 "decode(:my_column_1,'ONE','AAA','TWO','BBB', :my_column_1)"
    The table I am loading to is in user X and I am running the load from user Y. Everything works fine with conventional path load (not direct path) as grants are made for the table to user Y.
    When I load the data with same control file with direct path, I get an error ,
    01031 - insufficient privileges
    Is this anything to do with the syntex I have used in the control file or its a privilege issue. If its a privilege issue which privilege is that ?
    I did following tests,
    1) Load is run with conventional path load, from user Y and the decode statement is in control file - Load works
    2) Load is run with direct path load, from user Y and decode statement is in control file - Load fails with above mentioned error
    3) Load is run with direct path load, from user Y and decode IS REMOVED from the control file - Load works (!!!)
    What can be the conclusion? Way out ?
    Thanks and Regards

    <quote>What can be the conclusion? Way out ?</q>
    Adi is right ... in conventional mode, sqlldr will go through the SQL engine to insert data in your table (and the SQL engine knows what DECODE is); in direct path mode, sqlldr formats database blocks directly ... so it cannot do SQL functions.
    A way out ... load my_column_1 into the table the way you do and put a view on top having the extra my_column_2 with the decode on my_column_1.

  • SQL Loader choosing conventional path when direct path is requested

    We have a mystery regarding SQL Loader choosing to load with conventional path even though direct path is requested.
    We have a control file that produces direct-path loads and one which does not. The difference between them does not seem to account for the difference in behavior.
    The following control file does not give us direct-path:
    OPTIONS (
         SKIP=0,
         ERRORS=0,          
         DIRECT=TRUE,          
         NOLOGGING
    LOAD DATA
    INFILE "[file path]" "STR x'0A'"
    BADFILE "[file path].bad"
    DISCARDFILE "[file path].dsc"
    DISCARDMAX 0
    INSERT
    INTO [schema name].[table name]
    FIELDS TERMINATED BY X'2C'
    OPTIONALLY ENCLOSED BY '?'
    TRAILING NULLCOLS
         C1_ACD_LINE_CD     CHAR(2000),
    [column specifications continue]
    )When running with this control file, the log shows:
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 0
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table [schema name].[table name], loaded from every logical record.
    Insert option in effect for this table: INSERT
    TRAILING NULLCOLS option in effectIf we use a control file that is modified as follows:
    OPTIONS (
         SKIP=0,
         ERRORS=0,     
         DIRECT=TRUE,     
         PARALLEL=TRUE,
         NOLOGGING
         )Then we do get direct-path load:
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 0
    Continuation:    none specified
    Path used:      Direct
    Table [schema name].[table name], loaded from every logical record.
    Insert option in effect for this table: INSERT
    TRAILING NULLCOLS option in effectSo there is nothing about the table (constraints, triggers, etc.) that is preventing direct-path loads.
    Now, we stumbled into this PARALLEL thing by accident - we are not really trying to do parallel loads.
    In my reading of the Utilities guide (http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_modes.htm#autoId64 ), the PARALLEL option lets SQL Loader tolerate multiple sessions loading to the same segment at once, but does not perform parallel processing itself. So, is it possible there is some other lock on the table is causing SQL Loader to block direct-path loads to the table (because of a previous SQL Loader direct-path load, perhaps) unless the PARALLEL option is invoked? If so, how do we recognize that state and how do we resolve it?
    Version information:
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    PL/SQL Release 11.2.0.3.0 - Production
    CORE     11.2.0.3.0     Production
    TNS for Solaris: Version 11.2.0.3.0 - Production
    NLSRTL Version 11.2.0.3.0 - Production
    Any thoughts or suggestions would be appreciated.
    Thanks,
    Mike

    From the same link
    >
    To use a direct path load (except for parallel loads), SQL*Loader must have exclusive write access to the table and exclusive read/write access to any indexes.
    >
    So I suspect that when using only DIRECT=TRUE, Oracle is not able to get an exclusive lock on the required objects, so it uses the conventional mode.
    From a later section
    >
    - Segments to be loaded do not have any active transactions pending.
    To check for this condition, use the Oracle Enterprise Manager command MONITOR TABLE to find the object ID for the tables you want to load. Then use the command MONITOR LOCK to see if there are any locks on the tables.
    >
    Would be interested in knowing what you find
    HTH
    Srini

  • Direct Path Loading

    Sir,
    I tried the oracle utility SQLLDR.I created a table named
    TEST1(id Number primary key,var varchar2(50).
    I correcltly load data into TEST1 using normal load(not direct),bad file entries are filled with duplicate records
    When direct path load is used,all data are loaded into TEST1 voilating PRIMARY KEY constraint.
    After this i tried to insert data into TEST1 error showing table is in unusable state.
    Again direct path loading is used it also shows same error as above
    Please give me valuable information about the working of Direct Path Loading and data insertion.
    Regards Manojvilayil

    OCI_ATTR_DATA_SIZE in OCI Programmer's guide 9.2:
    Steps used in OCI defining - example: OCIAttrGet() into <undefined>
    Attributes of Type Attributes: ub4
    Attributes of Collection Types: ub2
    Attributes belonging to Columns of Tables or Views: ub2
    Attributes belonging to Arguments/Results: ub2
    Retrieving Column Data Types For a Table - example: OCIAttrGet() into ub4
    Describing with Character Length Semantics - example: OCIAttrGet() into ub4
    Creating a Parameter Descriptor for OCIType Calls: ub4
    Handle and Descriptor Attributes, Example on page A-72: OCIAttrSet from <undefined> with length 0
    Handle and Descriptor Attributes, Direct Path Column Parameter Attributes, Attribute Data Type: "ub2 */ub2 *".
    OCI_ATTR_DATA_SIZE in OCI Programmer's guide 12c R1:
    Implicit describe of a result - example: OCIAttrGet() into ub2
    Steps used in OCI defining - example: OCIAttrGet() into <undefined>
    Attributes of Type Attributes: ub2
    Attributes of Collection Types: ub2
    Attributes of Columns of Tables or Views: ub2
    Attributes of Arguments and Results: ub2
    Example 6–2: OCIAttrGet() into ub2
    Example 6–6: OCIAttrGet() into ub2
    Creating a Parameter Descriptor for OCIType Calls: ub2
    Handle and Descriptor Attributes, Example on page A-78: OCIAttrSet from <undefined> with length 0
    Handle and Descriptor Attributes, Direct Path Column Parameter Attributes, Attribute Data Type: "ub4 */ub4 *".
    Apparently the 12c manual is already adapted to ub4.
    Maybe it is a good idea to be careful with all length definitions?

  • Direct path load question

    I don't understand why during a direct path load insert an exclusive lock on the table is required.
    Suppose I have a table T without any indexes or constraints, why can't I update the table in a session and bulk load it in another session above the HWM ?
    Thanks in advance

    I don't understand why during a direct path load insert an exclusive lock on the table is required.
    direct path write
    During Direct Path operations, the data is asynchronously written to the database files. At some stage the session needs to make sure that all outstanding asynchronous I/O have been completed to disk. This can also happen if, during a direct write, no more slots are available to store outstanding load requests (a load request could consist of multiple I/Os).http://docs.oracle.com/cd/B28359_01/server.111/b28320/waitevents003.htm
    If you wish to get more details then check below link where Tanel is describing it beautifully. Tanel is talking about “asynch descriptor resize” wait event in Oracle 11gR2, but since we are discussing about direct path write, so I think this may also be relavant and of your interest.
    http://blog.tanelpoder.com/2010/11/23/asynch-descriptor-resize-wait-event-in-oracle/
    Regards
    Girish Sharma

  • External Table and Direct path load

    Hi,
    I was just playing with Oracle sql loader and external table features. few things which i ovserved are that data loading through direct path method of sqlloader is much faster and takes much less hard disk space than what external table method takes. here are my stats which i found while data loading:
    For Direct Path: -
    # OF RECORDS.............TIME...................SOURCE FILE SIZE...................DATAFILE SIZE(.dbf)
    478849..........................00:00:43.53...................108,638 KB...................142,088 KB
    957697..........................00:01:08.81...................217,365 KB...................258,568 KB
    1915393..........................00:02:54.43...................434,729 KB...................509,448 KB
    For External Table: -
    # OF RECORDS..........TIME...................SOURCE FILE SIZE...................DATAFILE SIZE(.dbf)
    478849..........................00:02:51.03...................108,638 KB...................966,408 KB
    957697..........................00:08:05.32...................217,365 KB...................1,930,248 KB
    1915393..........................00:17:16.31...................434,729 KB...................3,860,488 KB
    1915393..........................00:23:17.05...................434,729 KB...................3,927,048 KB
    (With PARALLEL)
    i used same files for testing and all other conditions are similar also. In my case datafile is autoextendable, hence, as par requirement its size is automatically increased and hard disk space is reduced thus.The issue is that, is this an expected behaviour? and why in case of external tables such a large hard disk space is used when compared to direct path method? Performance of external table load is also very bad when compared to direct path load.
    one more thing is that while using external table load with PARALLEL option, ideally, it should take less time. But what i actually get is more than what the time was without PARALLEL option.
    In both the cases i am loading data from the same file to the same table (once using direct path and once using external table). before every fresh loading i truncate my internal table into which data was loaded.
    any views??
    Deep
    Message was edited by:
    Deep

    Thanx to all for your suggestions.
    John, my scripts are as follows:
    for external table:
    CREATE TABLE LOG_TBL_LOAD
    (COL1 CHAR(20),     COL2 CHAR(2),     COL3 CHAR(20),     COL4 CHAR(400),
         COL5 CHAR(20),     COL6 CHAR(400),     COL7 CHAR(20),     COL8 CHAR(20),
         COL9 CHAR(400),     COL10 CHAR(400))
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY EXT_TAB_DIR
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY '"' MISSING FIELD VALUES ARE NULL     
    LOCATION ('LOGZ3.DAT')
    REJECT LIMIT 10;
    for loading i did:
    INSERT INTO LOG_TBL (COL1, COL2, COL3, COL4,COL5, COL6,
    COL7, COL8, COL9, COL10)
    (SELECT COL1, COL2, COL3, COL4, COL5, COL6, COL7, COL8,
    COL9, COL10 FROM LOG_TBL_load_1);
    for direct path my control file is like this:
    OPTIONS (
    DIRECT = TRUE
    LOAD DATA
    INFILE 'F:\DATAFILES\LOGZ3.DAT' "str '\n'"
    INTO TABLE LOG_TBL
    APPEND
    FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY '"'
    (COL1 CHAR(20),
    COL2 CHAR(2),
    COL3 CHAR(20),
    COL4 CHAR(400),
    COL5 CHAR(20),
    COL6 CHAR(400),
    COL7 CHAR(20),
    COL8 CHAR(20),
    COL9 CHAR(400),
    COL10 CHAR(400))
    and ya, i have used same table in both the situations. after loading once i used to truncate my table, LOG_TBL. i used the same source file, LOGZ3.DAT.
    my tablespace USERS is loaclly managed.
    thanks

  • SQL Loader direct path loads and unusable indexes

    sorry about all the questions. I am researching several issues. I am reading the Utilities document. It says that in certain circumstances indexes will become unusable. I have some questions about my scenario.
    1. tables partitioned by range
    2. local indexes
    3. all tables have 1 primary key and other indexes are non-unique
    4. all sql loads will go into the most recent partition
    5. users will be querying tables while sql loader is occurring
    6. only one sql loader session will run per table
    7. no foreign keys, triggers, or other constrants other than primary keys.
    The docs are not clear. Do I have a concern about unusable indexes with direct path loads? Will indexes function while the sql loader direct path is occurring(this I can't test since I have small data files now and they load fast, but I will have larger ones in production).
    My understanding is that Extertnal tables using Insert append is exactly the same as sql loader direct path load. Is this true?

    if you dont have anything productive to say how about you don't post at all? you have made ignorant posts like this for years.
    as far as reading the docs what do you think "the docs are not clear" means? By the docs I am referring to the utilities document.
    As far as version number its 10.2 and I forgot that. However, it does not appear that sql loader has really changed all that much over the last few versions.
    Finally I plan on testing it out and its more than a 2 minute test. I wanted to make sure I don't miss anythng in my tests.
    don't respond to any threads or posts I make from now on.

  • Cannot get Oracle sequence to be used with SQL Loader direct path load.

    I attempted to load approximately 3 million rows from a flat file into a 10g database using SQL Loader direct path load. In order to generate a primary key for each of these rows I used a simple SQL expression which gets the next number of an Oracle sequence and assigned that to my primary key column inside the control file. The strange thing is no primary key was generated for each of these columns as I see the sequence was not advanced. Furthermore when I try to run a query to find the count on those rows which have this column as NOT NULL ,I get the total of all the rows even though on inspection this column is empty in each of the rows.
    As far as I know, in the control file for a direct path load I am able to use a SQL Expression that returns simple scalar data which a NEXTVAL on a sequence does. (Oreilly Oracle SQL Loader The Definitive Guide p.184)
    Does anybody why my primary key was not generated and furthermore why is this column in a query appear to be NOT NULL when in fact it has nothing in it?

    Daniel,
    I appreciate your response alot.
    Now a follow on. You say that SQL*Loader is the fastest way to get data into Oracle Spatial even though the conventional path. How does SQL*Loader do this? Doesn't it use OCI?
    I just can't help but think that coverting my data to SQL*Loader format, and then having it parse it and send it to the database in some form must be slower than me just sending whatever SQL Loader sends to the DB myself using OCI. Am I missing something?
    The SQL Loader documentation seems to suggest that SQL*Loader just turns the input into alot of INSERT stateemnts. If so, can't I just do this?
    My performance with OCI and INSERT statements isn't very good, but I believe I am doing one transaction per insert. Might I be as well off to concentrate on fine tuning my OCI based code using INSERTs?
    I will actually do some time tests myself, but I would appreciate your opinion.
    Once again thanks for the great info you have provided.

  • Using functions with direct path load

    When loading data with sqlldr is it possible to have functions ( to alter the data) when loading in a direct path? Thanks

    When loading data with sqlldr is it possible to have functions ( to alter the data) when loading in a direct path?Did you try it? Which version do you use?
    How can I speed up the load?
    Did you study chapter 11 in the utilities guide Conventional and Direct Path Loads?
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/ldr_modes.htm#i1011553

  • DIRECT PATH LOAD의 개념 및 사용 방법

    제품 : ORACLE SERVER
    작성날짜 : 1998-11-27
    매우 많은 양의 데이타를 빠른 시간 내에 load하고자하는 경우 direct path load를
    사용할 수 있다. 여기에서 이러한 direct path load의 자세한 개념 및 사용방법,
    사용 시 고려해야 할 점 등을 설명한다.
    1. conventional path load
    일반적인 sql*loader를 이용한 방법은 존재하는 table에 datafile 내의 data를
    SQL의 INSERT command를 이용하여 insert시킨다. 이렇게 SQL command를
    이용하기 때문에 각각의 데이타를 위한 insert command가 생성되어 parsing되는
    과정이 필요하며, 먼저 bind array buffer (data block buffer) 내에 insert되는
    데이타를 입력시킨 후 이것을 disk에 write하게 된다.
    conventional path load를 사용하여야 하는 경우는 다음과 같다.
    --- load 중에 table을 index를 이용하여 access하여야 하는 경우
    direct load중에는 index가 'direct load state'가 되어 사용이 불가능하다.
    --- load 중에 index를 사용하지 않고 table을 update나 insert등을 수행해야
    하는 경우
    direct load 중에는 table에 exclusive write(X) lock을 건다.
    --- SQL*NET을 통해 load를 수행해야 하는 경우
    --- clustered table에 load하여야 하는 경우
    --- index가 걸려 있는 큰 table에 적은 수의 데이타를 load하고자 할 때
    --- referential이나 check integrity가 정의되어 있는 큰 table에
    load하고자 할 때
    --- data field에 SQL function을 사용하여 load하고자 할 때
    2. direct path load의 수행 원리
    Direct Path Loads는 다음과 같은 특징들로 인하여 매우 많은 양의 데이타를
    빠른 시간에 load하고자 할 때 이용하는 것이 바람직하다.
    (1) SQL INSERT 문장을 generate하여 수행하지 않는다.
    (2) memory 내의 bind array buffer를 이용하지 않고 database block의
    format과 같은 data
    block을 memory에 만들어 데이타를 넣은 후 그대로 disk에 write한다.
    memory 내의 block buffer와 disk의 block은 그 format이 다르다.
    (3) load 시작 시에 table에 lock을 걸고 load가 끝나면 release시킨다.
    (4) table의 HWM (High Water Mark) 윗 부분의 block에 data를 load한다.
    HWM는 table에 data가 insert됨에 따라 계속 늘어나고 truncate 외에는
    줄어들게 하지 못한다.
    그러므로, 항상 완전히 빈 새로운 block을 할당받아 data를 입력시키게 된다.
    (5) instance failure가 발생하여도 redo log file을 필요로 하지 않는다.
    (6) UNDO information을 발생시키지 않는다.
    즉 rollback segment를 사용하지 않는다.
    (7) OS에서 asynchronous I/O가 가능하다면, 복수개의 buffer에 의해서 동시에
    data를 읽어서 buffer에 write하면서 buffer에서 disk로 write할 수 있다.
    (8) parallel option을 이용하면 더욱 성능을 향상시킬 수 있다.
    3. direct path load의 사용방법 및 options
    direct path load를 사용하기 위한 view들은 다음 script에 포함어 있으며,
    미리 sys user로 수행되어야 한다. 단 이 script는 catalog.sql에 포함되어 있어,
    db 구성 시에 이미 수행되어진다.
    @$ORACLE_HOME/rdbms/admin/catldr.sql
    direct path load를 사용하기 위해서는 일반적인 sqlload 명령문에 DIRECT=TRUE를
    포함시키기만 하면 된다. 다음과 같이 기술하면 된다.
    sqlload username/password control=loadtest.ctl direct=true
    이 direct path load를 사용 시에 고려할 만한 추가적인 option 및 control file
    내에 기술 가능한 clause들을 살펴본다.
    (1) ROWS = n
    conventional path load에서 rows는 default가 64이며, rows에 지정된 갯수
    만큼의 row가 load되면 commit이 발생한다. 이와 비슷하게 direct load
    path에서는 rows option을 이용하여 data save를 이루며, data save가 발생하면
    data는 기존 table에 포함되어 입력된 data를 잃지 않게 된다.
    단 이 때 direct path load는 모든 data가 load된 다음에야 index가
    구성되므로 data save가 발생하여도 index는 여전히 direct load state로
    사용하지 못하게 된다.
    direct path load에서 이 rows의 default값은 unlimited이며, 지정된 값이
    database block을 채우지 못하면 block을 완전히 채우는 값으로 올림하여,
    partial block이 생성되지 않도록 한다.
    (2) PIECED clause
    control file내에 column_spec datatype_spec PIECED 순으로 기술하는
    것으로서 direct path load에만 유효하다. LONG type과 같이 하나의 data가
    maximum buffer size보다 큰 경우 하나의 data를 여러번에 나누어 load하는
    것이다. 이 option은 table의 맨 마지막 field
    하나에만 적용가능하며, index column인 경우에는 사용할 수 없다.
    그리고 load도중 data에 문제가 있는 경우 현재 load되는 data의 잘린 부분만
    bad file에 기록되게 된다. 왜냐하면 이전 조각은 이미 datafile에 기록되어
    buffer에는 남아있지 않기 때문이다.
    (3) READBUFFERS = n (default is 4)
    만약 매우 큰 data가 마지막 field가 아니거나 index의 한 부분인 경우
    PIECED option을 사용할 수 없다. 이러한 경우 buffer size를 증가시켜야
    하는데 이것은 readbuffers option을 이용하면 된다. default buffer갯수는
    4개이며, 만약 data load중 ORA-2374(No more slots for read buffer
    queue) message가 나타나면, buffer갯수가 부족한 것이므로 늘려주도록 한다.
    단 일반적으로는 이 option을 이용하여 값을 늘린다하더라도 system
    overhead만 증가하고 performance의 향상은 기대하기 어렵다.
    4. direct path load에서의 index 처리
    direct path load에서 인덱스를 생성하는 절차는 다음과 같다.
    (1) data가 table에 load된다.
    (2) load된 data의 key 부분이 temporary segment에 copy되어 sort된다.
    (3) 기존에 존재하던 index와 (2)에 의해서 정렬된 key가 merge된다.
    (4) (3)에 의해서 새로운 index가 만들어진다.
    기존에 존재하던 index와 temporary segment, 그리고 새로 만들어지는 index가
    merge가 완전히 끝날 때까지 모두 존재한다.
    (5) old index와 temporary segment는 지워진다.
    이와 같은 절차에 반해 conventional path load는 data가 insert될 때마다 한
    row 씩 index에 첨가된다. 그러므로 temporary storage space는 필요하지 않지만
    direct path load에 비해 index 생성 시간도 느리고, index tree의 balancing도
    떨어지게 된다.
    index생성 시 필요한 temporary space는 다음과 같은 공식에 의해 예측되어질 수
    있다.
    1.3 * key_storage
    key_storage = (number_of_rows) * (10 + sum_of_column_sizes +
    number_of_columns)
    여기에서 1.3은 평균적으로 sort 시에 추가적으로 필요한 space를 위한 값이며,
    전체 data가 완전히 순서가 거꾸로 된 경우에는 2, 전체가 미리 정렬된 경우라면
    1을 적용하면 된다.
    --- SINGLEROW clause
    이와 같이 direct path load에서 index 생성 시 space를 많이 차지하는 문제점
    때문에 resource가 부족한 경우에는 SINGLEROW option을 사용할 수 있다.
    이 option은 controlfile 내에 다음과 같은 형태로 기술하며, direct path
    load에만 사용 가능하다.
    into tables table_name [sorted indexes...] singlerow
    이 option을 사용하면 전체 data가 load된 뒤에 index가 구성되는 것이 아니라
    data가 load됨에 따라 data 각각이 바로 index에 추가된다.
    이 option은 기존에 미리 index가 존재하는 경우 index를 생성하는 동안
    merge를 위해 space를 추가적으로 필요로 하는 것을 막고자 하는 것이므로
    INSERT 시에는 사용하지 않고, APPEND시에만 사용하도록 하고 있다.
    실제 새로 load할 data 보다 기존 table이 20배 이상 클 때 사용하도록 권하고
    있다.
    direct path load는 rollback information을 기록하지 않지만, 이 singlerow
    option을 사용하면 insert되는 index에 대해 undo 정보를 rollback segment에
    기록하게 된다.
    그러나, 중간에 instance failure가 발생하면 data는 data save까지는 보존
    되지만 index는 여전히 direct load state로 사용할 수 없게 된다.
    --- Direct Load State
    만약 direct path load가 성공적으로 끝나지 않으면 index는 direct load
    state로 된다.
    이 index를 통해 조회하고자 하면 다음과 같은 오류가 발생한다.
    ORA-01502 : index 'SCOTT.DEPT_PK' is in direct load state.
    index가 direct load state로 되는 원인을 구체적으로 살펴보면 다음과 같다.
    (1) index가 생성되는 과정에서 space가 부족한 경우
    (2) SORTED INDEXES clause가 사용되었으나, 실제 data는 정렬되어 있지 않은
    경우
    이러한 경우 data는 모두 load가 되고, index만이 direct load state로 된다.
    (3) index 생성 도중 instance failure가 발생한 경우
    (4) unique index가 지정되어 있는 컬럼에 중복된 data가 load되는 경우
    특정 index가 direct load state인지를 확인하는 방법은 다음과 같다.
    select index_name, status
    from user_indexes
    where table_name = TABLE_NAME';
    만약 index가 direct load state로 나타나면 그 index는 drop하고 재생성
    하여야만 사용할 수 있다. 단, direct load 중에는 모든 index가 direct
    load state로 되었다가 load가 성공적으로 끝나면 자동으로 valid로 변경된다.
    --- Pre-sorting (SORTED INDEX)
    direct load 시 index구성을 위해서 정렬하는 시간을 줄이기 위해 미리 index
    column에 대해서 data를 정렬하여 load시킬 수 있다. 이 때 control file 내에
    SORTED INDEXES option을 다음과 같이 정의한다.
    이 option은 direct path load 시에만 유효하며, 복수 개의 index에 대해서
    지정가능하다.
    into table table_name SORTED INDEXES (index_names_with_blank)
    만약, 기존의 index가 이미 존재한다면, 새로운 key를 일시적으로 저장할 만큼
    의 temporary storage가 필요하며, 기존 index가 없는 경우였다면, 이러한
    temporary space도 필요하지 않다.
    이와 같이 direct path load 시에 index 구성 시에는 기존 데이타가 있는 table에
    load하는 경우 space도 추가적으로 들고, load가 완전히 성공적으로 끝나지 않으면
    index를 재생성하여야 하므로, 일반적으로 direct path load 전에 미리 table의
    index를 제거한 후 load가 모두 끝난 후 재생성하도록 한다.
    5. Recovery
    direct load는 기존 segment중간에 data를 insert하는 것이 아니라 완전히
    새로운 block을 할당받아 정확히 write가 끝난 다음 해당 segment에 포함되기
    때문에 instance failure시에는 redo log정보를 필요로 하지 않는다. 그러나
    default로 direct load는 redo log에 입력되는 data를 기록하는데 이것은 media
    recovery를 위한 것이다. 그러므로 archive log mode가 아니면 direct load에
    생성된 redo log 정보는 불필요하게 되므로 NOARCHIVELOG mode시에는 항상
    control file내에 UNRECOVERABLE이라는 option을 사용하여 redo log에 redo entry를 기록하지 않도록 한다.
    data가 redo log 정보 없이 instance failure시에 data save까지는 보호되는데
    반해 index는 무조건 direct load state가 되어 재생성하여야 한다. 그리고 data save이후의 load하고자 하는 table에 할당되었던 extent는 load된 data가
    user에게 보여지지는 않지만 extent가 free space로 release되지는 않는다.
    6. Integrity Constraints & Triggers
    direct path load중 not null, unique, primary key constraint는 enable
    상태로 존재한다. not null은 insert시에 check되고 unique는 load후 index를
    구성하는 시점에 check된다.
    그러나 check constraint와 referential constraint는 load가 시작되면서
    disable상태로 된다. 전체 데이타가 load되고 난 후 이렇게 disable된
    constraints를 enable시키려면 control file내에 REENABLE이라는 option을
    지정하여야 한다. 이 reenable option은 각 constraint마다 지정할 수는 없으며
    control file에 한번 지정하면 전체 integrity/check constraint에 영향을
    미치게 된다. 만약 reenable되는 과정에서 constraint를 위배하는 data가
    발견되면 해당 constraint는 enable되지 못하고 disabled status로 남게 되며,
    이렇게 위배된 data를 확인하기 위해서는 reenable clause에 exceptions option을 다음과 같이 추가하면 된다.
    reenable [exceptions table_name]
    이 때 table_name은 $ORACLE_HOME/rdbms/admin/utlexcpt.sql을 다른
    directory로copy하여 table이름을 exceptions가 아닌 다른 이름으로 만들어 수행시키면 된다.
    insert trigger도 integrity/check constraint와 같이 direct load가 시작하는
    시점에 disable되며, load가 끝나면 자동으로 enable된다. 단 enable되고 나서도
    load에 의해 입력된 data에 대해 trigger가 fire되지는 않는다.

  • Direct Path Loading Issues with Global Temporary Tables - OCI & OCILib

    I am writing some code to import data into a warehouse from a CPU grid which computes risk data. Due to the fact a computing grid is used there will be many clients which can load the data concurrently and at any point in time.
    Currently the import uses Binding in OCCI and chunking with a prepared statement to import the data into a global temporary table in a staging area after which a stored procedure is called within the same session which will process the data and load the data into a star schema.
    The GTT has the advantage that if any clients have issues no dirty data will be left and each client only sees their own instance of the data.
    I have been looking at using direct path loading to increase the performance of the load and have written some OCI code to perform the same task. I have manged to import the data into a regular heap based table using the OCI direct path apis. However when I try and use the same code to import against a Global Temporary Table I get an OCI Error (ORA-00600: internal error code, arguments: [6979], [16], [1], [1318528], [], [], [], [], [], [], [], [])
    I get error when the function OCIDirPathPrepare is executed. The same issue occurs in both OCI and OCILib.
    Is it not possible to use Direct Path Loading against a Global Temporry Table ? Because you can use the /*+ APPEND */ hint and load global temporary tables this way from tools like SQL Devloper / toad which is surely informing the SQL Engine to use Direct Path ?
    Looking at the table USER_OBJECTS I can see that for a Global Temporary Table the DATA_OBJECT_ID is null. Does this mean that it is impossible to us a direct path load into Global Temporary Tables ?
    Any ideas / suggestions would be really appreciated. If this means redesigning the application then I would appreciate suggestions which would allow many client to quick write processes in a parallel fashion. If this means creating a new parition in a Heap Table for each writer and direct path loading into this table then so be it.
    Thanks
    H
    Edited by: 813640 on 19-Nov-2010 11:08

    Replying to my own message in case anyone else is interested.
    I have now managed to successfully load data using direct path into a global temporary table with OCI. There appears to be no reason why this approach will not work.
    I loaded data into the temporary table and then issued a select count(*) on the table from within the session and from a new session. The results were as expected.
    The resaon for the ORA-006000 error was due to the fact that I had enabled table level parallel loading
    ie
    OCIAttrSet((dvoid *) context, (ub4) OCI_HTYPE_DIRPATH_CTX, *(ub1) 1*, (ub4)0, (ub4) OCI_ATTR_DIRPATH_PARALLEL, errhp)
    When loading a Global Temporary Table the OCI_ATTR_DIRPATH_PARALLEL attribute needs to be zero
    This makes sense, since the temp table does not have any partitions so it would not be possible to write in parallel to multiple paritions.
    Edited by: 813640 on 22-Nov-2010 08:42

  • SQL*Loader problem with direct path load

    Hi all,
    Its on Oracle 9.2
    I have a sqlldr control file which has couple of columns like,
    my_column_1 ,
    my_column_2 "decode(:my_column_1,'ONE','AAA','TWO','BBB', :my_column_1)"
    The table I am loading to is in user X and I am running the load from
    user Y. Everything works fine with conventional path load (not direct
    path) as grants are made for the table to user Y.
    When I load the data with same control file with direct path, I get an
    error ,
    01031 - insufficient privileges
    Is this anything to do with the syntex I have used in the control file
    or its a privilege issue. If its a privilege issue which privilege is
    that ?
    I did following tests,
    1) Load is run with conventional path load, from user Y and the decode
    statement is in control file - Load works
    2) Load is run with direct path load, from user Y and decode statement
    is in control file - Load fails with above mentioned error
    3) Load is run with direct path load, from user Y and decode IS REMOVED
    from the control file - Load works (!!!)
    What can be the conclusion? Way out ?
    Thanks and Regards

    You need to grant
    grant lock any table to userY;
    For more information see MetaLink Note 1082550.6

  • How to use direct select and insert or load to speedup the process instead of cursur

    Hi friends,
    I have stored procedure .In SP i am using cursur to load data from Parent to several child table.
    I have attached the script with this message.
    And my problem is how to use direct select and insert or load to speedup the process instead of cursur.
    Can any one please suggest me how to change this scripts pls.
    USE [IconicMarketing]
    GO
    /****** Object: StoredProcedure [dbo].[SP_DMS_INVENTORY] Script Date: 3/6/2015 3:34:03 PM ******/
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    -- =============================================
    -- Author: <ARUN,NAGARAJ>
    -- Create date: <11/21/2014>
    -- Description: <STORED PROCEDURE FOR DMS_INVENTORY>
    -- =============================================
    ALTER PROCEDURE [dbo].[SP_DMS_INVENTORY]
    @Specific_Date varchar(20) ,
    @DealerNum Varchar(6),
    @Date_Daily varchar(50)
    AS
    BEGIN
    SET NOCOUNT ON;
    --==========================================================================
    -- INVENTORY_CURSUR
    --==========================================================================
    DECLARE
    @FileType varchar(50),
    @ACDealerID varchar(50),
    @ClientDealerID varchar(50),
    @DMSType varchar(50),
    @StockNumber varchar(50),
    @InventoryDate datetime ,
    @StockType varchar(100),
    @DMSStatus varchar(50),
    @InvoicePrice numeric(18, 2),
    @CostPack varchar(50),
    @SalesCost numeric(18, 2),
    @HoldbackAmount numeric(18, 2),
    @ListPrice numeric(18, 2),
    @MSRP varchar(max),
    @LotLocation varchar(50),
    @TagLine varchar(max),
    @Certification varchar(max),
    @CertificationNumber varchar(max),
    @VehicleVIN varchar(50),
    @VehicleYear bigint ,
    @VehicleMake varchar(50),
    @VehicleModel varchar(50),
    @VehicleModelCode varchar(50),
    @VehicleTrim varchar(50),
    @VehicleSubTrimLevel varchar(max),
    @Classification varchar(max),
    @TypeCode varchar(100),
    @VehicleMileage bigint ,
    @EngineCylinderCount varchar(10) ,
    @TransmissionType varchar(50),
    @VehicleExteriorColor varchar(50),
    @VehicleInteriorColor varchar(50),
    @CreatedDate datetime ,
    @LastModifiedDate datetime ,
    @ModifiedFlag varchar(max),
    @InteriorColorCode varchar(50),
    @ExteriorColorCode varchar(50),
    @PackageCode varchar(50),
    @CodedCost varchar(50),
    @Air varchar(100),
    @OrderType varchar(max),
    @AgeDays bigint ,
    @OutstandingRO varchar(50),
    @DlrAccessoryRetail varchar(50),
    @DlrAccessoryCost varchar(max),
    @DlrAccessoryDesc varchar(max),
    @ModelDesc varchar(50),
    @Memo1 varchar(1000),
    @Memo2 varchar(max),
    @Weight varchar(max),
    @FloorPlan numeric(18, 2),
    @Purchaser varchar(max),
    @PurchasedFrom varchar(max),
    @InternetPrice varchar(50),
    @InventoryAcctDollar numeric(18, 2),
    @VehicleType varchar(50),
    @DealerAccessoryCode varchar(50),
    @AllInventoryAcctDollar numeric(18, 2),
    @BestPrice varchar(50),
    @InStock bigint ,
    @AccountingMake varchar(50),
    @GasDiesel varchar(max),
    @BookValue varchar(10),
    @FactoryAccessoryDescription varchar(max),
    @TotalReturn varchar(10),
    @TotalCost varchar(10),
    @SS varchar(max),
    @VehicleBody varchar(max),
    @StandardEquipment varchar(max),
    @Account varchar(max),
    @CalculatedPrice varchar(10),
    @OriginalCost varchar(10),
    @AccessoryCore varchar(10),
    @OtherDollar varchar(10),
    @PrimaryBookValue varchar(10),
    @AmountDue varchar(10),
    @LicenseFee varchar(10),
    @ICompany varchar(max),
    @InvenAcct varchar(max),
    @Field23 varchar(max),
    @Field24 varchar(max),
    @SalesCode bigint,
    @BaseRetail varchar(10),
    @BaseInvAmt varchar(10),
    @CommPrice varchar(10),
    @Price1 varchar(10),
    @Price2 varchar(10),
    @StickerPrice varchar(10),
    @TotInvAmt varchar(10),
    @OptRetail varchar(max),
    @OptInvAmt varchar(10),
    @OptCost varchar(10),
    @Options1 varchar(max),
    @Category varchar(max),
    @Description varchar(max),
    @Engine varchar(max),
    @ModelType varchar(max),
    @FTCode varchar(max),
    @Wholesale varchar(max),
    @Retail varchar(max),
    @Draft varchar(max),
    @myerror varchar(500),
    @Inventoryid int,
    @errornumber int,
    @errorseverity varchar(500),
    @errortable varchar(50),
    @errorstate int,
    @errorprocedure varchar(500),
    @errorline varchar(50),
    @errormessage varchar(1000),
    @Invt_Id int,
    @flatfile_createddate datetime,
    @FtpDate date,
    @Inv_cur varchar(1000),
    @S_Year varchar(4),
    @S_Month varchar(2),
    @S_Date varchar(2),
    @Date_Specfic varchar(50),
    @Param_list nvarchar(max),
    @Daily_Date Varchar(50);
    --====================================================================================
    --DECLARE CURSUR FOR SPECIFIC DATE (OR) DEALER-ID WITH SPECIFIC DATE (OR) CURRENT DATE
    --====================================================================================
    set @Date_Specfic = Substring(@Specific_Date,1,4) +'-'+Substring(@Specific_Date,5,2)+'-'+Substring(@Specific_Date,7,2);
    set @Daily_Date = SUBSTRING(@Date_Daily,14,4) + '-' + SUBSTRING(@Date_Daily,18,2)+ '-' + SUBSTRING(@date_Daily,20,2)
    IF @Daily_Date IS NOT NULL
    BEGIN
    Delete From [dbo].[DMS_INVENTORY_DETAILS]
    Where DMSInventoryID in(select ID from [dbo].[DMS_INVENTORY] where CONVERT (date,FtpDate)=CONVERT (date,GETDATE()));
    Delete From [dbo].[DMS_INVENTORY_AMOUNT]
    Where DMSInventoryID in(select ID from [dbo].[DMS_INVENTORY] where CONVERT (date,FtpDate)=CONVERT (date,GETDATE()));
    Delete From [dbo].[ICONIC_INVENTORY_VEHICLE]
    Where DMSInventoryVehicleID in(select ID from [dbo].[DMS_INVENTORY] where CONVERT (date,FtpDate)=CONVERT (date,GETDATE()));
    Delete From [dbo].[DMS_INVENTORY_VEHICLE]
    Where DMSInventoryID in(select ID from [dbo].[DMS_INVENTORY] where CONVERT (date,FtpDate)=CONVERT (date,GETDATE()));
    Delete From [dbo].[ICONIC_EQUITY_INVENTORY_COMPARE]
    Where InventoryVehicleId in(select ID from [dbo].[DMS_INVENTORY] where CONVERT (date,FtpDate)=CONVERT (date,GETDATE()));
    Delete From [dbo].[DMS_INVENTORY]
    Where ID in(select ID from [dbo].[DMS_INVENTORY] where CONVERT (date,FtpDate)=CONVERT (date,GETDATE()));
    DECLARE Inventory_Cursor CURSOR FOR
    SELECT * from [dbo].[FLATFILE_INVENTORY] where
    CONVERT (date,flatfile_createddate) = CONVERT (date,GETDATE()) order by flatfile_createddate;
    END
    Else
    BEGIN
    if (@Date_Specfic IS NOT NULL AND @DealerNum != '?????')
    BEGIN
    Delete From [dbo].[DMS_INVENTORY_DETAILS]
    Where DMSInventoryID in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic AND DMSDealerID='ACTEST' + @DealerNum);
    Delete From [dbo].[DMS_INVENTORY_AMOUNT]
    Where DMSInventoryID in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic AND DMSDealerID='ACTEST' + @DealerNum);
    Delete From [dbo].[ICONIC_INVENTORY_VEHICLE]
    Where DMSInventoryVehicleID in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic AND DMSDealerID='ACTEST' + @DealerNum);
    Delete From [dbo].[DMS_INVENTORY_VEHICLE]
    Where DMSInventoryID in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic AND DMSDealerID='ACTEST' + @DealerNum);
    Delete From [dbo].[ICONIC_EQUITY_INVENTORY_COMPARE]
    Where InventoryVehicleId in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic AND DMSDealerID='ACTEST' + @DealerNum);
    Delete From [dbo].[DMS_INVENTORY]
    Where ID in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic AND DMSDealerID='ACTEST' + @DealerNum);
    DECLARE Inventory_Cursor CURSOR FOR
    SELECT * from [dbo].[FLATFILE_INVENTORY] where FtpDate=@Date_Specfic AND ACDealerID='ACTEST' + @DealerNum;
    END
    ELSE
    BEGIN
    Delete From [dbo].[DMS_INVENTORY_DETAILS]
    Where DMSInventoryID in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic);
    Delete From [dbo].[DMS_INVENTORY_AMOUNT]
    Where DMSInventoryID in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic);
    Delete From [dbo].[ICONIC_INVENTORY_VEHICLE]
    Where DMSInventoryVehicleID in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic);
    Delete From [dbo].[DMS_INVENTORY_VEHICLE]
    Where DMSInventoryID in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic);
    Delete From [dbo].[ICONIC_EQUITY_INVENTORY_COMPARE]
    Where InventoryVehicleId in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic);
    Delete From [dbo].[DMS_INVENTORY]
    Where ID in(select ID from [dbo].[DMS_INVENTORY] where FtpDate=@Date_Specfic);
    DECLARE Inventory_Cursor CURSOR FOR
    SELECT * from [dbo].[FLATFILE_INVENTORY] where FtpDate=@Date_Specfic;
    END
    END
    OPEN Inventory_Cursor
    FETCH NEXT FROM Inventory_Cursor
    INTO
    @FileType ,
    @ACDealerID ,
    @ClientDealerID ,
    @DMSType ,
    @StockNumber ,
    @InventoryDate ,
    @StockType ,
    @DMSStatus ,
    @InvoicePrice ,
    @CostPack ,
    @SalesCost ,
    @HoldbackAmount ,
    @ListPrice ,
    @MSRP ,
    @LotLocation ,
    @TagLine ,
    @Certification ,
    @CertificationNumber ,
    @VehicleVIN ,
    @VehicleYear ,
    @VehicleMake ,
    @VehicleModel ,
    @VehicleModelCode ,
    @VehicleTrim ,
    @VehicleSubTrimLevel ,
    @Classification ,
    @TypeCode ,
    @VehicleMileage ,
    @EngineCylinderCount ,
    @TransmissionType ,
    @VehicleExteriorColor ,
    @VehicleInteriorColor ,
    @CreatedDate ,
    @LastModifiedDate ,
    @ModifiedFlag ,
    @InteriorColorCode ,
    @ExteriorColorCode ,
    @PackageCode ,
    @CodedCost ,
    @Air ,
    @OrderType ,
    @AgeDays ,
    @OutstandingRO ,
    @DlrAccessoryRetail ,
    @DlrAccessoryCost ,
    @DlrAccessoryDesc ,
    @ModelDesc ,
    @Memo1 ,
    @Memo2 ,
    @Weight ,
    @FloorPlan ,
    @Purchaser ,
    @PurchasedFrom ,
    @InternetPrice ,
    @InventoryAcctDollar ,
    @VehicleType ,
    @DealerAccessoryCode ,
    @AllInventoryAcctDollar ,
    @BestPrice ,
    @InStock ,
    @AccountingMake ,
    @GasDiesel ,
    @BookValue ,
    @FactoryAccessoryDescription ,
    @TotalReturn ,
    @TotalCost ,
    @SS ,
    @VehicleBody ,
    @StandardEquipment ,
    @Account ,
    @CalculatedPrice ,
    @OriginalCost ,
    @AccessoryCore ,
    @OtherDollar ,
    @PrimaryBookValue ,
    @AmountDue ,
    @LicenseFee ,
    @ICompany ,
    @InvenAcct ,
    @Field23 ,
    @Field24 ,
    @SalesCode ,
    @BaseRetail ,
    @BaseInvAmt ,
    @CommPrice ,
    @Price1 ,
    @Price2 ,
    @StickerPrice ,
    @TotInvAmt ,
    @OptRetail ,
    @OptInvAmt ,
    @OptCost ,
    @Options1 ,
    @Category ,
    @Description ,
    @Engine ,
    @ModelType ,
    @FTCode ,
    @Wholesale ,
    @Retail ,
    @Draft ,
    @flatfile_createddate,
    @FtpDate;
    WHILE @@FETCH_STATUS = 0
    BEGIN
    --==========================================================================
    -- INSERT INTO INVENTORY (PARENT TABLE)
    --==========================================================================
    BEGIN TRY
    INSERT INTO [dbo].[DMS_INVENTORY]
    DMSDealerID,
    StockNumber,
    DMSType,
    InventoryDate,
    FtpDate
    VALUES (@ClientDealerID,@StockNumber,@DMSType,@InventoryDate,@FtpDate);
    END TRY
    BEGIN CATCH
    SELECT
    @errornumber = ERROR_NUMBER(),
    @errorseverity = ERROR_SEVERITY(),
    @errortable = 'DMS_INVENTORY',
    @errorstate = ERROR_STATE(),
    @errorprocedure = ERROR_PROCEDURE(),
    @errorline = ERROR_LINE(),
    @errormessage = ERROR_MESSAGE();
    --==========================================================================
    -- INSERT ERRORS INTO DMSLOG_INVENTORY_ERROR
    --==========================================================================
    EXEC [SP_DMS_INVENTORY_ERROR] @FileType,@ACDealerID,@ClientDealerID,@DMSType,@StockNumber,@InventoryDate,@StockType,@DMSStatus,@InvoicePrice,@CostPack,
    @SalesCost,@HoldbackAmount,@ListPrice,@MSRP,@LotLocation,@TagLine,@Certification,@CertificationNumber,@VehicleVIN,@VehicleYear,@VehicleMake,@VehicleModel,@VehicleModelCode,
    @VehicleTrim,@VehicleSubTrimLevel,@Classification,@TypeCode,@VehicleMileage,@EngineCylinderCount,@TransmissionType,@VehicleExteriorColor,@VehicleInteriorColor,
    @CreatedDate,@LastModifiedDate,@ModifiedFlag,@InteriorColorCode,@ExteriorColorCode,@PackageCode,@CodedCost,@Air,@OrderType,@AgeDays,@OutstandingRO,
    @DlrAccessoryRetail,@DlrAccessoryCost,@DlrAccessoryDesc,@ModelDesc,@Memo1,@Memo2,@Weight,@FloorPlan,@Purchaser,@PurchasedFrom,@InternetPrice,
    @InventoryAcctDollar,@VehicleType,@DealerAccessoryCode,@AllInventoryAcctDollar,@BestPrice,@InStock,@AccountingMake,@GasDiesel,@BookValue,
    @FactoryAccessoryDescription,@TotalReturn,@TotalCost,@SS,@VehicleBody,@StandardEquipment,@Account,@CalculatedPrice,@OriginalCost,@AccessoryCore,
    @OtherDollar,@PrimaryBookValue,@AmountDue,@LicenseFee,@ICompany,@InvenAcct,@Field23,@Field24,@SalesCode,@BaseRetail,@BaseInvAmt,@CommPrice,@Price1,
    @Price2,@StickerPrice,@TotInvAmt,@OptRetail,@OptInvAmt,@OptCost,@Options1,@Category,@Description,@Engine,@ModelType,@FTCode,@Wholesale,@Retail,@Draft,
    @ERRORNUMBER,@ERRORSEVERITY,@ERRORTABLE,@ERRORSTATE,@ERRORPROCEDURE,@ERRORLINE,@errormessage,@FtpDate
    END CATCH
    -- PRINT @errornumber;
    -- PRINT @errorseverity;
    -- PRINT @errortable;
    -- PRINT @errorprocedure;
    -- PRINT @errorline;
    -- PRINT @errormessage;
    -- PRINT @errorstate;
    set @myerror = @@ERROR;
    -- This -- PRINT statement -- PRINTs 'Error = 0' because
    -- @@ERROR is reset in the IF statement above.
    -- PRINT N'Error = ' + @myerror;
    set @Inventoryid = scope_identity();
    -- PRINT @Inventoryid;
    --==========================================================================
    -- INSERT INTO DMS_INVENTORY_DETAILS (CHILD TABLE)
    --==========================================================================
    BEGIN TRY
    INSERT INTO [dbo].[DMS_INVENTORY_DETAILS]
    DMSInventoryID,
    StockType,
    DMSStatus,
    LotLocation,
    TagLine,
    Certification,
    CertificationNumber,
    CreatedDate,
    LastModifiedDate,
    ModifiedFlag,
    PackageCode,
    OrderType,
    AgeDays,
    OutstandingRO,
    Memo1,
    Memo2,
    Purchaser,
    PurchasedFrom,
    DealerAccessoryCode,
    InStock,
    AccountingMake,
    SS,
    Account,
    AccessoryCore,
    ICompany,
    InvenAcct,
    Field23,
    Field24,
    SalesCode,
    Draft,
    FTCode,
    FtpDate
    VALUES (
    @InventoryID,
    @StockType,
    @DMSStatus,
    @LotLocation,
    @TagLine,
    @Certification,
    @CertificationNumber,
    @CreatedDate,
    @LastModifiedDate,
    @ModifiedFlag,
    @PackageCode,
    @OrderType,
    @AgeDays,
    @OutstandingRO,
    @Memo1,
    @Memo2,
    @Purchaser,
    @PurchasedFrom,
    @DealerAccessoryCode,
    @InStock,
    @AccountingMake,
    @SS,
    @Account,
    @AccessoryCore,
    @ICompany,
    @InvenAcct,
    @Field23,
    @Field24,
    @SalesCode,
    @Draft,
    @FTCode,
    @FtpDate
    END TRY
    BEGIN CATCH
    SELECT
    @errornumber = ERROR_NUMBER(),
    @errorseverity = ERROR_SEVERITY(),
    @errorstate = ERROR_STATE(),
    @errortable = 'DMS_INVENTORY_DETAILS',
    @errorprocedure = ERROR_PROCEDURE(),
    @errorline = ERROR_LINE(),
    @errormessage = ERROR_MESSAGE();
    --==========================================================================
    -- INSERT ERRORS INTO DMSLOG_INVENTORY_ERROR
    --==========================================================================
    EXECUTE [SP_DMS_INVENTORY_ERROR] @FileType,@ACDealerID,@ClientDealerID,@DMSType,@StockNumber,@InventoryDate,@StockType,@DMSStatus,@InvoicePrice,@CostPack,
    @SalesCost,@HoldbackAmount,@ListPrice,@MSRP,@LotLocation,@TagLine,@Certification,@CertificationNumber,@VehicleVIN,@VehicleYear,@VehicleMake,@VehicleModel,@VehicleModelCode,
    @VehicleTrim,@VehicleSubTrimLevel,@Classification,@TypeCode,@VehicleMileage,@EngineCylinderCount,@TransmissionType,@VehicleExteriorColor,@VehicleInteriorColor,
    @CreatedDate,@LastModifiedDate,@ModifiedFlag,@InteriorColorCode,@ExteriorColorCode,@PackageCode,@CodedCost,@Air,@OrderType,@AgeDays,@OutstandingRO,
    @DlrAccessoryRetail,@DlrAccessoryCost,@DlrAccessoryDesc,@ModelDesc,@Memo1,@Memo2,@Weight,@FloorPlan,@Purchaser,@PurchasedFrom,@InternetPrice,
    @InventoryAcctDollar,@VehicleType,@DealerAccessoryCode,@AllInventoryAcctDollar,@BestPrice,@InStock,@AccountingMake,@GasDiesel,@BookValue,
    @FactoryAccessoryDescription,@TotalReturn,@TotalCost,@SS,@VehicleBody,@StandardEquipment,@Account,@CalculatedPrice,@OriginalCost,@AccessoryCore,
    @OtherDollar,@PrimaryBookValue,@AmountDue,@LicenseFee,@ICompany,@InvenAcct,@Field23,@Field24,@SalesCode,@BaseRetail,@BaseInvAmt,@CommPrice,@Price1,
    @Price2,@StickerPrice,@TotInvAmt,@OptRetail,@OptInvAmt,@OptCost,@Options1,@Category,@Description,@Engine,@ModelType,@FTCode,@Wholesale,@Retail,@Draft,
    @ERRORNUMBER,@ERRORSEVERITY,@ERRORTABLE,@ERRORSTATE,@ERRORPROCEDURE,@ERRORLINE,@errormessage,@FtpDate
    END CATCH
    --==========================================================================
    -- INSERT INTO DMS_INVENTORY_AMOUNT (CHILD TABLE)
    --==========================================================================
    BEGIN TRY
    INSERT INTO [dbo].[DMS_INVENTORY_AMOUNT]
    DMSInventoryID,
    AllInventoryAcctDollar,
    OtherDollar,
    PrimaryBookValue,
    AmountDue,
    LicenseFee,
    CalculatedPrice,
    OriginalCost,
    BookValue,
    TotalReturn,
    TotalCost,
    DlrAccessoryRetail,
    DlrAccessoryCost,
    DlrAccessoryDesc,
    InternetPrice,
    InventoryAcctDollar,
    BestPrice,
    Weight,
    FloorPlan,
    CodedCost,
    InvoicePrice,
    CostPack,
    SalesCost,
    HoldbackAmount,
    ListPrice,
    MSRP,
    BaseRetail,
    BaseInvAmt,
    CommPrice,
    Price1,
    Price2,
    StickerPrice,
    TotInvAmt,
    OptRetail,
    OptInvAmt,
    OptCost,
    Wholesale,
    Retail,
    FtpDate
    VALUES (
    @InventoryID,
    @AllInventoryAcctDollar,
    @OtherDollar,
    @PrimaryBookValue,
    @AmountDue,
    @LicenseFee,
    @CalculatedPrice,
    @OriginalCost,
    @BookValue,
    @TotalReturn,
    @TotalCost,
    @DlrAccessoryRetail,
    @DlrAccessoryCost,
    @DlrAccessoryDesc,
    @InternetPrice,
    @InventoryAcctDollar,
    @BestPrice,
    @Weight,
    @FloorPlan,
    @CodedCost,
    @InvoicePrice,
    @CostPack,
    @SalesCost,
    @HoldbackAmount,
    @ListPrice,
    @MSRP,
    @BaseRetail,
    @BaseInvAmt,
    @CommPrice,
    @Price1,
    @Price2,
    @StickerPrice,
    @TotInvAmt,
    @OptRetail,
    @OptInvAmt,
    @OptCost,
    @Wholesale,
    @Retail,
    @FtpDate
    END TRY
    BEGIN CATCH
    SELECT
    @errornumber = ERROR_NUMBER(),
    @errorseverity = ERROR_SEVERITY(),
    @errortable = 'DMS_INVENTORY_AMOUNT',
    @errorstate = ERROR_STATE(),
    @errorprocedure = ERROR_PROCEDURE(),
    @errorline = ERROR_LINE(),
    @errormessage = ERROR_MESSAGE();
    --==========================================================================
    -- INSERT ERRORS INTO DMSLOG_INVENTORY_ERROR
    --==========================================================================
    EXEC [SP_DMS_INVENTORY_ERROR] @FileType,@ACDealerID,@ClientDealerID,@DMSType,@StockNumber,@InventoryDate,@StockType,@DMSStatus,@InvoicePrice,@CostPack,
    @SalesCost,@HoldbackAmount,@ListPrice,@MSRP,@LotLocation,@TagLine,@Certification,@CertificationNumber,@VehicleVIN,@VehicleYear,@VehicleMake,@VehicleModel,@VehicleModelCode,
    @VehicleTrim,@VehicleSubTrimLevel,@Classification,@TypeCode,@VehicleMileage,@EngineCylinderCount,@TransmissionType,@VehicleExteriorColor,@VehicleInteriorColor,
    @CreatedDate,@LastModifiedDate,@ModifiedFlag,@InteriorColorCode,@ExteriorColorCode,@PackageCode,@CodedCost,@Air,@OrderType,@AgeDays,@OutstandingRO,
    @DlrAccessoryRetail,@DlrAccessoryCost,@DlrAccessoryDesc,@ModelDesc,@Memo1,@Memo2,@Weight,@FloorPlan,@Purchaser,@PurchasedFrom,@InternetPrice,
    @InventoryAcctDollar,@VehicleType,@DealerAccessoryCode,@AllInventoryAcctDollar,@BestPrice,@InStock,@AccountingMake,@GasDiesel,@BookValue,
    @FactoryAccessoryDescription,@TotalReturn,@TotalCost,@SS,@VehicleBody,@StandardEquipment,@Account,@CalculatedPrice,@OriginalCost,@AccessoryCore,
    @OtherDollar,@PrimaryBookValue,@AmountDue,@LicenseFee,@ICompany,@InvenAcct,@Field23,@Field24,@SalesCode,@BaseRetail,@BaseInvAmt,@CommPrice,@Price1,
    @Price2,@StickerPrice,@TotInvAmt,@OptRetail,@OptInvAmt,@OptCost,@Options1,@Category,@Description,@Engine,@ModelType,@FTCode,@Wholesale,@Retail,@Draft,
    @ERRORNUMBER,@ERRORSEVERITY,@ERRORTABLE,@ERRORSTATE,@ERRORPROCEDURE,@ERRORLINE,@errormessage,@FtpDate
    END CATCH
    --==========================================================================
    -- INSERT INTO DMS_INVENTORY_VEHICLE (CHILD TABLE)
    --==========================================================================
    BEGIN TRY
    INSERT INTO [dbo].[DMS_INVENTORY_VEHICLE]
    DMSInventoryID,
    InteriorColorCode,
    ExteriorColorCode,
    Air,
    ModelDesc,
    VehicleType,
    VehicleVIN,
    VehicleYear,
    VehicleMake,
    VehicleModel,
    VehicleModelCode,
    VehicleTrim,
    VehicleSubTrimLevel,
    Classification,
    TypeCode,
    VehicleMileage,
    FtpDate,
    EngineCylinderCount
    VALUES (
    @InventoryID,
    @InteriorColorCode,
    @ExteriorColorCode,
    @Air,
    @ModelDesc,
    @VehicleType,
    @VehicleVIN,
    @VehicleYear,
    @VehicleMake,
    @VehicleModel,
    @VehicleModelCode,
    @VehicleTrim,
    @VehicleSubTrimLevel,
    @Classification,
    @TypeCode,
    @VehicleMileage,
    @FtpDate,
    @EngineCylinderCount
    END TRY
    BEGIN CATCH
    SELECT
    @errornumber = ERROR_NUMBER(),
    @errorseverity = ERROR_SEVERITY(),
    @errortable = 'DMS_INVENTORY_VEHICLE',
    @errorstate = ERROR_STATE(),
    @errorprocedure = ERROR_PROCEDURE(),
    @errorline = ERROR_LINE(),
    @errormessage = ERROR_MESSAGE();
    --==========================================================================
    -- INSERT ERRORS INTO DMSLOG_INVENTORY_ERROR
    --==========================================================================
    EXEC [SP_DMS_INVENTORY_ERROR] @FileType,@ACDealerID,@ClientDealerID,@DMSType,@StockNumber,@InventoryDate,@StockType,@DMSStatus,@InvoicePrice,@CostPack,
    @SalesCost,@HoldbackAmount,@ListPrice,@MSRP,@LotLocation,@TagLine,@Certification,@CertificationNumber,@VehicleVIN,@VehicleYear,@VehicleMake,@VehicleModel,@VehicleModelCode,
    @VehicleTrim,@VehicleSubTrimLevel,@Classification,@TypeCode,@VehicleMileage,@EngineCylinderCount,@TransmissionType,@VehicleExteriorColor,@VehicleInteriorColor,
    @CreatedDate,@LastModifiedDate,@ModifiedFlag,@InteriorColorCode,@ExteriorColorCode,@PackageCode,@CodedCost,@Air,@OrderType,@AgeDays,@OutstandingRO,
    @DlrAccessoryRetail,@DlrAccessoryCost,@DlrAccessoryDesc,@ModelDesc,@Memo1,@Memo2,@Weight,@FloorPlan,@Purchaser,@PurchasedFrom,@InternetPrice,
    @InventoryAcctDollar,@VehicleType,@DealerAccessoryCode,@AllInventoryAcctDollar,@BestPrice,@InStock,@AccountingMake,@GasDiesel,@BookValue,
    @FactoryAccessoryDescription,@TotalReturn,@TotalCost,@SS,@VehicleBody,@StandardEquipment,@Account,@CalculatedPrice,@OriginalCost,@AccessoryCore,
    @OtherDollar,@PrimaryBookValue,@AmountDue,@LicenseFee,@ICompany,@InvenAcct,@Field23,@Field24,@SalesCode,@BaseRetail,@BaseInvAmt,@CommPrice,@Price1,
    @Price2,@StickerPrice,@TotInvAmt,@OptRetail,@OptInvAmt,@OptCost,@Options1,@Category,@Description,@Engine,@ModelType,@FTCode,@Wholesale,@Retail,@Draft,
    @ERRORNUMBER,@ERRORSEVERITY,@ERRORTABLE,@ERRORSTATE,@ERRORPROCEDURE,@ERRORLINE,@errormessage,@FtpDate
    END CATCH
    --==========================================================================
    -- MOVE CURSUR TO NEXT RECORD
    --==========================================================================
    FETCH NEXT FROM Inventory_Cursor
    INTO @FileType ,
    @ACDealerID ,
    @ClientDealerID ,
    @DMSType ,
    @StockNumber ,
    @InventoryDate ,
    @StockType ,
    @DMSStatus ,
    @InvoicePrice ,
    @CostPack ,
    @SalesCost ,
    @HoldbackAmount ,
    @ListPrice ,
    @MSRP ,
    @LotLocation ,
    @TagLine ,
    @Certification ,
    @CertificationNumber ,
    @VehicleVIN ,
    @VehicleYear ,
    @VehicleMake ,
    @VehicleModel ,
    @VehicleModelCode ,
    @VehicleTrim ,
    @VehicleSubTrimLevel ,
    @Classification ,
    @TypeCode ,
    @VehicleMileage ,
    @EngineCylinderCount ,
    @TransmissionType ,
    @VehicleExteriorColor ,
    @VehicleInteriorColor ,
    @CreatedDate ,
    @LastModifiedDate ,
    @ModifiedFlag ,
    @InteriorColorCode ,
    @ExteriorColorCode ,
    @PackageCode ,
    @CodedCost ,
    @Air ,
    @OrderType ,
    @AgeDays ,
    @OutstandingRO ,
    @DlrAccessoryRetail ,
    @DlrAccessoryCost ,
    @DlrAccessoryDesc ,
    @ModelDesc ,
    @Memo1 ,
    @Memo2 ,
    @Weight ,
    @FloorPlan ,
    @Purchaser ,
    @PurchasedFrom ,
    @InternetPrice ,
    @InventoryAcctDollar ,
    @VehicleType ,
    @DealerAccessoryCode ,
    @AllInventoryAcctDollar ,
    @BestPrice ,
    @InStock ,
    @AccountingMake ,
    @GasDiesel ,
    @BookValue ,
    @FactoryAccessoryDescription ,
    @TotalReturn ,
    @TotalCost ,
    @SS ,
    @VehicleBody ,
    @StandardEquipment ,
    @Account ,
    @CalculatedPrice ,
    @OriginalCost ,
    @AccessoryCore ,
    @OtherDollar ,
    @PrimaryBookValue ,
    @AmountDue ,
    @LicenseFee ,
    @ICompany ,
    @InvenAcct ,
    @Field23 ,
    @Field24 ,
    @SalesCode ,
    @BaseRetail ,
    @BaseInvAmt ,
    @CommPrice ,
    @Price1 ,
    @Price2 ,
    @StickerPrice ,
    @TotInvAmt ,
    @OptRetail ,
    @OptInvAmt ,
    @OptCost ,
    @Options1 ,
    @Category ,
    @Description ,
    @Engine ,
    @ModelType ,
    @FTCode ,
    @Wholesale ,
    @Retail ,
    @Draft ,
    @flatfile_createddate,
    @FtpDate;
    END
    CLOSE Inventory_Cursor;
    DEALLOCATE Inventory_Cursor;
    SET ANSI_PADDING OFF
    END
    Arunraj Kumar

    Thank you.
    And another question if the data is already there in the child table if i try to load alone it must delete the old data in the child tablee and need to get load the new data and 
    How to do this ?
    Arunraj Kumar
    You can do that with an IF EXISTS condition
    IF EXISTS (SELECT 1
    FROM YourChildTable c
    INNER JOIn @temptable t
    ON c.Bkey1 = t.Bkey1
    AND c.Bkey2 = t.Bkey2
    DELETE t
    FROM YourChildTable c
    INNER JOIn @temptable t
    ON c.Bkey1 = t.Bkey1
    AND c.Bkey2 = t.Bkey2
    INSERT INTO YourChildTable
    where Bkey1,Bkey2 etc forms the business key of the table
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • How to load metadata directly into essbase 9.3 ?

    Hi all,
    When I load metadata directly into essbase with IKM SQL to Hyperion Essbase (METADATA).
    My metadata is already on Microsoft SQL 2000. After execute an interface that load metadata to essbase and check in the essbase outline , It seem that outline data could not retrieve correctly.And still have many rejected records.
    My question is :
    Can I use the same rule file that used for load a metadata from text file?
    Because in this case I use the same rule file that load metadata from text file.
    Let me know if need more information on this.
    Thank you in advance.

    Hi,
    You can use the same rules file, make sure in the IKM options you have the same rules seperator as in your rules file.
    You can also turn on error logging and logging in the IKM options so you should get more information to why the metaload is failing
    Cheers
    John
    http://john-goodwin.blogspot.com/

Maybe you are looking for

  • Icloud is not working

    icloud is not working

  • Kubuntu 14.04 + Lenovo X240 + Thinkpad Ultra Dock 90w - multiply display

    Operating system - Kubuntu 14.04 I have Lenovo X240 two Dell monitors connected to DP1 and DP2 of the Thinkpad Ultra Dock 90w. Now it works only in next modes: Two connected monitors works in clone mode (each 1920x1080). Two connected monitors works

  • IMovie taking forever to share anything

    I've finished editing hours of video. Spent days on it. It is all chopped up into smaller movies usually under an hour in length. I have let it sit overnight to try and export and it doesn't finish, it takes forever. If I click share to file and unch

  • Consuming  WebServices using  WebDynPro

    Hi  Need help in finding some good material/hands on  for usage and consuming custom/External web services in  some sample code/project in Web DynPro kindly suggest some tutorials including hands on the  same thanks

  • Accessing DLL buffer data after receiving Windows Message

    Hi, I am programming MS4404 bar code scanner using LabVIEW. This is a miniscan and does not have trigger button. So we need to trigger it using Software.  Could communicates with scanner to enable. disable, beep, etc....issue is with the decoded data