Access issues while inserting data in a table in same schema

Hi All.
I have a script that at first creates and then populates a table. My script used to run fine in production environment till few hours back. But all of a sudden, it is popping up error while inserting data into the table .
Error message - "Insufficient Previlages".
Please suggest me what may be the reasons for this kind of error.
Thanks in advance

Sonika wrote:
Hi All.
I have a script that at first creates and then populates a table. My script used to run fine in production environment till few hours back. But all of a sudden, it is popping up error while inserting data into the table .
Error message - "Insufficient Previlages".
Please suggest me what may be the reasons for this kind of error.
1) something changed
2) you are hitting a bug

Similar Messages

  • Error while inserting data into a table.

    Hi All,
      I created a table.While inserting data into the table i am getting an error.Its telling "Create data Processing Function Module".Can any one help me regarding this?
    Thanx in advance
    anirudh

    Hi Anirudh,
      Seems there is already an entry in the Table with the same Primary Key.
    INSERT Statement will give short dump if you try to insert data with same key.
    Why dont you use MODIFY statement to achieve the same.
    Reward points if this Helps.
    Manish

  • Error while Inserting data into flow table

    Hi All,
    I am very new to ODI, I am facing lot of problem in my 1st interface. So I have many questions here, please forgive me if it has irritated to you.
    ========================
    I am developing a simple Project to load a data from an input source file (csv) file into a staging table.
    My plan is to achieve this in 3 interfaces:
    1. Interface-1 : Load the data from an input source (csv) file into a staging table (say Stg_1)
    2. Interface-2 : Read the data from the staging table (stg_1) apply the business rules to it and copy the processed records into another staging table (say stg_2)
    3. Interface-3 : Copy the data from staging table (stg_2) into the target table (say Target) in the target database.
    Question-1 : Is this approach correct?
    ========================
    I don't have any key columns in the staging table (stg_1). When I tried to execute the Flow Control of this I got an error:
    Flow Control not possible if no Key is declared in your Target Datastore
    With one of the response (the response was - "FLOW control requires a KEY in the target table") in this Forum I have introduced a column called "Record_ID" and made it a Primary Key column into my staging table (stg_1) and my problem has been resolved.
    Question-2 : Is a Key column compulsary in the target table? I am working in BO Data Integrator, there is no such compulsion ... I am little confused.
    ========================
    Next, I have defined one Project level sequence. I have mapped the newly introduced key column Record_Id (Primary Key) with the Project level sequence. Now I am got another error of "CKM not selected".
    For this, I have inserted "Insert Check (CKM)" knowledge module in my Project. With this the above problem of "CKM not selected" has been resolved.
    Question-3 : When is this CKM knowledge module required?
    ========================
    After this, the flow/interface is failing while loading data into the intermediar ODI created flow table (I$)
    1 - Loading - SS_0 - Drop work table
    2 - Loading - SS_0 - Create work table
    3 - Loading - SS_0 - Load data
    5 - Integration - FTE Actual data to Staging table - Drop flow table
    6 - Integration - FTE Actual data to Staging table - Create flow table I$
    7 - Integration - FTE Actual data to Staging table - Delete target table
    8 - Integration - FTE Actual data to Staging table - Insert flow into I$ table
    The Error is at Step-8 above. When opened the "Execution" tab for this step I found the message - "Missing parameter Project_1.FTE_Actual_Data_seq_NEXTVAL RECORD_ID".
    Question-4 : What/why is this error? Did I made any mistake while creating a sequence?

    Everyone is new and starts somewhere. And the community is there to help you.
    1.) What is the idea of moving data from stg_1 and then to stg_2 ? Do you really need it for any other purpose other than move data from SourceFile to Target DB.
    Otherwise, its simple to move data from SourceFile -> Target Table
    2.) Does your Target table have a Key ?
    3.) CKM (Check KM) is required when you want to do constraint validation (Checking) on your data. You can define constraints (business rules) on the target table and Flow Control will check the data that is flowing from Source File to Target table using the CKM. All the records that donot satisfy the constraint will be added to E$ (Error table) and will not be added to the Target table.
    4.) Try to avoid ODI sequences. They are slow and arent scalable. Try to use Database sequence wherever possible. And use the DB sequence is target mapping as
    <%=odiRef.getObjectName( "L" , "MY_DB_Sequence_Row" , "D" )%>.nextval
    where MY_DB_Sequence_Row is the oracle sequence in the target schema.
    HTH

  • Issues while Populating data in Tree Table..

    Hi,
    I am using Tree Table component to populate Hierarchical data in it.
    I create data controls based on web service proxy.
    While creating Tree table i selected the Display Attributes to show in Tree table.
    Now i am struck with two requirements:
    1. At run time when i see the data in Tree table i can see that Display Attributes for mixed.
    For Example:
    Andrew> Phone
    Work 123456789 // Here we can notice that, two attributes data displayed with on space gap. Phone Type and Phone number
    Home 987654321 need to show some separation character between thenm something like : Work - 123456789 Is it possible?
    > Email
    work [email protected]
    2. I need to show popup on click on Root node of the Tree table. If we take the above example, i want to perform a click event on "Andrew" so that i can popup and show some details.
    But when i try to insert a command link in Node. Parent & Child nodes are populated with command link. How to have command link only for parent Node.
    Code i am using:
    *<af:treeTable value="#{bindings.contact.treeModel}" var="node"*
    *selectionListener="#{bindings.contact.treeModel.makeCurrent}"*
    *rowSelection="single"*
    *binding="#{backingBeanScope.EditValidationDetails.tt1}"*
    *id="tt1" width="920">*
    *<f:facet name="nodeStamp">*
    *<af:column id="c1" width="800" filterable="true">*
    *<af:commandLink text="#{node}" id="cl2"/>*
    *</af:column>*
    *</f:facet>*
    *<f:facet name="pathStamp">*
    *<af:outputText value="#{node}"*
    *binding="#{backingBeanScope.EditValidationDetails.ot3}"*
    *id="ot3"/>*
    *</f:facet>*
    *</af:treeTable>*
    Thanks in Advance...
    Regards
    Thoom

    Hi,
    I am using Tree Table component to populate Hierarchical data in it.
    I create data controls based on web service proxy.
    While creating Tree table i selected the Display Attributes to show in Tree table.
    Now i am struck with two requirements:
    1. At run time when i see the data in Tree table i can see that Display Attributes for mixed.
    For Example:
    Andrew
    --Phone
    ----Work 123456789 // Here we can notice that, two attributes data displayed with on space gap. Phone Type and Phone number
    ----Home 987654321 need to show some separation character between thenm something like : Work - 123456789 Is it possible?
    --Email
    ----work [email protected]
    2. I need to show popup on click on Root node of the Tree table. If we take the above example, i want to perform a click event on "Andrew" so that i can popup and show some details.
    But when i try to insert a command link in Node. Parent & Child nodes are populated with command link. How to have command link only for parent Node.
    Code i am using:
    <
    <af:treeTable value="#{bindings.contact.treeModel}" var="node"
    selectionListener="#{bindings.contact.treeModel.makeCurrent}"
    rowSelection="single"
    binding="#{backingBeanScope.EditValidationDetails.tt1}"
    id="tt1" width="920">
    <f:facet name="nodeStamp">
    <af:column id="c1" width="800" filterable="true">
    <af:commandLink text="#{node}" id="cl2"/>
    </af:column>
    </f:facet>
    <f:facet name="pathStamp">
    <af:outputText value="#{node}"
    binding="#{backingBeanScope.EditValidationDetails.ot3}"
    id="ot3"/>
    </f:facet>
    </af:treeTable>
    >
    Thanks in Advance...
    Regards
    Thoom

  • Error in native SQL while inserting data into CORP table

    HI all,
    I am getting an exception while inserting records using native SQL into CORP table.
    PFB the code.
    LOOP AT gi_hrp1001 INTO wa_hrp1001.
      TRY.
            EXEC SQL.
              INSERT INTO misuser.Table_4
              VALUES (wa_hrp1001-otype , wa_hrp1001-objid ).
            ENDEXEC.
          CATCH cx_sy_native_sql_error.
            MESSAGE 'Connect - Error ' TYPE 'E'.
        ENDTRY.
      ENDLOOP.
    Please help.
    Thanks & Regards
    Nitesh

    I see two issue with your Insert. 1) you need to specify the field names in the table, for example otype and objid. 2) You need to indicate in the VALUES that you're using host variables by using the prefix :.
    I think the correct syntax is:
            EXEC SQL.
              INSERT INTO misuser.Table_4
             (otype, objid)
              VALUES (:wa_hrp1001-otype, :wa_hrp1001-objid)
            ENDEXEC.
    Good luck.

  • Performance issue while inserting data

    I have a .dat file containing the required data for the table.It has almost half a million rows.When i inserted the data using SQL LOADER it took
    almost whopping 70 minutes .How can i insert the data fast.what should i include in my control file to insert data fast.
    Or any other method?2
    oracle - 11gR2
    Cant use external tables as data is not of fixed length
    Edited by: Rahul_India on Oct 6, 2012 1:56 AM

    True but irrelevant.
    If the reason for the 70 minute load is CPU starvation the load time will remain 70 minutes.
    If the reason for the 70 minute load time is that a DBA implemented workload management the load time will remain 70 minutes.
    If the reason for the 70 minute load time is network saturation the load time will remain 70 minutes.
    If the reason for the 70 minute load time is the SAN bus is saturated the load time will remain 70 minutes.
    If the OP is on a RAC cluster and loading is happening on the wrong node causing remastering the load time will remain 70 minutes.
    and I could add many dozens more to this list without breaking a sweat.
    None of us have any basis for making any recommendation and again my intent here is not to be rude but sending the OP off on a wild goose chase saying DIRECT LOAD is faster, of course it is, or External Tables are faster, or whatever has no value if you do not first stop and ask the single most important question which is: "Why is the load taking 70 minutes?" Which is a question none of us can answer.
    What I am encouraging you to do is stop giving advice when you don't have a single byte of relevant information upon which to base a recommendation.
    Are you serious suggesting parallel execution? I do have experience with it. If CPU starvation is the root cause I recommend you try to work through the impact of your suggestion ... you will bring the server to its knees begging for mercy. So again please stop making recommendations until the OP responds with facts.

  • Performance issues while query data from a table having large records

    Hi all,
    I have a performance issues on the queries on mtl_transaction_accounts table which has around 48,000,000 rows. One of the query is as below
    SQL ID: 98pqcjwuhf0y6 Plan Hash: 3227911261
    SELECT SUM (B.BASE_TRANSACTION_VALUE)
    FROM
    MTL_TRANSACTION_ACCOUNTS B , MTL_PARAMETERS A  
    WHERE A.ORGANIZATION_ID =    B.ORGANIZATION_ID 
    AND A.ORGANIZATION_ID =  :b1 
    AND B.REFERENCE_ACCOUNT =    A.MATERIAL_ACCOUNT 
    AND B.TRANSACTION_DATE <=  LAST_DAY (TO_DATE (:b2 ,   'MON-YY' )  )  
    AND B.ACCOUNTING_LINE_TYPE !=  15  
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      3      0.02       0.05          0          0          0           0
    Fetch        3    134.74     722.82     847951    1003824          0           2
    total        7    134.76     722.87     847951    1003824          0           2
    Misses in library cache during parse: 1
    Misses in library cache during execute: 2
    Optimizer mode: ALL_ROWS
    Parsing user id: 193  (APPS)
    Number of plan statistics captured: 1
    Rows (1st) Rows (avg) Rows (max)  Row Source Operation
             1          1          1  SORT AGGREGATE (cr=469496 pr=397503 pw=0 time=237575841 us)
        788242     788242     788242   NESTED LOOPS  (cr=469496 pr=397503 pw=0 time=337519154 us cost=644 size=5920 card=160)
             1          1          1    TABLE ACCESS BY INDEX ROWID MTL_PARAMETERS (cr=2 pr=0 pw=0 time=59 us cost=1 size=10 card=1)
             1          1          1     INDEX UNIQUE SCAN MTL_PARAMETERS_U1 (cr=1 pr=0 pw=0 time=40 us cost=0 size=0 card=1)(object id 181399)
        788242     788242     788242    TABLE ACCESS BY INDEX ROWID MTL_TRANSACTION_ACCOUNTS (cr=469494 pr=397503 pw=0 time=336447304 us cost=643 size=4320 card=160)
       8704356    8704356    8704356     INDEX RANGE SCAN MTL_TRANSACTION_ACCOUNTS_N3 (cr=28826 pr=28826 pw=0 time=27109752 us cost=28 size=0 card=7316)(object id 181802)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
          1   SORT (AGGREGATE)
    788242    NESTED LOOPS
          1     TABLE ACCESS   MODE: ANALYZED (BY INDEX ROWID) OF
                    'MTL_PARAMETERS' (TABLE)
          1      INDEX   MODE: ANALYZED (UNIQUE SCAN) OF
                     'MTL_PARAMETERS_U1' (INDEX (UNIQUE))
    788242     TABLE ACCESS   MODE: ANALYZED (BY INDEX ROWID) OF
                    'MTL_TRANSACTION_ACCOUNTS' (TABLE)
    8704356      INDEX   MODE: ANALYZED (RANGE SCAN) OF
                     'MTL_TRANSACTION_ACCOUNTS_N3' (INDEX)
    Elapsed times include waiting on following events:
      Event waited on                             Times   Max. Wait  Total Waited
      ----------------------------------------   Waited  ----------  ------------
      row cache lock                                 29        0.00          0.02
      SQL*Net message to client                       2        0.00          0.00
      db file sequential read                    847951        0.40        581.90
      latch: object queue header operation            3        0.00          0.00
      latch: gc element                              14        0.00          0.00
      gc cr grant 2-way                               3        0.00          0.00
      latch: gcs resource hash                        1        0.00          0.00
      SQL*Net message from client                     2        0.00          0.00
      gc current block 3-way                          1        0.00          0.00
    ********************************************************************************On a 5 node rac environment the program completes in 15 hours whereas on a single node environemnt the program completes in 2 hours.
    Is there any way I can improve the performance of this query?
    Regards
    Edited by: mhosur on Dec 10, 2012 2:41 AM
    Edited by: mhosur on Dec 10, 2012 2:59 AM
    Edited by: mhosur on Dec 11, 2012 10:32 PM

    CREATE INDEX mtl_transaction_accounts_n0
      ON mtl_transaction_accounts (
                                   transaction_date
                                 , organization_id
                                 , reference_account
                                 , accounting_line_type
    /:p

  • Compare data between two tables of same schema

    Folks,
    I have one very intresting query which i would like to share with you all and looking forward for the solution asap.
    Scenario
    I have two table say TableA and TableB, both having same structre say as below
    TableA
    Col1 Var(10)
    Col2  INT
    TableB
    Col1 Var(10)
    Col2  INT
    I want to compare data between these two tables and store compared data into third table, let me expalin the whole scenario.
    TableA
    ColA          ColB
    INDIA          1
    PAKistan      2
    TableB
    ColA          ColB
    INDIA          1
    PAK             3
    I want result like
    Difference
    ColA          ColB
    True            0
    False           -1
    I want to store this difference in thrid table.
    i.e. when comparing text, i need TRUE when compare 100% else False, Caption is not considered.
         When comparing numeric value, simple sub is requried , TableA-TableB
    Note - I dont want to use any external tool to compare the table data, i required sql query to do the same.
    Thanks
    Amit Srivastava
    Amit
    Please mark as answer if helpful
    http://fascinatingsql.wordpress.com/

    Whereas the abbreviation of countries that exist in Table2 table are the first three letters of the name of the country*, here's a suggestion:
    -- code #1 v2
    INSERT into [Difference] (Col1, Col2, ACol1, BCol1)
    SELECT case when A.Col1 = B.Col1 then 'true' else 'false' end,
    (IsNull(A.Col2, 0) - IsNull(B.Col2, 0)), A.Col1, B.Col1
    from TableA as A full outer join
    TableB as B on (A.Col1 = B.Col1
    or Left(A.Col1, 3) = B.Col1);
    Is the COLLATE database case insensitive? If not, the code #1 above will have to be modified, using the upper () function or using COLLATE case insensitive in A.Col1 and B.Col1 columns.
    But if the abbreviation of the country follow the
    ISO 3166-1 alpha-3 standard, will require a fourth table containing the symbol and name of countries.
    -- code #2 v2
    ;with
    TableB_2 as (
    SELECT case when Len(Col1) = 3
    then (SELECT Country_name from [ISO 3166-1 a3] where Cod = Col1)
    else Col1 end as Col1, Col2
    from TableB
    INSERT into [Difference] (Col1, Col2, ACol1, BCol1)
    SELECT case when A.Col1 = B.Col1 then 'true' else 'false' end,
    (IsNull(A.Col2, 0) - IsNull(B.Col2, 0)), A.Col1, B.Col1
    from TableA as A full outer join
    TableB_2 as B on A.Col1 = B.Col1;
    Structure and data to test:
    use tempdb;
    CREATE TABLE TableA (Col1 varchar(10), Col2 int);
    CREATE TABLE TableB (Col1 varchar(10), Col2 int);
    CREATE TABLE [Difference] (Col1 varchar(10), Col2 int, ACol1 varchar(10), BCol1 varchar(10));
    INSERT into TableA values ('INDIA', 1), ('PAKistan', 2), ('China', 12);
    INSERT into TableB values ('INDIA', 1), ('PAK', 3), ('Bhutan', 3);
    go
    CREATE TABLE [ISO 3166-1 a3] (Cod char(3) primary key, [Country_name] varchar(30));
    INSERT into [ISO 3166-1 a3] values
    ('IND', 'India'), ('PAK', 'Pakistan'), ('CHN', 'China'), ('BGD', 'Bangladesh'),
    ('BTN', 'Bhutan'), ('MMR', 'Myanmar'), ('NPL', 'Nepal');
    go
    (*) If the short form of the country name using the first three letters of the country name,
    false positives can occur. For example,
    Mali and Malta or
    Angola and Anguilla.
    José Diz     Belo Horizonte, MG - Brasil

  • ORA-00600: internal error code while inserting data in table

    hi gems..
    i am getting the below error while inserting data in a table...
    *ORA-00600: internal error code, arguments: [kqd-objerror$ ] , , [0], [98], [BIN$sm1O+fYhF1jgRAAhKNYyZA==$0], [], [], [], [], [], [], []*
    i can select the table absolutely but cant insert datas(but this is the schema owner and so datas should get inserted)
    i have checked the alert.log...the entries in last few lines are like this:
    <msg time='2011-11-25T03:08:55.763+05:30' org_id='oracle' comp_id='clients'
    type='UNKNOWN' level='16' host_id='ICS167DOR'
    host_addr='10.184.134.139'>
    <txt>Directory does not exist for read/write [oracle/ora11g/app/ora11g/product/11.2.0/dbhome_1/log] [oracle/ora11g/app/ora11g/product/11.2.0/dbhome_1/log/diag/clients]
    </txt>
    </msg>
    please help...thanks in advance
    Edited by: user12780416 on Nov 25, 2011 3:29 AM

    hi...
    finally i got the solution...i know that this problem may occur due to some other reasons also for different users...but the problem which caused the developers facing the error in this case is below:
    they faced the error while importing the dumps in the server. at the same time the application developers told that they can select the tables but cannot insert any datas.
    after listenning to this, i assumed that this may be a space problem with the system tablespace as it is responsible for storing the data dictionary.
    i asked for the free spaces for the system tablespace and got the reason. It has only 0.2% left.
    i told them to issue the resize command for the system01.dbf datafile(allocated 2GB more) and the problem got resolved.
    Hope this helps..thanks

  • Issue while insert and update data to DB tables

    Hello all,
    i am having an issue while insert the data to DB table.
    my scenario is DB1 to DB2. i had a sender channel with select query which fetches data from DB1 and inserts to DB2.
    so the select query will fetch the records that were INSERTED to DB1 and records that were UPDATED to DB1 and needs to insert/update to DB2 table.
    Now the issue is i am able to insert the records but not able toupdate the records to DB2 table due to primary key issue.
    im message mapping
    sender message type is as follows:
    <src_message1>
    ----<row>
    -------<fieldA>
    -------<filedB>
    -------<filedC>
    Receiver message type as follows:
    <trgt_message1>
    ----<STATEMENT_1>
    ----<TABLE_NAME>
    ----<ACTION> INSERT
    ----<TABLE>
    ----<ACCESS>
    ----<field1> primary key
    ----<field2>
    ----<field3>
    ----<field4>
    ----<KEY>
    ----<field1>
    ----<field2>
    ----<field3>
    ----<field4>
    my query in sender channel is : select filedA, filedB, filedC from test_table where createdate=sysdate or updatedate=sysdate
    so it feteches the data from DB1 and inserting to DB2 but not updating the records to DB2 due to primarykey issue.
    please suggest how to solve ....will it solve by using UPDATE_INSERT for action?
    Best Regards,SARAN

    Hi Nagarjuna,
    i have done the following changes to target mapping structure;
    1. action as UPDATE_INSERT
    2.  in access tab, i had mapped fieldDate to field4.
    3. in Key tab, i assigned the sysdate to field4.
    but issue still exist. could you please check my above changes are correct or not. if wrong please provide me the details that needs to be done.
    thanks in advance.
    i'm providing the error details again:
    my query in sender channel is : select filedA, filedB, filedC, FiledDate from TEST_TABLE where fieldDate=sysdate or updatedate=sysdate
    it returns 4 records as follows:
    fieldA--fieldB-fieldC---fieldDate
    1001----EU----  1----
        2011-11-10
    1002----CN----  0----
         2011-11-10
    1003----AP---- 1----
          2008-03-15 (already exist in DB2)
    1004----JP----  1----
        2007-04-12 (already exist in DB2)
    the first two records are created today and remaining 2 records are updated the fieldC from 0 to 1 ( in DB1 )
    while inserting these 4 records to DB2, we get the following error "java.sql.SQLException: ORA-00001: unique constraint (data.TEST_TABLE_PK) violated" .
    Best Regards,SARAN

  • Error while insert data using execute immediate in dynamic table in oracle

    Error while insert data using execute immediate in dynamic table created in oracle 11g .
    first the dynamic nested table (op_sample) was created using the executed immediate...
    object is
    CREATE OR REPLACE TYPE ASI.sub_mark AS OBJECT (
    mark1 number,
    mark2 number
    t_sub_mark is a class of type sub_mark
    CREATE OR REPLACE TYPE ASI.t_sub_mark is table of sub_mark;
    create table sam1(id number,name varchar2(30));
    nested table is created below:
    begin
    EXECUTE IMMEDIATE ' create table '||op_sample||'
    (id number,name varchar2(30),subject_obj t_sub_mark) nested table subject_obj store as nest_tab return as value';
    end;
    now data from sam1 table and object (subject_obj) are inserted into the dynamic table
    declare
    subject_obj t_sub_mark;
    begin
    subject_obj:= t_sub_mark();
    EXECUTE IMMEDIATE 'insert into op_sample (select id,name,subject_obj from sam1) ';
    end;
    and got the below error:
    ORA-00904: "SUBJECT_OBJ": invalid identifier
    ORA-06512: at line 7
    then when we tried to insert the data into the dynam_table with the subject_marks object as null,we received the following error..
    execute immediate 'insert into '||dynam_table ||'
    (SELECT

    887684 wrote:
    ORA-00904: "SUBJECT_OBJ": invalid identifier
    ORA-06512: at line 7The problem is that your variable subject_obj is not in scope inside the dynamic SQL you are building. The SQL engine does not know your PL/SQL variable, so it tries to find a column named SUBJECT_OBJ in your SAM1 table.
    If you need to use dynamic SQL for this, then you must bind the variable. Something like this:
    EXECUTE IMMEDIATE 'insert into op_sample (select id,name,:bind_subject_obj from sam1) ' USING subject_obj;Alternatively you might figure out to use static SQL rather than dynamic SQL (if possible for your project.) In static SQL the PL/SQL engine binds the variables for you automatically.

  • Error while selecting date from external table

    Hello all,
    I am getting the follwing error while selecting data from external table. Any idea why?
    SQL> CREATE TABLE SE2_EXT (SE_REF_NO VARCHAR2(255),
      2        SE_CUST_ID NUMBER(38),
      3        SE_TRAN_AMT_LCY FLOAT(126),
      4        SE_REVERSAL_MARKER VARCHAR2(255))
      5  ORGANIZATION EXTERNAL (
      6    TYPE ORACLE_LOADER
      7    DEFAULT DIRECTORY ext_tables
      8    ACCESS PARAMETERS (
      9      RECORDS DELIMITED BY NEWLINE
    10      FIELDS TERMINATED BY ','
    11      MISSING FIELD VALUES ARE NULL
    12      (
    13        country_code      CHAR(5),
    14        country_name      CHAR(50),
    15        country_language  CHAR(50)
    16      )
    17    )
    18    LOCATION ('SE2.csv')
    19  )
    20  PARALLEL 5
    21  REJECT LIMIT UNLIMITED;
    Table created.
    SQL> select * from se2_ext;
    SQL> select count(*) from se2_ext;
    select count(*) from se2_ext
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04043: table column not found in external source: SE_REF_NO
    ORA-06512: at "SYS.ORACLE_LOADER", line 19

    It would appear that you external table definition and the external data file data do not match up. Post a few input records so someone can duplicate the problem and determine the fix.
    HTH -- Mark D Powell --

  • Insert data from a table

    <p style="margin-top: 0; margin-bottom: 0"><font face="Arial" size="2">Hi,</font></p>
    <p style="margin-top: 0; margin-bottom: 0"> </p>
    <p style="margin-top: 0; margin-bottom: 0"><font face="Arial" size="2">I would
    like to insert data in a table. The data are selected by a query. I wrote this
    code but there is an error :</font></p>
    <p style="margin-top: 0; margin-bottom: 0"> </p>
    <font SIZE="2">
    <p style="margin-top: 0; margin-bottom: 0"></font><font face="Courier New">
    <font SIZE="2" COLOR="#0000f0">PROCEDURE</font><font SIZE="2"> </font>
    <font SIZE="2" COLOR="#808000">GLMASTER_SAVE</font><font SIZE="2"> </font>
    </font><font SIZE="2" COLOR="#0000f0"><font face="Courier New">IS</font></p>
    </font><font SIZE="2">
    <p style="margin-top: 0; margin-bottom: 0"></font>
    <font SIZE="2" COLOR="#0000f0" face="Courier New">BEGIN</p>
    </font>
    <p style="margin-top: 0; margin-bottom: 0"><font face="Courier New">
    <font SIZE="2" COLOR="#0000f0">  INSERT</font><font SIZE="2"> </font>
    <font SIZE="2" COLOR="#0000f0">INTO</font><font SIZE="2"> </font></font>
    <font SIZE="2" COLOR="#808000"><font face="Courier New">GLMASTER_SAVE</font></p>
    </font>
    <p style="margin-top: 0; margin-bottom: 0">
    <font SIZE="2" COLOR="#0000f0" face="Courier New"> </font><font SIZE="2" COLOR="#0000f0" face="Courier New">
    </font><font SIZE="2" COLOR="#0000f0" face="Courier New"> </font><font SIZE="2" COLOR="#0000f0" face="Courier New">
    </font><font face="Courier New"><font SIZE="2" COLOR="#0000f0">(</font><font SIZE="2">COMPANY</font><font SIZE="2" COLOR="#0000f0">,</font><font SIZE="2">
    ACCT_UNIT</font><font SIZE="2" COLOR="#0000f0">,</font><font SIZE="2"> ACCOUNT</font><font SIZE="2" COLOR="#0000f0">,</font><font SIZE="2">
    SUB_ACCOUNT</font><font SIZE="2" COLOR="#0000f0">,</font><font SIZE="2">
    ACTIVE_STATUS</font></font><font SIZE="2" COLOR="#0000f0"><font face="Courier New">)</font></p>
    </font><font SIZE="2">
    <p style="margin-top: 0; margin-bottom: 0"></font>
    <font SIZE="2" COLOR="#0000f0" face="Courier New"> </font><font SIZE="2" COLOR="#0000f0" face="Courier New">
    VALUES</p>
    </font><font SIZE="2">
    <p style="margin-top: 0; margin-bottom: 0"></font>
    <font SIZE="2" COLOR="#0000f0" face="Courier New"> </font><font SIZE="2" COLOR="#0000f0" face="Courier New">
    </font><font SIZE="2" COLOR="#0000f0" face="Courier New"> </font><font SIZE="2" COLOR="#0000f0" face="Courier New">
    </font><font face="Courier New"><font SIZE="2" COLOR="#0000f0">(SELECT</font><font SIZE="2">
    </font><font SIZE="2">COMPANY</font><font SIZE="2" COLOR="#0000f0">,</font><font SIZE="2">
    </font><font SIZE="2">ACCT_UNIT</font><font SIZE="2" COLOR="#0000f0">,</font><font SIZE="2">
    </font><font SIZE="2">ACCOUNT</font></font><font SIZE="2" COLOR="#0000f0"><font face="Courier New">,</font></font><font SIZE="2"><font SIZE="2" COLOR="#0000f0" face="Courier New">
    </font></font><font face="Courier New"><font SIZE="2">SUB_ACCOUNT</font><font SIZE="2" COLOR="#0000f0">,</font><font SIZE="2">
    </font></font><font SIZE="2"><font face="Courier New">ACTIVE_STATUS</font></p>
    <p style="margin-top: 0; margin-bottom: 0">
    <font SIZE="2" COLOR="#0000f0" face="Courier New"> </font><font SIZE="2" COLOR="#0000f0" face="Courier New">
    </font><font SIZE="2" COLOR="#0000f0" face="Courier New"> </font><font SIZE="2" COLOR="#0000f0" face="Courier New">
    </font><font SIZE="2" COLOR="#0000f0" face="Courier New"> </font><font SIZE="2" COLOR="#0000f0" face="Courier New">
    </font><font SIZE="2" COLOR="#0000f0" face="Courier New"> </font></font><font face="Courier New"><font SIZE="2" COLOR="#0000f0">FROM</font><font SIZE="2">
    </font></font><font SIZE="2"><font face="Courier New">GLMASTER</font></p>
    <p style="margin-top: 0; margin-bottom: 0">
    <font SIZE="2" COLOR="#0000f0" face="Courier New"> </font><font SIZE="2" COLOR="#0000f0" face="Courier New">
    </font><font SIZE="2" COLOR="#0000f0" face="Courier New"> </font></font><font face="Courier New"><font SIZE="2" COLOR="#0000f0">ORDER</font><font SIZE="2">
    </font><font SIZE="2" COLOR="#0000f0">BY</font><font SIZE="2"> </font>
    <font SIZE="2">COMPANY</font><font SIZE="2" COLOR="#0000f0">,</font><font SIZE="2">
    </font><font SIZE="2">ACCT_UNIT</font><font SIZE="2" COLOR="#0000f0">,</font><font SIZE="2">
    </font><font SIZE="2">ACCOUNT</font><font SIZE="2" COLOR="#0000f0">,</font><font SIZE="2">
    </font><font SIZE="2">SUB_ACCOUNT</font></font><font SIZE="2" COLOR="#0000f0"><font face="Courier New">);</font></p>
    </font><font SIZE="2">
    <p style="margin-top: 0; margin-bottom: 0">
    <font SIZE="2" COLOR="#0000f0" face="Courier New"> </font><font SIZE="2" COLOR="#0000f0" face="Courier New">
    </font></font><font SIZE="2" COLOR="#0000f0" face="Courier New">COMMIT;</p>
    </font><font SIZE="2">
    <p style="margin-top: 0; margin-bottom: 0"></font>
    <font SIZE="2" COLOR="#0000f0" face="Courier New">END;</p>
    </font>
    <p style="margin-top: 0; margin-bottom: 0"> </p>
    <p style="margin-top: 0; margin-bottom: 0"><font face="Arial" size="2">Do you
    know why ?</font></p>
    <p style="margin-top: 0; margin-bottom: 0"> </p>
    <p style="margin-top: 0; margin-bottom: 0"><font face="Arial" size="2">Thank you
    for your help.</font></p>
    <p style="margin-top: 0; margin-bottom: 0"> </p>
    <p style="margin-top: 0; margin-bottom: 0">
    <font color="#008080" face="Arial" size="2"><i><b>Patrick</b></i></font></p>

    try this:
      INSERT INTO GLMASTER_SAVE
        (COMPANY, ACCT_UNIT, ACCOUNT, SUB_ACCOUNT, ACTIVE_STATUS)
      SELECT COMPANY, ACCT_UNIT, ACCOUNT, SUB_ACCOUNT, ACTIVE_STATUS
           FROM GLMASTER
       ORDER BY COMPANY, ACCT_UNIT, ACCOUNT, SUB_ACCOUNT;you need not to include the VALUES
    Message was edited by:
    Warren Tolentino
    sorry ps while posting this you had already posted your solution.

  • Inserting Data into nested table

    I am exploring the differences between OBJECT & RECORD.
    As i am still in process of learning, I found that both are structures which basically groups elements of different datatypes or columns of different datatypes, one is used in SQL and other is used in PL/SQL, please correct me if I am wrong in my understanding.
    Below i am trying to insert data into an table of type object but i am unsuccessful can you please help.
    CREATE OR REPLACE type sam as OBJECT
    v1 NUMBER,
    v2 VARCHAR2(20 CHAR)
    ---Nested Table---
    create or replace type t_sam as table of sam;
    --Inserting data----
    insert into table(t_sam) values(sam(10,'Dsouza'));
    Error Message:
    Error starting at line 22 in command:
    insert into table(t_sam) values(sam(10,'Dsouza'))
    Error at Command Line:22 Column:13
    Error report:
    SQL Error: ORA-00903: invalid table name
    00903. 00000 -  "invalid table name"
    *Cause:   
    *Action:

    Ariean wrote:
    So only purpose of equivalent SQL types concept of nested tables is to use them as one of the data types while defining an actual table?
    Sort of - you can definitely use them for more than just "defining an actual table". (I'm fairly certain you could pass a nested table into a procedure, for example - try it, though - I'm not 100% sure on that - it just "makes sense". If you can define a type, you can use it, pass it around, whatever.).
    Ariean wrote:
    And that nested table could be a record in SQL or an Object in PLSQL or just simple datatype(number,varchar etc)?
    Nested tables are just like any other custom data type. You can create a nested table of other data types. You can create a custom data type of nested tables.
    It could get stupidly .. er, stupid O_0
    CREATE TYPE o_myobj1 AS object ( id1   number, cdate1  date );
    CREATE TYPE t_mytype1 AS table of o_myobj1;
    CREATE TYPE o_myobj2 AS object ( id2   number,  dumb  t_mytype1 );
    CREATE TYPE t_dumber AS table of o_myobj2;
    O_0
    Ok, my brain's starting to hurt - I hope you get the idea
    Ariean wrote:
    Secondly is my understanding correct about OBJECT & RECORD?
    I can't think of any benefit of describing it another way.

  • Issue while inserting a BDC program in Inbound Proxy.JDBC-- PI-- SAP.

    The scenerio is jdbcsender-sappi-inboundproxy(ECC6.0).
    The issue is related to SAP Plant Maintenance Module where  We have a requirement for creating Maintenance item
    programmatically from (SQLDatabase)Legacy Data for one of the interface.
    since there are no standard BAPIS/Idocs or function modules available from SAP side for creating the maintenance item. So I
    have written BDC program and with the help of submit statement in inbound proxy program, I am calling BDC program for
    creation of the maintenance item.
    When I tested the program independently on proxy side  the Maintenance Item is getting created successfully but when I
    executed from end-to-end ie.  SQLDATABASE->SAP PI-->SAP. The message is getting strucked into the queue and queue  got stopped on SAP.
    ECC side and the status of the message is scheduled state  on SXMB_MONI  transaction of SAP ECC.
    As the message  is in scheduled state multiple number of Maintenance Items are getting created with the same values.
    Has any one of our SAP friends, encountered this type of issue while inserting a BDC program in inbound proxy, please help 
    in fixing this issue. FYI... I am using sap pi7.0 with service pack 24 and ecc6.0
    Waiting for your kind expert guidance...
    cheers,
    Ram

    Raj,
    Thanks for the reply. I have tried registering the queues but still the same problem. the message got stuck in the queue of ECC and showing below message in queue.
    function module                                    StatusText
    SXMS_ASYNC_EXEC                  connection closed (no data)
    I have checked in the forums especially  for this issue but no one has provided the answer for this.
    Thanks and Regards
    Ram

Maybe you are looking for

  • Matching Memory Modules?

    I recently bought a new iMac G5 with a 1GB memory module pre-installed. I have added a 1GB memory module and I am wondering if there is any way for me to tell whether the memory is matched, that is, operating at a 128-bit data path. I read about matc

  • How can I run a .bat file from an html/javascript adobe air installation package?

    I write an html/javascript code which works completely true but when I made the installation package by air-sdk it dosent work properly my code is: <html> <head>     <title>Hello World</title>           <script type="text/javascript">         functio

  • Just need 1/8 inch to RCA for digital coax out with audigy se right?

    DJust need /8 inch to RCA for digital coax out with audigy se right? Hi there, I need a low profile soundcard that can deli'ver digital 5. through either coax or optical, looks like the audigy se will do just that. I have a yamaha receiver with DTS,

  • Checking Time Dimension

    Hi experts! I have a Time dimension with next members: Year--> INT-----> Exemple: 2011 Quarter-> Char--> Exemple: Q1/11 Month-> Char --> Exemple: January Day-> Char ------> Exemple: 1 January Date-> Date ---> Exemple: 01/01/2011 When I try to get a r

  • Relocating Many original files at once

    Hello, I am using itunes v 6 and with all of my music locates in the same itunes music folder on an external hard drive. Since installing the latest version of itunes, I have received the following error code; "original file could not be found, would