Error in rcv_transactions_interface table but no record in po_interface_err

Hi Everyone,
I'm having a issue in receipts interface. I'm inserting data into rcv_headers_interface table and rcv_transactions_interface tables and then doing fnd_request.submit_request of Receiving transaction processor program. The import program completed with normal status , but the status_flag in both the interface tables is 'ERROR' . I tried to get the error message from po_interaface_errors table using interface_header_id of rcv_headers_interface and also using interface_transaction_id of rcv_transactions_interface.
There is no record in po_interface_errors table.
Is there any other table where I need to look for error?
Please help me out.
Thanks
Sunny

Can you do this and let us know the results?
1. Change the Proflle value for RCV: Processing Mode to Batch
2. Change the profile value for RCV: Debug Mode to Yes
3. Perform transaction
4. Run Receiving Transaction Processor
5. Check the log file at the bottom.
6. If you can send the log file,. we can assist.
Also as a clue this error looks like is coming back from Inventory. Please check whether the item controls are correctly setup and item attributes are enabled correctly. Your code combinations are all still enabled that are used in the po distributions and so on.
Thanks
Nagamohan

Similar Messages

  • ORA-06502 error with external table having long records

    I'm getting a strange error with the oci driver that I don't get with the thin driver.
    The basic situation is that we using external tables and both the oci and thin drivers have been working until we tested with a table that had longer records than the previous tables. The new table has 1800 byte records.
    The thin driver still works fine with the new table. However, the oci driver generates the following error with the new table:
    ORA-06502: PL/SQL: numeric or value error: character string buffer too small
    I suspect that with oci driver the oci DLLs are reading the external file and the DLLs can't handle the longer record length.
    Particulars
    - Oracle DB Server is 10.2.0.1 on SunOS 5.9
    - OCI Instant Client instantclient_10_2 on Windows XP
    - OCI client code from DB server installation running on DB server
    - Works with thin driver from Windows XP and on DB server machine for all record lengths
    - Works with both OCI drivers for records < 1800 bytes (don't know actual limit)
    - Fails with both OCI drivers for records = 1800 bytes.
    Does anyone out there have any thoughts.
    Thanks in advance.

    Your access parameters are in the wrong order. External tables are a bit fussy like that. Refer to the access_parameters section in the Utilities manual and follow the order there. From memory it will go something like this:
    RECORDS DELIMITED...
    LOG/BAD/DISCARDFILE...
    FIELDS TERMINATED...LDRTRIM
    MISSING FIELD VALUES...
       fields...
    )Regards...

  • Error OGG-01517  Position of first record processed

    INFO OGG-01517 Position of first record processed LSN: 0x00000008:00000038:0004, Jul 10, 2012 1:45:47 PM.
    TABLEWildcard resolved (entry DBO.*):
    table DBO.NEW TABLE;
    Source Context :
    SourceModule : [er.main]
    SourceID : [er/rep.c]
    SourceFunction : [get_map_entry]
    SourceLine : [8661]
    2012-07-10 13:55:37 ERROR OGG-00212 Invalid option for MAP: TABLE.
    2012-07-10 13:55:37 INFO OGG-00178 VAM Client Report <Last LSN Read: 00000008:0000003d:0002
    Open Transactions
    0000:0000025e (2012/07/10 13:55:35.943) @ 00000008:0000003d:0001: Upd(Comp) = 0(0), Row(comp) = 1(0)
    .2012-07-10 13:55:37 INFO OGG-00178 VAM Client Report <Sanity checking is not enabled.
    .I try edit error in file but failed.
    table dbo."New Table"; but No records extracted.

    Bug 13458343 is fixed in version 11.2.1.
    Please try it ... change the version 11.1 for 11.2 and post your results

  • Records stuck in rcv_transactions_interface table

    RVTTH-115f: Subroutine process_transaction() returned error
    Cause: Subroutine process_transaction() returned an internal error.
    Action: Note this error number and the actions you are trying to perform. Contact your system
    -- This is the error meassage in po_interface_errors table for the records stuck in rcv_transactions_interface table. Unable to deliver the receipts made against PO.
    Patch as suggested in metalink is already in place. Can someone please me?

    Can you do this and let us know the results?
    1. Change the Proflle value for RCV: Processing Mode to Batch
    2. Change the profile value for RCV: Debug Mode to Yes
    3. Perform transaction
    4. Run Receiving Transaction Processor
    5. Check the log file at the bottom.
    6. If you can send the log file,. we can assist.
    Also as a clue this error looks like is coming back from Inventory. Please check whether the item controls are correctly setup and item attributes are enabled correctly. Your code combinations are all still enabled that are used in the po distributions and so on.
    Thanks
    Nagamohan

  • Error on extend an internal table, but the required space was not available

    Hi All,
    I am trying to retrieve cost data from COVP tables, as well as the quantity, cost centre, and price unit fields from MSEG / BSEG tables. Thing is, if the AWTYP value in COVP table = 'MKPF', i have to get the data from MSEG, otherwise, I need to get it from BSEG table.
    I kept getting this error message:
    <i>You attempted to extend an internal table, but the required space was not available.</i>
    Anyone have any idea how to make my codes more efficient?
    Please help. Thanks!
    Regards,
    Cecilia
    REPORT  ZMISCY004.
    TABLES : COVP, BSIS, BSEG, MSEG, MAKT, CSKU.
    PARAMETERS :
    P_FILE(300) DEFAULT '\sapdcdatasaprptoh.txt' LOWER CASE.
    DATA MSG_TEXT(50).
    DATA :
    XKOKRS LIKE COVP-KOKRS,
    XBELNR LIKE COVP-BELNR,
    XBUZEI LIKE COVP-BUZEI,
    XGJAHR LIKE COVP-GJAHR,
    XPERIO LIKE COVP-PERIO,
    XWKGBTR LIKE COVP-WKGBTR,
    XWTGBTR LIKE COVP-WTGBTR,
    XREFBN LIKE COVP-REFBN,
    XREFBZ LIKE COVP-REFBZ,
    XKSTAR LIKE COVP-KSTAR,
    XBEKNZ LIKE COVP-BEKNZ,
    XMATNR LIKE COVP-MATNR,
    XBUKRS LIKE COVP-BUKRS,
    XREFGJ LIKE COVP-REFGJ,
    XREFBK LIKE COVP-REFBK,
    XLTEXT LIKE CSKU-LTEXT,
    XMAKTX LIKE MAKT-MAKTX,
    XAWTYP LIKE COVP-AWTYP,
    XTWAER LIKE COVP-TWAER,
    XSGTXT LIKE COVP-SGTXT,
    BSEG_KOSTL LIKE BSEG-KOSTL,
    BSEG_MEINS LIKE BSEG-MEINS,
    BSEG_MENGE LIKE BSEG-MENGE,
    MSEG_KOSTL LIKE MSEG-KOSTL,
    MSEG_MENGE LIKE MSEG-MENGE,
    MSEG_MEINS LIKE MSEG-MEINS.
    DATA : BEGIN OF ITAB_COVP OCCURS 10,
    KOKRS LIKE COVP-KOKRS,
    BELNR LIKE COVP-BELNR,
    BUZEI LIKE COVP-BUZEI,
    GJAHR LIKE COVP-GJAHR,
    PERIO LIKE COVP-PERIO,
    WKGBTR(15) TYPE C,
    WTGBTR(15) TYPE C,
    REFBN LIKE COVP-REFBN,
    REFBZ(3) TYPE C,
    KSTAR LIKE COVP-KSTAR,
    BEKNZ LIKE COVP-BEKNZ,
    MATNR LIKE COVP-MATNR,
    BUKRS LIKE COVP-BUKRS,
    REFGJ LIKE COVP-REFGJ,
    REFBK LIKE COVP-REFBK,
    LTEXT LIKE CSKU-LTEXT,
    MAKTX LIKE MAKT-MAKTX,
    AWTYP LIKE COVP-AWTYP,
    TWAER LIKE COVP-TWAER,
    SGTXT LIKE COVP-SGTXT,
    BSIS_WRBTR(13) TYPE C,
    BSEG_KOSTL LIKE BSEG-KOSTL,
    BSEG_MEINS LIKE BSEG-MEINS,
    BSEG_MENGE LIKE BSEG-MENGE,
    MSEG_KOSTL LIKE MSEG-KOSTL,
    MSEG_MENGE LIKE MSEG-MENGE,
    MSEG_MEINS LIKE MSEG-MEINS.
    DATA : END OF ITAB_COVP.
    SELECT M1~KOKRS
           M1~BELNR
           M1~BUZEI
           M1~GJAHR
           M1~PERIO
           M1~WKGBTR
           M1~WTGBTR
           M1~REFBN
           M1~REFBZ
           M1~KSTAR
           M1~BEKNZ
           M1~MATNR
           M1~BUKRS
           M1~REFGJ
           M1~REFBK
           M1~AWTYP
           M1~TWAER
           M1~SGTXT
           M4~KOSTL
           M4~MENGE
           M4~MEINS
    INTO (XKOKRS,
          XBELNR,
          XBUZEI,
          XGJAHR,
          XPERIO,
          XWKGBTR,
          XWTGBTR,
          XREFBN,
          XREFBZ,
          XKSTAR,
          XBEKNZ,
          XMATNR,
          XBUKRS,
          XREFGJ,
          XREFBK,
          XAWTYP,
          XTWAER,
          XSGTXT,
          MSEG_KOSTL,
          MSEG_MENGE,
          MSEG_MEINS
    FROM COVP AS M1
    LEFT OUTER JOIN MSEG AS M4
    ON M1~REFBN = M4~MBLNR AND M1~REFBZ = M4~ZEILE AND M1~REFGJ = M4~MJAHR
    WHERE M1~SCOPE = 'OCOST' AND M1~AWTYP = 'BKPF'
      OR M1~SCOPE = 'OCOST' AND M1~AWTYP = 'MKPF'
      OR M1~KSTAR = '972022'.
    IF XAWTYP = 'BKPF'.
    SELECT KOSTL MENGE MEINS INTO (BSEG_KOSTL, BSEG_MENGE, BSEG_MEINS) FROM
    BSEG WHERE BELNR = XREFBN AND BUZEI = XREFBZ AND GJAHR = XREFGJ AND
    BUKRS = XREFBK.
    ENDSELECT.
    MOVE BSEG-KOSTL TO BSEG_KOSTL.
    MOVE BSEG-MEINS TO BSEG_MEINS.
    MOVE BSEG-MENGE TO BSEG_MENGE.
    ELSE.
    MOVE ' ' TO BSEG_KOSTL.
    MOVE ' ' TO BSEG_MEINS.
    MOVE ' ' TO BSEG_MENGE.
    ENDIF.
    *GET LTEST
    SELECT LTEXT INTO XLTEXT
    FROM CSKU
    WHERE KTOPL = 'COAA' AND SPRAS = 'EN'.
    *GET MAKTX
    IF XMATNR <> ' '.
    SELECT SINGLE * FROM MAKT WHERE MATNR = XMATNR.
    MOVE MAKT-MAKTX TO XMAKTX.
    ELSE.
    MOVE ' ' TO XMAKTX.
    ENDIF.
    MOVE : XKOKRS TO ITAB_COVP-KOKRS,
           XBELNR TO ITAB_COVP-BELNR,
           XBUZEI TO ITAB_COVP-BUZEI,
           XGJAHR TO ITAB_COVP-GJAHR,
           XPERIO TO ITAB_COVP-PERIO,
           XWKGBTR TO ITAB_COVP-WKGBTR,
           XWTGBTR TO ITAB_COVP-WTGBTR,
           XREFBN TO ITAB_COVP-REFBN,
           XREFBZ TO ITAB_COVP-REFBZ,
           XKSTAR TO ITAB_COVP-KSTAR,
           XBEKNZ TO ITAB_COVP-BEKNZ,
           XMATNR TO ITAB_COVP-MATNR,
           XBUKRS TO ITAB_COVP-BUKRS,
           XREFGJ TO ITAB_COVP-REFGJ,
           XREFBK TO ITAB_COVP-REFBK,
           XLTEXT TO ITAB_COVP-LTEXT,
           XMAKTX TO ITAB_COVP-MAKTX,
           XAWTYP TO ITAB_COVP-AWTYP,
           XTWAER TO ITAB_COVP-TWAER,
           XSGTXT TO ITAB_COVP-SGTXT,
           BSEG_KOSTL TO ITAB_COVP-BSEG_KOSTL,
           BSEG_MEINS TO ITAB_COVP-BSEG_MEINS,
           BSEG_MENGE TO ITAB_COVP-BSEG_MENGE,
           MSEG_KOSTL TO ITAB_COVP-MSEG_KOSTL,
           MSEG_MENGE TO ITAB_COVP-MSEG_MENGE,
           MSEG_MEINS TO ITAB_COVP-MSEG_MEINS.
    APPEND ITAB_COVP.
    CLEAR ITAB_COVP.
    ENDSELECT.
    ENDSELECT.
    OPEN DATASET P_FILE FOR OUTPUT IN TEXT MODE.
    IF SY-SUBRC NE 0.
       WRITE: 'File cannot be opened. Reason:', MSG_TEXT.
       EXIT.
    ENDIF.
    LOOP AT ITAB_COVP.
    TRANSFER ITAB_COVP TO P_FILE.
    ENDLOOP.
    CLOSE DATASET P_FILE.

    Cecilia - I think your problemn is a nested select:
    *GET LTEST
      SELECT ltext INTO xltext
      FROM csku
      WHERE ktopl = 'COAA' AND spras = 'EN'.
    *GET MAKTX
        IF xmatnr <> ' '.
          SELECT SINGLE * FROM makt WHERE matnr = xmatnr.
          MOVE makt-maktx TO xmaktx.
        ELSE.
          MOVE ' ' TO xmaktx.
        ENDIF.
        MOVE : xkokrs TO itab_covp-kokrs,
                     etc.
              mseg_meins TO itab_covp-mseg_meins.
        APPEND itab_covp.
        CLEAR itab_covp.
      ENDSELECT.
    Do you need to do the inner select for every cost element text?
    Rob

  • How to update zero to a column value when record does not exist in table but should display the column value when it exists in table?

    Hello Everyone,
    How to update zero to a column value when record does not exist in table  but should display the column value when it exists in table
    Regards
    Regards Gautam S

    As per my understanding...
    You Would like to see this 
    Code:
    SELECT COALESCE(Salary,0) from Employee;
    Regards
    Shivaprasad S
    Please mark as answer if helpful
    Shiv

  • Error while updating table before branching to a report

    Hi,
    I have an apex form screen where on click of a button, i need to change a few flags on the screen and then display a bi publisher report.
    when the user clicks the PRINT Button a javascript function is called, which will set the flags and submits the form. I have the default process row of table process to update the form fields.
    in the branch i gave the BI publisher report url.
    everything works fine so far.
    if i click the print button again for a second time, i get a checksum ... error as shown below.
    ORA-20001: Error in DML: p_rowid=982-000790, p_alt_rowid=_ID_NUMBER, p_rowid2=, p_alt_rowid2=. ORA-20001: Current version of data in database has changed since user initiated update process. current checksum = "BD63FDD3142B79017CCD2C8DA8ED8CA7" application checksum = "B2FD7581A9478214E59264F9C1CFAF96"
    Error Unable to process row of table .
    OK
    Any idea how to fix this?
    What i am trying to do is:
    when the print button is clicked then i need to set the Print flag in the screen and database to true.I am using the default row update process generated by apex for a table form.
    Thanks
    Knut

    Hi Scott,
    example 1
    I have a demo at the following url
    http://apex.oracle.com/pls/otn/f?p=30091:6:1206476651563662::NO:::
    1. page 6 is the report and page 7 will be the details form.
    2. select a record from the report click edit and it will take you to the details screen page 7.
    3. on page 7 i have 2 select boxes with YES/NO flag. Initially set them to NO.
    save changes.
    4. click on the print button. it will reset the select lists to YES and opens a report.
    5.now go and edit anything on the screen and click apply changes.
    we will get the an error.
    All i am trying to do is update the flags to YES before displaying the report which works fine. but if i try to edit any data and save it throws an error.
    is my table.
    CREATE TABLE "DEMO_MINISTER"
    (     "MINISTER_ID" VARCHAR2(12),
         "NAME" VARCHAR2(50),
         "CERT_FLAG" VARCHAR2(1),
         "LABEL_FLAG" VARCHAR2(1)
    MINISTER_ID" is the primary key generated using a sequence. (i have to actually generate it with - year prefix etc hence a varchar)
    example 2:
    http://apex.oracle.com/pls/otn/f?p=30091:1:607292687304632:::::
    i have another page page 1 with demo_customers list. here the id is a number here. on edit it displays page 5 where i tried the same print reset flag stuff here and it works.
    Table Data Indexes Model Constraints Grants Statistics UI Defaults Triggers Dependencies SQL
    CREATE TABLE "DEMO_CUSTOMERS"
    (     "CUSTOMER_ID" NUMBER NOT NULL ENABLE,
         "CUST_FIRST_NAME" VARCHAR2(20) NOT NULL ENABLE,
         "CUST_LAST_NAME" VARCHAR2(20) NOT NULL ENABLE,
         "CUST_STREET_ADDRESS1" VARCHAR2(60),
         "CUST_STREET_ADDRESS2" VARCHAR2(60),
         "CUST_CITY" VARCHAR2(30),
         "CUST_STATE" VARCHAR2(2),
         "CUST_POSTAL_CODE" VARCHAR2(10),
         "PHONE_NUMBER1" VARCHAR2(25),
         "PHONE_NUMBER2" VARCHAR2(25),
         "CREDIT_LIMIT" NUMBER(9,2),
         "CUST_EMAIL" VARCHAR2(30),
         "PRINT_FLAG" VARCHAR2(1),
         "PRINT_LABEL_FLAG" VARCHAR2(1),
         CONSTRAINT "DEMO_CUST_CREDIT_LIMIT_MAX" CHECK (credit_limit <= 5000) ENABLE,
         CONSTRAINT "DEMO_CUSTOMERS_PK" PRIMARY KEY ("CUSTOMER_ID") ENABLE
    So what should i do to get the example 1 to work.
    Thanks
    knut

  • STATUS Field isBlank on AP_INVOICES_INTERFACE table but REQUEST_ID is there

    Hi All,
    After running Payables Open Interface Import, I see Request_id is populated on AP_INVOICES_INTERFACE Table but STATUS field is still blank and i don't see any error details on AP_INTERFACE_REJECTIONS_TABLE as well for these invoices and ofcourse not in Base tables. I am under the impressions that, when ever This Standard Import is run, it will update Request_ID along with STATUS field to either PROCESSED or REJECTED. But in my case only request_id populated but not STATUS. Does anyone had faced this? and what could be the case?
    Thank you.

    Hello,
    Does the report output show whether records have been processed?
    Else check whether ORG_ID and SOURCE (Note: the source name must be setup in Payables lookups setup) are correctly populated in AP_INVOICES_INTERFACE.
    Vik

  • Error while importing table having ctxcat index

    Hi
    I created a table and ctxcat index on the same. I exported the same. While importing, table got imported but for index creation it gave error imp-00017 with ora-29855.
    Though the index have been created but with some errors. I can see the index in all_indexes table but if i try to see the entry in dba_segments, record is not found.
    Seems that index is not created properly.
    Pls suggest ASAP.
    Regards
    Rajiv

    What's your source and target Oracle version ?
    1) make sure context option is installed on target DB
    2) create a user ctxsys, the schema where the context related objects resides

  • Error: ORA-06502: PL/SQL: numeric or value error:NULL index table key value

    Hi,
    I am trying toceate an interface which collects data from database make some transformation and populated seeded tables in the same database.
    My Approach is :
    a) Create a record type variable ( concists of multiple segments)
    b) Create a pl/sql table type of the records type in a)
    c) Created a cursor of same structure as pl/sql table type ( collects data)
    d) BULK CoLLECT data from cursor into pl/sql table
    e) Print the data
    But during Bulk Collect i get the below error :
    Unexpected error in xxc_hr2hr_populate_elements.main at FLOW TRACE-1.120. Error: ORA-06502: PL/SQL: numeric or value error: NULL index table key value
    ORA-20003: Unexpected error in xxc_hr2hr_populate_elements.get_data at FLOW TRACE-1.1.100. Error: ORA-06502: PL/SQL: numeric or value error: Bulk Bind: Truncated Bind

    Ashish_Apps wrote:
    Hi,
    I am trying toceate an interface which collects data from database make some transformation and populated seeded tables in the same database.
    My Approach is :
    a) Create a record type variable ( concists of multiple segments)
    b) Create a pl/sql table type of the records type in a)
    c) Created a cursor of same structure as pl/sql table type ( collects data)
    d) BULK CoLLECT data from cursor into pl/sql table
    e) Print the data
    But during Bulk Collect i get the below error :
    Unexpected error in xxc_hr2hr_populate_elements.main at FLOW TRACE-1.120. Error: ORA-06502: PL/SQL: numeric or value error: NULL index table key value
    ORA-20003: Unexpected error in xxc_hr2hr_populate_elements.get_data at FLOW TRACE-1.1.100. Error: ORA-06502: PL/SQL: numeric or value error: Bulk Bind: Truncated BindVerify the rows you are fetching are having the same data types as your Record like Dates should go to date , decimal should not be fetched in type defined as number etc.

  • Dml error logging for tables in 10.2.0.5

    Hi experts,
    I have problems loading error records into the shadow table specified in the configuration window for tables. Upon executing the mapping, the execution results show warnings with the error records but these records were not moved to the shadow/error table.
    The actual scenario is: I specified a non-null data rule on the supplier_code column in the supplier table. Inside the mapping where I used this supplier table, I choose the option "Move to error" for this non-null data rule. Upon executing, it seems that owb treats these as warnings and not errors and thus did not propagate them to the error table. Anyone has any suggestion as to how to make use of the error tables in owb?

    DML error logging is generated for set-based PL/SQL mappings if the following conditions are satisfied:
    1. the Error table name property is set for the operator (table/view/mv)
    2. the PL/SQL Generated Mode of the module that contains the mapping is set to 10gR2 and above or Default.
    If the value is set to Default, ensure that location associated with the module has the Version set to 10.2 or above.
    When you use a data object in a mapping, the Error Table Name property for this data object is derived from the shadow table name property of the data object . If you modify the error table name of a data object (using the shadow table name property), you must synchronize all the operators bound to this data object.
    If you are still facing the issue then provide your email and I can send a simple example of data rule mapping in OWB 10.2.0.5
    Sutirtha

  • Database Adapter: cannot access table with complex record type as columns

    Hi all,
    I cannot perform any operations on a table that has columns with complex record type.
    I have created a table to store purchase order details.
    Sample script:
    CREATE type XX_CUST_INFO_TYP as object
    ssn VARCHAR2(20),
    rating NUMBER(15)
    CREATE type XX_ITEM_TYP as object
    item_name VARCHAR2(20),
    unit_price NUMBER(15),
    quantity NUMBER(15)
    CREATE table XX_PORDER (cust XX_CUST_INFO_TYP, porder XX_ITEM_TYP);
    When i try to access the table X_PORDER in jdev through a database Adapter, i receive the error as
    "some tables contains columns that are not recognized by the database adpter"
    1.) so in this case, how to include such tables that have complex types?
    Also, check out this scenario also..
    1. add a table through a database adapter
    2. drop the table in backend
    3. i can still see the table and its structure in the database adapter wizard even after restarting Jdeveloper.. How is it possible?
    These are some really interesting scenarios to experiment. Please suggest your ideas on this..
    Thanks All!

    Hi Hem,
    for a select you could select against a view. And for inserts you could create a stored procedure. They support complex types since 10.1.2. Complex types support in tables/views was added for 11 (next major release).
    You might be able to use PureSQL as a workaround too, i.e.
    insert into XX_PORDER values (XX_CUST_INFO_TYP(?,?), XX_ITEM_TYP(?, ?, ?))
    As for your other problem, in 10.1.2/10.1.3 the DBAdapter wizard sits on top of the Jdev Offline Tables and TopLink Mapping Workbench components. When you remove a table in the wizard it won't delete the Offline DB component. It was added by the wizard, but afterwards it is public to the entire Jdev project. You must remove it from Jdev yourself. This has been improved for the next major release too, no artifacts from underlying components are created.
    To remove it select:
    Offline DB Objects -> <schema> -> <table> and try File.. Erase From Disk.
    Thanks
    Steve

  • SQL Loader to append data in same table but using differnet WHEN clauses

    In my data file i have a header record and a detail record identified by Record_type = 1 and 2 respectively.
    The Database table has all the columns to capture detail records but i want to capture jus one column of header record now also in my existing table. So i have added that column (DATA_DATE)in my table but how to capture that value ?
    im writing my control file using two WHEN clauses, something like -
    load data
    into table t_bdn
    append
    when RECORD_TYPE = '2'
    FIELDS TERMINATED BY "|" TRAILING NULLCOLS
    SEQUENCE_NO
    , RECORD_TYPE
    , DISTRIBUTOR_CODE
    , SUPPLIER_CODE
    , SUPPLIER_DISTRIBUTOR_CODE
    , DISTRIBUTOR_SKU
    , SUPPLIER_SKU
    when RECORD_TYPE = '1'
    FIELDS TERMINATED BY "|" TRAILING NULLCOLS
    SEQUENCE_NO FILLER
    , RECORD_TYPE FILLER
    , CREATE_DATE FILLER
    , DATA_DATE "NVL(to_date(:DATA_DATE, 'YYYY/MM/DD'),to_date('9999/12/31', 'YYYY/MM/DD'))"
    im getting error " expecting INTO and foung WHEN RECORD_TYPE = '1' "
    if i give iNTO second time it will append a new row altogether in my table but i want the same row to be updated with this DATA_DATE value coming from RECORD_TYPE =1 and header record has 4 delimited data text fields only and i am interested in fetching just the 4th column..
    KIndly suggest what to do ?

    Ravneek, I could be wrong but sqlldr is a 'load' program, that is, it inserts data. I am unaware of any ability to update existing rows as you seem to want. What you appear to want to do is more the job of a merge statement.
    I would look at writing a pro* language, a .net, or a java program to perform inserts where some or all of the newly inserted rows are also to be updated.
    From the manual: (Oracle® Database Utilities 10g Release 2 (10.2) Part Number B14215-01)
    Updating Existing Rows
    The REPLACE method is a table replacement, not a replacement of individual rows. SQL*Loader does not update existing records, even if they have null columns. To update existing rows, use the following procedure:
    1. Load your data into a work table.
    2. Use the SQL language UPDATE statement with correlated subqueries.
    3. Drop the work table.
    HTH -- Mark D Powell --

  • Error applying Adobe Reader 11.0.07 patch - "Error 2762. Cannot write script record. Transaction not started"

    I have a problem applying the "adobe reader 11.0.07" patch. I have downloaded "AdbeRdrUpd11007_MUI.msp" from Adobe site and tried applying the patch using the command.
    The following error is prompted
    "Error 2762. Cannot write script record. Transaction not started" at the end of installation, But the version is updated to "11.0.07" in ARP.
    This error is mainly related to custom action sequence in "Execute sequence".
    I have created logfile and checked. it points to the custom action - "CreateAcroPDFRegForIE11".
    Could someone help me resolve this issue?
    Thanks in Advance..

    Hello Kashif,
    Find the answers below:
    1) How did you install reader MUI for example by running setup.exe or through an AIP or may be via SCCM or GPO? Any customizations during first install?
         package is for SCCM  and the Installation is carried through msi . AIP
    2) Which version of Reader MUI do you have installed curently?
         Adobe Reader 11.0.05
    3) Which command did you use to run the patch?
         msiexec /p AdbeRdrUpd11007_MUI.msp
    4) Did you run the patch from an elevated command prompt?
         Yes
    5) Which OS version are you at?
         Windows 7 64 bit
    Am unable to share the log file, am just copying the lines where the installation stops and prompts error:
    MSI (s) (24:20) [11:26:44:071]: Doing action: CreateAcroPDFRegForIE11
    Action start 11:26:44: CreateAcroPDFRegForIE11.
    MSI (s) (24:20) [11:26:44:071]: Transforming table CustomAction.
    MSI (s) (24:20) [11:26:44:071]: Transforming table CustomAction.
    MSI (s) (24:20) [11:26:44:071]: Note: 1: 2262 2: CustomAction 3: -2147287038
    MSI (s) (24:20) [11:26:44:071]: Transforming table CustomAction.
    MSI (s) (24:20) [11:26:44:071]: Transforming table Binary.
    MSI (s) (24:20) [11:26:44:071]: Transforming table Binary.
    MSI (s) (24:20) [11:26:44:071]: Note: 1: 2262 2: Binary 3: -2147287038
    MSI (s) (24:20) [11:26:44:071]: Transforming table Binary.
    MSI (s) (24:20) [11:26:44:071]: Note: 1: 2762
    MSI (c) (80:E8) [11:26:44:243]: Font created.  Charset: Req=0, Ret=0, Font: Req=MS Shell Dlg, Ret=MS Shell Dlg
    Error 2762.Cannot write script record. Transaction not started.
    MSI (s) (24:20) [11:33:58:305]: Product: Adobe Reader XI (11.0.07)  MUI -- Error 2762.Cannot write script record. Transaction not started.
    Action ended 11:33:58: CreateAcroPDFRegForIE11. Return value 3.
    Thanks In Advance..

  • Dyncamics Connector Error CRM to NAV, no integration record within filter

    In working to connect CRM Account records to the NAV Contact Card as Companies, experiencing an inner exception error that there is no integration record within the filter.  Searching for a solution has not turned up a solid answer to path to resolution. 
    We are mapping basic data fields; address/city/state/zip/email/website/account name/account number/...  The only required fields in NAV on the Contact Card are No and Name which we populate.  The Account record from CRM does get created in NAV on
    the Contact Card BUT updates within those data fields will not take and we get these errors (#1 & #2) in sequence.  Error #3 below comes from an attempt at updating a field's data to sync from CRM Account to NAV after receiving errors
    #1 & #2.  It should be noted that the we are trying to send the Account record to Contact Card then to Customer Card.  The Customer Card record is created without issue, but we believe error #3 may be fallout from the exception error #1.
    1) WARNING TID:18 [2013-09-23T11:19:02.5269850-05:00]: [Account to NAV Contact Card] has encountered an error while processing key [[Test Company, 59057fb9-2cdb-e111-86eb-00155d006a06]]. Exception occurred in Microsoft Dynamics NAV
     --- Exception Dump ---
     Caught Exception: Exception occurred in Microsoft Dynamics NAV
     Stack trace:
       at Microsoft.Dynamics.Integration.Adapters.Nav2009.NavObjectProvider.WriteObject(Object value)
       at Microsoft.Dynamics.Integration.Service.MapThread.ProcessReads()
     Inner Exception: There is no Integration Record within the filter.
    Filters: Record ID: Contact: 88888
     Stack trace:
       at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall)
       at System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters)
       at Microsoft.Dynamics.Integration.Adapters.Nav2009.NavIntegrationHelper.InvokeUpdateMethod(Object& newobj, Boolean create, String integrationId, Boolean updateIntegrationId)
       at Microsoft.Dynamics.Integration.Adapters.Nav2009.NavIntegrationHelper.DoUpdate(Object newobj, Object obj, String integrationId, Boolean create, Boolean hasExistingLines)
       at Microsoft.Dynamics.Integration.Adapters.Nav2009.NavIntegrationHelper.DoWrite(Object newobj)
       at Microsoft.Dynamics.Integration.Adapters.Nav2009.NavObjectProvider.WriteDynamicObject(Object value)
       at Microsoft.Dynamics.Integration.Adapters.Nav2009.NavObjectProvider.WriteObject(Object value)
    2) [Account to NAV Contact Card] has encountered an error while processing key [[Test Company, 59057fb9-2cdb-e111-86eb-00155d006a06]]. The record already exists.
    3) [Account to NAV Contact Card] has encountered an error while processing key [[Test Company, 59057fb9-2cdb-e111-86eb-00155d006a06]]. Table Id 18 of a supplied key does not match Table Id 5050 of the page 5050  source table.
    Any direction to resolution would be much appreciated.
    Thanks,
    Troy

    May have isolated the cause to the Type field on the Contact Card in NAV not accepting a "Company" value as sent through the Connector.  There are two values on the Contact Card for Type, Person/Company, where Person is accepted during send to create
    an Integration Record but Company is not (even using it as the designed default doesn't work).
    Has anyone experienced NAV not accepting the Type code of Company on the Contact Card through the Connector from CRM Accounts which in turn stops the creation of an Integration Record for the Account?
    Any help or input would be very beneficial.  Thanks.

Maybe you are looking for