Data is not getting reconciled with the R/3 Source

Hi Gurus,
I am trying to extract data from R/3, and i did extracted data from R/3 to PSA (no target is created yet),
the records in R/3 are arround 1003 (2lis_11_vaitm) and after extraction the data records in PSA are around 5000+ , and i don't have any transformations in between , its a just replica of data from R/3 to PSA, can any body clear out this issue pls.
Surabhi

Hi Surabhi,
Check RSA3 and Increase the packet size  to  maimum number.
Also Check  PSA  if there is any existing  data  before loading to PSA.
Is it Full load ? if yes (you are in DEV or Quality then try to delete the data in setup tables and fill setup tables again)
Regards,
Siv

Similar Messages

  • Data is not getting replicating to the destination db.

    I has set up streams replication on 2 databases running Oracle 10.1.0.2 on windows.
    Steps for setting up one-way replication between two ORACLE databases using streams at schema level followed by the metalink doc
    I entered a few few records in the source db, and the data is not getting replication to the destination db. Could you please guide me as to how do i analyse this problem to reach to the solution
    setps for configuration _ steps followed by metalink doc.
    ==================
    Set up ARCHIVELOG mode.
    Set up the Streams administrator.
    Set initialization parameters.
    Create a database link.
    Set up source and destination queues.
    Set up supplemental logging at the source database.
    Configure the capture process at the source database.
    Configure the propagation process.
    Create the destination table.
    Grant object privileges.
    Set the instantiation system change number (SCN).
    Configure the apply process at the destination database.
    Start the capture and apply processes.
    Section 2 : Create user and grant privileges on both Source and Target
    2.1 Create Streams Administrator :
    connect SYS/password as SYSDBA
    create user STRMADMIN identified by STRMADMIN;
    2.2 Grant the necessary privileges to the Streams Administrator :
    GRANT CONNECT, RESOURCE, AQ_ADMINISTRATOR_ROLE,DBA to STRMADMIN;
    In 10g :
    GRANT CONNECT, RESOURCE, AQ_ADMINISTRATOR_ROLE,DBA to STRMADMIN;
    execute DBMS_STREAMS_AUTH.GRANT_ADMIN_PRIVILEGE('STRMADMIN');
    2.3 Create streams queue :
    connect STRMADMIN/STRMADMIN
    BEGIN
    DBMS_STREAMS_ADM.SET_UP_QUEUE(
    queue_table => 'STREAMS_QUEUE_TABLE',
    queue_name => 'STREAMS_QUEUE',
    queue_user => 'STRMADMIN');
    END;
    Section 3 : Steps to be carried out at the Destination Database PLUTO
    3.1 Add apply rules for the Schema at the destination database :
    BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
    schema_name => 'SCOTT',
    streams_type => 'APPLY ',
    streams_name => 'STRMADMIN_APPLY',
    queue_name => 'STRMADMIN.STREAMS_QUEUE',
    include_dml => true,
    include_ddl => true,
    source_database => 'REP2');
    END;
    3.2 Specify an 'APPLY USER' at the destination database:
    This is the user who would apply all DML statements and DDL statements.
    The user specified in the APPLY_USER parameter must have the necessary
    privileges to perform DML and DDL changes on the apply objects.
    BEGIN
    DBMS_APPLY_ADM.ALTER_APPLY(
    apply_name => 'STRMADMIN_APPLY',
    apply_user => 'SCOTT');
    END;
    3.3 Start the Apply process :
    DECLARE
    v_started number;
    BEGIN
    SELECT decode(status, 'ENABLED', 1, 0) INTO v_started
    FROM DBA_APPLY WHERE APPLY_NAME = 'STRMADMIN_APPLY';
    if (v_started = 0) then
    DBMS_APPLY_ADM.START_APPLY(apply_name => 'STRMADMIN_APPLY');
    end if;
    END;
    Section 4 :Steps to be carried out at the Source Database REP2
    4.1 Move LogMiner tables from SYSTEM tablespace:
    By default, all LogMiner tables are created in the SYSTEM tablespace.
    It is a good practice to create an alternate tablespace for the LogMiner
    tables.
    CREATE TABLESPACE LOGMNRTS DATAFILE 'logmnrts.dbf' SIZE 25M AUTOEXTEND ON
    MAXSIZE UNLIMITED;
    BEGIN
    DBMS_LOGMNR_D.SET_TABLESPACE('LOGMNRTS');
    END;
    4.2 Turn on supplemental logging for DEPT and EMPLOYEES table :
    connect SYS/password as SYSDBA
    ALTER TABLE scott.dept ADD SUPPLEMENTAL LOG GROUP dept_pk(deptno) ALWAYS;
    ALTER TABLE scott.EMPLOYEES ADD SUPPLEMENTAL LOG GROUP dep_pk(empno) ALWAYS;
    Note: If the number of tables are more the supplemental logging can be
    set at database level .
    4.3 Create a database link to the destination database :
    connect STRMADMIN/STRMADMIN
    CREATE DATABASE LINK PLUTO connect to
    STRMADMIN identified by STRMADMIN using 'PLUTO';
    Test the database link to be working properly by querying against the
    destination database.
    Eg : select * from global_name@PLUTO;
    4.4 Add capture rules for the schema SCOTT at the source database:
    BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
    schema_name => 'SCOTT',
    streams_type => 'CAPTURE',
    streams_name => 'STREAM_CAPTURE',
    queue_name => 'STRMADMIN.STREAMS_QUEUE',
    include_dml => true,
    include_ddl => true,
    source_database => 'REP2');
    END;
    4.5 Add propagation rules for the schema SCOTT at the source database.
    This step will also create a propagation job to the destination database.
    BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_PROPAGATION_RULES(
    schema_name => 'SCOTT',
    streams_name => 'STREAM_PROPAGATE',
    source_queue_name => 'STRMADMIN.STREAMS_QUEUE',
    destination_queue_name => 'STRMADMIN.STREAMS_QUEUE@PLUTO',
    include_dml => true,
    include_ddl => true,
    source_database => 'REP2');
    END;
    Section 5 : Export, import and instantiation of tables from
    Source to Destination Database
    5.1 If the objects are not present in the destination database, perform
    an export of the objects from the source database and import them
    into the destination database
    Export from the Source Database:
    Specify the OBJECT_CONSISTENT=Y clause on the export command.
    By doing this, an export is performed that is consistent for each
    individual object at a particular system change number (SCN).
    exp USERID=SYSTEM/manager@rep2 OWNER=SCOTT FILE=scott.dmp
    LOG=exportTables.log OBJECT_CONSISTENT=Y STATISTICS = NONE
    Import into the Destination Database:
    Specify STREAMS_INSTANTIATION=Y clause in the import command.
    By doing this, the streams metadata is updated with the appropriate
    information in the destination database corresponding to the SCN that
    is recorded in the export file.
    imp USERID=SYSTEM@pluto FULL=Y CONSTRAINTS=Y FILE=scott.dmp IGNORE=Y
    COMMIT=Y LOG=importTables.log STREAMS_INSTANTIATION=Y
    5.2 If the objects are already present in the desination database, there
    are two ways of instanitating the objects at the destination site.
    1. By means of Metadata-only export/import :
    Specify ROWS=N during Export
    Specify IGNORE=Y during Import along with above import parameters.
    2. By Manaually instantiating the objects
    Get the Instantiation SCN at the source database:
    connect STRMADMIN/STRMADMIN@source
    set serveroutput on
    DECLARE
    iscn NUMBER; -- Variable to hold instantiation SCN value
    BEGIN
    iscn := DBMS_FLASHBACK.GET_SYSTEM_CHANGE_NUMBER();
    DBMS_OUTPUT.PUT_LINE ('Instantiation SCN is: ' || iscn);
    END;
    Instantiate the objects at the destination database with
    this SCN value. The SET_TABLE_INSTANTIATION_SCN procedure
    controls which LCRs for a table are to be applied by the
    apply process. If the commit SCN of an LCR from the source
    database is less than or equal to this instantiation SCN,
    then the apply process discards the LCR. Else, the apply
    process applies the LCR.
    connect STRMADMIN/STRMADMIN@destination
    BEGIN
    DBMS_APPLY_ADM.SET_SCHEMA_INSTANTIATION_SCN(
    SOURCE_SCHEMA_NAME => 'SCOTT',
    source_database_name => 'REP2',
    instantiation_scn => &iscn );
    END;
    Enter value for iscn:
    <Provide the value of SCN that you got from the source database>
    Note:In 9i, you must instantiate each table individually.
    In 10g recursive=true parameter of DBMS_APPLY_ADM.SET_SCHEMA_INSTANTIATION_SCN
    is used for instantiation...
    Section 6 : Start the Capture process
    begin
    DBMS_CAPTURE_ADM.START_CAPTURE(capture_name => 'STREAM_CAPTURE');
    end;
    /

    same problem, data not replicated.
    its captured,propagated from source,but not applied.
    also no apply errors in DBA_APPLY_ERROR. Looks like the problem is that LCRs propagated from source db do not reach target queue.can i get any help on this?
    queried results are as under:
    1.at source(capture process)
    Capture Session Total
    Process Session Serial Redo Entries LCRs
    Number ID Number State Scanned Enqueued
    CP01 16 7 CAPTURING CHANGES 1010143 72
    2. data propagated from source
    Total Time Executing
    in Seconds Total Events Propagated Total Bytes Propagated
    7 13 6731
    3. Apply at target(nothing is applied)
    Coordinator Session Total Total Total
    Process Session Serial Trans Trans Apply
    Name ID Number State Received Applied Errors
    A001 154 33 APPLYING 0 0 0
    4. At target:(nothing in buffer)
    Total Captured LCRs
    Queue Owner Queue Name LCRs in Memory Spilled LCRs in Buffered Queue
    STRMADMIN STREAMS_QUEUE 0 0 0

  • Data is not getting displayed in the report from an Infoset.

    Hi All,
    I am having a report  based on an infoset. This report is displaying the data in the Dev. envmt. When it is transported to the QA, it is not displaying the data in the BEx as well as RSRT, in the QA envmt. The patch levels of both the Dev. and QA are the same. The Queries are same in the Dev and QA also.
    While trying to display the data from the infoset (rt.click- display data), i am able to view the data, in the QA.
    Could anyone please suggest why the data is not getting displayed in the query designer.
    Thanks & Regards,
    A.V.N.Rao

    Hi Ashish,
    I ran the "ZPS/!ZPS" in RSRT where ZPS is the infoset name. In Dev, it displayed the values. In QA, it displayed the below messages:
    ECharacteristic 0TCAKYFNM does not exist. Check authorizations
    WThere are calculated elements. These results are bracketed [  ]
    and below that, it displayed the values for Number of records. But, it has not displayed the values for the other figures.
    Does this has any impact in QA.
    Thanks & Regards,
    AVN Rao.

  • Master data is not getting enabled in the InfoSet.

    Hi Experts,
    I have 3 DSOs in an InfoSet and I have also brought 2 Master data infoobjects in this infoset.
    The master data check box in the infoset is not enabled and also not getting displayed in the Query Designer.
    Can anyone please let me know how do I bring the master data in the query???
    Also please advice me whether I should have only 3 DSOs in the infoset.
    Please help.
    Thanks

    Hi,
    In my DSO I have an InfoObject called Emp No. in the Data Field.
    The Emp.No is being maintained as master with (Emp Name, Address, Telephone No, DOB) as attributes.
    I have loaded the data in the Emp. No master. Then tried loading the transaction data in DSO.
    The Emp.No is there in the DSO active data, but in the query designer its not getting displayed.
    Hope its clear.
    Please help.
    Thanks

  • Data is not getting loaded in the Cube

    Hi Experts,
    I had a cube to which I created aggregates and then I deleted the data of the cube and made some changes in it and trying to load the data again from PSA.
    But I'm the data is not getting loaded in to it the cube.
    Can anyone tell me do I have to delete the aggregates before I start a fresh data in to my cube. If yes can you tell me how do i delete it and load it again.
    If it is something else please help me resolve my issue.
    Thanks

    Hi,
    Deactivate the aggregates and then load the data from PSA to Cube.
    And then reactivate the aggregates. While reactivating them you can ask to fill them also.
    Regards,
    Anil Kumar Sharma. P

  • Master data is not getting displayed in the Query Designer

    Hi,
    I have a DSO in which I have an InfoObject called Emp No. in the Data Field.
    The Emp.No is being maintained as master with (Emp Name, Address, Telephone No, DOB) as attributes.
    I have loaded the data in the Emp. No. master. Then tried loading the transaction data in DSO.
    The Emp.No is there in the DSO active data, but in the query designer its not getting displayed.
    Hope its clear.
    Please help.
    Thanks

    Hi,
    I have brought the Emp. No. in the Key Field and also have activate the master data again.
    Yet my Query Designer doesn't have the Emp. No.
    I have done a full load for both Master and Transaction.
    Please advice me what the other alternative.
    Thanks

  • If Records of different list items are entered, then the data is not getting inserted in the table.

    Hi Everyone,
    A Very Very Happy, Fun-filled, Awesome New Year to You All.
    Now coming to the discussion of my problem in Oracle Forms 6i:
    I have created a form in which the data is entered & saved in the database.
    CREATE TABLE MATURED_FD_DTL
      ACCT_FD_NO    VARCHAR2(17 BYTE)               NOT NULL,
      CUST_CODE     NUMBER(9),
      FD_AMT        NUMBER(15),
      FD_INT_BAL    NUMBER(15),
      TDS           NUMBER(15),
      CHQ_NO        NUMBER(10),
      CREATED_DATE  DATE,
      CREATED_BY    VARCHAR2(15 BYTE),
      PREV_YR_TDS   NUMBER(15),
      ADD_FD_AMT    NUMBER(15),
      DESCRIPTION   VARCHAR2(100 BYTE),
      P_SAP_CODE    NUMBER(10),
      P_TYPE        VARCHAR2(1 BYTE)
    The form looks like below:
    ENTER_QUERY     EXECUTE_QUERY     SAVE     CLEAR     EXIT
    ACCT_FD_NO
    CUST_CODE
    FD_AMT
    FD_INT_BAL
    PREV_YR_TDS
    TDS
    ADD_FD_AMT
    P_SAP_CODE
    P_TYPE
    CHQ_NO
    DESCRIPTION
    R
    W
    P
    List Item
    There are 5 push buttons namely ENTER_QUERY, EXECUTE_QUERY, SAVE, CLEAR, EXIT.
    The table above is same as in the form. All the fields are text_item, except the P_TYPE which is a List_Item ( Elements in List Item are R, W & P).
    The user will enter the data & save it.
    So all this will get updated in the table MATURED_FD_DTL .
    I am updating one column in another table named as KEC_FDACCT_MSTR.
    and
    I want this details to get updated in another table named as KEC_FDACCT_DTL only if the P_TYPE='P'
    CREATE TABLE KEC_FDACCT_DTL
      FD_SR_NO                NUMBER(8)             NOT NULL,
      FD_DTL_SL_NO            NUMBER(5),
      ACCT_FD_NO              VARCHAR2(17 BYTE)     NOT NULL,
      FD_AMT                  NUMBER(15,2),
      INT_RATE                NUMBER(15,2),
      SAP_GLCODE              NUMBER(10),
      CATOGY_NAME             VARCHAR2(30 BYTE),
      PROCESS_YR_MON          NUMBER(6),
      INT_AMT                 NUMBER(16,2),
      QUTERLY_FD_AMT          NUMBER(16,2),
      ITAX                    NUMBER(9,2),
      MATURITY_DT             DATE,
      FDR_STAUS               VARCHAR2(2 BYTE),
      PAY_ACC_CODE            VARCHAR2(85 BYTE),
      BANK_CODE               VARCHAR2(150 BYTE),
      NET_AMOUNT_PAYABLE      NUMBER,
      QUATERLY_PAY_DT         DATE,
      CHEQUE_ON               VARCHAR2(150 BYTE),
      CHEQUE_NUMBER           VARCHAR2(10 BYTE),
      CHEQUE_DATE             DATE,
      MICR_NUMBER             VARCHAR2(10 BYTE),
      PAY_TYPE                VARCHAR2(3 BYTE),
      ADD_INT_AMT             NUMBER(16,2),
      ADD_QUTERLY_FD_AMT      NUMBER(16,2),
      ADD_ITAX                NUMBER(16,2),
      ECS_ADD_INT_AMT         NUMBER(16),
      ECS_ADD_QUTERLY_FD_AMT  NUMBER(16),
      ECS_ADD_ITAX            NUMBER(16)
    So for the push button 'Save' , i have put in the following code in the Trigger : WHEN BUTTON PRESSED,
    BEGIN
         Commit_form;
              UPDATE KEC_FDACCT_MSTR SET PAY_STATUS='P' WHERE ACCT_FD_NO IN (SELECT ACCT_FD_NO FROM MATURED_FD_DTL);
              UPDATE MATURED_FD_DTL SET CREATED_DATE=sysdate, CREATED_BY = :GLOBAL.USER_ID WHERE ACCT_FD_NO = :acct_fd_NO;
    IF :P_TYPE='P' THEN
         INSERT INTO KEC_FDACCT_DTL
              SELECT FD_SR_NO, NULL, MATURED_FD_DTL.ACCT_FD_NO, FD_AMT, INT_RATE, P_SAP_CODE,
                   GROUP_TYPE, (TO_CHAR(SYSDATE, 'YYYYMM'))PROCESS_YR_MON,
                   FD_INT_BAL, (FD_INT_BAL-MATURED_FD_DTL.TDS)QUTERLY_FD_AMT , MATURED_FD_DTL.TDS,
                   MATURITY_DATE, P_TYPE, NULL, NULL, (FD_INT_BAL-MATURED_FD_DTL.TDS)NET_AMOUNT_PAYABLE,
                   NULL, NULL, CHQ_NO, SYSDATE, NULL, 'CHQ', NULL, NULL, NULL, NULL, NULL, NULL
              FROM MATURED_FD_DTL, KEC_FDACCT_MSTR
         WHERE KEC_FDACCT_MSTR.ACCT_FD_NO=MATURED_FD_DTL.ACCT_FD_NO;
    END IF;
    COMMIT;
         MESSAGE('RECORD HAS BEEN UPDATED AS PAID');
         MESSAGE(' ',no_acknowledge);
    END;
    If P_TYPE='P' , then the data must get saved in KEC_FDACCT_DTL table.
    The problem what is happening is,
    If i enter the details with all the records as 'P' , the record gets inserted into the table KEC_FDACCT_DTL
    If i enter the details with records of 'P' and 'R' , then nothing gets inserted into the table KEC_FDACCT_DTL.
    Even the records with 'P' is not getting updated.
    I want the records of 'P' , to be inserted into table KEC_FDACCT_DTL, even when multiple records of all types of 'P_Type' (R, w & P) are entered.
    So, can you please help me with this.
    Thank You.
    Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
    Oracle Forms Builder 6i.

    Its not working properly.
    At Form_level_Trigger: POST_INSERT, I have put in the following code.
    IF :P_TYPE='P'THEN
      INSERT INTO KEC_FDACCT_DTL
      SELECT FD_SR_NO, NULL, MATURED_FD_DTL.ACCT_FD_NO, FD_AMT, INT_RATE, P_SAP_CODE,
      GROUP_TYPE, (TO_CHAR(SYSDATE, 'YYYYMM'))PROCESS_YR_MON,
      FD_INT_BAL, (FD_INT_BAL-MATURED_FD_DTL.TDS)QUTERLY_FD_AMT , MATURED_FD_DTL.TDS,
      MATURITY_DATE, P_TYPE, NULL, NULL, (FD_INT_BAL-MATURED_FD_DTL.TDS)NET_AMOUNT_PAYABLE,
      NULL, NULL, CHQ_NO, SYSDATE, NULL, 'CHQ', NULL, NULL, NULL, NULL, NULL, NULL
      FROM MATURED_FD_DTL, KEC_FDACCT_MSTR
      WHERE KEC_FDACCT_MSTR.ACCT_FD_NO=MATURED_FD_DTL.ACCT_FD_NO;
      END IF;
    MESSAGE('RECORD HAS BEEN UPDATED AS PAID');
    MESSAGE(' ',no_acknowledge);
    It worked properly when i executed first time, but second time , in database duplicate values were stored.
    Example: First I entered the following in the form & saved it.
    ACCT_FD_NO
    CUST_CODE
    FD_AMT
    FD_INT_BAL
    PREV_YR_TDS
    TDS
    ADD_FD_AMT
    P_SAP_CODE
    P_TYPE
    CHQ_NO
    DESCRIPTION
    250398
    52
    50000
    6000
    0
    600
    0
    45415
    P
    5678
    int1
    320107
    56
    100000
    22478
    3456
    2247
    0
    45215
    R
    456
    320108
    87
    50000
    6500
    0
    650
    0
    21545
    W
    0
    In the database, in table KEC_FDACCT_DTL, the ACCT_FD_NO:250398 with P_TYPE='P' record was inserted.
    ACCT_FD_NO
    P_TYPE
    250398
    P
    But second time, when i entered the following in the form & saved.
    ACCT_FD_NO
    CUST_CODE
    FD_AMT
    FD_INT_BAL
    PREV_YR_TDS
    TDS
    ADD_FD_AMT
    P_SAP_CODE
    P_TYPE
    CHQ_NO
    DESCRIPTION
    260189
    82
    50000
    6000
    0
    600
    0
    45415
    P
    5678
    interest567
    120011
    46
    200000
    44478
    0
    4447
    0
    45215
    R
    456
    30191
    86
    50000
    6500
    0
    650
    0
    21545
    W
    56
    In the database, in the table KEC_FDACCT_DTL, the following rows were inserted.
    ACCT_FD_NO
    P_TYPE
    250398
    P
    250398
    P
    260189
    P
    320107
    R
    320108
    W
    There was duplicate of 250398 which i dint enter in the form second time,
    All the other P_TYPE was also inserted , but i want only the P_TYPE='P' to be inserted into the database.
    I want only those records to be inserted into the form where P_TYPE='P' and duplicate rows must not be entered.
    How do i do this???

  • Master data is not getting displayed in the Report in Quality

    HI Experts,
    I have ZESP_Num as master for my transaction data.
    I have loaded the data in the Development and designed and checked my report. It is working fine.
    But after moving it to Quality. My report is not getting the master data.
    When I try to do right click manage ZESP_Num.
    I'm getting message like this.
    The data from the InfoProvider ZESP_NUM involved could not be checked
    Message no. DBMAN242
    Diagnosis
    A participating InfoProvider is virtual and aggreement between results and selections cannot be guaranteed.
    Can anyone plz help me.
    Thanks

    Hi,
    Please check the SNOTE Note 1364083 u2013 Message text will appear unexpectedly in surface.
    Hope helpful.
    Regards,
    RAO

  • Proposed delivery date should not get copied to the target document

    Hi gurus,
    I am creating a quotation and it is taking proposed delivery date as creation date and confirming a schedule line. When i create the sales order the same data is coped to the sale order. I have a different delivery date in sales order , but the system dose not allow me to change the date saying that a schedule line already confirmed for this line item.
    I have unticked the copy schedule line the copay control of item category from quotation to order. still the same problem...pls help

    Hi,
    If you carefully look at the problem, the root cause is not at the schedule line but at the document header where the delivery date is given.
    So if you really dont want the delivery date to be copied, you should do that in a new copying routine and in that routine, you should copy all the details except the delivery date. This means that even if you create with reference, system will ask you for requested delivery date. Now you can give the delivery date of your choice and further scheduling will happen automatically.

  • Data is not getting loaded using the Demon for RDA

    Hi friends,
    We are doing a POC for Real Time Data acuisition.
    For this we have copied an exisiting datasource and made it Real Time enabled.
    Replicated the datasource now we can see the new datasource in BW.
    We have created an Init Info Package (This loads only till the PSA).
    Loaded the data using Full DTP to load from PSA to the DSO, this succesfully loaded the data in the DSO.
    After that we Created a Delta Info package and a Delta DTP and assigned them to a Demon and started the demon. But strangely i can see the delta data getting Loaded to the PSA but there is nothing happening on the DTP. When i check the DSO manage there is no Delta request. The only request what we have is the Full load.
    Is there any step or something im missing? please guide me.
    Regards
    BN

    Hi,
    Deactivate the aggregates and then load the data from PSA to Cube.
    And then reactivate the aggregates. While reactivating them you can ask to fill them also.
    Regards,
    Anil Kumar Sharma. P

  • Data not getting uploaded in the AQ table.

    Hi All
    We have one issue with the AQ tables.
    The scenario is like we first send data from Oracle to SOA and then to a third party tool.
    For this we first populate data in one custom payload table and then in to one custom AQ table.
    Now the data is getting updated in the paylaod table and XML is getting generated, but it is not getting interfaced to the third party tool.
    Reason : Data is not getting populated in the custom AQ table from where SOA picks the data and interfaces it ti the third party tool.
    Error is a custom one :
    While equeing the data using the below statement
    l_chr_corrid := p_in_event.getvalueforparameter ('CORRID');
    since the CORRID Id is null it gives us the message :Correlation ID has not been set for the payload.
    Part of the code for setting WF parameters :
    l_chr_corrid_val := l_chr_event_name || '.order';
    wf_event.addparametertolist (p_name => l_chr_corrid_name,
    p_value => l_chr_corrid_val,
    p_parameterlist => l_param_list
    l_local_event.setcorrelationid (l_chr_corrid_val);
    l_chr_rule_ret := wf_rule.default_rule (p_in_subscription_guid, l_local_event); -- Returns success
    This is quite urgent , hence kindly help us find a sollution to this issue.
    Thanks in advance
    Trupti

    Looks like you havent populated the correlation id. To make sure this is unique use the conversationID, or GUI, etc.
    Not sure what version you are on. If 10g look at this note (obviously chnage from file to AQ, concept the same):
    http://middleware1.wordpress.com/category/adapters/page/2/
    If 11g you should be able to see the property when you edit the invoke activity, there is a tab called properties.
    cheers
    James

  • Data is not getting posted in ABAP Proxy.

    Hi,
    I am working on File to ABAPProxy scenario. The data is not going to proxy.
    In PI sxmb_moni is status is successfull and in the R3 sxmb_moni the status is successfull. But the data is not getting posted in the tables(ABAPProxy). I had checked the inbound and oubound queues. And with the same input data, abapers cheked their code in Abapproxy, then the table is getting updated. When we trigger from PI, the tables are not getting updated.Please help me.
    Thanks,
    Pragathi.

    Hi,
    The problem may be
    Case 1: Cache is not getting refreshed(Check SXI_CACHE)
    Case 2: The Queue Is blocked (Check SMQ1 & SMQ2)
    Regards,
    Sainath

  • Not getting cleared with internal working of JSF

    Hi All,
    I have started with basics of JSF and created some simple apps using netBeans.
    However I am not getting cleared with the exact internal working of JSF i.e. how the things are implemented in JSF libraries: jsf-api and jsf-impl.
    Please advice or share some links on this.
    --Sushant                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    Hi,
    While creating JSP custom tags, we start from scratch by first creating tag handler class then .tld file for that and finally deploying both these files.
    I am looking for a similar approach for making our own JSF component.
    I don't want to put jsf-impl API into lib directory. Is it possible to make a simple component using jstl standard libraries ?
    --Sushant                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Dispute case does not get updated with subsequent partial payment

    Hi Experts
    I am now still testing the Dispute management .. but it's my first time to see that some times created dispute case does not get updated with the payment posted against the invoice which the dispute case has been raised .. when i trying to add open item to the current dispute case i face this Error..
    Process step 004: Change not possible, process step 003 missing
    Message no. UDM_MSG037
    Diagnosis
    The dispute case is to be changed by process step 004 from accounting. However, there is at least one process (for example, clearing transaction from payment or credit memo) that was performed before the current step and that has not yet updated the dispute case. The changes to the dispute case must be carried out in the correct order.
    System Response
    The dispute case could not be changed.
    Procedure
    In an asynchronous change to dispute cases using IDoc, make sure that all IDocs of the previous process have been posted. Then you can carry out the required action (for example, post the current IDoc from process step 004 or include further items). The immediate previous process step is 003.
    Your kind feed back is highly appreciated..
    Regards
    Mahmoud EL Nady

    Hi
    Thanks for straight forward solutions its now working properly after run the program.. thanks too much..
    also do i need to run this program periodically or once i notice that one dispute case it not getting updated?
    Regards
    Mahmoud El Nady

  • Freight condition getting multiplied with the quantity

    Dear Gurus,
    I have configured freight  condition, ZHD0, copied from HD00. When I am putting Rs 1000 in header, if i have 10 quantities, it is getting multipled with 10 and getting the total as 10000. I want it to be 1000 only and not get multipled with the quantity. Kindly suggest what needs to be done.
    Also, if there are 10 line items, the freight value should not be multipled with the quantity.
    Kindly suggest
    Shilpa Shah

    Hi;
    This could be issue of unit of measure. The sales unit of measure might be different than pricing unit of measure. So system might picking the unit conversion from material master. Like if your sales unit is CS and pricing PC. in material if it is maintained like 1 CS =10 PC. In this case system will multiply the amount.
    I hope this will help.
    Regards;
    Avinash

Maybe you are looking for