Need to post Full Load data (55,000 records) to the target system.

Hi All,
We are getting the data from SAP HR system and we need to post this data to the partner system. So we configured Proxy(SAP) to File(Partner) scenario. We need to append the data of each message to the target file. Scince this is a very critical interface, we have used the dedicated queues. The scenario is working fine in D. When the interface transported to Q, they tested this interface with full load i.e with 55,000 messages.All messages are processed successfully in Integration Engine and to process in Adapter engine, it took nearly 37 hrs. We need to post all 55,000 records with in 2 hrs.
The design of this interface is simple. We have used direct mapping and the size of each message is 1 KB. But need to append all messages to one file at the target side.We are using Advantco sFTP as receiver adapter and proxy as a sender.
Could you please suggest a solution to process all 55,000 messages with in 2hrs.
Thanks,
Soumya.

Hi Soumya,
I understand your scenario as, HR data has be send to third party system once in a day. I guess, they are synchronizing employee (55,000) data in third party system with SAP HR data, daily.
I would design this scenario as follows:-
I will ask ABAPer to write a ABAP program, which run at 12:00, pickup 55,000 records from SAP HR tables and place them in one file. That file will be placed in SAP HR file system (you can see it using al11). At 12:30, PI File channel will pick up the file and transfer the file to third party target system as it is, without any transformation. File to File, pass through scenario (no ESR objects). Now, ask the target system to take the file, run their program (they should have some SQL routines). That SQL program will insert these records into target system tables.
If 55,000 records make huge file at SAP HR system, ask ABAPer to split it into parts. PI will pick them in sequence based on file name.
In this approach, I would ask both SAP HR (sender) and third party (target) system people to be flexible. Otherwise, I would say, it is not technically possible with current PI resources. In my opinion, PI system is middleware, not system in which huge computations can be done. If messages are coming from different systems, then collection in middleware makes sense. In your case, collecting large number of messages from single system, at high frequency is not advisable. 
If third party target system people are not flexible, then go for File to JDBC scenario. Ask SAP HR ABAPer to split input file into more number of files (10-15, you PI system should be able to handle). At receiver JDBC, use native SQL. You need java mapping to construct, SQL statements in PI. Donu2019t convert flat file to JDBC XML structure, in your case PI cannot handle huge XML payload.
You have to note, hardware upgrade is very difficult (you need lot of approvals depending your client process) and very costly. In my experience hardware upgrade will take 2-3 months.
Regards,
Raghu_Vamsee

Similar Messages

  • Delta load is done,now need to have full load

    Hi,
    Delta load is done,now need to have full load.
    Here for DTP we dont have full repair,what is the process.

    Hi,
    You will have to use repair full load at infopackage level.
    But repair full load should be done only if their is any problem
    in the load.
    Process
    1)selectively delete the data where problem had occured
    2)Repiar full load in the infopackage for that selection.
    You cant do a direct full load on the same selection as that of delta.
    Please assign points if it helped you
    Regards,
    Senoy

  • To load data from a cube in SCM(APO) system to a cube in BI system.

    Experts,
         Please let me know whether it is possible to load data from a cube in SCM(APO) system to a cube in BI system.If so explain the steps to perform.
    Thanks,
    Meera

    Hi,
    Think in this way,
    To load the data fro any source we need datasource Ok. You can genare Export data source for Cube in APO,  then use that datasource for BW extraction, try like this. I think it will work, in my case I'm directly loading data from APO to BW using the DataSource sthat are genaraed on Planning Area.
    Why you need to take data from APO cube?. Is there any condition for that?. If it is not mandatory, you can use the dame datasource and load the data to BW, if they have any conditions while loading the data from APO to APO cube, they you try check wherther it is possible in BW or not. If possible then you use DataSource and do the same calculation in BW directly.
    Thanks
    Reddy

  • Customized delta data source for deleting data record in the source system.

    Hello Gurus,
           there is a customized delta data source,  how to implement delta function for deleting data record in the source system?
    I mean if there is record deleted in the source sytem, how to notify SAP BW system for this deleting change by this customized delta
    data source?
    Many thanks.

    Hi,
    when ever record deleted we need to write the code to insert the record in  Z table load this records into BW in a cube with similar structure.while loading into this cube multiply the Keyfigure by -1.
    add this cube in the Multi Provider.The union of the records in the orginal cube and the cube having deleted records will result in zero vale and will not be displayed in report .
    Regards,

  • MacBookPro SSD Error- ALERT: The partition map needs to be repaired because there's a problem with the EFI system partition's file system.

    MacBook Pro OS x Mavericks. I have Sandisk 256GB SSD and I am getting a below error message.
    ALERT: The partition map needs to be repaired because there's a problem with the EFI system partition's file system.
    Does anyone know how to resolve this error and fix it ? unfortunately I don't have a Time Machine backup. Thanks!

    Have you tried to run the verify/repair permissions and verify/repair disk through the recovery disk? If that didn't work then I would reinstall the OS after backing up. If that still doesn't work I would take it in.

  • 0EC_PCA_3 does not return data in RSA3 if the target system is entered

    Hello all,
    Datasource 0EC_PCA_3 is not transferring data to the BW 7.0 system. In the ECC 6.0 system datasource 0EC_PCA_3 does not return data in RSA3 if the target system (BWDCLNT100) is entered in the selection fields. When omitting the target system all expected data is returned.
    Datasource 0EC_PCA_1 works fine for the same target system.
    There is one thread dealing with the same problem but the person does not provide the solution he found.
    Does anyone know what the issue is?
    Regards and thanks,
    József.

    The problem was solved by removing the datasource in both BW and R/3 and activating everything from scratch again. We didn't find out what was the cause of the problem, but at least it is working now.

  • When I try to load Indesign, an error message appears: "the exploitation system you are using is no longer taking responsibility with Indesign". What can I do to load the software?

    When I try to load Indesign, an error message appears: "the exploitation system you are using is no longer taking responsibility with Indesign". What can I do to load the software?

    Do you have a license for InDesign?

  • Date Field to Load the full load data

    I have a master data fields like plant, factory, Industry units and Machine Areas all in sequential hierarchy in my data source these all have "valid to" and "valid from date". Now i want to load this Data source with full load on a monthly basis.
    1. Please tell me how can I add data field that captures all records for a month
    2. Valid From field is there but if i say 07/01/2007  as valid from it will not pull the previous one.
    3. Please let me know the step by step approach to make the full load on monthly basis. I do not have any other date field in datasource other than valid from and valid to which are date char 8 yyyymmdd field example factory valid fr = 01/2008
    valid to 08/2008
    like that. Please send me any documents or abap code that can solve this
    [email protected]
    Please help me to solve this
    Thanks
    Poonam Roy

    Hi,
    It seems yopu have already deleted the data for one month from the cube??
    Should not have done that as DSO would have picked the delta correctly.
    Just make sure that the delta was not too much high or it is better to run full repairs.
    If you have deleted the data from the cube then do the full repair for the cube from the DSO for the same selections which you have deleted from the cube.
    It will correct the data in the cube.
    And no need to do reinit after that as it will pick the delta correctly for next time.
    Thanks
    Ajeet

  • Need help - How to load data which contains \r\n

    Hi All,
    We have a requirement wherein we need to load the data from a .dat file, with fields separated by | symbol and the line separator specified as '\r\n'
    INFILE '/scratch/xyz/abcd.dat' "STR '\r\n'"
    FIELDS TERMINATED BY '|' OPTIONALLY ENCLOSED BY '"'
    Can you please help us load data which contains multiple lines in a single field to a database table field , keeping the line separator in ctl file as '\r\n' itself.
    When we try bringing the '\r\n' within a text field enclosed in "" , the '\r\n' , sqlldr considers it as the end of that record and not as a data which needs to go into a column.
    One option we have is to have the extraction process create the data in such a way that the fields with multiple lines , be brought in the .dat file enclosed in "" as
    "Test
    "|8989|abcd
    where the new line in the first field is just '\n' .
    Is there any other way in which the requirement can be addressed.
    Version: SQL*Loader: Release 11.1.0.7.0
    Thanks,
    Rohin

    In addition, you would need to know the character set encoding of the csv files (e.g. the code page used).
    Basically you need to have the facts about both client (csv file) and database character set.
    As suggested, use the available documentation - it's there to help you!
    http://www.oracle.com/technology/tech/globalization/htdocs/nls_lang%20faq.htm
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14225/ch2charset.htm

  • Need to DELETE and LOAD data of the Last month to

    Hi Experts,
    I need to delete the last month data from cube because One material is not updated with a Required value. We have made some changes in Update routine for a Value. The Same change must be reflected from last month of data ,
    So I need to delate the data based on selection. And reload  the data again. And the data flows from
                                               2LIS_13_VDITM (Info Source ) to ZSD_C03 (Info Cube). 
    I read many SDN threads, but I am getting confused.
    How can I proceed for these  ?
    Thanks,
    Utpal.

    Hi Srikanth,
    Thank you for responding.
    I had a problem on one material. So deleted the request and Reload it from PSA. That issue got solved.
    And Now the problem is , Now my senior is saying Delete the data from April-09 till date and Reload it. The issue is My CUBE ( ZSD_C03 ) is updated with 4 data sources. 2LIS_11_V_ITM   ,  2LIS_13_VDITM   ,  2LIS_12_VCHDR  , 2LIS_11_VAITM .
    And I need to delete data from 2LIS_13_VDITM  data source . How Can I proceed for the current issue   ?
    Please suggest   ...
    Thank you ,
    Utpal

  • Error while loading  data into External table from the flat files

    HI ,
    We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
    While loading the data, we are encountering the following error.
    Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04063: un) while loading data into table_ext
    Please let us know what needs to be done in this case to solve this problem.
    Thanks,
    Kartheek

    Kartheek,
    I used Google (mine still works).... please check those links:
    http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
    http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
    HTH,
    Thierry

  • How to move full repair data from DSO1 (soruce) to DSO2(target)?

    Hello experts,
    I doing full repair for po in first level ods, Do i have to move this data from first level ods to second leve ods? or the daily nightly delta from first leve ods to second level ods will take care of it?
    If i have to load the data from first level ods to second level ods how to do it?
    do i have to delete the data first from first level ods before i do the full repair for the po? or just run the infopackage for the po's with the full repair option checked?
    note:we don't use dtp here to load data from bi to bi here.
    Thanks in advance.
    Sharat.

    Delete the first level and second level DSO' s data and set a full repair in the infopackage. the delte will take take all data to second level as ur deleting the data from second level.

  • Loading data : Creation of entries in the Qualified Table

    Hi,
    I was wondering if anyone has succesfully achieved the following...
    I'm looking at the standard materials repository.
    - Main Table : Product
    - Qualified Table: Location Data. This qualified table single non-qualifier field is a look-up to a flat look-up table, called Plants
    The flat table is populated with the valid list of plants, using the check table synchronisation. The Main Table is empty, so is the Location Table.
    In order to load products into this repository using the import manager it seems that it is required to "create" entries in the qualified table (Location), one for each plant-code.
    Is this true ? I actually would expect the SAP MDM Import Manager client to be able to create the records in both the main table and if needed, the qualified table.  As long as my plant code exists in the "Plants" look-up table, that should be sufficient. It seems not the case..
    Thx,
    Dirk

    The sequence of Import
    - Load the Flat Lookup tables. ( in your case Plants)
    - Load the Qualified Tables ( in your case load the non qualifiers of the Location Data)
    - Load the Main table (in your case Products)
    Though MDM allows you to create Lookup values on the fly, I would rather go with this approach.
    If you are using Import Manager, you can add the Qualified Table Records in the fly.
    But if you are using Import Server, you will NOT be able to add Qualified table records in the fly. I tested this, and I see the files in the ImportX error directory.

  • How to load data to fact table for the year 2008 without modifying session sql

    Hi,
    I need to load data to OOTB Financial-General ledger related fact tables using new DAC execution plan for the year 2008 (just for the year 2008 not before or after). without modifying OOTB informataica session SQL. do you know how to do it?
    appreciate you for your help. thank you.

    Do you know why you are going for new Execution plan instead of using existing?
    let me know this will share what I know
    thanks for checking
    PS: JV its you!!
    Message was edited by: SriniVEERAVALLI

  • STPO's --- load date is not appearing in the item detail

    Hi SAP Expert's,
    We have a issue with inter company STPO, the Qty of mtrl is80 Kg. we received 10 kg on 1st feb.
    In item detail --Delivery schedule tab it showing the delivery date is 1st feb and goods issue is 5 th feb.
    Same like they delivery 10 kg .Delivery date is 25th feb,but the system not showing the  Transport date, load date and goods issue date. Mtrl availablity date is past date of delivery date (1st feb).
    When we change the delivery date 26 th feb. the system showing the transport date, load date and goods issue date. Now the mtrl availablity date is 25th feb.
    Please any one can clarify my doubt. Why the system is not showing the transport date, load date, and goods issue date.
    Please tell me that where its from the  system picks these dates..
    Thanks in Advance.
    Thanks and Regards
    Chandru.

    Hi Venkatesh,
    Check the condition record in transaction NACE. Whether the print output field is present in the condition record.
    Regards,
    Gajendra.

Maybe you are looking for