Understanding the data transfer from source system to PSA

hi experts,
i an new to BI. i have created a transactional data source for VBAK table and replicated that to BI. Created transformations and info package for uploading the data into PSA. i just want to know the sequence of steps that the system executes when we run the infopackage. i know that it makes a rfc call to source and brings the data back to BI. i want to where that RFC call happens and what is the RFC FM name etc.,. I tried to debug the process using /h, but as you know, it is very difficult to debug the whole standard code. it is very complex. i got lost some where in the middle. if anybody has any idea / or done any research on this ,please share your findings.
thanks

Hi,
Once you click on Start buton in Infopackage,
1. BW system wil send on erequest to ECC along with DataSource details.
2. Based on BW request ECC will respond and it will send message OK and starts extraction.
3. Based on InfoPackage selection the Data will executes in ECC and picks the data in Packets.
4. And ECC will send the data in teh form of Datapackets to BW (Idocs)
5. Once the selection creatiro is over the Job will finish.
6. You can see the data in BW
Thanks
Reddy

Similar Messages

  • ERec - Data Transfer from eRec system to HCM System

    Hello All,
    Currently am able to do the data transfer from eRec system to HCM system but am finding some inconsistencies here.
    In our scenario we are on ehp4 and no XI in place, We are doing via RFC only. Stanalone eRec system
    So here are few queries and if someone can address these would be really great and appreciate the help
    1) While i do the data transfer activity, when i select activity and don not give the Hiring Date in the Org Assignment Column it throws an error "When transferring data using XI, the hiring date must be specified". But am not using XI, then why should it give this error?
    Also is it not possible that i do the data transfer without any hiring date?
    Also there are two options Planned and Completed. When i do the data transfer i keep it in planned status and do it it automatically saying that Complete
    Is this correct
    2)Now when i do the data transfer it is getting appeared in HR system when i execute the PA48 tcode.
    Here my queries are
    a) Suppose i do a data transfer for a candidate Abc on 10.10.2010 and run the PA48 tcode it appears in the list as hire date 10.10.2010
    Now as a recruiter am again doing the data transfer for the same candidate Abc for 11.10.2010 and againg when i see PA48 there is again another entry for the Abc candidate for hiring
    Is this correct process. How should these duplicate entries to be avoided. Pls help me
    3) Also i case if I want additions infotypes to be transferred in PA48 how should we go for this development
    Pls help me in these queries

    Hi ,
    Check the following OSS notes.
    1. 1006679
    2. 997181 ( Read the attachment in the OSS note ) gives the detailed information regarding EREC and HR)
    Let me know if you need more info.
    Regards,
    Ram

  • DataSource extraction very slow ( from Source System to PSA it takes 23 hrs

    Friends,
    We have enhanced the datasource 0CRM_SALES_ORDER_I with the user exit....after the enhancement i.e (adding the new fields and wrote some coding to enhance ...) the data extraction takes place for around 23 hours. there is approximately 2,50,000 records.
    Can you please suggest any steps to tune up the performance of the datasource.
    NOTE: Data Extraction from source system to PSA alone takes 23 hrs.once the data is arrived in PSA then the loading of data to cube is fast.
    PLZ help me to solve this issue.
    BASKAR

    Hi Friends,
    This is the code used for the datasource enhancement.(EXIT_SAPLRSAP_001)
    DATA : IS_CRMT_BW_SALES_ORDER_I LIKE CRMT_BW_SALES_ORDER_I.
    DATA:  MKT_ATTR TYPE STANDARD TABLE OF  CRMT_BW_SALES_ORDER_I.
    DATA: L_TABIX TYPE I.
    DATA: LT_LINK TYPE STANDARD TABLE OF CRMD_LINK,
          LS_LINK TYPE CRMD_LINK.
    DATA: LT_PARTNER TYPE STANDARD TABLE OF CRMD_PARTNER,
          LS_PARTNER TYPE CRMD_PARTNER.
    DATA: LT_BUT000 TYPE STANDARD TABLE OF BUT000,
          LS_BUT000 TYPE BUT000.
    DATA: GUID TYPE CRMT_OBJECT_GUID.
    DATA: GUID1 TYPE CRMT_OBJECT_GUID_TAB.
    DATA: ET_PARTNER TYPE CRMT_PARTNER_EXTERNAL_WRKT,
          ES_PARTNER TYPE CRMT_PARTNER_EXTERNAL_WRK.
    TYPES: BEGIN OF M_BINARY,
           OBJGUID_A_SEL TYPE CRMT_OBJECT_GUID,
            END OF M_BINARY.
    DATA: IT_BINARY TYPE STANDARD TABLE OF M_BINARY,
          WA_BINARY TYPE M_BINARY.
    TYPES : BEGIN OF M_COUPON,
             OFRCODE TYPE CRM_MKTPL_OFRCODE,
             END OF M_COUPON.
    DATA: IT_COUPON TYPE STANDARD TABLE OF M_COUPON,
          WA_COUPON TYPE M_COUPON.
    DATA: CAMPAIGN_ID TYPE CGPL_EXTID.
    TYPES : BEGIN OF M_ITEM,
             GUID TYPE CRMT_OBJECT_GUID,
            END OF M_ITEM.
    DATA: IT_ITEM TYPE STANDARD TABLE OF M_ITEM,
          WA_ITEM TYPE M_ITEM.
    TYPES : BEGIN OF M_PRICE,
                  KSCHL TYPE PRCT_COND_TYPE,
                  KWERT  TYPE PRCT_COND_VALUE,
                  KBETR   TYPE PRCT_COND_RATE,
            END OF M_PRICE.
    DATA: IT_PRICE TYPE STANDARD TABLE OF M_PRICE,
          WA_PRICE TYPE M_PRICE.
    DATA: PRODUCT_GUID TYPE COMT_PRODUCT_GUID.
    TYPES : BEGIN OF M_FRAGMENT,
             PRODUCT_GUID TYPE COMT_PRODUCT_GUID,
             FRAGMENT_GUID TYPE COMT_FRG_GUID,
             FRAGMENT_TYPE TYPE COMT_FRGTYPE_GUID,
           END OF M_FRAGMENT.
    DATA: IT_FRAGMENT TYPE STANDARD TABLE OF M_FRAGMENT,
          WA_FRAGMENT TYPE M_FRAGMENT.
    TYPES : BEGIN OF M_UCORD,
          PRODUCT_GUID TYPE     COMT_PRODUCT_GUID,
          FRAGMENT_TYPE     TYPE COMT_FRGTYPE_GUID,
          ZZ0010 TYPE     Z1YEARPLAN,
            ZZ0011 TYPE Z6YAERPLAN_1,
            ZZ0012 TYPE Z11YEARPLAN,
            ZZ0013 TYPE Z16YEARPLAN,
            ZZ0014 TYPE Z21YEARPLAN,
         END OF M_UCORD.
    DATA: IT_UCORD TYPE STANDARD TABLE OF M_UCORD,
          WA_UCORD TYPE M_UCORD.
    DATA: IT_CATEGORY TYPE STANDARD TABLE OF COMM_PRPRDCATR,
          WA_CATEGORY TYPE COMM_PRPRDCATR.
    DATA: IT_CATEGORY_MASTER TYPE STANDARD TABLE OF ZPROD_CATEGORY ,
          WA_CATEGORY_MASTER TYPE ZPROD_CATEGORY .
    types : begin of st_final,
               OBJGUID_B_SEL  TYPE CRMT_OBJECT_GUID,
               OFRCODE TYPE CRM_MKTPL_OFRCODE,
               PRODJ_ID TYPE CGPL_GUID16,
               OBJGUID_A_SEL type     CRMT_OBJECT_GUID,
              end of st_final.
    data : t_final1 type  standard table of st_final.
    data : w_final1 type  st_final.
    SELECT  bOBJGUID_B_SEL aOFRCODE  aPROJECT_GUID bOBJGUID_A_SEL  INTO table t_final1 FROM
       CRMD_MKTPL_COUP as a  inner join CRMD_BRELVONAE as b on  bOBJGUID_A_SEL = aPROJECT_GUID .

  • Is there is any way to find the data transfer from client to Configuration Manager for health monitoring and hardware Inventory

    Hi
    Can Configuration Manager provide a way to find the data transfer from client to Configuration Manager for health monitoring and hardware Inventory. How can I know what amount of data is consumed during that process

    Place archive_reports.sms in %systemroot%\ccm\inventory\temp\ for both 64-bit and 32-bit computers.
    There are two situations where you can use this depending on the type of client:
    1. To keep inventory reports on a client (that is not an MP), create the following file:
    %systemroot%\ccm\inventory\temp\archive_reports.sms
    2. To keep inventory reports on a MP (that is also a client), create the following file:
    <x>:\sms_ccm\inventory\temp\archive_reports.sms
    The XML file will be saved in the inventory\temp folder.
    More information on the above here: http://blogs.technet.com/b/configurationmgr/archive/2012/09/17/controlling-configuration-manager-2012-using-hidden-files.aspx

  • Significant slowness in data transfer from source DB to target DB

    Hi DB Wizards,
         My customer is noticing significant slowness in Data copy from the Source DB to the Target DB. The copy process itself is using PL/SQL code along with cursors. The process is to copy across about 7M records from the source DB to the target DB as part of a complicated Data Migration process (this will be a onetime Go-Live process). I have also attached the AWR reports generated during the Data Migration process. Are there any recommendations to help improve the performance of the Data transfer process.
    Thanks in advance,
    Nitin

    multiple COMMIT will take longer to complete the task than a single COMMIT at the end!Lets check how much longer it is:
    create table T1 as
    select OWNER,TABLE_NAME,COLUMN_NAME,DATA_TYPE,DATA_TYPE_MOD,DATA_TYPE_OWNER,DATA_LENGTH,DATA_PRECISION,DATA_SCALE,NULLABLE,COLUMN_ID,DEFAULT_LENGTH,NUM_DISTINCT,LOW_VALUE,HIGH_VALUE,DENSITY,NUM_NULLS,NUM_BUCKETS,LAST_ANALYZED,SAMPLE_SIZE,CHARACTER_SET_NAME,CHAR_COL_DECL_LENGTH,GLOBAL_STATS,USER_STATS,AVG_COL_LEN,CHAR_LENGTH,CHAR_USED,V80_FMT_IMAGE,DATA_UPGRADED,HISTOGRAM
    from DBA_TAB_COLUMNS;
    insert /*+APPEND*/ into T1 select *from T1;
    commit;
    -- repeat untill it is >7Mln rows
    select count(*) from T1;
    9233824
    create table T2 as select * from T1;
    set autotrace on timing on;
    truncate table t2;
    declare r number:=0;
    begin
    for t in (select * from t1) loop
    insert into t2 values ( t.OWNER,t.TABLE_NAME,t.COLUMN_NAME,t.DATA_TYPE,t.DATA_TYPE_MOD,t.DATA_TYPE_OWNER,t.DATA_LENGTH,t.DATA_PRECISION,t.DATA_SCALE,t.NULLABLE,t.COLUMN_ID,t.DEFAULT_LENGTH,t.NUM_DISTINCT,t.LOW_VALUE,t.HIGH_VALUE,t.DENSITY,t.NUM_NULLS,t.NUM_BUCKETS,t.LAST_ANALYZED,t.SAMPLE_SIZE,t.CHARACTER_SET_NAME,t.CHAR_COL_DECL_LENGTH,t.GLOBAL_STATS,t.USER_STATS,t.AVG_COL_LEN,t.CHAR_LENGTH,t.CHAR_USED,t.V80_FMT_IMAGE,t.DATA_UPGRADED,t.HISTOGRAM
    r:=r+1;
    if mod(r,10000)=0 then commit; end if;
    end loop;
    commit;
    end;
    --call that couple of times with and without  "if mod(r,10000)=0 then commit; end if;" commented.
    Results:
    One commit
    anonymous block completed
    Elapsed: 00:11:07.683
    Statistics
    18474603 recursive calls
    0 spare statistic 4
    0 ges messages sent
    0 db block gets direct
    0 calls to kcmgrs
    0 PX remote messages recv'd
    0 buffer is pinned count
    1737 buffer is not pinned count
    2 workarea executions - optimal
    0 workarea executions - onepass
    10000 rows commit
    anonymous block completed
    Elapsed: 00:10:54.789
    Statistics
    18475806 recursive calls
    0 spare statistic 4
    0 ges messages sent
    0 db block gets direct
    0 calls to kcmgrs
    0 PX remote messages recv'd
    0 buffer is pinned count
    1033 buffer is not pinned count
    2 workarea executions - optimal
    0 workarea executions - onepass
    one commit
    anonymous block completed
    Elapsed: 00:10:39.139
    Statistics
    18474228 recursive calls
    0 spare statistic 4
    0 ges messages sent
    0 db block gets direct
    0 calls to kcmgrs
    0 PX remote messages recv'd
    0 buffer is pinned count
    1123 buffer is not pinned count
    2 workarea executions - optimal
    0 workarea executions - onepass
    10000 rows commit
    anonymous block completed
    Elapsed: 00:11:46.259
    Statistics
    18475707 recursive calls
    0 spare statistic 4
    0 ges messages sent
    0 db block gets direct
    0 calls to kcmgrs
    0 PX remote messages recv'd
    0 buffer is pinned count
    1000 buffer is not pinned count
    2 workarea executions - optimal
    0 workarea executions - onepass
    What we've got?
    Single commit at the end, avg elapsed: 10:53.4s     
    Commit every 10000 rows (923 times), avg elapsed: 11:20.5s
    Difference:     00:27.1s     3.98%
    Multiple commits is just 4% slower. But it is safer regarding Undo consumed.

  • Data loading from source system takes long time.

    Hi,
         I am loading data from R/3 to BW. I am getting following message in the monitor.
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    in the source system
    Is there anything wrong with partner profile maintanance in the source system.
    Cheers
    Senthil

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (In source system if load is from R/3 or in BW if its a datamart load) (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    Also see if there is any 'sysfail' for any datapacket in SM37.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. (In source system if load is from R/3 or in BW if its a datamart load). See if its accessing/updating some tables or is not doing anything at all.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.(In source system if load is from R/3 or in BW if its a datamart load)
    5) SM58 and BD87 for pending tRFCs and IDOCS.
    Once you identify you can rectify the error.
    If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the data tables.
    SM21 - System log can also be helpful.
    Thanks,
    JituK

  • Doesn't load from source system to PSA and DSO

    Hi Gurus,
    We have launched a Delta load from Source System (POSDM) to DSO. In the source system there are 5000000 rows but none was loaded into DSO or PSA. The delta load was ok, but any data was loaded.
    If the delta  load was lunch again, would be loaded those 5000000 rows. Or those 5000000 rows wouldn't be able to load again.
    Any idea abut this issue? Any feedback will be really appreciated.
    Thanks in advance.

    Hi David,
    Are you sure this 5 million records are delta records and should be pulled into BW as delta??
    did you count the number of records in the underlying tables??
    which data source you are using??
    Delta loads are suppose to bring the new records or changed records since the last data load and not all the records.
    Since the request is green as you said and still it shows 0 records then it is possible that nothing has changed since last delta.try to see the details of the data load... if it failed...may be thats the reason you are not able to see any records.
    If you schedule the delta again it will bring the records changed or created after last delta.
    If the delta was unsuccessfull then turn the QM status and overall status of request to red manually in the monitor of the request and delete it from PSA and all the targets in which its loaded and schedule the delta again and it should bring the delta.
    Thanks
    Ajeet
    Thanks
    Ajeet

  • How to restrict number of Data Records from Source system?

    Hi,
    How can I restrict the number of Data records from R3 source system that are being loaded into BI. For example I have 1000 source data records, but only wish to transfer the first 100. How can I achieve this? Is there some option in the DataSource definition or InfoPackage definition?
    Pls help,
    SD

    Hi SD,
    You can surely restrict the number of records, best and simplest way is, check which characteristics are present in selection screen of InfoPackage and check in R3, which characteristics if given a secection could fetch you the desired number of records. Use it as selection in InfoPackage.
    Regards,
    Pankaj

  • Regarding data transfer from SAP system to XI

    Hi Experts,
    I have a program in which a file of type XSTRING is sent to SAP Business Connector through a Function module. Now we want that file to be transferred to third party system through XI as interface instead of SAP BC.
    Can anybody suggest what will be the required steps ?
    Also please mention the required things to be done on ABAP part.
    Thanks in adv.,
    Akash

    Hi Gordon,
    We don't have the idoc creation part in our program so it will be better if we can use xstring file only.
    Can you suggest any function modules for data transfer to XI ?
    Thanks,
    Akash

  • Function Module required to convert the date coming from external system

    Hi Friends,
       I need a Function Module that would convert the incoming date from external sytem in format YYYYMMDD to the SAP system in DDMMYYYY. The External system data type for the date is Numberic. Please suggest any FM if you know.I found out many in SAP but didn't find for this requirement

    Wait a minute.  I am definetely missing something here.  You want to convert from an external format YYYYMMDD to SAP's internal format?  SAP's internal format is YYYYMMDD.
    All you should have to do is move you external date directly to your internal date.
    data: date1(8) type c value '20051225'. "Format YYYYMMDD
    data: date2 type sy-datum. "Format (YYYYMMDD)
    write: / date2.
    When you write out date2 it will be in whatever format your user profile has. 
    If you want to force the date format independent of your user profile settings during the write statement:
    write: / date2 DDMMYY.
    Message was edited by: Thomas Jung

  • How to restrict no. of data packets from source system into BW

    Hi,
    when i try loading data from the IDES system to the PSA in BI 7.0. I get the following error from Data packet 41 onwards...
    Data Package 41 : arrived in BW ; Processing : Selected number does not agree with transferred n
    I have already read the various suggestions on the forum. Since I am loading data for testing purpose, i would like to restrict the data load until the 40th data packet. Is this possible & how?
    Pls help. Thanks
    SD

    Hi,
    I don't think there's a parameter you can set for the number of data packages.
    There's a parameter for the size of the datapackage. But that's different.
    Of course you can always try and restrict the data you load with the selection of your infopackage.
    Thats a more sensible way of doing it rather than restricting the number of data packages.
    Because even somehow if you managed to get them to 40, you won't know what data you will be missing that the datapackages from 41 onwards would have brought.  And this will lead to reconcilliation issues with R/3 data in your testing.
    -RMP

  • How the data transfer from r/3 to bw

    Hi gurus,
      for example iam extracting the data from sd module.
    r/3  side some postings are done.that postings are comes exactly which place in r/3 side.
    Then how we have taken that postings of data from r/3 to bw side.
    Can u any one tell me the brief flow.No links please.
    i will appreciate ur help and assign points
    Thanku

    Hi,
    check the following
    http://help.sap.com/bp_biv235/BI_EN/html/bw.htm
    business content
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/biw/g-i/how%20to%20co-pa%20extraction%203.0x
    https://websmp203.sap-ag.de/co
    http://help.sap.com/saphelp_nw04/helpdata/en/37/5fb13cd0500255e10000000a114084/frameset.htm
    (navigate with expand left nodes)
    also co-pa
    http://help.sap.com/saphelp_nw04/helpdata/en/53/c1143c26b8bc00e10000000a114084/frameset.htm
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/fb07ab90-0201-0010-c489-d527d39cc0c6
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1910ab90-0201-0010-eea3-c4ac84080806
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/ff61152b-0301-0010-849f-839fec3771f3
    FI-CO 'Data Extraction -Line Item Level-FI-CO'
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/a7f2f294-0501-0010-11bb-80e0d67c3e4a
    FI-GL
    http://help.sap.com/saphelp_nw04/helpdata/en/c9/fe943b2bcbd11ee10000000a114084/frameset.htm
    http://help.sap.com/saphelp_470/helpdata/en/e1/8e51341a06084de10000009b38f83b/frameset.htm
    http://www.sapgenie.com/sapfunc/fi.htm
    FI-SL
    http://help.sap.com/saphelp_nw2004s/helpdata/en/28/5ccfbb45b01140a3b59298c267604f/frameset.htm
    http://help.sap.com/saphelp_erp2005/helpdata/en/41/65be27836d300ae10000000a114b54/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/ee/cd143c5db89b00e10000000a114084/frameset.htm
    Please reward points.

  • Master Data Extraction from source system

    Hi BW Gurus,
    I need to extract the "<b>Purchase Document Type</b>" and "<b>Purchase Document Status</b>" master data from R3.
    Can anybody shed some light how can i find out which table in R3 host this data.
    Thank you in advance and i promise to assign point for your answers
    Thank you

    Hi,
    I feel the base table for Purchasing Document Type is T161.
    For Status of purchasing document there is no base table .the avilable value are comming from Available Domain values.
    With rgds,
    Anil Kumar Sharma .P

  • Data Flow from Source systemside LUWS and Extarction strucures

    Hi
    Can Anybody Explain the Data flow from Source system to Bi System .Especially I mean the Extract Structure and LUWS where does they come in picture ,the core data flow of inbound and out bound queues .If any link for the document  would also be helpful.
    Regards
    Santosh

    Hi See Articles..
    http://wiki.sdn.sap.com/wiki/display/profile/Surendra+Reddy
    Data Flow from LBWQ/SMQ1 to RSA7 in ECC (Records Comparison).
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/enterprise-data-warehousing/data%20flow%20from%20lbwq%20smq1%20to%20rsa7%20in%20ecc%20(Records%20Comparison).pdf
    Checking the Data using Extractor Checker (RSA3) in ECC Delta Repeat Delta etc...
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/80f4c455-1dc2-2c10-f187-d264838f21b5&overridelayout=true 
    Data Flow from LBWQ/SMQ1 to RSA7 in ECC and Delta Extraction in BI
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/d-f/data%20flow%20from%20lbwq_smq1%20to%20rsa7%20in%20ecc%20and%20delta%20extraction%20in%20bi.pdf
    Thanks
    Reddy

  • Data transfer from one crm system (Prod) to another crm system ( quality)

    Hi,
      I have a requirement in which i have to transfer a set of data ( BP, Product, Sales Order etc) for a given date range from production system to Quality system. I am more concerned abt the Business documents which move from CRM to ERP.
    I thought of moving those outbond bdocs which are generated in crm system for the passing the data into ERP from Prod to Qty sytem and  try to reprocess them as an inbound bdocs in Qty through some program. But i came to know that bdocs will not contain the full data of the business documents which gets created in CRM system. So if i reprocess those bdocs also i will not get the full data of those businesss documents.
    Any other alternative is there ?
    Regards,
    M Khan

    Hi,
    In the all the data transfer between SAP systems RFC is the best option.
    Just create a source system for the new BW system and it will work.
    Thanks
    Message was edited by:
            Ajeet Singh

Maybe you are looking for