Delta upload in XI and calling Infopacakages in Inbound Proxy

Hi All
I have a scenario where File--->XI----->BW
In this scenario I have to pick a file compare and send it to BW system. Before sneding into the BW system I have to apply the logic of delta upload in XI.
Requirement: the recent file picks the data for the current week and previous week . We have to  eliminate the data of the previous week and send the recent weeks data to BW in file format i.e. data of recent week will be sent eliminating the data of previous weeks.
I have one more query
How to call the 6 different infopackages in ABAP Proxy.
Do anyone know the process for doing the same??
Abhishek Mahajan
that will be a good guide

Similar Messages

  • Delta upload of Strucutr and content in CCM catalogs

    Hi guys,
    I want to do a Delta upload of Structure and Content in supplier catalogs and a delta uploads of structures in the master catalof y procurement catalog.
    With the CSV file Im using now, everytime  Im doing a incial upload.
    This is the file Im using (example):
    SAP CATALOG CSV 2.0 <,>FULL,,,,,,,,,,,,,,
    Schema,OS,Image,Sort,{}Office Supplies Schema,,,,,,,,,,,
    Category,OS,,,,{}Office Supplies,,,,,,,,,,
    Category,275mm_Lateral_Suspension_Files,OS,,{}275mm Lateral Suspension Files,,,,,,,,,,,
    Category,330mm_Lateral_Suspension_Files,OS,,{}330mm Lateral Suspension Files,,,,,,,,,,,
    ItemValuation,parent cty,/ccm/supplier_part_no,/ccm/supplier_name,/ccm/short_description,/CCM/LONG_DESCRIPTION,/CCM/order_unit,/ccm/price[1]#/ccm/amount,/ccm/price[1]#/ccm/currency_code,/ccm/contract_id,/ccm/contract_item_id,/ccm/Product_Group,/ccm/lead_time,/ccm/supplier_id,/ccm/product_id,/ccm/base_uom, Sort
    1,275mm_Lateral_Suspension_Files#OS,13,Office Supplies 2,Patch para cable 9231,Patch para cable 9231,PCE,2000,CLP,4410000091,20,44120000,3,100006,2611,PCE,2
    2,275mm_Lateral_Suspension_Files#OS,548750,Office Supplies 3,VRF RS DATO, VRF RS dato,PCE,10000000,CLP,4410000088,10,44120000,2,100007,1000037,PCE,3
    What Im a doing wrong? Maybe it is the caracter FULL in the CSV-File provoking this?
    Thanks for any help.

    I think your idea is correct regarding the "FULL" indicator.
    Refer the SAP help document here:
    We leave this blank when doing delta uploads - that works for us.
    Here is the comment from the help file to support this change:
    The identifier full, if you want the CSV file to completely replace the existing catalog
    If you leave this column blank, SAP Catalog Authoring Tool treats the CSV file as an update file: only those catalog objects (for example, schema, categories, items) that are contained in the CSV file are replaced. All other catalog objects remain unchanged. This means you cannot use an update to delete catalog objects.
    SAP CATALOG CSV 2.0 <|> full
    The separator is “|” and this upload replaces the existing catalog.
    End of the example.
    If you use “Microsoft Excel” to create a CSV file (with “;” as the separator), you must ensure that the content of the first line is located in the first column (the first cell of the worksheet).
    End of the note.

  • Upload data from abap program to abap inbound proxy

    I have requirement to upload flat file data to an internal table and call Inbound proxy abap class and pass all the internal table data to tha proxy clas method structure.
    So could any one help me how to send/pass data to class, Please give some e.g.
    Edited by: Bobby G on Nov 18, 2009 4:35 AM

    you may follow the following ways
    1. create a report and by using GUI_upload, give the path as default , you can convert the flat file's data into internal table.
    2. call that report in proxy method by returning parameter as a table, this table can use further in the proxy.
    Another way, you have
    1.  Create a transparent table and by using the GUI_Upload in report,  you may  store the data.
    2.  from step 1, you may use data in the proxy.

  • Generic Datasource Delta upload Issue

    Hi all,
    I have created a Generic Datasource for Solution Manager ODS 0crm_process_srv_h which contains the userstatus, transaction number, transaction desc and GUID of CRM Order Object (ods key field),only these fields.
      I have taken transaction number as delta field and it is as timestamp. But delta upload is not happening. If I give GUID or Status also delta upload is not happening . Please suggest which is the best way to give to delta upload.
    Thanks and Regards,

    in your case you need to create a function module as a extractor. Create a extract structure as well and add a timestamp field (lets call it zdeltahelp) to be used as the delta relevant field of the datasource. After successful init the last extraction timestamp will be passed to the fm and you can start selecting all resb entries for orders created and/or changed from that time on as well as the ones deleted (but therefore you need to read the change documents as well). For more information about generic extraction using fm search the forum or check out my business card. There you will find a link to a weblog about that issue.

  • Delta Upload Problem

    Hi All,
    I have a ods which feeds the cube, i designed this to facilitate delta capability to cube.i loaded 17000 records(delta upload) to ODS and it says transfered-17000 and added-17000 but only 12000 records have been transfered to the cube and there are no routines in the update rules of cube from Ods, i am not sure why its only transfered 12000 records to cube when the added records in ODS are 17000.

    Hi Shetty.
       If you are having Keys as key1 key2 key3 key4 and Qty and value. If you want to make it overwrite you have load in overwrite mode.
    It works like this:
    key1 key2 key3 key4 Qty value
    A     B   C    D    10  100  -->> already there
    if you are loading new data from Flat file in overwrite mode contains below data then,
    It works like this:
    key1 key2 key3 key4 Qty value
    A     B   C    D    15  150 
    A     B   C    E    20  200
    After this load your ODC data will be
    key1 key2 key3 key4 Qty value
    A     B   C    D    15  150 
    A     B   C    E    20  200
    Hope it Helps

  • Just uploaded IOS 8 and now can't hear anyone when making a call. If put speaker on then I can hear! Any advice?

    Just uploaded IOS 8 and now can't hear anyone when making a call. If put speaker on then I can hear! Any advice?

    I just tried to help another guy in another discussin with similar problems.
    Headset input is not working
    I would suggest a iPhone restore and fresh load from Backup... but it is no guarantee.

  • Init and delta uploads using process chain

        I have a scenario where for the first upload i use init upload for that i have an infopackage now for the subsequent uploads i.e delta upload i get the option for the same only once i have run the init upload.
           so the doubt is can i use a process chain defined to carry out the whole task?????

    Hi Virat,
    AS init is a one time activity, so what you can do is.
    1> set the init in infopack and run the process chain.
    2> remove the infopack and then include a new infopack with delta option or change the existing infopack to delta option.
    3> schedule the infopack.
    whatever you are expecting is not possible.
    Hope I answered your question.
    THanks and Regards,

  • How to do Delta upload and consolidate in SSM, how to use transformer.ini

    Can anybody explain me how to do Delta data upload and also how to do delta data consolidation in SAP SSM.
    Also how can i create and use transformer.ini file.
    Thanks and reagrds
    Edited by: Himanshu Bisla on Mar 25, 2009 2:09 PM

    Where from are you planning to upload data?
    When you mention a delta upload, does it mean loading new data for a new PERIOD, where no data yet existed, or loading values to measures for periods where data already exists, doing a SUM?
    When loading from an external source, you use a READ command, which by default replaces existing values with those that are being loaded. If you add the parameter ADD or SUBTRACT to that IDQL command you are able to change this default behaviour to enable delta loads (with SUM or SUBTRACTION operations).
    As for the CONSOLIDATIONS, you can define a date range for the consolidation, or run it for a single period (month, quarter) if needed, or restrict the number of measures you want to consolidate. There is no way to automatically define that only the measures and periods that have new data will be consolidated. This has to be made explicit in the CONSOLIDATE instruction.
    Regarding the Transformer.ini, are you planning to use PAS as an ETL tool? Unfortunately, I can only point you to the information that is available on the PAS Help, but that's probably where you found the reference to this function in the first place...
    Hope this helps!
    Best regards,

  • Delta upload in different times and targets

    Dear BW Gurus,
    I have to fill a Master Data table with a delta (for performance reasons) from an ODS object, and later fill an InfoCube with the same delta from the same ODS Object. This two step approach is necessary because the Cube uses the Master Data filled in the first step - so I cannot do the two uploads within the same InfoPackage.
    My approach was the following in a Process Chain:
    1. Fill and activate "source ODS"
    2. Load request from "source ODS" to PSA only
    3. Update Material Master from PSA using "Read PSA and Update Data Target" process with the variant setting the PSA table and the Master Data InfoObject as Data Target.
    4. Run "Attribute Change" process for the Master Data IO
    5. Update InfoCube from the same PSA that was used for the Master Data InfoObject. This is also a "Read PSA and Update Data Target" process with the variant setting the same PSA table as before and now the InfoCube as the Data Target.
    To me this seemed logical but when I run the Process Chain I got the following error message at point 3 (update Master Data from PSA): "No request for posting found in the variant".
    This appears strange to me because if I specified the request, it would only work for the first upload and not for all the others.
    Can anyone please help me how to overcome this issue of Delta upload in different times and targets?
    Thanks in advance,

    If I create two delta InfoPackages, the first one clears the delta queue, so the second InfoPackage won't find any data. This is unfortunately not the way to do...
    Your proposal is quite original - if I upload the same ODS with its data (amended by the date field) that would in fact generate a new delta for the same ODS.
    However, in the meantime a more simple idea came up in our BW team, inspired by your proposal. For the uploading of the Master Data I now don't use the delta functionality of the ODS export Data Source, but instead I set up a full upload filtered for the actual date. This ensures the good performance of Master Data load (load time will not increase with time) and also we will have all recent Master Data uploaded from the ODS because that load is in the same Process Chain where the "Source ODS" is updated. Later in the same Process Chain I simply start the delta upload for the Cube.
    Thanks a lot for all your help.
    P.s. I will be out of office for until 24th July.

  • RSA7 shows zero records for delta upload of 2LIS_03_BF Datasource

    Dear Professionals,
    I am having constant issues with inventory management DataSource (2LIS_03_BF). I scheduled delta upload in r3 production using LBWE, keeping in mind that in LBWE there is no options for posting dates, ETC... In RSA3, there are records, but when i check RSA7, it is showing 0 records. What could be the problem. Do i need to do anything/activation anywhere in LBWE or RSA7 to see the records. Please advise.

    Normally there should be a collective run which transfers data from the application-specific queues to the BW delta queue (RSA7), just like this workflow:
    MCEX03 queue==> RMBWV03 (Collective run job) ==> RSA7-Delta queue => BW
    It is necessary to schedule the job control regularly - see point 3 of SAP Note 505700.
    Check also these SAP Notes:
    869628     Constant WAITUPDA in SMQ1 and performance optimization
    728687     Delta queued: No data in RSA7
    527481     tRFC or qRFC calls are not processed
    739863    Repairing data in BW

  • Routine,delta upload

    Hi experts,
    i am new to bw and sdn .
    can u answer for my questions ,i was asked in my interview.
    1)what is the scenario of start routine?
    2)how to generate a report for present year and previous three years?
    3)what is the mechanism behind  delta upload?
    thanks in advance

    Hi Krishna,
    1) Start Routine - Start routine is executed prior to executing of transfer rules, so if there is any global variable to be used in all transfer rules then it is better to declare that in start routine.
    After start routine execution is over then transfer rules r executed.
    Start routine are at two levels
    1. update roule
    2. transfer rules
    start routine is processed once per data package load .For ex : you want to eliminate a set of records based on dome criteria you can do that with start routine . But update rules or transfer rules are processed once per data record where you can manipulate each single record .
    There is an internal table called data_package which will have all the data available for processing you can later store all the data in another internal table (you can declare) which can later be used in update rules .
    3) Delta Upload Mechanism - Regarding the Delta upload mechanism, it's simply the mechanism that helps updating only the changed data, and not the full update. The field 0RECORDMODE determines whether the records are added to or overwritten. It determines how a record is updated in the delta process: A blank character signifies an after image, ‘X’ a before image, ‘D’ deletes the record and ‘R’ means a reverse image.
    You can check following link for details.

  • Failure in Delta Upload

           we have delta upload for our purchasing data to BW. from the infopackages the data is Updated to ODS- 0PUR_O01, which provides data to Infocubes - 0PUR_C08 and 0PUR_C07 and ODS 0PUR_O02 ( which in turn provides data to infocube - 0PUR_C09 ). 3 days back one request in ODS 0PUR_O01 went into error status. WAfter marking the request "RED" it was deleted from the ODS, but since the request had the same PSA_ID as that of an another successful (GREEN) request in the same ODS with data mart status ticked, this request also got deleted by mistake, as a result the datamart status for all the request in the ODS (0PUR_O01) is missing, from this time onwards the subsequent data updates to infocubes 0PUR_C08 and 0PUR_C07 and ODS 0PUR_O02 (also to Infocube 0PUR_C09 ) are finishing in Error status. we have deleted all the requests (after marking them "RED") for previous three days from the data targets ( infocubes 0PUR_C08, 0PUR_C07, and 0PUR_C09 and ODS 0PUR_O02). we also have deleted the correposnding requests from ODS 0PUR_O01 (after marking them "RED").Now when we are trying to reconstruct the requests at 0PUR_O01, it goes successful, but when we try to activate the requests into subsequent data targets ( back ground job BI_ODSA* ) the job fails with a short dump and the following error message.
    Error Message:-
    Check Load from InfoSource 80PUR_O01 , Packet Delta Package for Update by ODS 0PUR_O01
    Please execute the mail for additional information.
    Error message from the source system
    An error occurred in the source system.
    System response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.
    How you remove the error depends on the error message.
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    Short Dump:-
    What happened?
       The current application program detected a situation which really
       should not occur. Therefore, a termination with a short dump was
       triggered on purpose by the key word MESSAGE (type X).
    What can you do?
       Print out the error message (using the "Print" function)
       and make a note of the actions and input that caused the
       To resolve the problem, contact your SAP system administrator.
       You can use transaction ST22 (ABAP Dump Analysis) to view and administer
        termination messages, especially those beyond their normal deletion
       is especially useful if you want to keep a particular message.
    Error analysis
       Short text of error message:
       Request 0000017556 in ODS 0PUR_O02 must have QM status green before it i
       s activated
       Technical information about the message:
       Message classe...... "RSM1"
       Number.............. 110
       Variable 1.......... 0000017556
       Variable 2.......... "0PUR_O02"
       Variable 3.......... " "
       Variable 4.......... " "
       Variable 3.......... " "
       Variable 4.......... " "
    How to correct the error
       Probably the only way to eliminate the error is to correct the program.
       You may able to find an interim solution to the problem
       in the SAP note system. If you have access to the note system yourself,
       use the following search criteria:
       If you cannot solve the problem yourself and you wish to send
       an error message to SAP, include the following documents:
    Please suggest a solution for this problem and to bring back the missing data mart statuses of the all the requests in ODS 0PUR_O01.

    hi olivier,
       the said request is in ODS 0PUR_O02. will try to explain the scenario:
    the data flow in this regard is:
    PSA ->          Infopackages                       -
    > *
              (2LIS_02_SGR, 2LIS_02_ITM  etc.)     
    *-----> ODS (OPUR_O01)  --> Infocube (OPUR_C07)
    *-----> ODS (OPUR_O01)  --> Infocube (OPUR_C08)
    *-----> ODS (OPUR_O01)  --> ODS (0PUR_O02) -
    > ***
    ----> Infocube (OPUR_C09)
    now when we try to reconstruct a request in ODS OPUR_O01, the a new request is created (Requid- 175429) with green status but without datamart icon. Now when this request is activated in 0PUR_O01 and since "Activate ODS object Data automatically and   Update Data Targets from ODS object automatically"  is checked in 0PUR_O01, new requests are created in ODS 0PUR_O02 (Requid- 17566)and infocubes  OPUR_C07(Requid- 17566) and OPUR_C08(Requid- 17566).  All these requests are with 0 data and PSA_ID are also 0. and then the above short dump and error comes in.. one thing to note here is
    "Activate ODS object Data automatically and   Update Data Targets from ODS object automatically" option is also checked in ODS 0PUR_O02.
    Hope this clears up everything..
    Edited by: Nivin Joseph Varkey on Jan 14, 2008 7:20 AM

  • Delta upload  for generic data sources.

    Hai All,
    I tried as per SAP online document , but still i am not getting the result i.e delta upload from the BW side. Here i explained  everything what i did.
    In R/3
    1. I have created a table with 3 fields SNO, SNAME, DOB
    2. Then i created some entries in that table.
    3. With Tr Code RSO2 i created one data source. Here i took Master Attribute Datasource
    4. In generic delta i took the Generic field as DOB. and Time stamp and upper limit 10 sec.
    In BW side
    1. I have replicated the data sources under application compone in which i have creatred in R/3..
    2. Then i activated that data source and created infopackage for that.
    3. in the selection i have specified 01.01.1900 to 01.01.9999
    4. First i made  Full Update then i get all the records from R/3.
    5. In R/3 i have created 2 more entries.
    6. In Infopackage UPDATE Tab i have selected the Initilize Delta Process ( Initialization with Data Transfer).
    For this i am getting the error message as follows.
    Error message from the source system
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.
    How you remove the error depends on the error message.
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    Here i closed the all things in R/3 as per the note mentioned above. Still it is giving the error message.
    If i select the Intilization Delta Process( Intilization with out Data Transfer) then i am getting the option Delta Update . I selected this Delta update and scheduled but no data is coming.
    Please help.
    Prashanth K

    Hai  Sachin,
                       I am getting the problem at PSA only. Actually that is connected to ODS Object only. Until unless we don't get the delta data into PSA we can't proceed futher. So please help. I am working on NW 2004S.
    Prashanth K

  • To upload a RTF and a PDF file to SAP R/3 and print the same through SAP

    I have a requirement to upload a PDF file and a RTF file to SAP R/3 and print the same.
    I wrote the following code for uploading a RTF file to SAP R/3 and print the same. However, the problem is , the formatting present in the RTF document( bold/italics..etc) is not being reflected when I do the 'print-preview' after the executing the code below :
    report z_test_upload .
    data: begin of itab occurs 0,
             rec type string,
          end of itab.
    data: options like itcpo.
    data: filename type string,
          count type i.
    data: filetype(10) type c value 'ASC'.
    DATA:   string_len TYPE i,
            n1 TYPE i.
    selection-screen begin of block b1.
      parameter: p_file1(128) default 'C:\test_itf.rtf'.
    selection-screen end of block b1.
                file_name = p_file1.
    move p_file1 to filename.
    call function 'GUI_UPLOAD'
              filename                = filename
              filetype                = filetype
              data_tab                = itab
              file_open_error         = 1
              file_read_error         = 2
              no_batch                = 3
              gui_refuse_filetransfer = 4
              invalid_type            = 5
              no_authority            = 6
              unknown_error           = 7
              bad_data_format         = 8
              header_not_allowed      = 9
              separator_not_allowed   = 10
              header_too_long         = 11
              unknown_dp_error        = 12
              access_denied           = 13
              dp_out_of_memory        = 14
              disk_full               = 15
              dp_timeout              = 16
              others                  = 17.
    if sy-subrc <> 0.
      message id sy-msgid type sy-msgty number sy-msgno
                with sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    loop at itab.
      string_len = strlen( itab-rec ).
      n1 = string_len DIV 134.
      ADD 1 TO n1.
      DO n1 TIMES.
        rtfline-line = itab-rec.
        APPEND rtfline.
        SHIFT itab-rec BY 134 PLACES.
    header-tdspras = 'E'.
      CODEPAGE               = '0000'
        DIRECTION              = 'IMPORT'
        FORMAT_TYPE            = 'RTF'
       FORMATWIDTH            = 72
        HEADER                 = header
        SSHEET                 = 'WINHELP.DOT'
        WITH_TAB               = 'X'
        WORD_LANGU             = SY-LANGU
        TABLETYPE              = 'ASC'
      TAB_SUBSTITUTE         = 'X09  '
      LF_SUBSTITUTE          = ' '
      REPLACE_SYMBOLS        = 'X'
      REPLACE_SAPCHARS       = 'X'
      MASK_BRACKETS          = 'X'
        NEWHEADER              = NEWHEADER
      WITH_TAB_E             =
      FORMATWIDTH_E          =
        FOREIGN                = RTFLINE
        ITF_LINES              = ITFLINE.
      LINKS_TO_CONVERT       =
    if sy-subrc <> 0.
         HEADER        = newheader
         OPTIONS       = options
      RESULT        =
        LINES         = itfline.
    if sy-subrc <> 0.
    Any hints or suggestions to solve this problem will be highly appreciated.

    Hi Vishwas,
    Check out the thread [Efficient way of saving documents uploaded|Re: Efficient way of saving documents uploaded by users; and check the blog by Raja Thangamani.
    Also check the thread [Export Images through Function Modules   |Export Images through Function Modules;.
    Hope it helps you.

  • Delta uploading job get cancelled in SAP R/3

    question, during delta loading from SAP R/3 to BW, if the delta uploading job get cancelled twice for some reason, will I lost some delta data??
    Pls advise!
    Many Thanks!

    If your delta jobs are getting error out in BW then you don't need to worry about, because till that time your delta pointer in source system is not yet updated. But if you make any request green by chance in BW monitor then your delta pointer will get updated and you might loose some delta.
    You can request repeat delta multiple times without any issue.

Maybe you are looking for