*Can we do mutiple DTP's to load data from 2lis_03_BF and compress *

Dear Experts
We are loading data available in SAP BI from PSA level to cube 0IC_C03 from 2LIS_03_BF Data source using DTPu2019s
The data is already being loaded to 2LIS_03_BF Data source to retail cubes daily from a long time.
For our clients requirement, we have developed standard business content cube 0IC_C03.
I have transported the cube to PRD and now the issue is while running DTP
It is failing at 1250 data packet out of 2950 data packets
My question is can, I create a filter to load data using 0calmonth or plant to load all the material moments data in multiple DTPu2019s and can I compress all the requests in one go and make everything work to reconcile back with MC.9
Awaiting your replies
Thanks
Satish

Well, the first thing before doing anything is to find out why is it failing...
Check the DTP properties... Are you running parallel processes in the DTP and they fail to finish? Is it only one serial process and it times out? Any dumps in ST22?
Sometimes the issue is simple to fit, before you go ahead and make changes like this where you'll have good probabilities of locking processes while trying to write at the same time to the same tables...
Just a suggestion...

Similar Messages

  • Load data from file and send to background

    Hello, is it possible to load data from a file on the presentation server and then create a batch input to call a transaction with those values but on background NOT using data set?
    Thanks in advance.

    U can surely design the process to be executed in various ways..one of them is as below ::
    Upload ABAP.
    1. It reads the file from the presentation server and exports it into INDX database with a specific filename.
    2. Updates a ztable with user/upload date and time/filename/status=U.
    Batch ABAP.
    1. It reads the ztable for all the files with status=U.
    2. Using the filename IMPORTS the data from INDX database.
    3. Creates the batch input to call transaction.
    4. Sets the status as process along with process date and time.
    With the above you can have multiple uploading data and batch job picking up each file and doing the needful.
    Regards
    Anurag

  • What is the different to load date from SBIW and LO(lbwe)

    Hi all experts,
    I am wondering if i need to data to generate reports(infostructure S001) in BW. Can i just only load data in "business content datasources"(SBIW) ->select S001 OR I have to load data from LO.
    What is the different between two (SBIW,LBWE)
    Thank you
    Koala

    Hi Koala,
    After SBIW tcode execution you have a screen where another tcodes are accesible. Including LBWE. Namely, in SBIW screen
    'Data transfer to the SAP Business Information Warehouse/Settings for Application-Specific Datasources (PI)/Logistics/Managing Extract structures/Logistics Extraction Structures Customizing Cockpit' is LBWE tcode!
    As I already have answered to you
    anyone  experienced data load for S001?
    You need to choose between LIS extraction and LO extraction, not between SBIW and LBWE.
    BTW, managing LIS setup is situated in SBIW also:
    ...Logistics/Managing transfer information structures/Application-Specific setup of statistical data.
    Best regards,
    Eugene

  • Dtp error when loading data from one cube to another cube?

    hi,experts
      it seems strange that when I tick the some characteristics as navigational attributes in the cube , the error occurs during the execution of the DTP:error while updating to target Z* (Cube name ).
    once i turn the flag off , no error appear. could anyone give me a clue ?
    thanks in advance!

    Hi,
    When u make changes in the cube u need to make necessary changes in Transformation and activate the DTP.The checking of Navigational attributes in cube will appear in u r transformation as new fields where u need to create a mapping for them and then activate, also activate your DTP and then load the data. This should resolve your issue.
    Regards

  • How can I use Hash Table when processing the data from cdpos and cdhdr

    Hello Guru,
    I've a question,
    I need to reduce the access time to both cdhdr and cdpos.
    Because may be I'll get a huge number of entries.
    It looks like that by processing cdhdr and cdpos data will take many secondes,
    it depends on how many data you need to find.
    Hints : Putting instructions inside a form will slow down the program?
    Also, I just want use Hash table and I need to put a loop-instruction going on the hash-table in form.
    I know that it's no possible but I can declare an index inside my customized hash table.
    For example :
    DO
    READ TABLE FOR specific_hash_table WITH KEY TABLE oindex = d_oindex.
    Process data
    d_oindex += 1.
    UNTIL d_oindex = c_max_lines + 1.
    Doing this would actually not necessary improve the performance.
    Because It looks like I'm having a standard table, may be there's a hash function, but it could be a bad function.
    Also I need to use for example COUNT (*) to know how many lines I get with the select.
    FORM find_cdpos_data_with_loop
      TABLES
        i_otf_objcs TYPE STANDARD TABLE
      USING
        i_cdhdr_data TYPE HASHED TABLE
        i_objcl TYPE j_objnr
    *    i_obj_lst TYPE any
        i_option TYPE c
      CHANGING
        i_global TYPE STANDARD TABLE.
      " Hint: cdpos is a cluster-table
      CONSTANTS : objectid TYPE string VALUE 'objectid = i_obj_lst-objectid',
                  changenr TYPE string VALUE 'changenr = i_obj_lst-changenr',
                  tabname TYPE string VALUE 'tabname = i_otf_objcs-tablename',
                  tabnameo1 TYPE string VALUE 'tabname NE ''''',
                  tabnameo2 TYPE string VALUE 'tabname NE ''DRAD''',
                  fname TYPE string VALUE 'fname = i_otf_objcs-fieldname'.
      DATA : BEGIN OF i_object_list OCCURS 0,
                objectclas LIKE cdpos-objectclas,
                objectid LIKE cdpos-objectid,
                changenr LIKE cdpos-changenr,
             END OF i_object_list.
      DATA : i_cdpos LIKE TABLE OF i_object_list WITH HEADER LINE,
             i_obj_lst LIKE LINE OF i_cdpos.
      DATA : tabnamev2 TYPE string.
      IF i_option EQ 'X'.
        MOVE tabnameo2 TO tabnamev2.
      ELSE.
        MOVE tabnameo1 TO tabnamev2.
      ENDIF.
    *LOOP AT i_cdhdr_data TO i_obj_lst.
      SELECT objectclas objectid changenr
        INTO TABLE i_cdpos
        FROM cdpos
        FOR ALL ENTRIES IN i_otf_objcs
        WHERE objectclas = i_objcl AND
              (objectid) AND
              (changenr) AND
              (tabname) AND
              (tabnamev2) AND
              (fname).
      LOOP AT i_cdpos.
        APPEND i_cdpos-objectid TO i_global.
      ENDLOOP.
    *ENDLOOP.
    ENDFORM.                    "find_cdpos_data

    Hey Mart,
    This is what I met, unfortunately I get the same performance with for all entries.
    But with a lot of more code.
    FORM find_cdpos_data
      TABLES
        i_otf_objcs TYPE STANDARD TABLE
      USING
        i_objcl TYPE j_objnr
        i_obj_lst TYPE any
        i_option TYPE c
      CHANGING
        i_global TYPE STANDARD TABLE.
      " Hint: cdpos is a cluster-table
      CONSTANTS : objectid TYPE string VALUE 'objectid = i_obj_lst-objectid',
                  changenr TYPE string VALUE 'changenr = i_obj_lst-changenr',
                  tabname TYPE string VALUE 'tabname = i_otf_objcs-tablename',
                  tabnameo1 TYPE string VALUE 'tabname NE ''''',
                  tabnameo2 TYPE string VALUE 'tabname NE ''DRAD''',
                  fname TYPE string VALUE 'fname = i_otf_objcs-fieldname'.
    *  DATA : BEGIN OF i_object_list OCCURS 0,
    *            objectclas LIKE cdpos-objectclas,
    *            objectid LIKE cdpos-objectid,
    *            changenr LIKE cdpos-changenr,
    *         END OF i_object_list.
    ** complete modified code [begin]
      DATA : BEGIN OF i_object_list OCCURS 0,
                objectclas LIKE cdpos-objectclas,
                objectid LIKE cdpos-objectid,
                changenr LIKE cdpos-changenr,
                tabname LIKE cdpos-tabname,
                fname LIKE cdpos-fname,
             END OF i_object_list.
    ** complete modified code [end]
      DATA : i_cdpos LIKE TABLE OF i_object_list WITH HEADER LINE.
      DATA : tabnamev2 TYPE string.
    ** complete modified code [begin]
    FIELD-SYMBOLS : <otf> TYPE ANY,
                    <otf_field_tabname>,
                    <otf_field_fname>.
    ** complete modified code [end]
      IF i_option EQ 'X'.
        MOVE tabnameo2 TO tabnamev2.
      ELSE.
        MOVE tabnameo1 TO tabnamev2.
      ENDIF.
    **  SELECT objectclas objectid changenr
    **    INTO TABLE i_cdpos
    *  SELECT objectid
    *      APPENDING CORRESPONDING FIELDS OF TABLE i_global
    *      FROM cdpos
    *      FOR ALL ENTRIES IN i_otf_objcs
    *      WHERE objectclas = i_objcl AND
    *            (objectid) AND
    *            (changenr) AND
    *            (tabname) AND
    *            (tabnamev2) AND
    *            (fname).
    ** complete modified code [begin]
      SELECT objectid tabname fname
          INTO CORRESPONDING FIELDS OF TABLE i_cdpos
          FROM cdpos
          WHERE objectclas = i_objcl AND
                (objectid) AND
                (changenr) AND
                (tabnamev2).
    ASSIGN LOCAL COPY OF i_otf_objcs TO <otf>.
      LOOP AT i_cdpos.
      LOOP AT i_otf_objcs INTO <otf>.
       ASSIGN COMPONENT 'TABLENAME' OF STRUCTURE <otf> TO <otf_field_tabname>.
       ASSIGN COMPONENT 'FIELDNAME' OF STRUCTURE <otf> TO <otf_field_fname>.
        IF ( <otf_field_tabname>  EQ i_cdpos-tabname ) AND ( <otf_field_fname> EQ i_cdpos-fname ).
          APPEND i_cdpos-objectid TO i_global.
          RETURN.
        ENDIF.
      ENDLOOP.
      ENDLOOP.
    ** complete modified code [end]
    **  LOOP AT i_cdpos.
    **    APPEND i_cdpos-objectid TO i_global.
    **  ENDLOOP.
    ENDFORM.                    "find_cdpos_data

  • Any diff. in loading data from (SD, MM , PP ) and from (SCM, CRM)

    Guyz-
      Can some one tell me, whether there will be any difference between loading data in BW from modules like SD-MM-PP and loading data from SCM,CRM.
      Any source of material/docs for loading data from SCM and CRM.
    Thanks in advance

    Hi Punzu,
    Check these:
    http://help.sap.com/bp_biv235/BI_EN/html/bw.htm
    CRM Business Content:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/af/ed833b2ab3ae0ee10000000a11402f/frameset.htm
    extract data from bw to crm, using APD-analyis process designer
    http://help.sap.com/saphelp_nw2004s/helpdata/en/6d/40a2bb63ac744a80eb288830e01f7c/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/49/7e960481916448b20134d471d36a6b/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/a5/9b1b40fcdd8f5ce10000000a155106/frameset.htm
    Also check this:
    Re: CRM to BW loads
    Bye
    Dinesh

  • How to load data from Quikbooks into PeopleSoft General Ledger module ?

    Folks,
    Hello.
    My client's data is currently stored in Quikbooks.
    After install PeopleSoft FSCM 9.0, I need to load the data from Quickbooks into PeopleSoft General Ledger module. But I don't know how to use PeopleSoft Integration Broker to integrate with Quikbooks.
    Can any folks tell me how to load data from Quikbooks into PeopleSoft General Ledger module ?

    Hi,
    If the data load is one time process, then you can use PL/SQL or datamover to load the data to PeopleSoft application.
    If you want to load the data in real time, then you need to create In-bound and out-bound nodes to perform the transaction.
    Thanks
    Soundappan

  • DTP load data from DSO to CUBE can't debug

    Hi,all expert.
    I met a problem yestoday.
    When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
    inconsistent input parameter(parameter:<unknow>,value <unknown>)
    message no: RS_EXCEPTION 101.
    I don't know why,PLZ help me,thank you very much.

    Hi,
    An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
    Thanks,
    Arminder

  • How to load data from one Infocube to another request by request using DTP

    Hi All,
    I have a scenario where we are maintaining backup Infocube B for Infocube A.  User loads data from a flat file many times a day in different requests.  We need to maintain backup of the Infocube A on weekly basis.  i.e data to be refreshed with delta update from cube A to cube B.  There are some situations where user deletes some of the requests in Infocube A randomly after use.  Will this change effect in back up cube B when performed data refresh from cube A to cube B.  i.e. this functionality is similar to reconstruct in BW 3.5.  Now we are running on BI 7.0 SP 9.
    Can anyone answer this ASAP.
    Many Thanks,
    Ravi

    You cannot load request by request, " Get Data By Request " DTP loads request by request on " First In First Out " basis. You can run some Pseudo/Fake DTP's if you dont want to load data from a particular request.
    If the user deletes a request from Cube A, it wont be loaded to Cube B but if it is already loaded in Cube B and later the user deletes the request from Cube A you have to delete the request frm Cube B. Inorder to monitor request by request run DTP with   " Get Data By Request set.

  • How to load data from PSA to CUBE & DSO at a time using DTP in BI 7 ?

    HI all,
    I am new to BI 7 . How to load the data at same time to DSO & INFO CUBE using DTP.
    Please provide me steps to load & plz specify which update mode I have to use ( FULL OR DELTA ) which one is best.
    Plz Suggest me.
    Thanks & Regards,
    Kiran m.
    Message was edited by:
            kiran manyam

    Below are the basic steps which we follow in any BI 2004S system:
    1)Create datasource. Here u can set/check the Soucre System fields.
    2)Create Transformation for that datasource. (no more update rules/transfer rules)
    2.1) While creating transformation for DS it will ask you for data target name, so just assign where u want to update ur data.
    DataSource -> Transformation -> Data Target
    Now if you want to load data into data target from Source System Datasource:
    1) Create infopackage for that data source. If you are creating infopackage for new datasources, it will only allow you update upto PSA, all other options u can see as disabled.
    2)Now Create DTP (Data Transfer Process) for that data source.
    3) NOw schdule the Infopackage, once the data is loaded to PSA, you can execute your DTP which will load data to data target.
    If you are loading data from one one data target to other, no need to use PSA, you can directly execute DTP in that case.
    Data Source -> Transformation (IP/DTP) -> Data Target1 -> DTP ->Data Target 2
    Use the below link for detailed example:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/fc61e12d-0a01-0010-2883-e2fc63ef729b
    Infosources are no more mandatory with BI 7.0, below is the link to scenarios where we use infosources:
    http://help.sap.com/saphelp_nw04s/helpdata/en/44/0243dd8ae1603ae10000000a1553f6/content.htm
    Full or delta depends on your requirement...
    chk the below thread to know better
    difference between the various loads
    hope it helps
    Message was edited by:
            sriram viswanathan

  • Error while loading data from DataSource to InfoCube/DSO through DTP

    Hi Friends,
    I am trying to load data from DataSource to InfoCube/DSO through DTP and getting the following error:
    Exception in substep: Extraction completed
    Processing terminated
    Data package 1 / 30.04.2009 22:38:30 / Status 'Processed with Errors'
    When i see the detail message it says:
    Syntax error in routine "convert_to_applstru", line 25 of generated program"
    I checked all my transformation and looks ok. I was able to load data earlier and i getting this error all of sudden. Does some one know what is this routine "convert_to_applstru" ?
    Thanks,
    Amit

    Hi Arun,
    Where do i see this generated program in RSRV. I dont see this in RSRV. Please guide me.
    Thanks,
    Amit

  • If your database in Full Recovery mode, can you use Bulk Insert Task to load data

    If your database in Full Recovery mode, can you use Bulk Insert Task to load data

    If your database in Full Recovery mode, can you use Bulk Insert Task to load data
    Yes you can ofourse but dont be in idea that logging will be mininal. Loggign will be as per recovery model full. Every thing will be logged. If you are going to use bulk insert task you can consider switching recovery model to Bulk logged but you will not
    have option to do point in time recovery.
    PS: please dont create duplicate threads
    If you read first Note section in below link it clearly states that yes logging will be full and you can use
    http://technet.microsoft.com/en-us/library/ms191244(v=sql.105).aspx
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it.
    My TechNet Wiki Articles

  • Error in loading data from PSA to masterdataTables by using ProcessManually

    Hello,
    While loading Attribute master data to Masterdata tables, the data is loaded successfully to PSA but when I clicked process manually tab in the monitor screen to load the Attribute master data to Masterdata tables it is showing error.
    Can anyone pls help me in solving this issue.
    Thanks,
    Soujanya

    Hi Soujanya,
    If it is a SAP BI 7 data flow, in order to load data from PSA to the InfoObject master data tables, you have to run the corresponding DTP.
    On the other hand, if it is a SAP BW 3.x data flow, the InfoPackage should be configured so as to update data to subsequent targets (that is to say, the InfoObject).
    Finally, take into account that if the DataSource provides duplicate records, the DTP or the InfoPackage (depending on the dataflow version) should be configured in order to deal with it.  Otherwise, you could receive error messages during data load.
    I hope this helps you.
    Regards,
    Maximiliano

  • Error while loading data from DSO to Infocube

    Hi all,
    I'm loading data from flat file to the cube via a DSO. I get an error in the DTP.
    The data loads to the DSO, i'm able to see the contents in the active data table.
    Transformation from DSO to Cube is activated without any errors. Its a one to one mapping. No routines used.
    When i execute the DTP ( i use full load option as it is a one time load) i get the following error in DTP: All of which are Red.
    Process Request
    Data Package 1: Error during processing
    Extract from DSO  XXXXX
    Processing Terminated
    The long text of the error gives the msg:  Data package 1: Processed with errors    RSBK257
    Thanks

    Hi,
    You can check the below forums.
    DTP with error RSBK257
    - Jaimin

  • How to load data from PSA in BI 7.0

    Hi,
    Gurus
    I'm trying to load data from PSA to my InfoObject masterdata but I can't get the change mode, so I change the processing to data Target.Right now its on PSA only
    I will reward poaints
    Regards
    Alex

    Hi,
      I am just a beginner in 7.0, i had the same issue I loaded the master data in PSA but couldn't load in Data target i.e. info object (correct me if my understanding of your problem is wrong).
    Try these steps:
    1) Create and activate Transformation.
    2) Create and activate DTP first then in DTP itself in <b>Execute tab</b> click "execute" ,go to infoobject right click on the info object you have created and select "maintain info object" , the datas will be loaded into info object
    *Please assign points if useful.

Maybe you are looking for

  • Can I have two itunes accounts on the same computer?

    I share a itunes account with my family and we all purchase games, music, etc on this account. However, I would like my own account so I can purchase whatever I want with my credit card. Am I allowed to use two accounts in the same itunes? I want to

  • JVM problem with EJB...

    Hi, I have an application which access EJBs located in WebSPhere Application Server 3.5 on a server. Why when i execute my application with SUN's JRE i have a problem (FAILED to Lookup Bean) and when i execute my application with Websphere'JRE, there

  • How can I reinstall the YouTube app?

    How can I reinstall the YouTube app?

  • Infosets - Adhoc Query

    Dear Abapers, I have created a new infotype: 9100. now i have to add fields from 9100, 0000, 0001 and 0002 to a infoset z_pa_admin which is existing already. I have added the newly created infotype: 9100 to this infoset to appear on left side. now i

  • Workflow for PO release in ECC

    Hi All, can someone provide me the name of TASK invloved in  workflow for releasing PO in ECC. I want to map the release of PO through workflow. and I would also like to know how to find out the different workflow. It would be great if someone can gi