Dyanmic selection data load in DTP

Hi all,
I need to load data one after one in terms of Cal week from 2000 to 2012 thrugh DTP.
I can write a routine in DTP for the field Cal week
But can any one help me to guide how to load this DTP multiple times through Process chain?
thank You.
Regards
Bala

Hi,
i think, he want to load: calweek 01.2000 per full, then 02.2000, 03.2000, ...
=> you have to create 53 * 12 = 636 DTPs
other solution:
- create a Z-table. there you saved the last loaded calweek.
- create a Z-program to trigger a event.
- insert in your dtp a filter: read z-table. add 1 week. save the new week in the z-table.
- create a process chain. insert the dtp. insert the z-program.
=> so you have a loop. the process chain starts himself.
=> the next dtp load the next week.
=> you need a break for this loading. For example: when you are in 01.2013 create a shotdump.
Sven

Similar Messages

  • Selective data load using DTP

    Hi,
    We have created a data flow from once cube to other cube using Transformations. Now we would like to do selective data load from source cube to target cube. The problem is that in DTP we are not able to give the selective weeks in the Filter area because we can give filter conditions in change mode only and in production system we canu2019t go to change mode for DTP. So we struck up there. Can any one of you tell me how to do selective data load in this scenario
    Thanks in advance

    Hi,
    As a try, createe a new DTP and try to get in to change mode.It might accept that way.
    otherway round,you can go the way as manisha explained in previous post.
    Do the load and do a selective deletion. you can do selective deletion using a program.
    Cheers,
    Srinath.

  • Selective data load and transformations

    Hi,
    Can youu2019ll pls clarify me this
    1.Selective data load and transformations can be done in
        A.     Data package
        B.     Source system
        C.     Routine
        D.     Transformation Library-formulas
        E.     BI7 rule details
        F.     Anywhere else?
    If above is correct what is the order in performance wise
    2.Can anyone tell me why not all the fields are not appear in the data package data selection tab even though many include in datasource and data target.
    Tks in advance
    Suneth

    Hi Wijey,
    1.If you are talking about selective data load, you need to write a ABAP Program in the infopackage for the field for which you want to select. Otherway is to write a start routine in the transformations and delete all the records which you do not want. In the second method, you get all the data but delete unwanted data so that you process only the required data. Performancewise, you need to observe. If the selection logic is complicated and taks a lot of time, the second option is better.You try both and decide yourself as to which is better.
    2. Only the fields that are marked as available for selection in the DS are available as selection in the data package. That is how the system is.
    Thanks and Regards
    Subray Hegde

  • Selective data load to InfoCube

    Dear All,
    I am facing the following problem :
    I have created staging DSOs for billing item (D1) and Order item (D2). Also i have created one InfoCube (C1) which requires combined data of order and billing and so we have direct transformation with billing DSO (D1-->C1) and in transformation routines we had look up from Order item (D2) DSO.
    Now all the deltas are running fine. But in today's delta particular Order has not retrieved, say 123, but corresponding Billing document, say 456,  has been retrieved through delta.
    So now while DTP ran for C1 cube it has not loaded that particular billing doc (456) and corresponding Order details(123).
    I thought of loading this particular data by creating new Full DTP to Cube C1. Is this approach ok?
    Please help on the same.
    Regards,
    SS

    Hi,
    Yes you can do a full load. Just make sure the selection condition in your DTP is EXACTLY THE SAME as selective delete on C1.
    I'd suggest put a consolidation DSO D3 in the position of C1. And you can always use delta update C1 from D3. In my company there are similar cases and we love the consolidation DSO.
    Regards,
    Frank

  • Data load through DTP giving Error while calling up FM RSDRI_INFOPROV_READ

    Hi All
    We are trying to load data in Cube through DTP from DSO. In the Transformation, we are looking up Infocube data through SAP Standard Function Module 'RSDRI_INFOPROV_READ'. The Problem we are facing is that our loads are getting failed & it is giving error as 'Unknown error in SQL Interface' & Parallel process error.
    In the DTP, We have Changed the No. of Parallel processes from 3 (default) to 1 but still the above issue exists with data loads.
    We had similar flow developed in 3.5 (BW 3.5 Way) where we had used this Function Module 'RSDRI_INFOPROV_READ' & there our data loads are going fine.
    We feel there is compatability issue of this FM with BI 7.0 data flows but are not sure. If anybody has any relevant inputs on this or has used this FM with BI 7.0 flow then please let me know.
    Thanks in advance.
    Kind Regards
    Swapnil

    Hello Swapnil.
    Please check note 979660 which mentions this issue ?
    Thanks,
    Walter Oliveira.

  • Data load from DTP

    Dear All,
    I have loaded around 2 millions of records through DTP with parrell processing enabled. It has load by 250 data package. One of the data package in DTP monitor is in yellow state from long time and that was stopped at Write to Fact table step.
    If I need to reload again the data load will take long time. Any ideas how to correct that package alone.
    As we used to do in 3.x (Manual Update)
    Please let me know any ideas on this.
    Regards
    PV

    Hi,
    with that switch you can decide if the status will be set automaticly or if you want to look for the result and set it manually. There should be no real reason to do so.
    The processing will be in parallel by default. Do you have any special update rules that might be the cause of a dead lock? In general the process is just for doing that, so that no deadlock should occur.

  • Master Data load via DTP (Updating attribute section taking long time)

    Hi all,
    Iam loading to a Z infoobject. Its a master data load for attributes.Surprisingly, i could find that PSA  pulls records very fastly( 2 minutes) but the DTP which updates the infoobject takes a lot of time. It runs into hours.
    When observed the DTP execution monitor, which shows the breakup of time between extraction,filter,transformation,updation of attributes
    i could observe that the last step "updation of attributes for infoobject" is taking lots of time.
    The masterdata infoobject has got also two infoobjects compounded.
    In transformation ,even they are mapped.
    No of parallel processes for the DTP was set to 3 in our system.
    Job Class being "C".
    Can anyone think of what could be the reason.

    Hi,
    Check the T code ST22 for any short dump while loading this master data. There must be some short dump occured.
    There is also a chance that you are trying to load some invalid data (like ! character as a first character in the field) into the master.
    Regards,
    Yogesh.

  • Calrification on Data load thru DTP

    Hi All,
    We are loading the data from DSO to cube thru DTP. Here the DSO has the complete history data. But i need to load only one year data to the cube. Thru filter option in DTP can I load this to cube or anyother possibilties are there? can you guide me in this.
    Thanks
    Sathiya

    Hi,
    There are tow ways in which you can achieve this.
    1. Through a filter in a DTP.
    2. Write a start routine in which delete all the data except current year.
    - Jaimin

  • Master data Load - Strategy

    Hi All,
    I would like to know the better master data strategy.
    When using 3.x master data load strategy I can load directly into the info object without going thru additional DTP process. Is there any specific advantages of doing this, besides that it is 7.X.
    I have been using 2004s from 2005 but most of the implementations we used 3.x methodology for master data load and  DTP for transaction load. i would like to know whether SAP recommands new methodology using DTP for master data loads? If I load my master data using 3.x I can avoid one extra step, but will it be discontinued in the future? and have to use DTP even for this?
    Please advice if you know what is the best way forward strategically an dtechnically?
    Thanks,
    Alex.

    Alex,
    Please read my answer...
    The new data flow designed by SAP is using the DTP, even for Master Data... Right now you can use the "3.x" style which they maintain for backward-compatibility but down the road, eventually, it will be dropped, so looking ahead, technically and strategically, the right way to go is by using DTP...
    You can go here http://help.sap.com/saphelp_nw70/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm and check under Data Warehousing, Data Distribution, Data Transfer Process...
    You could also open an OSS note to SAP and ask them directly.
    Thanks,
    Luis

  • Selection in data load from infoprovider

    Hi Guys,
    In BPC NW 7.5 we have to load data from infoprovider allowing users to select data (like for dimension TIME) by BPC prompt.  We found two solutions solving partially our problem. In fact, users have to modify manually the selections.
    SOLUTION 1:
    To load data we want use the process chain /CPMB/INFOPROVIDER, but we know that it is not possible insert selections in the /CPMB/INFOPROVIDER prompt (as describe in this post Re: Package LOAD INFOPROVIDER, Select input ENTITY).
    To select data we can use an intermediate infocube as a BW workaround (as describe in this post Re: BPC 7.5: Delta Load when loading from BI InfoProvider ) to have a source with only the selected data.This could be done by a selection in the DTP between the source infocube and the intermediate infocube. This solution is not dynamic, in fact, in this case users have to modify manually the DTP selection.
    How can we allow users to insert this selection in the DTP by a BPC prompt?
    SOLUTION 2:
    To select data we can use a transformation file inserting a selection like
    SELECTION = <Dimension1_techname>,<Dimension1_value>.
    It is not dynamic, in fact, also in this case users have to modify manually the file selection.
    Do you know how to allow these selection by a BPC prompt to avoid these manual changes?
    Do you know other solutions?
    Thank you for your support.

    Hi D-Mark,
    This is definitely a place where it would be nice to see some additional functionality added to BPC. Variable replacement in the transformation file based on the data manager prompt would probably be the best thing to have in the software.
    In any case, getting back to your question, manually modifying the transformation file selection is the most common practice on BPC projects. The blog linked by Naresh is a fairly elegant way to do this, though it doesn't completely get around the fact that it's easy to forget to do and easy to get confused about what is going on in the transformation file.
    A third option that no one has mentioned is to do a SELECTION statement in the transformation file based on navigational attributes in the source InfoProvider. This approach can make the selection statement dynamic based on the contents of BW InfoObjects. Still not very user-friendly, but if you can put an automatic process in place to update the BW navigational attributes this might meet your need without having to set up an extra BW staging InfoProvider.
    The SELECTION syntax is documented here, though it doesn't mention that you can select on navigational attributes: [http://help.sap.com/saphelp_bpc75_nw/helpdata/en/5d/9a3fba600e4de29e2d165644d67bd1/frameset.htm]
    With navigational attributes (the profit center attribute of cost center, for example) it would be something like:
    SELECTION=0COST_CENTER___0PROFIT_CENTER,PC01
    Ethan

  • Can we use 0INFOPROV as a selection in Load from Data Stream

    Hi,
    We have implemented BW-SEM BPS and BCS (SEM-BW - 602 and BI 7 ) in our company.
    We have two BPS cubes for Cost Center and Revenue Planning and we have Actuals Data staging cube, we use 0SEM_BCS_10 to load actuals.
    We created a MultiProvider on BPS cubes and Staging cube as a Source Data Basis for BCS.
    Issue:
    When loading plan data or Actuals data into BCS (0BCS_C11) cube using Load from Data Stream method, we have performance issue, We automated load process in a Process Chain. Some times it take about 20 hrs for only Plan data load for 3 group currencies and then elimination tasks.
    What I noticed is, (for example/) when loading Plan data, system is also reading Actuals Cube which is not required, there is no selection available in Mapping or selection tab where I can restrict data load from a particular cube.
    I tried to add 0INFOPROV into databasis but then it doen't show up as selection option in the data collection tasks.
    Is there a way where I can restrict data load into BCS using this load option and able to restrict on cube I will be reading data from ?
    I know that there is a filter Badi available, but not sure how it works.
    Thanks !!
    Naveen Rao Kattela

    Thanks Eugene,
    We do have other characteristics like Value Type (10 = Actual and 20 = Plan) and Version (100 = USD Actual and 200 = USD Plan), but when I am loading data into BCS using Load from Data Stream method, the request goes to all the underlying cubes, which in my case are Planning cubes and Actual Cube, but I don't want request to goto Actual Cube when I am running only Plan load. I think its causing some performance issue.
    For this reason I am thinking if I can use 0INFOPROV as we use in Bex queries to filter the InfoProvider so that the data load performance will improve. 
    I was able to to bring in 0INFOPROV into DataBasis by adding 0INFOPROV  in the characteristics folder used by the Data Basis.
    I am able to see this InfoObject Data Stream Fileds tab. I checked marked it to use it in the selection and regenerated the databasis.
    I was expecting that now this field would be available for selection in data collection method, but its not.
    So If its confirmed that there is no way we can use 0INFOPROV as a selection then I would suggest my client for a redesign of the DataBasis it self.
    Thanks,
    Naveen Rao Kattela

  • Data Load PSA to IO (DTP) Dump: EXPORT_TOO_MUCH_DATA

    Hi Gurus,
    Iu2019m loading Data from PSA to IO: 0BPARTNER. I habe around 5 Mil entries.
    During the load the control Job dumps with the following dump:
    EXPORT_TOO_MUCH_DATA
    1. Data must be distributed into portions of 2GB
    2. 3 possible solutions:
        - Either increase the sequence counter (field SRTF2) to include INT4
        or
        -export less data
        or
        -distribute the data across several IDs
    If the error occures in a non-modified SAP program, you may be able to
    find an interim solution in an SAP Note.
    If you have access to SAP Notes, carry out a search with the following
    keywords:
    "EXPORT_TOO_MUCH_DATA" " "
    "CL_RSBK_DATA_SEGMENT==========CP" or "CL_RSBK_DATA_SEGMENT==========CM00V"
    "GET_DATA_FOR_EXPORT"
    This is not the first time I do such large load.
    The Field: SRTF2 is already an INT4 type.
    Version BI: 701. SP06
    I found a lot of OSS Notes for monitoring jobs, Industry solutions, by BI change runu2026 nothing however to data loading process.
    Has anyone encountered this problem please?
    Thanks in advance
    Martin

    Hi Martin,
    There were series of notes which may be applicable here.
    However if you have semantic grouping enabled it may be that this is a data driven issue.
    The System will try to put all records into one package in accordance with teh semantic key.
    If it is too generic many records could be input to one data package.
    Please choose another (more) fields for semantic grouping - or unselect all fields if the grouping is not nessessary at all.
    1409851  - Problems with package size in the case of DTPs with grouping.
    Hope this helps.
    Regards,
    Mani

  • Extraction problem - selection conditions for data load using abap program

    Hi All,
           I have a problem loading data over a selected period where the selection of date range is done using ABAP routine (type 6). Here though in the request header tab in monitor screen i'm able to see the selection date range populated correctly, no records are being extracted. But if i delete the abap filter and directly give the same date range for selection we are able to extract data. if any body has faced similar problem and have a solution for it please help me with yur suggestion.
    Thanks,
    nithin.

    It seems the the data range is not properly set in the routine.
    You can check the value of selection period generated by routine in the data selection tab-> execute button is there .
    Click it to to test the selection values generated by the ABAP routine..
    If the value here seems correct one then paste the code of the routine that u have written with brief logic details that u have applied.
    Sonal.....

  • Hierarchy  data loading  in sap  bw 7.3 for 0 costcenter-dtp error

    Hi ,
    I am loading data from ECC with data source 0COSTCENTER_0101_HIER to BW.For this i have created transformations between info object HIERARCHIES 0COSTCENTER and data source and DTP as well.I have loaded data upto PSA with info package  no issues,while loading data from PSA to hierarchy i got error in DTP like "Node [00000006, 00000010 ]: Leaf ' ' already exists as child of node 00000005 ".
    Can any one faced similar issue in BW7.3 ,please share your thoughts
    Thanks,
    Sri.

    Hi Srivardhan,
    I too got the same error. This error probably rises if you are entering hierarchy with some characteristics in your infoObject whose data type is numc (character string with only digits).
    Instead try using a infoobject with Char as data type. or simply change the infoobject in which u are implementing your hierarchy to char data type.
    And give suitable sufficient length.
    Regards.

  • DTP problem on master data loading

    Dear Experts
           When we loading 0Employee master data daily, sometime PSA data can not reflect to master table
         For example, data in PSA as following
    10001    01/01/2008     01/31/2008    C1
    10001    02/01/2008     08/31/2008    C1
    10001    09/01/2008     12/31/9999    C2
         After DTP request success running, the data as following
    10001    01/01/2008     01/31/2008    C1
    10001    02/01/2008     08/31/2008    C1
    10001    09/01/2008     12/31/9999    C1
        the C2 in PSA can not update to master table, this problem happened time to time, but it hardly find it until users' complain.
        When I manually deleted DTP request from Master attribute list and re-load from PSA to master table, the problem can be fixed.
        By the way, I have already set DTP Semantic Groups as Employee ID.
         Anyone can advice it, thanks in advance.
    Regards
    Jie

    jie miao,
    Previous master data already available in BW. every day load you need to load only changes. So last record is enough.
    Eg:
    Initial load on 01-01-2008
    10001 01/01/2008 12/31/9999 C1
    --> Data loaded on 2nd Jan and above data available in bw.
    on 03-11-2008 you will get following records
    10001 01/01/2008 11/02/2008 C1
    10001 09/01/2008 12/31/9999 C2
    Load these records on 03 Nov, so system updates accordingly.
    Hope it Helps
    Srini

Maybe you are looking for

  • Open and Save Dialogs Very Slow to Open and Populate

    In all of my apps, the Open and Save dialogs have become +very slow+ to open and populate since installing Snow Leapoard. This only happens on one of my machines, and not others. (I've unmounted all connected drives, just to eliminate that possibilit

  • How can I paste a pdf of a Publisher document into Pages so I can edit?

    I have been asked to print a document that was created in Publisher on a PC. I have been sent a pdf of the document, which is A4, printed both sides, with four A6 text boxes on each side, with the same artwork and text in each box. The trouble is, th

  • Oracle Database Web Service Client using UTL_DBWS :: ORA-29532 Error

    Hi, I have the Oracle Database 10.2.0.1.0 :- SQL> select * from v$version; BANNER Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod PL/SQL Release 10.2.0.1.0 - Production CORE    10.2.0.1.0      Production TNS for 32-bit Windows: Versi

  • Oracle 8.1.7 + RedHat 7.3 installation success

    After several days of work, I succeded in installing Oracle 8i on RedHat 7.3. 1)You can proceed like usual for the account + groups creation. 2)Make the oracle directory. 3)This is what i add in my /home/oracle/.bash_profile PATH=$PATH:$HOME/bin expo

  • Date fromat in Forms 5.0

    Hi !! Can anyone guide me as to how do I insert the data through the forms 5.0 to oracle data base in the dd-mon-yyyy hh24:mi:ss format. The problem is that I need to pass the value of the sysdate to the last_update_date form variable in the pre-inse