Selective data load using DTP

Hi,
We have created a data flow from once cube to other cube using Transformations. Now we would like to do selective data load from source cube to target cube. The problem is that in DTP we are not able to give the selective weeks in the Filter area because we can give filter conditions in change mode only and in production system we canu2019t go to change mode for DTP. So we struck up there. Can any one of you tell me how to do selective data load in this scenario
Thanks in advance

Hi,
As a try, createe a new DTP and try to get in to change mode.It might accept that way.
otherway round,you can go the way as manisha explained in previous post.
Do the load and do a selective deletion. you can do selective deletion using a program.
Cheers,
Srinath.

Similar Messages

  • Dyanmic selection data load in DTP

    Hi all,
    I need to load data one after one in terms of Cal week from 2000 to 2012 thrugh DTP.
    I can write a routine in DTP for the field Cal week
    But can any one help me to guide how to load this DTP multiple times through Process chain?
    thank You.
    Regards
    Bala

    Hi,
    i think, he want to load: calweek 01.2000 per full, then 02.2000, 03.2000, ...
    => you have to create 53 * 12 = 636 DTPs
    other solution:
    - create a Z-table. there you saved the last loaded calweek.
    - create a Z-program to trigger a event.
    - insert in your dtp a filter: read z-table. add 1 week. save the new week in the z-table.
    - create a process chain. insert the dtp. insert the z-program.
    => so you have a loop. the process chain starts himself.
    => the next dtp load the next week.
    => you need a break for this loading. For example: when you are in 01.2013 create a shotdump.
    Sven

  • How to push a perticular request from PSA to Data target using DTP in BI

    Hello all.
      I have mulitple requests in PSA & wanted to move only perticular requests (Randomly,  say first request & last request ) to data target using DTP in BI 7.0. How can we do this.. ? Is there any option in DTP (BI 7.0)for selective loading of requests from PSA to Data Target..
      Thanks in advance..
    Cheers,
    sami.

    Hi,
      It is possible by using 'zero fetch' in DTP
    If you want only the recent req to be loaded.
    1. In PSA, change the status of the recent request to red
    2. Do a zero fetch
    that is processing mode - No data transfer; delta status in source: fetched
    With this processing mode you execute a delta without transferring data.
    3. change the particular request status back to green
    4. Run a delta load
    Regards,
    Priya.D

  • Selective data load and transformations

    Hi,
    Can youu2019ll pls clarify me this
    1.Selective data load and transformations can be done in
        A.     Data package
        B.     Source system
        C.     Routine
        D.     Transformation Library-formulas
        E.     BI7 rule details
        F.     Anywhere else?
    If above is correct what is the order in performance wise
    2.Can anyone tell me why not all the fields are not appear in the data package data selection tab even though many include in datasource and data target.
    Tks in advance
    Suneth

    Hi Wijey,
    1.If you are talking about selective data load, you need to write a ABAP Program in the infopackage for the field for which you want to select. Otherway is to write a start routine in the transformations and delete all the records which you do not want. In the second method, you get all the data but delete unwanted data so that you process only the required data. Performancewise, you need to observe. If the selection logic is complicated and taks a lot of time, the second option is better.You try both and decide yourself as to which is better.
    2. Only the fields that are marked as available for selection in the DS are available as selection in the data package. That is how the system is.
    Thanks and Regards
    Subray Hegde

  • Data load using   0HR_PA_OS_1 data source

    Hi,
    I am trying to load data (Full Load) using 0HR_PA_OS_1 (Staffing Assignments) data source, But it is extracting 0 records….
    I checked in RSA3 (R/3) it is stilling showing 0 records…but there is significant amount of data already present in BW Cube (50000 Records)…
    Do I need to perform any pre requisites to perform data load using  0HR_PA_OS_1
    Thanks

    Refered SAP Note 429145

  • Extraction problem - selection conditions for data load using abap program

    Hi All,
           I have a problem loading data over a selected period where the selection of date range is done using ABAP routine (type 6). Here though in the request header tab in monitor screen i'm able to see the selection date range populated correctly, no records are being extracted. But if i delete the abap filter and directly give the same date range for selection we are able to extract data. if any body has faced similar problem and have a solution for it please help me with yur suggestion.
    Thanks,
    nithin.

    It seems the the data range is not properly set in the routine.
    You can check the value of selection period generated by routine in the data selection tab-> execute button is there .
    Click it to to test the selection values generated by the ABAP routine..
    If the value here seems correct one then paste the code of the routine that u have written with brief logic details that u have applied.
    Sonal.....

  • Selective data load to InfoCube

    Dear All,
    I am facing the following problem :
    I have created staging DSOs for billing item (D1) and Order item (D2). Also i have created one InfoCube (C1) which requires combined data of order and billing and so we have direct transformation with billing DSO (D1-->C1) and in transformation routines we had look up from Order item (D2) DSO.
    Now all the deltas are running fine. But in today's delta particular Order has not retrieved, say 123, but corresponding Billing document, say 456,  has been retrieved through delta.
    So now while DTP ran for C1 cube it has not loaded that particular billing doc (456) and corresponding Order details(123).
    I thought of loading this particular data by creating new Full DTP to Cube C1. Is this approach ok?
    Please help on the same.
    Regards,
    SS

    Hi,
    Yes you can do a full load. Just make sure the selection condition in your DTP is EXACTLY THE SAME as selective delete on C1.
    I'd suggest put a consolidation DSO D3 in the position of C1. And you can always use delta update C1 from D3. In my company there are similar cases and we love the consolidation DSO.
    Regards,
    Frank

  • Loads using DTP - manual update possible?

    Hi,
    I have a question regarding loads done using DTP in BW 7.0.
    In 3.X loads using infopackages, sometimes a data package will get runtime errors such as DBIF_RSQL_SQL_ERROR (deadlocks). Usually, we would change the status to red and do a manual update of the affected data package in the data load monitor.
    However, I cannot find this option for DTP and I have to reload the whole request even only 1 data package has failed because of the SQL error. Does anyone know how to repair this in a DTP load?
    Any help will be appreciated, thanks in advance!

    Hi CK,
    Not found a way of doing this at current but I know where you are coming from, you're thinking of the old set manual status or even try a sm58 and trigger the idocs.  If a DTP fails you might have the option of using the error stack but only if this has been configured.  A failed load will prompt you to do this.  Other ways of trying to fix the SQL errors are to check what the SQL problems are, most of our problems have been tablespace issues, etc.
    Daily monitoring by basis/bw admin should sort these.
    Cheers,
    Pom

  • Data load through DTP giving Error while calling up FM RSDRI_INFOPROV_READ

    Hi All
    We are trying to load data in Cube through DTP from DSO. In the Transformation, we are looking up Infocube data through SAP Standard Function Module 'RSDRI_INFOPROV_READ'. The Problem we are facing is that our loads are getting failed & it is giving error as 'Unknown error in SQL Interface' & Parallel process error.
    In the DTP, We have Changed the No. of Parallel processes from 3 (default) to 1 but still the above issue exists with data loads.
    We had similar flow developed in 3.5 (BW 3.5 Way) where we had used this Function Module 'RSDRI_INFOPROV_READ' & there our data loads are going fine.
    We feel there is compatability issue of this FM with BI 7.0 data flows but are not sure. If anybody has any relevant inputs on this or has used this FM with BI 7.0 flow then please let me know.
    Thanks in advance.
    Kind Regards
    Swapnil

    Hello Swapnil.
    Please check note 979660 which mentions this issue ?
    Thanks,
    Walter Oliveira.

  • Data load from DTP

    Dear All,
    I have loaded around 2 millions of records through DTP with parrell processing enabled. It has load by 250 data package. One of the data package in DTP monitor is in yellow state from long time and that was stopped at Write to Fact table step.
    If I need to reload again the data load will take long time. Any ideas how to correct that package alone.
    As we used to do in 3.x (Manual Update)
    Please let me know any ideas on this.
    Regards
    PV

    Hi,
    with that switch you can decide if the status will be set automaticly or if you want to look for the result and set it manually. There should be no real reason to do so.
    The processing will be in parallel by default. Do you have any special update rules that might be the cause of a dead lock? In general the process is just for doing that, so that no deadlock should occur.

  • Unable to load CSV data using APEX Data Load using Firefox/Safari on a MAC

    I have APEX installed on a Windows XP machine connected to an 11g database on the same Windows XP machine.
    While on the windows XP, using IE 7, I am able to successfully load a CSV spreadsheet of data using the APEX Data Load utility.
    However, if I switch to my MacBook Pro running OS X leopard, then login into same APEX machine using Firefox 2 or 3 or Safari 3, then try to upload CSV data, it fails on the "Table Properties" step when it asks you for the name of the new table and then asks you to set table properties, the table properties just never appear (they do appear in IE 7 on Windows XP) and if you try to hit the NEXT button, you get error message: "1 error has occurred. At least one column must be specified to include in new table." and of course, you can't specify the any of the columns because there is nothing under SET TABLE PROPERTIES in the interface.
    I also tried to load data with Firefox 2, Firefox 3 (beta), and Safari 3.1, but get same failed result on all three. If I return to the Windows XP machine and use IE 7.0, Data Load works just fine. I work in an ALL MAC environment, it was difficult to get a windows machine into my workplace, and all my end users will be using MACs. There is no current version of IE for the MAC, so I have to use Firefox or Safari.
    Is there some option in Firefox or Safari that I can turn on so this Data Load feature will work on the MAC?
    Thanks for your help. Any assistance appreciated.
    Tony

    I managed to get this to work by saving the CSV file as Windows CSV (not DOS CSV), which allowed the CSV data to be read by Oracle running on Windows XP. I think the problem had to do with different character sets being used for CSV on MAC versus CSV on Windows. Maybe if I had created my windows XP Oracle database with Unicode as the default character set, I never would have experienced this problem.

  • Master Data load via DTP (Updating attribute section taking long time)

    Hi all,
    Iam loading to a Z infoobject. Its a master data load for attributes.Surprisingly, i could find that PSA  pulls records very fastly( 2 minutes) but the DTP which updates the infoobject takes a lot of time. It runs into hours.
    When observed the DTP execution monitor, which shows the breakup of time between extraction,filter,transformation,updation of attributes
    i could observe that the last step "updation of attributes for infoobject" is taking lots of time.
    The masterdata infoobject has got also two infoobjects compounded.
    In transformation ,even they are mapped.
    No of parallel processes for the DTP was set to 3 in our system.
    Job Class being "C".
    Can anyone think of what could be the reason.

    Hi,
    Check the T code ST22 for any short dump while loading this master data. There must be some short dump occured.
    There is also a chance that you are trying to load some invalid data (like ! character as a first character in the field) into the master.
    Regards,
    Yogesh.

  • Data load uses wrong character set, where to correct? APEX bug/omission?

    Hi,
    I created a set of Data Load pages in my application, so the users can upload a CSV file.
    But unlike the Load spreadsheet data (under SQL Workshop\Utilities\Data Workshop), where you can set the 'File Character Set', I didn't see where to set the Character set for Data Load pages in my application.
    Now there is a character set mismatch, "m³/h" and "°C" become "m�/h" and "�C"
    Where to set?
    Seems like an APEX bug or at least omission, IMHO the Data Load page should ask for the character set, as clients with different character sets could be uploading CSV.
    Apex 4.1 (testing on the apex.oracle.com website)

    Hello JP,
    Please give us some more details about your database version and its character set, and the character set of the CSV file.
    >> …But unlike the Load spreadsheet data (under SQL Workshop\Utilities\Data Workshop), where you can set the 'File Character Set', I didn't see where to set the Character set for Data Load pages in my application.
    It seems that you are right. I was not able to find any reference to the (expected/default) character set of the uploaded file in the current APEX documentation.
    >> If it's an APEX omission, where could I report that?
    Usually, an entry on this forum is enough as some of the development team members are frequent participants. Just to be sure, I’ll draw the attention of one of them to the thread.
    Regards,
    Arie.
    ♦ Please remember to mark appropriate posts as correct/helpful. For the long run, it will benefit us all.
    ♦ Author of Oracle Application Express 3.2 – The Essentials and More

  • Dump when loading Master data(Attributes). using DTP

    Hello....  I am loading activity network.
    and getting a dump.
    "MESSAGE_TYPE_X" " "
    "CL_RSDMD_UPDATE_MASTER_DATA===CP"  or "CL_RSDMD_UPDATE_MASTER_DATA===CM00R"
    "_CHECK_DATA_PACKET"
    check for oss couldn't find any solution  I am in (BI 7.0 - 2004s) support pack SAPKW70015. If anyone of you has come accross this please let me know how you solved the issue.
    thank you.

    Hi Gopal,
       Here is the info on error message.
    " There are duplicates of the data record 1 & with the key '00000000038 &' for characteristic EPROPERTY &. "
    Hope this helps in understand the DTP error.
    Thanks,
    SB.

  • Need faster data loading (using sql-loader)

    i am trying to load approx. 230 million records (around 60-bytes per record) from a flat file into a single table. i have tried sql-loader (conventional load) and i'm seeing performance degrade as the file is being processed. i am avoiding direct path sql-loading because i need to maintain uniqueness using my primary key index during the load. so the degradation of the load performance doesn't shock me. the source data file contains duplicate records and may contain records that are duplicates of those that are already in the table (i am appending during sql-loader).
    my other option is to unload the entire table to a flat file, concatenate the new file onto it, run it through a unique sort, and then direct-path load it.
    has anyone had a similar experience? any cool solutions available that are quick?
    thanks,
    jeff

    It would be faster to suck into a Oracle table, call it a temporary table and then make a final move into the final table.
    This way you could direct load into an oracle table then you could
    INSERT /*+ APPEND */ INTO Final_Table
        SELECT DISTINCT *
        FROM   Temp_Table
        ORDER BY ID;This would do a 'direct load' type move from your temp teable to the final table, which automatically merging the duplicate records;
    So
    1) Direct Load from SQL*Loader into temp table.
    2) Place index (non-unique) on temp table column ID.
    3) Direct load INSERT into the final table.
    Step 2 may make this process faster or slower, only testing will tell.
    Good Luck,
    Eric Kamradt

Maybe you are looking for

  • Strange wifi problem on a brand-new MacBook Pro

    Hey there, I'm new here. Desperation has got the better of me, and I hope I can get some advice. So, last sunday I bought a brand new 13" MacBookPro with 2.8g i7 processor and 10.7.2 OS (I can't update to 10.7.4 because System Update tells me the dow

  • Error while loading java file in Oracle

    Hi, I am facing error while loading java class into oracle . Can you please help -bash-3.2$ cd /export/home/sacuser/ -bash-3.2$ loadjava -user sacuserdb/sacuserdb@BLM - resolve Oracle_Validator/src/com/nec/blm/oracle/validator/Validate.java unrecogni

  • Last Logon User name query attribute not populating

    I have created a query that looks for all desktops in my environment and returns the hostname of the PC, the computer system model, and the last logon user name of each machine.  The problem is about 200 PCs do not have the last logon user name field

  • ExecuteQuery() error with MySQL

    I have already followed all the steps recommended by OTN to create a connection with MySQL (most up-to-date files) in JDeveloper 10g. It works fine except when I try to find a view and execute a query using an Application Module in a jsp: Application

  • Configuring SquirrelMail to use SSL with SMTP

    I ran the conf.pl script to have SquirrelMail access my IMAP server via SSL, and everything works. I then tried to use SSL with SMTP as well but when I used SquirrelMail to send mail, I got an error saying "Can't open SMTP stream". What is the correc