W/O DSO - Delta doesnu00B4t work (Request by Request)

Hi experts,
I´m facing something really strange to me. I really need to load data from a W/O DSO to another W/O DSO with delta functionality (request by request), but it doesn´t work at all. It´s condensing all requests in a single one in target W/O DSO. Why is this happening?
The whole scenario:
1) Data target is initally empty
2) I have 3 requests in source W/O DSO with 1000 records each
3) DTP is set Delta request by request with 50.000 size package
After data loading, I get one single request with 3000 records. I really need 3 requests with 1000 records each.
Any ideas????
We are on NW 7.0 EHP1 - SP3

Hi Fabio:
>So you are telling me that you loaded more than one requisition in first DSO, executed the DTP and had the same number of requisitions in the second DSO?
Exactly.
>If it´s true, which version of Netweaver you are and SP level?
On my BW Server, I clicked on the "System" menu, selected the "Status..." option, and clicked on the "Component Information button" to obtain the information below.
Software Component:                      SAP_BW
Release:                                 700
Level:                                   0022
Highest Support Package:                 SAPKW70022
Short Description of Software Component: SAP NetWeaver BI 7.0
Regards,
Francisco Milán.
P.S. Concerning the flags, you're right, the settings must be:
- "Only Get Delta Once" (unchecked)
- "Get All New Data Request By Request" (checked)
- "Retrieve Until No More New Data" (checked)
Edited by: Francisco Milan on Jun 18, 2010 4:40 PM

Similar Messages

  • 'Get All New Data Request by Request' option not working Between DSO n Cube

    Hi BI's..
             Could anyone please tell me why the option ' Get one Request only' and  'Get All New Data Request by Request' is not working in DTP between Standard DSO and InfoCube.
    Scenario:
    I have done the data load by Yearwise say FY 2000 to FY 2009 in Infopackage and load it to Write-optimised DSO (10 requests) and again load Request by request to Standard DSO and activate each request. I have selected the option in DTP's to  'Get All New Data Request by Request' and its working fine between WDSO and SDSO. But not working between Cube and SDSO. While Execute DTP its taking as a single request from SDSO to Cube.( 10 request to single request).
    Regards,
    Sari.

    Hi,
    How does your DTP setting looks like from below options ? It should be change log, assuming you are not deleting change log data.
    Delta Init. Extraction from...
    - Active Table (with archive)
    - Active Table (without archive)
    - Archive ( full extraction only)
    - Change Log
    Also if you want to enable deltas, please do not delete change log. That could create issue while further update from DSO.
    Hope that helps.
    Regards
    Mr Kapadia
    *Assigning points is the way to say thanks*

  • How does the Delta functionality works while updating Infocube from PA

    Hi Gurus
    How does the Delta functionality works when you send forecast to the Info cube from the Planning Area.
    As I have weekly based planning, every week we have to update the forecast into the Cube. If I update  every week into the Cube , then the values are going to append  in the Info cube.
    Shall I create ODS in between the Planning area  and the Info cube .
    (or)
    Any other method ?
    Please Suggest.

    Hi Preetham,
    Data Extraction from planning area to infocube usually its supports for full update only, when u r going to load the data from planning area to cube ,it deletes the previous data and u vil get the fresh data, this is the best method performance wise,
    if u want to use the delta functionality u can go for planning area to Dso, Dso to cube, it vil supports the delta functionality, but it vil degrades the system performance,
    regards
    ravi

  • REPEAT DELTA WILL WORK FOR MASTER DATA DELTA LOADS

    Hi Guru's,
    REPEAT DELTA WILL WORK FOR MASTER DATA DELTA LOADS?
    AS I AM NEW TO SAP-BI CAN ANY ONE GIVE THE DETAIL EXPLANATION ON IT.
    IT WILL BE GREAT AND HELPFULL.
    Thanks and Regards,
    CM

    Hi CM,
    yes you can perform the repeat delta even for the master data if the datasource which was supplying the data was delta capable.
    You can do the repeat delta only in case if your previous request was erroneous and dont forget to make the request red in the status,so that system will treat like you are requesting the failed request once again where in case you will miss the previous request, i,e erroneous request if you dont make the request red.
    when you make the request red manually in the status,the pointer will move to the previous request once again in the delta queue and there will be no chance of missing the delta records.
    This process was same for both transaction as well as master data.
    Regards,
    Sunil...

  • How to load request by request from DSO to Cube

    Hi All,
    I am loading the data from DSO to Cube. In DSO each request has some 50 lakhs records. When I give execute in the cube transformation after some time it is giving timout error(Basis have set the time till 25 min) and I gave one more option in DTP get all new data request by request but this option also not getting the proper data. Again I am getting the time error.
    If anyone knows any other option please let me know.
    Thanks for your help in advance
    Regards
    Sathiya

    Hi,
    You can load on the criteria of key fields in the DSO.
    Or selecting Fiscal periods may help.
    Hope this helps,
    Regards,
    anil

  • How to make added custom field Delta mechanism work for LO Cookpit?

    We appended a new custome field into extraction structure of the LO Cookpit datasource 2LIS_02_ITM through RSA6. And then run CMOD to write the exit code to populate the value and it works fine. But after we read Roberto's Weblog:
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it , find the enhancement we did can't make the Delta mechanism works if only this new field gets changed in a record, and the change cannot be reflected on BW.
    In order to make the Delta mechansim works, one option is to append the custom field into one of the LIS Communication Structure MCEKKO of  2LIS_02_ITM extraction structure and then write the enhancement code to this LIS Communication Structure.  In this way, the custom field can be treated like standard field that whenever it gets changed, the Delta mechanism or the user exit will be triggered.  The enhancement for this LIS Communication Struture is LEINS001. 
    Now we've got two problems when doing the above enhancements of the LIS Communication structure:
    1.  Maintain extraction structure problem in LBWE:
    We added a custom field, e.g., called ZZZ in LIS Communication structure MCEKKO (Purchasing Document Header) by creating a new custom append structure and add the field ZZZ into it, then activate the new append structure, but get warning msg like "Field name DUMMY is reserved (Do not use structure as include in DB table)".
    We do find a field called DUMMY in the structure MCEKKO. How to get rid of the warning msg and successfully activate the new custom append structure with the new field ZZZ?
    You could say that we can ignore the warning msg, but why the new custom field appended can not be seen from the right side pool of the Maintain Extract Structure in LBWE?  We need to make the new custom field shows up in the right side pool and then drag it over to the left side frame and to activate the extraction structure to change it.  But now it doesn't show up in the right frame pool!
    2. Refer to Sanyam's weblog
    /people/sanyam.kapur/blog/2005/04/30/custom-fields-and-bw-extractors-making-a-mixed-marriage-work-part-ii, there is a sample code inside this article which tells us how the sample code looks like to write this LIS Communicaton structure enhancement to make the Delta mechanism works, below is the code we modified to suit our needs, but don't know if it would work?
    DATA: i_t_ekko LIKE ekko   OCCURS 1 WITH HEADER LINE.
    DATA: ebeln LIKE ekpo-ebeln,
          it_ekko TYPE TABLE OF ekko WITH HEADER LINE,
          old_val(50) TYPE c.  "For storing the value from the Field Symbol
    FIELD-SYMBOLS <fs> TYPE ANY TABLE.
    CASE zeitp.
      WHEN 'MA'.                            "When creating a purchase order
        MOVE '(SAPLEINS)T_EKKO[]' TO old_val.
        ASSIGN (old_val) TO <fs>.
        i_t_ekko[] = <fs>.
        LOOP AT xmcekko.
          ebeln = xmcekko-ebeln.
          IF xmcekko-supkz = '1'.            "Old Value ?
            SELECT SINGLE * FROM ekko INTO it_ekko WHERE ebeln = ebeln.
            xmcekko-ZZZ = it_ekko-ZZZ.
          ELSE.                              "New Value ?
            READ TABLE i_t_ekko WITH KEY ebeln = ebeln.
            xmcekko-ZZZ = i_t_ekko-ZZZ.
          ENDIF.
          MODIFY xmcekko.
        ENDLOOP.
    EndCase.
    Share your ideas and we will give you reward points!

    Hello Kevin
    We have exactly the same problem. Did you solve it? If you did, could you please tell us how you did it? We neither see our field in LBWE. Our field is a Z field from a Z table.
    I hope you could help us. We just have this issue to finish our project.
    Thanks a lot in advanced

  • Error during approval of customized Work Schedule Substitution Request

    Hi Gurus,
    Good day!
    Just wanted to check your inputs on an error encountered during execution of a customized application done for the MSS approval of a work schedule substitution request
      Application error occurred during request processing.
      Details:   com.sap.tc.webdynpro.services.sal.core.DispatcherException: Wrong WebDynpro-URL: no application name specified
    Exception id: [E41F13C50EFA007600000227000011180004AAD57AFD5474]
    In simple terms, what does this error mean and what are the items that we need to check?
    Regards,

    The error means the dispatcher can't find the WebDynpro application.
    1. Make sure the Business package which contains the application is installed properly (most likely not).
    2. Make sure the PCD-Path / Folder which contains the iView has end user permissions.
    3. Also check whether the System ID the iView corresponds with has end user permissions.
    Also, please always post the entire error log, most of the time the errors are nested and analyzing an error with only the thrown error (which isn't the root cause) is like pulling teeth :-S
    regards, Lukas

  • How to switch how to switch Work process from request type DIA to BTC.

    We have problem in our sap sytem,When we try to log in to the sytem it throughd the abap dump
    START_CALL_SICK, "no application server offers an BTC service".in sm50 it shows only dialoge workprocess and no backgroung workprocss are present.When i checked the corresponding profile parameter for BGD it is fine.
    Please help me in solving the problem.Can i Manually switch the Work process from request type DIA to BTC.If so please let me know the tcode and the proceedure in detail.I am new to basis.
    Thanks,
    Megha.

    Hi,
    It can be the reason that you or another person would have changed the profile parameters to include background processes, but not restarted the system
    Check the following
    1. Latest date and time of when the profile file was changed
    2. Latest Restart of the System
    3. Latest date and time of when Dialog, BTC process was changed
    May be this will give you some clue

  • I can't get my adobe pdf converter to work.  It requests me to rebuy it.

    I can't get my adobe pdf converter to work.  It requests me to rebuy it

    Hi,
    Please try to log in at https://cloud.acrobat.com with your Adobe ID and password.  Then you can select "Export PDF" at top menu.
    From Adobe Reader X or XI, please do below:
    1. Launch Reader
    2. Select "Tools" from right top
    3. Click "Sign in" link in blue then sigin with your Adobe ID and password.
    4. Select "ExportPDF" tab
    5. Click "Select PDF File" link to select your PDF file to export
    6. Click "Convert" button.
    Thank you.
    Hisami

  • Delta not working for 2LIS_17_I3OPER

    HI BW Experts,
    Can somebody please help me to solve the following problem.
    Software :BI 7.0
    There is a custom built release flag in the extractor 2LIS_17_I3OPER and it can be changed by users.When ever Flag is changed(R or blank) it should generate a Delta and should be effective in BW system.
    But the delta is generated randomly at moment not sure why and when i checked in RSA3 for 2LIS_17_I3OPER all the PM orders are having blank Release flags although they have R in their actual PM order.
    So how can i check if Delta is working?
    Anybody please help me to solve this issue?

    Hi Joseph,
    In LBWE check the update mode of the datasource. if it is queued delta or v3, run the job (in job control) before runing RSA3.
    I don't know where you are getting the flag from. Your view is nice, but the system will do what is was programed, not what you think. If the flag doesn't generates a modification for the document, you won't get a delta. This field was added from RSA6 or have you enhaced LIS?
    Regards,
    Diego

  • Delta not working for datasource 0CRM_SRV_IBASE_TRAN

    Hi All
    We are facing problem with setting up delta for datasource 0CRM_SRV_IBASE_TRAN after transporting to QA system.
    <b>In QA System:</b>
    a.Initialization and Full load has worked successfully and it did fetch the
       records.
    b.But the newly created records are not coming through delta.
    c.Moreover delta is working in Dev but not in QA environment. All the transports
       have gone in successfully from Dev to QA system.
    d.<b>Only difference</b> we found between dev & QA is: Datasource not listed in SMQ1(Outbound Queue) whereas it is listed in Dev System
    We have already checked & ensure that following things are in order:
    <b>Source System Checks:</b>
    1.Init successfully in place. Entry present in RSA7.
    2.SMW01 checked.
    3.BWA5 & BWA7 checked.
    4.GNRWB checked
    5.No entry present in SMQ1 in QA system but its present in Dev system???
    <b>BW Checks:</b>
    6.Datasource replication done. Info source activated.
    Would appreciate any help on this.
    Thanks
    Pramod

    Hi Pramod,
    SMQ1 will only show an entry, when there are available data for the delta queue, i.e. someone has made a change or added new data.So checking RSA7 and SMQ1 will give the same result, if RSA7 shows more than 0 entries.
    The problem may be the init. What init did you do: an init with data, or an init without data, or an init with a filter. Experience with our CRM system tells me, best way to initialize any CRM data is to do a full data init (without filter) or a init without data (without filter). I wonder about the FULL load you mentioned ? When did you do it ?
    Kind regards,
    Jürgen

  • How Delta mechanism works for the datasource 0CRM_SALES_ACT_1 in CRM .

    Hi all,
    How Delta mechanism works for the datasource 0CRM_SALES_ACT_1 in CRM .
    I mean timestamp, date etc? And how I can check that
    Appreciate u r response.
    Thanks
    Pramod

    Hi,
    in your system, goto rsa1->tools->assign source system id....
    regards
    Siggi
    PS: I strongly recommend to use the search functionality. This has been asked already a few times.

  • TS4207 Why since upgrading to iOS 7 BBC radio app doesn't work? Says requested URL not found

    Since upgrading to iOS 7 my BBC radio app doesn't work. Says requested URL not found on this server. What to do?

    Hi makc3d,
    Thanks for the response. After all that time I started losing hope.
    I would not say that the abvious solution is to remove the tag. I think it should not be put there in the first place if there's no code at all. Moreover, this only happens when exporting a MC from the Library, and it works just fine when exporting normally (publish). It seems more like a bug than an expected behaviour.
    Also, AIR should check for actual code in this tag instead of relying only on the tag itself.
    Your solution is a nice workaround, and I will certainly try that out. But as I said, it will not fix the original bug, it will only patch it later on in the process.

  • DTP is not running by request by request?

    Hi ,
    Am loading data from DSO - > cube , by DTP which setting as extraction mode as Get all new data request by request.
    In actual scenario i collected around 7 requests of data in DSO, so by functionality of Cube DTP setting  Get all new data request by request  when i run DTP ideally it  should come as request by request to target cube, but in my case all requests are collecting in one request.
    so what will be the reason of behaving like this?
    cheers
    vas

    Hi,
    The option(GET ALL NEW DATA Request by Request)which you mentioned work as delta only. for more details follows as bellow:
    This indicator belongs to a data transfer process (DTP) that gets data from a DataSource or InfoPackage in delta mode. To prevent DTP requests from becoming too large (when the source contains a large amount of data) to be sent to the target as one DTP request, delta DTPs can be used to get the data request by request.
    USE:
    If you set this indicator, a DTP request only gets data from one request in the source.
    When it completes processing, the DTP request checks whether the source contains any further new requests. If the source contains more requests, a new DTP request is automatically generated and processed.
    Regards,
    Suman

  • "get all new data request by request" after compressing source Cube

    Hi
    I need to transfer data from one Infocube to another and use the Delta request by request.
    I have tried this when data on Source Infocube was not compressed and it worked.
    Afterwards some requests were compressed and after that the delta request by request is transfering all the information to target Infocube in only one request.
    Do you know if this a normal behavior?  
    Thanks in advance

    Hi
    The objective of compression is it will delete all the request in your F table and moves data to E table.after compression you don't have request by request by data.
    This is the reason you are getting all the data in single request.
    Get data request by request works, if you don't compress the data in your Cube.
    If you want to know about compression, check the below one
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035d300-b477-2d10-0c92-f858f7f1b575?QuickLink=index&overridelayout=true
    Regards,
    Venkatesh.

Maybe you are looking for