Tables of delta loads

Hi
       can you tell me the tables to find the total no of delta loads in the entire system and also provide the list of tables for various objects like cubes ,ods and infosurces etc
regards
Siva

Hi.......
Check this Link :
http://sapbwneelam.blogspot.com/2007/08/bw-useful-tables.html
Hope this helps you.........
Regards,
Debjani...........

Similar Messages

  • Table on delta loads from data mart

    Hi,
    I am loading data from two DSO's (let's call them A and B) to another DSO (B) in BI 7 with a BW3.5 delta infopackage.
    Now I want to know where I can find information on the timestamp or last request from the last delta load from A and B to C.
    So in fact I would like to know how the system knows which requests in A en B have not been loaded to C yet at the next delta load.
    In which table can I find the information for ODS A and B that is used by the system to define what data in the change log has been loaded or not to ODS C or other targets. (In fact I should have a comparison table in BW as ROOS* in R3)
    Thanks in advance!
    Kind regards,
    Bart

    Hi Guys,
    Thanks for the answers.
    I know how to check everything in the Workbench, but I want to know where the information of the delta is stored technically.
    Just for the sake of completeness:
    Due to some issues; several successive loads from A and B were correctly loaded into DSO C (the 'new' table), but could not be activated. It is not possible to do a repeat or anything else. I am not going into too much detail, but just take this for granted.
    The only way we can 'solve' the problem is to make the system believe that the 3 last loads (activated data) in A and the two last loads in B have not been loaded to C yet. Just deleting the last delta's in C and do a new delta from A and B to C will not work.
    Therefore I want to 'manipulate' the table that is being read by a delta load. If I can change the timestamp or request numbers in that table, I can make the system believe that some requests have not been loaded to C yet.
    Dirty, but it should work I think. But I am still figuring out what table contains information about the datamart data source (8A and 8B) and the last delta load to the targets.
    Hope this is more clear.
    Thanks in advance!
    Kind regards,
    Bart

  • ALE and IDOC Delta loads

    ALE Delta is used for all Master data delta extracts and referencing change pointer tables
    IDOC Delta Loads are used for transactional delta and access source system queues
    Are the above statements correct? Thanks

    Nitesh
    As you know there are two types of Transfer Methods (PSA & IDOC). Info Idoc are used for both the types of Transfer Methods. Only ALE is used to transfer InfoIDocs.
    The Transfer Method only determine how the data is transfered.
    IN PSA Transfer Method Trfc is used to transfer data directly from Source system to BW.
    Thanks
    Sat

  • Material delta load parameter table it_marc is empty

    Hi,
    we are on SRM 7.0 using classic szenario. We use middleware to download material master from ECC. We us BADI Method IF_EX_PRODUCT_CUSTOMER2~MAP_R3_TO_CRM_MATERIAL to set the material status in SRM depending on MARC data.
    When we do the initial download then the parameter table IT_MARC is not empty, and we can use it to set the material status depending on MARC data.
    But when the material in ECC is changed, and the delta load runs, the parameter table is empty! Even when values in ECC table MARC are changed. In R3AC1 Filter the table is activ.
    Does anyone have an idea why the parameter table IT_MARC is empty in BADI Method?
    Best Regards,
    Ben

    Hi Ben,
    I suggest you check documentation in note 428989. This should be of help to use this BADI.
    Regards,
    Ivy

  • Set up table and transaction data delta load init

    Dear experts,
    I am following a document left by my predecessor on "post processing activities" for BI prod post go-live.
    And for transaction data it says to :
    1. execute init with data transfer infopack manually
    2. execute delta update infopack manually
    3. go to ECC : delete setup data (lbwg)
    4. setup the tables for  (OLIxxBW)
    5. then finally activate Process chains.
    Is this sequence correct?
    Shouldn't I first delete the setup table, setup the tables and then execute the infopackages??
    Regards,
    Alice

    Hi,
    The sequence is correct ,and after this you need to run the job for the collective run in SAP ,which is based on your delta queue.
    i.e if you choose direct delta it is not required ,but if you use 'queued delta' then you need to run the collective run to move the delta from the extraction queue to delta queue .This job should run before the delta load.
       It is better to fill the set-up table during downtime .ie.no documents posted during the set-up table run. But if you dont have an option you can fill the set-up table during operation hours ,but need fo follow certain steps to get all the delta data. You may need to use 'Init without delta transfer' or early delta initialization in the infopackage .You need ODS as a data target to achieve this . Please search SDN ,you will get more information about this.
    Thanks.

  • Populate setup table impact on delta loading

    Hi  Expert,
         For a ECC standard data source supporting delta mode; Taking 2lis_03_bf for example, now DSO A is loading data from 2lis_03_bf; but there are new requirement need to load the history data from this data source. So i want to populate the setup table of 2lis_03_bf and then make a full upload to another DSO B. If i do this, any impact on delta loading to DSO A ?
        In my opinion, populating the setup table is not impact on the delta loading data to DSO A ! Because delta loading is related to Delta Queue and full upload is related to setup table.
    Thanks,
    Dragon

    Hi Draco,
    Are you in 3.x or 7.0 version?
    This scenario needs to be addressed diffferently based on the version.
    If its 3.x version, then you need to create a full repair request (Overwrite mode in DSO A) and load the data into two targets
    or
    Create a full update infopack, remove tick from data target tab for DSO A and load to DSO B.
    If 7.x version.
    Create a full upload info pack and extract the data into PSA and using the new DTP from PSA to DSO B.
    Create another DTP on PSA to DSO A and do the inti without data transfer and run regular deltas. If you dont do this, then the DTP for DSO a will pick up the full update request from PSA in delta run.
    Gurus: what are your comments on this approach?
    thanks
    Srikanth

  • Table with Full / Delta Load information?

    Is there a table I can go to where I can see if a cube is a full load vs delta?
    Thanks, I will assign points!
    ~Nathaniel

    Hi,
    ckeck the table ROOSPRMSC in R/3.It gives you the complete details of your Init and Delta Loads.
    hope this helps.
    Assign points if useful.
    Regards,
    Venkat

  • No initial load of Customers, Material and delta load of Sales Orders.

    Hi Experts,
    I am facing a very troublesome issue. I am not able to setup the Middleware portion for initial and delta loads. I read a lot of documents and corrected a lot of things. finally, the connectivity is done with R/3 and CRM. Initial load of all objects is successful (as per Best PRactices guide). Customizing load is successful.
    But after now I have these open issues for which I am unable to find any answers (am really exhausted!!):
    - Customer_main load, it was succesful, but no BP's of R/3 are available.
    - Material, it failed in SMW01, SMQ2, the errors are:
    Mat. for Initial Download: Function table not supported
    EAN xxxxxxxxxxxxxxxxxx does not correspond to the GTIN format and cannot be transferred
    EAN yyyyyyyyyyyyyyyyyy does not correspond to the GTIN format and cannot be transferred
    Plant xx is not assigned to a business partner
    - Sales order, it shows green bdoc, but error segments says "No upload to R/3" and the order does not flow to R/3.
    We had our system setup 3 years back for data transfer and Middleware. But few things changed and connectivity stopped. I did all that again now, but am not yet successful. Any inputs will be greatly appreciated.
    Thanks,
    -Pat

    Hi Ashvin,
    The error messages in SMW01 for MAterial initial load is :
         Mat. for Initial Download: Function table not supported
         EAN 123456789000562 does not correspond to the GTIN format and cannot be transferred
         EAN 900033056531434 does not correspond to the GTIN format and cannot be transferred
         Plant 21 is not assigned to a business partner
    I have done the DNL_PLANT load successfully. Why then the plant error?
    Some of the messages for BP:
    Messages for business partner 1331:
    No classification is assigned to business partner 1331
    For another,
         Partner 00001872(469206A60E5F61C6E10000009F70045E): the following errors occurred
         City Atlanta does not exist in country US
         Time zone EST_NA does not exist
         You are not allowed to enter a tax jurisdiction code for country US
         Validation error occurred: Module CRM_BUPA_MAIN_VAL, BDoc type BUPA_MAIN.
    Now, the time zone EST is assigned by default in R/3. Where do I change that? I do not want to change time zones as this may have other impacts. Maybe CRM I cna change this, not for sure in R/3. City check has been deactivated in R/3 and CRM, still the error.
    Till these 2 are not solved, I cannot go into the Sales order loads.
    Any ideas will be greatly appreciated.
    Thanks,
    -Pat

  • Delta loading procedure from Write Optimized DSO to Infocube

    Hi All,
    We are using Write optimized DSO in our project to which I am loading data using Standard DSO 0FI_GL_12
    From Write optimized DSO, we are loading delta records into Infocube, please provide your inputs to my following questionnaire:
    1) I am quite interested to know how the delta records gets loaded into infocube whenever we are using 'Write optimized DSO' as we don't have any image concept in Optimized DSO
    Ex: If am using Standard DSO, we have Change log table and image concept will allow to get updated value to Cube
    let us assume
    Active Table
    111            50
    111            70 (overwrite)
    Change Log Table
    111            -50        (X -- Before Image)
    111             70    ( '  ' -- After Image) symbol for after image is 'Space'
    So if we load this record to the target as a delta the above two records from change log table will get loaded to the CUBE and Cube will have 70 as the updated record
    If am using 'Write Optimized',
    Active Table
    111            50
    111            70 (overwrite)
    When this record loaded to the cube, as Info Cube is always having 'Additive' feature so the total value will 50+70 =120 which is wrong?
    Correct me what feature will work here to get updated value as '70' into Cube from Write Optimized DSO'
    2)As the data source is delta capable and having  'ADDITIVE' delta process, only the delta records based on REQUEST ID are loaded into Info Cube with the  updated key figure value?
    Thanks for your inputs and much appreciated.
    Regards,
    Madhu

    Hi Madhu,
    In best practice, we use WODSO in Initial layer and then Standard DSO. Just for mass data load/staging purpose.
    In best practice : Data source ----> WODSO ---> std. DSO
    In your case : Data source ----> Std.DSO  -----> WODSO.
    In both cases if data load design is not in accurate way, then your cube will have incorrect entries.
    For ex:  today 9 am : 111,  50  (in active table)
    Data load to cube, same day 11 am : then cube will have 111    50.
    Same day, value got changed  in std. DSO  1 pm :   111   70(over write function @ active table).
    Same day/next day if you load data to cube, it will have 2 records one is with value 50 and other would be 70.  So to avoid such scenarios we should plan load in accurate way.  Else make change your DTP settings as  ‘source table : change log table.
    Coming to your case:
    Once after the load to Std. DSO, load data to WODSO by changing the DTP settings ‘Delta Init.Extraction from’  : Change log.
    Now data available @WODSO from change log table, then you load to cube In delta mode.

  • Issue with Delta Load in BI 7.0... Need resolution

    Hi
    I am having difficulty in Delta load which uses a Generic Extractor.  The generic extractor is based on a view of two Tables.  I use the system date to perform the delta load.  If the system date increaes by a day, the load is expected to pick up the extra records.  One of the tables used in the view for master data does not have the system date in it.
    the data does not even come up to PSA.  It keeps saying there are no records....  Is it because I loaded the data for yesterday and manually adding today's data...? 
    Not sure what is the cuase of delta failing....
    Appreciate any suggestions to take care of the issue.
    Thanks.... SMaa

    Hi
    The Generic DataSource supports following delta types:
    1. Calender day
    2. Numeric Pointer
    3. Time stamp
    Calday u2013 it is based on a  calday,  we can run delta only once per day that to at the end of the clock to minimize the missing of delta records.
    Numeric pointer u2013 This type of delta is suitable only when we are extracting data from a table which supports only creation of new records / change of existing records.
    It supports
    Additive Delta: With delta type additive delta, the record to be loaded only returns the respective changes to key figures for key figures that can be aggregated. The extracted data is added in BI (Targets: DSO and InfoCube)
    New status for changed records: With delta type new status for changed records, each of the records to be
    loaded returns the new status for all key figures and characteristics. The values in BI are overwritten (Targets: DSO and Master Data)
    Time stamp u2013 Using timestamp we can run delta multiple times per day but we need to use the safety lower limit and safety upper limit with minimum of 5 minutes.
    As you specified, the DS is based on VIEW (of two tables, with one containing date and other does not).
    Kindly check the above lines and verify, if the view (primary key) could be used to determine the Delta for your DS.
    Also let us the if any standard SAP tables used in creating VIEW and if so, what is the size of the DS load every day?
    Thanks.
    Nazeer

  • Delta load to ODS taking too much time

    Hi,
    I have one delta load running everyday to ODS .its getting data from R3 & its taking too 3-4 hours daily.due to this data is not getting updated in cube on time and there arises issue with data in Reports
    Please let me know what are possible solution to check this ODS Delta load and improve its loading time
    thanks
    nilesh

    Please check if the extraction is taking more time in R/3 or data loading is taking more time in BW. If R/3 extraction is taking is more time, try creating indexes on R/3 tables.
    -Vikram

  • Repair request for Delta load? Orders missing in Billing line Items

    Hi
    I am using BW 3.5; Some Orders are missing in BW for Billing line item reports (2LIS_13_VDITM) according to the users.
    The report is based on Cube directly, ODS is not used. I planned to do a repair full request for this month's data so as to get the missing orders.
    Could you please advice me on correct procedure to do that? Delta loads runs every day and todays delta is completed.
    Thanks and BR
    P.B

    While it's usually recommended to execute setups during "quiet time", it's not required. Especially in this case. Since you're going to be restricting the setup to only extract a range of Billing Documents that are representative of the dates of data missing (or purely the identified missing documents), there is no need to wait for "quiet time" because you'll pick up the data either in your Full Repair extract or in the delta extract. Depending on how the target InfoProvider for this data is setup (cumulative v. non-cumulative), you may have to look out for duplicate data.
    My recommendation would be to run your setup for only those documents that have been identified as missing. That way, you're limiting the amount of data that's required to be extracted to the setup table and then into BW, but also will remove any risk of duplicate data in your target InfoProviders.

  • Delta Load on DSO and Infocube

    Hi All,
            I would like to know the procedure for the scenario mentioned below.
    Example Scenario:
    I have created a DSO with 10 characteristics and 3 keyfigure. i have got 10 Customers whose transactions are happening everyday. A full upload on the DSO was done on 7th October 09. How can i load their changing data's from 8th Oct to till date to DSO? and what will be the situation for the same in the case of Infocube??
    Step by step guidance will be a great help
    Thanks in advance
    Liquid

    Hi,
    The key-fields take an important role at DSO level alone as you get the power of overwritting records. Once thats done at the DSO level you can simply carry on with a Delta load into the cube from your Change Log table and you don't have to worry about anything. Just to add, all the characteristics in the cube are key-fields so you will get a new records for each different value only the key-figures will sum up for a set of all the same characteristics.
    Thanks,
    Arminder

  • Issue with delta load from R/3 to BW

    Hi frnds,
    There is a standarded D.S  2LIS_05_ITEM in R/3. Evereday We extraceted data from R/3 to BW based on this standard Data Source with delta load.There are two fields ZX and ZY in BW ,some times the data haven't extracted for this 2 fields in BW with the delta update.But some times this 2 fields properly extracted in to BW with the same delta update.
    Why is it happen like this ? Please give some inputs on this.
    Regards,
    Satya.

    Hello,
    if its a standard field then its getting populated in correct way.
    if its custom field then you need to analyze the records for which its getting populated and for one which its not.
    Quite possible that some cutomization in CMOD results in this behaviour.
    Also,check underlying tables to see the correct values.
    Regards
    Ajeet

  • Problem in delta load with Z field

    Hello Experts,
    We have CRM 5.0 system and as per the requirement we have
    added ZFIELD to BUT000 table. This field is added to BUT000 via EEWB – Easy
    Enhancement Workbench. This field is also BW enabled.
    BW enabled means – this field is used / available to BW
    extractor for further analysis.
    We update this ZFIELD with custom program – FM BUPA_CENTRAL_CI_CHANGE.
    When full load is done – data flows to BW correctly, ZFIELD
    values are reflecting correctly. But when we are doing delta load it is not
    working.
    If we use BP t-code delta works perfectly. So issue from BW
    side is ruled out. The FM BUPA_CENTRAL_CI_CHANGE is not triggering the change
    record for this BP. But then I don’t see any other FM / BAPI to use so that
    change record is also created.
    Could you please provide any pointers or any checks to
    overcome this problem?
    Thanks in advance.

    Hi Ashtankar,
    When you are changing the attributes(say ZFIELD) and save the transaction in BP, Is this updating the CHDAT(Change date) in BUT000?
    If the change date is being applied, then Delta shouldnt be a problem - if you are using datasource 0BPARTNER_ATTR.
    In order to check if the delta is being captured or not, you could use TCODE RSA7. Also, this brings the Doubt when you say Full load works while delta doesnt, try Re-initializing the datasource from BW after Datasource replication - because there have been some changes effective in Source system which the BW system/Delta queue might take into consideration.
    Regards,
    Thejas K

Maybe you are looking for

  • Too many columns to be shown in the Enterprise Manager 11g?

    Hello, we are having some problems with the Enterprise Manager 11g. When we want to VIEW DATA of a specific table, we get this exception. We think that our table has too many columns to be displayed. If we delete some of the columns, the data is show

  • Syncing with USB 1.1

    I was having issues getting my wife's iPhone to sync with an iMac G4 800MHz. After reading several of the posts, I tried using a USB 2.0 hub instead of connecting directly to the iMac. That resolved the data corruption I was getting when importing ph

  • What is better: Export as *.MOV or Export as *.MP4 ?

    I have a couple of small DigiCam *.MP4 videos. Some of them I have to rotate 90 deg clockwise, to edit, to re-encode and to save as a new video. When coming to the "export" step I have the choice between the original *.MP4 (=MPEG-4) format and *.MOV

  • Adobe reader Closes when I try to use it

    Hi everybody. I have the problem that when I open a .pdf it closes inmediately. I tryed reinstalling the programme every time it happens, but I always get the same results. I se Windows 7 and all the updates are installed. What could I do? Thanks you

  • Command Server Behavior in CS4 Troubled

    Does anyone know what happened to the Command server behavior in CS4. When I use MX 2004 to create an Update Command, I get the following code on the page: set Command1 = Server.CreateObject("ADODB.Command") Command1.ActiveConnection = MM_sportstest_