Transactional Data questions

We have some data issue with transactional data. We are loading Inventory data to DSO (Full load) --> Cube (Delta load) daily. Here is an example what our data issues are:
Day 1
ITEM               QTY
00001              4
Day 2
ITEM               QTY
00001              200
Day 3
ITEM               QTY
00001              100
We would like to see 200 overwrites 4 on Day 2, and 100 overwrites 200 on Day 3.
The results I got is
Day 1               4
Day 2               196
Day 3               96-
Can anyone explains me why it does this and how to overwrite the previous day's QTY instead of calculation on it? Thank you!

Hi Qingbo,
Refer to this link:
http://help.sap.com/saphelp_nw04s/helpdata/en/a0/eddc370be9d977e10000009b38f8cf/frameset.htm
It talks about the overwrtie option in the Transformation Rules, such as this:
<i>Update
Transformation rules define the rules that are used to write data to a DataStore object. They are very similar to the transformation rules for InfoCubes. The main difference is the behavior of data fields in the update. When you update requests into a DataStore object, you have an overwrite option as well as an addition option.
See also Transformation Type.
The delta process that is defined for the DataSource also influences the update. When loading files, the user must select a suitable delta process so that the correct transformation type is used.
Unit fields and currency fields operate just like normal key figures, meaning that they must be explicitly filled using a rule.</i>
and this as well:
http://help.sap.com/saphelp_nw04s/helpdata/en/49/851342bc45dd2ce10000000a1550b0/frameset.htm
Hope this helps.

Similar Messages

  • Transactional data loads PIR, IM stock, Open PO's documentation

    I have to make a documentation of the process of transactional data loads.
    transactional data loads pirchase info rec, IM stock, Open PO's
    the transactional data is live in both sap and legacy, so they have to be even in both systems in all stages of this process.
    how do i maintain that is the question.
    send me any details regarding this.
    thank you
    sridhar

    Check these three thigns
    /n/sapapo/CCR
    /n/sapapo/CQ
    Check what type of stock has active IM, and what type of stock went in after you created the GR.
    Still if you have any problem let us know.
    My

  • Transactional data PIR, IM stock, Open PO's documentation

    I have to make a documentation of the process of transactional data loads.
    transactional data loads pirchase info rec, IM stock, Open PO's
    the transactional data is live in both sap and legacy, so they have to be even in both systems in all stages of this process.
    how do i maintain that is the question.
    send me any details regarding this.
    thank you
    sridhar

    CIN Master data
    J1ID - Material Chapter ID / Customer Excise Details / Vendor Excise Details.
    Data Migration
    To load open POs use BAPI - BAPI_PO_CREATE1
    To load open stock use BAPI - BAPI_GOODSMVT_CREATE
    To load Invoices use BAPI - BAPI_INCOMINGINVOICE_CREATE

  • Deleting master data after loading transactional data using flat file

    Dear All,
    I have loaded transaction data  into an infocube using a flat file . While loading DTP i have checked the  option "load  transactional data with out master data exists" . So transactional data is loaded even if no master data is there in BW.
    While loading the flat file, I made a mistake for DIVISION Characteristic  where original master data value is '04000' , but i loaded the transactional data with value '4000' .Then i later realized after seeing the data from the infocube and deleted the request. then i reloaded data with value '04000'. Till now every thing is fine.
    But when I see the master data for DIVISION , i can see a new entry  with value '4000'.
    My question is how to delete the entry value '4000' from DIVISION. I tried deleting manually this entry from 'maintaining masterdata' , but it is not allowing me to do so .
    I have also checked if any transactional data exists for that value '4000' , as i said earlier I have deleted the transactional data with that values. even tried to delete the entries from the master data table, but i donot see a option to delete entries there.
    Please suggest me on this.
    Regards,
    Veera

    Hi,
    Goto RSA1 right click on the Info object and select Delete Master data. This will delete the master data unused existing in the table.
    If this master data is not used any where else just delete the master data completely with SID option.
    If even this doesnt work you can delete the complete table entire in SE14. But this will wipe out the entire table. Be sure if you wanna do this.
    Hope this helps
    Akhan.

  • Error in Transaction Data - Full Load

    Hello All,
        This is the current scenario that I am working on:
    There is a process chain which has two transaction data load (FULL LOADS) processes to the same cube.In the process monitor everything seems okay (data loads seem fine) but overall status for both loads failed due to 'Error in source system/extractor' and it says 'error in data selection'.
    Processing is set to data targets only.
    On doing a manage on the cube, I found 3 old requests that were red and NOT set to QM status red. So I set them to QM status red and Deleted them and the difference I saw was that the subsequent requests became available for Reporting.
    Now this data load which is a full load takes for ever - I dont even know why I do not see a initialize delta update option there - can Anyone tell me why I dont see that.
    And, coming to the main question, how do I get the process chain completed - will I have to repeat the data loads or what options do I have to have a succesfully running process chain or at least these 2 full loads of transaction data.
    Thank you - points will be assigned for helpful answers
    - DB
    Edited by: Darshana on Jun 6, 2008 12:01 AM
    Edited by: Darshana on Jun 6, 2008 12:05 AM

    One interesting discovery I just found in R/3, was this job log with respect to the above process chain:
    it says that the job was cancelled in R/3 because the material ledger currencies were changed.
    the process chain is for inventory management and the data load process that get cancelled are for  the job gets cancelled in the source system:
    1. Material Valuation: period ending inventories
    2. Material Valuation: prices
    The performance assistant says this but I am not sure how far can I work on the R/3 side to rectify this:
    Material ledger currencies were changed
    Diagnosis
    The currencies currently set for the material ledger and the currency types set for valuation area 6205 differ from those set at conversion of the data (production startup).
    System Response
    The system does not allow you to post transactions after changing the currency settings to ensure consistency.
    Procedure
    Replace the current settings with the those entered at production
    start-up.
    If you wish to change the currency settings, you must use programs to convert data from the old to the new currencies.
    Inform your system administrator
    Anyone knowledgable in this area please give your inputs.
    - DB

  • Upload transaction data for transactions like KSU2/KSU5,KB31N

    We are trying to upload transaction data for transactions like KSU2/KSU5,KB31N etc using batch data processing within LSMW. We are facing couple of problems as mentioned:<br />
    We have to upload data, which has a single header record with 200-300 line items record. How to define target structures to represent header & line items. Currently we are able to upload header data with 1-2 line items depending upon the recording. The present recording gives us option to load 1 line item per document. <br />
    Can excel be used instead of text file for uploading data in LSMW? <br />
    During processing of transaction, User clicks on button and checks a value. Based on this value the future course of action is determined. How can we do the BDC recording or do customization (Write a custom code) <br />
    If errors occur during upload of data, will the processing stop at that line item or will it ignore the error record and continue processing of the remaining data. <br />
    How can we write custom code to modify the data during BDC session and not before session? <br />
    Immediate pointers to this would be very useful. Let me know if more information is required.<br />
    <br />
    Thanks,<br />
    Mehfuze

    hi ali,
    as for your question related to excel upload via lsmw , please find the below links to be usefull
    [Upload of excel file in LSMW]
    [Upload excel by LSMW for FV60 ?]
    [Upload Excel file through LSMW]
    [How upload an excel file using an LSMW.]
    regards,
    koolspy.

  • What are the settings master data and transaction data from r/3 to in APO.

    Hi all,
    Can u suggest me ,I need to conform in apo what are the setting when transfering master data and transaction data from r/3 to APO.
    frm
    babu

    Hi
    The data get transfered from R3 to APO via CIF which is SAP standard.
    Please find enclosed herewith the link which will provide you detail information regarding it.
    http://help.sap.com/saphelp_scm41/helpdata/en/9b/954d3baf755b67e10000000a114084/frameset.htm
    Please let us know if it helps you. Please also let us know if you have any more specific question.
    Thanks
    Amol

  • Problem in Flat file transaction data loading to BI7

    Hi Experts,
    I am new to BI7, got problem in activating data source for transaction data and create transfer routine to calculate sales revenue which is not there in flat file but flat file has got price per unit and quantity sold.
    Flat file fields are:
    Cust id   SrepID  Mat ID  Price per unit  Unit measure  Quantity   TR. Date
    cu100    dr01      mat01         50                 CS              5     19991001       
    created info objects are
    IO_CUID
    IO_SRID
    IO_MATID
    IO_PRC
    IO_QTY
    IO_REV
    0CALDAY
    When i created IO_QTY, unit/curency was given as 0UNIT. In creation of DS, under fields tab, extra field CS was shown I did confuse to which object should i map it. and at the same time when i enter IO_QTY its prompting one dailogbox with 0UNIT, so i said ok. Now i can't map CS to unit as system created automatic ally 0UNIT when i enter IO_QTY. Could you please give me solution for this and procedure to enter formulae for calculating Sales revenue at this level instead of query level. Your solutions will be more appreciated with points.
    Await for ur solutions
    Thanks
    Shrinu

    Hi Sunil,
    Thanks for your quick response. I tried to assign the infoobjects under the fields tab in creation of DS. I have not reached to data target yet.
    Could you please answer to my total question.
    Will you be able to send some screen shots to my email id [email protected]
    Thanks
    Shri

  • BPC:NW - Best practices to load Transaction data from ECC to BW

    I have a very basic question for loading GL transaction data into BPC for variety of purposes, would be great if you can point me towards best practices/standard ways of making such interfaces.
    1. For Planning
    When we are doing the planning for cost center expenses and need to make variance reports against the budgets, what would be the source Infocube/DSO for loading the data from ECC via BW, if the application is -
    YTD entry mode:
    Periodic entry mode:
    What difference it makes to use 0FI_GL_12 data source or using 0FIGL_C10 cube or 0FLGL_O14 or 0FIGL_D40 DSOs.
    Based on the data entry mode of planning application, what is the best way to make use of 0balance or debit_credit key figures on the BI side.
    2. For consolidation:
    Since we need to have trading partner, what are the best practices for loading the actual data from ECC.
    What are the typical mappings to be maintained for movement type with flow dimensions.
    I have seen multiple threads with different responses but I am looking for the best practices and what scenarios you are using to load such transactions from OLTP system. I will really appreciate if you can provide some functional insight in such scenarios.
    Thanks in advance.
    -SM

    For - planning , please take a look at SAP Extended Financial Planning rapid-deployment solution:  G/L Financial Planning module.   This RDS captures best practice integration from BPC 10 NW to SAP G/L.  This RDS (including content and documentation) is free to licensed customers of SAP BPC.   This RDS leverages the 0FIGL_C10 cube mentioned above.
      https://service.sap.com/public/rds-epm-planning
    For consolidation, please take a look at SAP Financial Close & Disclosure Management rapid-deployment solution.   This RDS captures best practice integration from BPC 10 NW to SAP G/L.  This RDS (including content and documentation) is also free to licensed customers of SAP BPC.
    https://service.sap.com/public/rds-epm-fcdm
    Note:  You will require an SAP ServiceMarketplace ID (S-ID) to download the underlying SAP RDS content and documentation.
    The documentation of RDS will discuss the how/why of best practice integration.  You can also contact me direct at [email protected] for consultation.
    We are also in the process of rolling out the updated 2015 free training on these two RDS.  Please register at this link and you will be sent an invite.
    https://www.surveymonkey.com/s/878J92K
    If the link is inactive at some point after this post, please contact [email protected]

  • ALE Configuration for Transactional Data (Purchase Order)

    Dear Experts,
    I want to configure ALE for Purchase Orders(Transactional Data).
    For that:
    I have done neccessary Condition  Records at NACE with necessary Output Type Configuration. Configured Port and Partner Profile with Message Types: ORDERS and ORDECHG (as Outbound Parametrs) .Assigned those Message Types against defined Sender and the Receiver at Customer Distribution Model . Configured Port and Partner Profile at the Receiver System also.
    Now I have 2 queries.
    i) Is this much of ALE configuration is enough for Transactional Data for communication between 2 systems or I have to do something more?
    ii) For Master Data we configure Change Pointer(we can see field assignments against Change Document Object at BD52 for Master Data Messaage Types) . Is it necessary to configure Change Pointer for Transactional Data or it will autometically handled by the system(field assignments cannot be seen against Change Document Objects at BD52 for Transactional Data Messaage Types)?
    Regards
    Arnab

    Hi kumar,
    what Alexander  said is absolutely correct .
    from your second question no master data distribution configuration is required for the transactional data distribution.  
    let us know once if you face any problems.
    ~linagnna

  • Modeling transaction data

    Hello all,
    I have 2 questions that I was hoping to get an answer to:
    Question 1:
    What is the normal way of modeling transaction data with a changing status in a BW system? Are there any links/threads to read?
    I thought that transaction  idata would go into the DSO that any changes to transaction data would be recorded there (very granular) while the aggregation of data would be placed in the cube.
    Question 2:
    For what reason would someone place a navigation attribute in the dimension of a cube?
    TIA
    PS - this is for BI 7.0
    Edited by: Casey Harris on Feb 4, 2008 10:15 PM

    Casey,
    A couple of quick answers that aren't links:
    1)  Ideally BW 7.0 allows you to create an Enterprise Data Warehouse (EDW), where you have granular data loaded to DSO's that then aggregate the data into cubes.    That is what we strive for.  In practice it doesn't always work out that way.  Do some searches on EDW and you should find some info.
    2)  Navigation attributes are essentially links to master data attributes.  By not putting them directly in a cube, you save a little bit of space in the cube.  The most common use of them that we have is when users tell us they want to filter on a field that is not directly in a cube, but is in the master data attributes.  We can then easily make that field a navigation attribute.  Otherwise if you wanted to add the field to the cube, you'd have to reload all the data, which can be quite painful.
    Michael

  • Loadind transaction data from SAP ECC 6.0 system failed

    Hi!
    I have SAP ECC 6.0 IDES system with the following clients:
    100 DEV client
    200 BI client of SAP 6.0
    Now I would like to load the transaction data from InfoSources 0FI_AR_6 and 0FI_AR_4 from SAP ECC client 100.
    The extraction process has been frozen in status "yellow" and after a while changed the status to "red" with the following information
    Errors while sending packages from OLTP to BI
    Diagnosis
    No IDocs could be sent to BI using RFC.
    System Response
    There are IDocs in the source system ALE outbox that did not arrive in the ALE inbox of BI.
    Further analysis:
    Check the TRFC log.
    You can access this log using the wizard or the menu path "Environment -> Transact. RFC -> In source system".
    Error handling:
    If the TRFC is incorrect, check whether the source system is fully connected to BI. In particular, check the authorizations of the background user in the source system.
    The users BWREMOTE and ALEREMOTE seems to be ok and have the suitable authorization.
    The tcode SM58 does not show any entries.
    The tcode BD87 shows that the IDocs was sent (source system) and has been received on target system.
    Question:
    How can I solve my problem?  Please give me some technical information (tcode, report).
    Thank you very much!
    regards

    Walk through your basis set up and  check out
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c0751ba5-b7a8-2b10-6d97-e91e85c0fafa
    pts appreciated.

  • How to rectify the errors in master data loads & transactional data loads?

    hy,
    please any one tell me
    How to rectify the errors in master data loads & transactional data loads?
    thnQ
    Ravi

    Hi,
    Please post specific questions in the forum.
    Please explain the error you are getting.
    -Vikram

  • Why APO transaction data is not stored in tables

    Hi All,
    Why APO transaction data is not stored in tables.
    can u explain me.
    Babu

    Good question Babu,
    There indeed are tables for master data in APO database whereas the transaction data (qty, amount, dates and document numbers) are stored in a denormalized data base called livecache. But the master data too is identified with GUID's.. a cryptic and unique number given to e.g. a location, prodcuct, t/lane, customer and documents. This is because of speed of access if e.g. an external system is to fetch an information from APO using a unique session ID, instead of making a sequential read in APO tables, it read the live cache by called the GUID's that the calling characterics of the event. If one or more ECC system or an external execution system is connected to APO and one of the system collapses and rebuild again or if a system is assigned to a new business system group etc.. then its likely that systems can be out of sync and its where the live cache syncs up the data with APO tables again. If you still want to see tables in human readable format.. there are views available e..g the tables for locations LOCID and table for products MATLOC would take only GUID as inputs but the view V_MATLOC will show you the materials and locations by the numbers you know.
    Hope I was able to explain. But yes the why part is still something of a mystery to me.. becasue with transparent tables it is easier to create local reports in APO and thats what probably SAP doesnt want. SAP can answer best as to what were the initial user and commercial considerations when developing APO
    Regards,
    Loknath

  • Transactional Data vs. Master Data

    I need to implement a BW but I have the following question;
    Imagine I have Master Data like:
    Material:
    Materialname
    Materialgroup
    Time:
    Month
    Year
    Costumer
    Name
    Town
    Region
    On the other hand I have transactional data:
    Sales
    Amount
    I wonder how do I get td and md data connected? I guess I have my master data in characteristics and the transactional data in an dso.
    But when combining them into an InfoCube how does the info cube get the information how sales is connected to material?
    Or do I fill up my DSO with the Material number also?
    I wonder how would the upload files look like for master data and then for transactional data? I don't have any example here..

    when the master data has been delivered time dependend
    I create an upload for master data und i deliver a time stamp from and to...
    uploading transactional data i have to deliver also the time from and to in the upload file, right?
    at least i have historical truth with my master date? i could load td data for a special time period and it is automatically correct for the master data for that time period? or is there somthing wrong in that concept?

Maybe you are looking for

  • HT201359 How can i change My iTunes account from Italy to France

    I want to change My iTunes account from Italy to' france

  • Boot camp with xp -- couldn't connect to the internet when installed

    I need to run xp for my job because the software they use only works with windows.  So I tried installing the copy of XY I have, XP Professional, last night.   I used the same XP disk for my first mac a few years ago, a 24" iMac with Leopard.  I am n

  • Special Pricing Query

    Hi, I need some assistance with a Price List query. I have special volume discounts setup for items within the main price lists.  I have done it this way rather than doing special pricing against business partners. I need a query where you select a p

  • Any  *USEREXIT* available to check PGI done or not while printing delivery

    My Requirement is Printing of the delivery ticket (Delivery Challan) should be done only after post goods issue and not before that. Hence it has to be validated before every printing. So, the customer can be given delivery ticket only after post goo

  • 10.4.8 Update iMac20 Panic/Dump/Hang

    I'm a new mac user and responded to the automated update of 10.4.8 (from 10.4.7 --among other updates), my system hung and now concludes its boot with a panic dump message something like... -panic (cpu 0 caller 0x003A8DBD): Unable to find driver for