Loading transactional data before go live

Dear colleagues,
In my actual project we have an unexpected and strange requirement wich is to load all the postings that they have done during the year before go-live in their legacy system (we are going live on may). Therefore we have divided the initial loading in two phases:
1) Classical loading (Balance GL-accounts,Open line items, Assets) at the date 31122008
2)Loading of all postings made during the first half of the year in their legacy system. In that point is when my doubts arise, specially regarding vendor and costumer invoices and their payments. How can we link the invoices and their payments? how can we clear them? and what about the partial payments?
If anybody has done a similar loading i would apreciate his help and guidelines,
Kind regards
Marc Puigvert

Hi Marc,
Interesting Issue..
Its only theoretically that you are going live on may..
The time you are doing your first live transaction in SAP, you have gone live..
So, when you are posting the invoices and payments (AR and AP) you would do as you are doing the normal process (or)
Create a LSMW to upload the same on a daily or a weekly basis.
Because the posting date would contain the actual date of transaction, there should not be any issue in clearing..
Good Luck
Cheers
Redoxcube

Similar Messages

  • Loading transactonal data before go-live

    Dear colleagues,
    In my actual project we have an unexpected and strange requirement wich is to load all the postings that they have done during the year before go-live in their legacy system (we are going live on may). Therefore we have divided the initial loading in two phases:
    1) Classical loading (Balance GL-accounts,Open line items, Assets) at the date 31122008
    2)Loading of all postings made during the first half of the year in their legacy system. In that point is when my doubts arise, specially regarding vendor and costumer invoices and their payments. How can we link the invoices and their payments? how can we clear them? and what about the partial payments?
    If anybody has done a similar loading i would apreciate his help and guidelines,
    Kind regards
    Marc Puigvert

    Hi,
    It is suggestible to leave the trace of the legacy data in the legacy system itself and upload the closing balances only.
    Regards
    Suresh Addagiri

  • Can i load transactional Data from ge DS without having load M Data before

    Hello,
    I want to load transactional data from a generic data source.
    I just want to know whether it would be successful, after the master data of 0material i load in the PSA couldn't be loaded in the infoObject (000000XXXXXXX -> material number error) by the DTP (see Thread BI - BI general - User arnaud).
    If i can load the transactional data without having loaded the master data in the info Object, is it possible that it loads also the master data simultaneous.
    Thank you very much for helping
    AR.

    Loading transactions before master data can result in data elements not being populated in the InfoProvider in some cases, hence the "Best Practice" recommendation to load master data first.
    For example, if you have update/transformation rules that go read master data to get additional data to enhance the transactional data, or you map master data attributes  during the transaction load, the master data attributes could be missing values if you don't load master data first.

  • Why we load Master data first before loading Transaction data

    Hi Experts,
    why we load Master data first before loading Transaction data, specify any reasons for that ? Is it mandatory to load MD first ?
    I will allocate points to those who help me in detail. My advance thanks who respond to my query.
    Edited by: Nagireddy Pothireddy on Mar 10, 2008 8:17 AM

    Hi Nagireddy,
    I hope this helps....
    The bottom line for building cubes it to view facts against dimensions. When i say facts these are the key-figures i.e sales volume, Sales vat etc against some characteristics like sales Area,  Cost center , plant.
    Basically charateristics are those against which key-figures are measures like Costcenter, plant, material etc.
         Dimensions are a grouping of related characteristic. So basically a cube has a central fact table with dimesions associated to it in a relational schema. Imagine now you want to view a key figure Sales Volume against a dimension plant. when you consider plant , it has a distribution channel, purchasing organisation , company code, sales area, region etc associated with it. So which form the attributes of plant and also have some or the other description (texts) and aslo hierarchy. first we load the master data and then the transaction data follows.

  • Error while loading Transactional data from NW BW Infoprovider

    Hi,
      I am trying to load the transactional data using the delivered "Transactional data from NW BW Infoprovider" package and getting error message Error occurs when loading transaction data from other cube .
    Below is the transformation file content
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = TAB
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=NO
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT= 9999999999999
    ROUNDAMOUNT=
    *MAPPING
    ACCOUNT = 0ACCOUNT
    BUSINESSTYPE = *NEWCOL(NOBTYPE)
    BWSRC = *NEWCOL(R3)
    COMPANYCODE = 0COMP_CODE
    DATASRC = *NEWCOL(R3)
    FUNCTIONALAREA = *IF(0FUNC_AREA = *STR() then *STR(NOFA);0FUNC_AREA)
    GAME = *NEWCOL(NOGAME)
    INPUTCURRENCY = *IF(0CURRENCY = *STR() then *STR(NOCURR) ; 0CURRENCY)
    PROBABILITY = *NEWCOL(NOPROB)
    PRODUCT = *NEWCOL(NOPROD)
    PROFITCENTER = 0PROFIT_CTR
    PROJECT = *NEWCOL(NOPROJECT)
    TIME = 0FISCPER
    VERSION = *NEWCOL(REV0)
    WEEKS = *NEWCOL(NOWEEK)
    SIGNEDDATA= 0AMOUNT
    *CONVERSION
    Below is the Error Log
    /CPMB/MODIFY completed in 0 seconds
    /CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
    /CPMB/CLEAR completed in 0 seconds
    [Selection]
    InforProvide=ZPCA_C01
    TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\R3_TD_P&L.xls
    CLEARDATA= Yes
    RUNLOGIC= No
    CHECKLCK= Yes
    [Messages]
    Task name CONVERT:
    No 1 Round:
    Error occurs when loading transaction data from other cube
    Application: ACTUALREPORTING Package status: ERROR
    Does anyone had this issue before or is there something that needs to be looked in the BW delivered task ?
    ANy Inpout will be appreciated.
    Sanjay

    Hello Guru,
    Currently i am getting below error while loading costcenter master data from BW to BPC.
    Task name MASTER DATA SOURCE:
    Record count: 189
    Task name TEXT SOURCE:
    Record count: 189
    Task name CONVERT:
    No 1 Round:
    Info provider  is not available
    Application: ZRB_SALES_CMB Package status: ERROR
    Anybody can tell me, if i have missed anything ???
    Regards,
    BI NEW
    Edited by: BI  NEW on Feb 23, 2011 12:25 PM

  • What if i load transaction data without loading master data

    Hello experts,
    What are the consequences if i load transaction data without loading master data?Are there any other factors except the load performance because of SID generations etc and inconsistencies.
    <b>What kind of potential inconsistencies will occur?</b>
    Problem here is:
    when the transaction load starts a new master data such as employee(x) would be have been created in R/3 which does not exist in BW and hence transaction load fails.
    Thanks and Regards
    Uma Srinivasa rao

    Hi Rao,
    In case you load the master data after loading the transcation data and if there is any look up at the master data in teh update rules, then you can delete and reconstruct the requests in the ODS/Cube so that the latest master data is pulled in in the data target.
    Make sure you do the Apply hier/attr change before doing the delete and reconstruct.
    Bye
    Dinesh

  • Error while loading transaction data

    Dear Guru's
    I am loading transaction data which got terminated giving processing overdue error .
    While analysing i found there are lot of records in psa marked in red .
    Please suggest how to resolve and to load valid records.
    Regards
    Karan

    Hi,
    While analysing i found there are lot of records in psa marked in red .
    Correct the error records in PSA.Error will be in different ways (Ex Special chars, unit conversionu2026etc) so try to find the cause of the error and correct it.
    Before this make the QM status (in target) and technical status (in monitor screen) of error request to red and delete the request from target. Then only you can change the records in PSA.
    After correcting click on the tab Process manually in monitor screen or go to manage of the PSA (data source) click on schedule immediately. Error request will be updated again.
    I am loading transaction data which got terminated giving processing overdue error.
    Just wait for some time until it is loaded.
    Still it is not resolved See the error criteria in monitor screen status tab.
    Normally this type of message comes due to internal table space. Try to reduce the packet size and increase no of packets.  Consult the basis persons to increase the table space and clean the logs.
    Delete the request from the target and load again.
    With regards,
    Kishore.
    Edited by: Siv Kishore on May 26, 2009 10:47 AM

  • Deleting master data after loading transactional data using flat file

    Dear All,
    I have loaded transaction data  into an infocube using a flat file . While loading DTP i have checked the  option "load  transactional data with out master data exists" . So transactional data is loaded even if no master data is there in BW.
    While loading the flat file, I made a mistake for DIVISION Characteristic  where original master data value is '04000' , but i loaded the transactional data with value '4000' .Then i later realized after seeing the data from the infocube and deleted the request. then i reloaded data with value '04000'. Till now every thing is fine.
    But when I see the master data for DIVISION , i can see a new entry  with value '4000'.
    My question is how to delete the entry value '4000' from DIVISION. I tried deleting manually this entry from 'maintaining masterdata' , but it is not allowing me to do so .
    I have also checked if any transactional data exists for that value '4000' , as i said earlier I have deleted the transactional data with that values. even tried to delete the entries from the master data table, but i donot see a option to delete entries there.
    Please suggest me on this.
    Regards,
    Veera

    Hi,
    Goto RSA1 right click on the Info object and select Delete Master data. This will delete the master data unused existing in the table.
    If this master data is not used any where else just delete the master data completely with SID option.
    If even this doesnt work you can delete the complete table entire in SE14. But this will wipe out the entire table. Be sure if you wanna do this.
    Hope this helps
    Akhan.

  • While loading transaction data into cube what are the tables generats

    Hi,
    while loading transaction data into cube what are the tables normally generats.

    Hi,
    Normally datas will be loading to 'F'- Fact tables (/BIC/F****) *** - cube name..
    When you do compress the request the data will be moved into E tables.
    Regards,
    Siva.

  • Need more Info about "Load transactional data when master data not loaded"

    Hi,
    Can you please explain me this option in the infopackage " Load transactional data when master data is not loaded"....
    Say i load a transactional data record which has a material no. AAAXX.
    In the fact table, the material no. is replaced with the corresp. DIM ID.
    Now, assume that there is no entry for this Material no.
    AAAXX in the master data table...so no DIM ID for it..
    How is it then stored in the fact table ?
    Hope i have managed to explain the scenario..
    Thanks in advance,
    Punkuj

    Hello Punkuj K,
    How r u ?
    No, if the entry for that Material Number "AAAXX" is not there in the Master Data then it will create a SIDs & DIMs ID for that & Transaction Data will be loaded.
    Use
    Choose this indicator if you want to always update the data, even if no master data for the navigation attributes of the loaded records exists. The master data is generated from the loaded transaction data. The system draws SIDs. You can subsequently load the master data.
    Dependencies
    The texts, attributes, and hierarchies are not recognized by the system. Load the texts / attributes / hierarchies for the corresponding InfoObjects separately.
    This function corresponds to the update. Possible errors when reading the master data into the update rules are not associated with this indicator.
    Recommendation
    We recommended loading the master data into the productive systems in advance. Select this indicator especially in test systems.
    Best Regards....
    Sankar Kumar
    +91 98403 47141

  • Loading transaction data from flat file to SNP order series objects

    Hi,
    I am an BW developer and i need to provide data to my SNP team.
    Can you please let me know more about <b>loading transaction data (sales order, purchase order, etc.,) from external systems into order based SNP objects/structure</b>. there is a 3rd party tool called webconnect that gets data from external systems and can give data in flat file or in database tables what ever required format we want.
    I know we can use BAPI's, but dont know how. can you please send any <b>sample ABAP program code that calls BAPI to read flat file and write to SNP order series objects</b>.
    Please let me know ASAP, how to get data from flat file into SNP order based objects, with options and I will be very grateful.
    thanks in advance
    Rahul

    Hi,
    Please go through the following links:
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4180456
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4040057
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=3832922
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4067999
    Hope this helps...
    Regards,
    Habeeb
    Assign points if helpful..:)

  • Problem in loading transactional data from to 0MKT_DSO1(ods) to 0MKTG_C01

    Hi,
    I am trying to load lead transaction data to the standard Crm lead management cube from ODS.There is a problem while loading transaction data from 0MKT_DSO1(ods) to the infocube 0MKTG_C01 as the field 0STATECSYS2(CRM STATUS)  is set to 10 in ods -meaning incorrect transaction. This feild is not there in the infocube.
    There is a routine in the cube that deletes data records with (0statecsys2) set to 10.
    THIS field is not coming in the transaction.
    so, where can i see the master data in crm source system? and why is that feild getting set to 10 ?
    thanks in advance!

    Thanks for the reply..
    I have checked the Fact table which shows
    1. packet Dimension
    2. Time dimension
    3. Unit dimension.
    I have kept the 0CALDAY as the time characteristics.
    Sample data i have loaded from ODS to Cube.
    Sample data in ODS.
    Sales order No_____0CALDAY_____AMOUNT
    800001___________12/02/2009____15
    I have loaded this data in Cube with Full Upload.
    Data in Cube.
    Sales order No_____0CALDAY_____AMOUNT
    800001___________12/02/2009____15
    Again i am loading the same data to cube
    Data in cube after loading.
    Sales order No_____0CALDAY_____AMOUNT
    800001___________12/02/2009____15
    800001___________12/02/2009____15
    The data is duplicated and it is not cumulating.
    Am i missing anything on this.
    Pls help..
    Thanks,
    Siva.

  • Error Loading Transactional Data into Cube(0PCA_C01)

    Hi Guys,
        I am trying to Install the following cubes from Business Content. 0PCA_C01 / 0PCA_C02 (Profit Center Analysis). Everything got replicated. I am trying to load transaction data now. I created Infopackage and loaded the data. Its running for a long time. It still says " not yet completed/warning ". If i try to see the content of the cube, I am getting the following errors.
    "Your user master record is not sufficiently maintained for object Authorization Object3.
    System error: RSDRC / FORM AUTHORITY_CHECK USER NOT AUTHORIZED 0PCA_C01 0PCA_C01
    System error: RSDRC / FUNC RSDRC_BASIC_CUBE_DATA_GET ERROR IN RSDRC_BASIC_QUERY_DATA_GET 0PCA_C01 64
    System error: RSDRC / FORM DATA_GET ERROR IN RSDRC_BASIC_CUBE_DATA_GET 0PCA_C01 64"
    Also if i try to change something in the infopackage, it says "Init. select. for field name  currently running in". I guess its because the job is still running.
    Please let me know if i missed something.
    Raj

    Hi Raj
    This seems to be an authorization case.
    i guess you are in BW Dev system.
    Go to SU01. Enter your user id and display.Go to Tabs ROle and PRofile.
    Ideally in development, a developers id shud have access to all devevlopment activities. So check with basis folks and get access to the relevant profiles. If you can get SAP_ALL, then perfect!!
    Prakash
    Assignin points is a way of saying thanks on SDN!!

  • How to rectify the errors in master data loads & transactional data loads?

    hy,
    please any one tell me
    How to rectify the errors in master data loads & transactional data loads?
    thnQ
    Ravi

    Hi,
    Please post specific questions in the forum.
    Please explain the error you are getting.
    -Vikram

  • Incorrect records When load Transaction Data

    Hi experts:
    I just loaded transactional Data. Data first came from a CSV file to PSA, then from PSA to DSO, Last DSO to Cube.
    Data in PSA looks correct. After data got loaded to DSO, I found so many Item Numbers are not in PSA. I deleted all the old requests in PSA, and only have one request now. I don't know what are those old Item Numbers come from. It gets loaded everytime. Anyone knows?
    Thanks!

    In that case you must be having some update routine that is increasing number of records in DSO. Can you please paste here number of records transferred and added from DSO Manage for the data load request you are talking about. Also let me know what do you mean by -  I found so many Item Numbers are not in PSA
    Regards
    Pradip

Maybe you are looking for

  • How to enable multi-language support in JAVA

    i want to make an application whose language will be URDU but how to enablw urdu or any non-english language characters in java.plz help.i am an SCJP.

  • Problem sending form with fmt:message key=" " / tag system and jsp code

    I have developt my new web page using jsp. I have done a booking form, please se below here. There are two pages, bookings.jsp and booking_sent.jsp. The page has 6 different languages so i have also used the <fmt:message key=" " /> tag system for eac

  • Records Management instance node for report

    Hello all! I'm trying to set up a record model in SCASEPS transaction to start GM_BUDGET_OVERVIEW via a report Instance Node. In the Instance Node attributes I've set CSELNAME PVERSION=0. Now I'm trying to fill in automatically the Grant number (whic

  • TiBook battery reporting wrong information

    I recently got a TiBook 1GHz, and of course it being used, there was no guarantee on the battery life. I was also given a third party 45W charger, so I wasn't expecting anything spectacular. I loaded Tiger on it, left it to charge overnight, and unpl

  • Third Party Remittance Document Numbers

    Hi All, Client would like separate document numbers per vendor in the third party remittance posting. Currently one document number has several vendors. Does anyone know how to accomplish this? Thanks, Marlo