Data Load for 20M records from PSA

Hi Team,
               We need to reload a huge volume of data (around 20 million records) of Billing data (2LIS_13_VDITM) PSA to the first level DSO and then to the higher level targets.
If we are going to run the entire load with one full request from PSA to DSO for 20M records will it have any performance issue?
Will it be a good approach to split the load based on ‘Billing Document Number’?
In Case, If we the load by 'Billing Document Number'; will it create any performance issue from the reporting perspective (if we receive the data from multiple requests?) Since most of the report would be ran based on Date and not by 'Billing Document Number'.
Thanks
San

Hi,
Better solution put the filter based on the year and fiscal year.
check the how many years of data based on the you can put filter.
Thanks,
Phani.

Similar Messages

  • Query regarding the data type for fetcing records from multiple ODS tables

    hey guys;
    i have a query regarding the data type for fetcing records from multiple ODS tables.
    if i have 2 table with a same column name then in the datatype under parent row node i cant add 2 nodes with the same name.
    can any one help with some suggestion.

    Hi Mudit,
    One option would be to go as mentioned by Padamja , prefxing the table name to the column name or another would be to use the AS keyoword in your SQL statement.
    AS is used to rename the column name when data is being selected from your DB.
    So, the query  Select ename as empname from emptable will return the data with column name as empname.
    Regards,
    Bhavesh

  • No data Exists for this selection in PSA.

    Hi Guys,
    I received an Error message with loading of 2lis_03_bf datasource..
    Error message when processing in the Business Warehouse
    Diagnosis
    An error occurred in the SAP BW when processing the data. The error is documented in an error message.
    System response
    A caller 01, 02 or equal to or greater than 20 contains an error meesage.
    Further analysis:
    The error message(s) was (were) sent by:
    PSA Table
    DETAILS Tab: Processing (data packet): Errors occurred
                       : Data Package 6 ( ? Records ) : Errors occurred
                       : Update PSA ( 0 Records posted ) : Errors occurred
                       : No data exists for this selection in PSA
    For DataPacket 6 (Records sent :21572 - Records Received : 0) 
    All other datapackets are updated fine with almost 10,00,000 records..Any help on this ,guys..
    Manythanks
    Arun

    Hi,
    While the load was in progress some one might have deleted the data in PSA or there might be a PSA deletion job running or there might be a PSA deletion chiain which includes this table. Check all these to find out how it got deleted.
    In order to load the data again you can do a reload and probably it works fine this time.
    Anup.

  • Sqlldr is loading only 1st record from xml document

    Hi,
    I am trying to load XML doc with multiple records using sql*loader.
    I have registered my XSD perfectly.
    This is my control file
    LOAD DATA
    INFILE *
    INTO TABLE Orders APPEND
    XMLType(xmldata)
    FIELDS(
         xmldata LOBFILE (CONSTANT FULDTL_2.xml)
    TERMINATED BY '???')
    BEGINDATA
    FULDTL_2.xml
    -- Here, what I have to give for TERMINATED BY '???'
    My xml doc
    <Order ID="146120486" Status="CL" Comments="Shipped On 08/05/2008"/>
    <Order ID="143417590" Status="CL" Comments="Handset/Device has been received at NRC" ShipDate=""/>
    sqlldr is loading only 1st record from the file.
    How can I make it to load all the records from my xml doc.
    Thanks in advance.

    thanks for both the replies above - essentially the same correct solution.
    something worth noting now that I've written and tested both a SAX solution and a DOM solution is that there is a significant (4 x) time penalty using SAX.
    I considering dividing the vector I am storing/recovering into chunks and saving each chunk separately using DOM to speed things up...
    any thoughts on this approach?

  • Deleting records from PSA

    Hi,
      Can anyone tell me how to delete the errorneous records from psa table. i have abt 24 data packets each have abt 23000 records.
    the error is the dta package 23 and 24.
    regards,
    nithesh
    Edited by: nithesh prakash on Sep 16, 2008 7:56 AM

    Hi.,
    Another thing........why your data oacket failed.......is your extraction got completed...........
    If your data packets failed due to SID issue...........then you load the master data....delete the request without making it red from the target ............go to RSA1 >> click on PSA >> find the request and reconstruct it.........
    Otherwise if you extraction is completed............do manual update.......ie right click on data packet >> do manual update.........before that make the settings background.........otherwise in foreground it will take long time..................
    Hope this helps you...........
    Regards,
    Debjani.......

  • Master Data Loading for Prices and Conditions in CRM - "/SAPCND/GCM"

    Hi,
    Could anyone give me some inputs on Master Data Loading for Prices and Conditions in CRM.
    T. Code is:  /SAPCND/GCM
    I need to load data on a file (extracted from 4.6) for service contracts.
    I tried LSMW : for this transaction, recording does not work.
    I am trying loading thru Idocs (LSMW). But that too is note really working.
    Do we require some custom development for this , or is some SAP standard funcntionality available ??
    Can anyone provide some valuable inputs one this.
    Would appreciate your responses.

    Hi Tiest,
    Thanx for responding.
    U r right, our clint is upgrading from 4.6 to ECC.
    So as per the clients requirements, we are maintaining all the configs for Services in CRM.
    Services Data which was in 4.6 is being pulled put on flat files which needs to be loaded in CRM. So middleware would not be able to do this.
    What I am looking os some standard upload program.
    LSMW recording does not work.
    This I-Doc "CRMXIF_COND_REC_SLIM_SAVE_M", i am able to load a single record. But I am not able to find , how to make this function for multiple entries.
    IN standard we for loading master data thru I-docs, we map the values to the standard fields which are available in that I-Doc.
    But in this particular i-doc, there is a common field for which I need to define the field name and a field value..
    Till now, I am only able to define just one field name and a field value.
    I want this to word for mutliple entries.
    Hope u get my point.
    Thanx

  • Master data Load for Substance Creation :  Using BAPI in LSMW  ?

    Hi ,
    Had any one did a Master data load for creating a Substance in SAP RM. If so please let me know the best possible option. I am beeing trying to figure out the option to load Substances and its IDENTIFIER / MATERIAL ASSINGMNET from my legacy system. But I am not geeting a clue to use LSMW. ( Direct or BatchInput )
    I am not seeing any BAPI for uploading it through the LSMW except the one BUS1077 ( method SAVEREPMUL ) and I guess this is used along with IDOC.
    Appreciate your help and suggestion.
    David

    Thanks John.
    I will try this option what you have suggested. Even I told by business team to look for CG33. But they were been telling me some format issue. Probably they would meant the same what you have said. But they were not aware of work around for this. your information is intresting.
    The other option given to me was to create a recording for each and every Tab like Substance header / Identifier / Material assignment etc. seperately. Technically I don't feel good to create in that way. Do you think that is alernate approach.?
    With Respect & Regards
    David

  • Master data load for 0COSTCENTER (Cost Center) failing

    Hi Experts
    I have a master data load for 0COSTCENTER (Cost Center). The Load has started failing from a couple of days at DTP.
    (R) Filter Out New Records with the Same Key
          (G) Starting Processing...
          (R) Dump: ABAP/4 processor: DBIF_RSQL_SQL_ERROR
    I am unable to understand the reason for this failure. I tried loading the data 1 costcenter at a time it is still failing so i doubt if its a internal table storage issue as suggested by the dump.
    Could you help me on this one please.
    Regards
    Akshay Chonkar

    Thanks All
    I have got the issue resolved
    The Error Stack for the DTP had accumulated so many entries that it was unable to process further.
    So i deleted the entries in Error Stack usin SE14. Then again executed my DTP.
    Everything is fine now thanks for your help.
    Regards
    Akshay Chonkar

  • Is there a way to notify/alert users about records from PSA

    Hi All,
    During transaction data load in BW 3.5 I am using the option" No update of transaction data if Master data does not exist" in the Infopackage.
    If any records don't meet this criteria they will show in different Request in PSA.
    My question: Is there a way to notify/alert users about these records from PSA?.
    Through process chain we can't send any alerts in this case as loads will not fail.
    Please help.
    Thank you.
    Sree

    Brian, Accepeted. We can not create Infoset on PSA.
    To create report on PSA, follow this.
    https://websmp104.sap-ag.de/~sapidb/011000358700008145112002
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/968dab90-0201-0010-c093-9d2a326969f1
    /message/5239445#5239445 [original link is broken]
    Thanks...
    Shambhu

  • Meet a problem of data exchange for sale order from CRM to R3.

    Dear Friends:
          I do the data exchange for sale oder from to R3 today , the problem's detail is as follows:
          When i save a sale order in CRM (Version is 5.0) . it can automatically generate a bdoc which bdoc type is BUS_TRANS_MSG. but the bdoc status alway is "Sent to receivers (not all have confirmed)". and the original order in CRM can not be change .it quote that "Document is being distributed - changes are not possible",  so i check the order status analysis in detail .it presents two error messages ," Event 'BEFORE_CHANGE', attribute '     FINI': Error code for function module 'CRM_STATUS_BEFORE_COMPLETED_EC' , "Item is not yet completed in OLTP system".  so i check  the order in R/3 ,it has already been create and without any error messages.
       Would like to tell me how to solve it . thanks your any idear..

    Hi Benjamin,
    When performing uploads to R/3 from CRM there is a response from the OTLP system that is sent back to the CRM Middleware to confirm that the data records were received and processed correctly. 
    Here is a checklist you can run through to verfiy that the connections, systems and objects that are needed are all in place:
    <b>On R/3 system:</b>
    - Check R/3 outbound queue (transaction SMQ1) for any entries that are not reaching CRM.
    - Check that all RFC destinations on R/3 are defined correctly and are pointing to CRM
    - Check the CRMCONSUM table in R/3 to ensure CRM is registered as a consumer
    - Check the CRMRFCPAR table in R/3 to ensure that order objects are valid for exchange between R/3 and CRM
    - Check for any short dumps in R/3 (ST22/ST21)
    <b>On CRM:</b>
    - Are there entries stuck in the inbound queue (SMQ2) with R3AU* names?
    - What does the CRM Middleware Trace show (SMWT)?  Sometimes this has more detail than the specific BDoc overview (SMW01)
    - Check for short dums in CRM (ST22)
    Let us know what else you uncover and we can work from there.
    Brad

  • Need to skip the data load for one of the sub chain.

    Hi All,
    In our project, we have one meta chain and have many sub chains inside that. Today we don't want to run one of the sub chain. As we are in BW7.0, Skip options are not available. Can you please help how to stop the data load for the particular sub-chain.Please suggest.
    Thanks.

    Hi Jalina,
    If this is a frequent request then you can create a custom ABAP Program and then use the below simple logic to skip/run the meta chain, but you will also have to change the link between the sub chains as "Always" instead of "Successful" so that no matter if the dependent chain is successful OR not the nex chain will proceed. If you are comfortable with event then you can also use event to achieve this
    Program Logic- Create a new table where you maintain meaningful values (indicator/description) in it and the program should read the data from the table, Based on the values in the table the program will be successful/fail
    Thanks
    Abhishek Shanbhogue

  • Master Data Load for New Attribute

    Hi Users,
    We had to implement a separate load flow for a new field coming from R3. This field was to be added to existing master data object.
    I added a new Display attribute for an existing 0GL_ACCOUNT master data object.
    This new attribute along with some other existing fields is getting data from another master data object with an infosource in between because two transformations cannot be created for same source and target.
    When i load the data i dont see data being populated for this new field. I did ACR, checked the keys e.t.c.
    Source object has data but after executing DTP no data comes to this attribute. No routines or anything.
    Please suggest
    Regards
    Zabi

    Hi,
    The situation is:
    Field x from source maps to
    1. field y ( which was existing field ) and also maps to
    2. field z which is the new attribute.
    field y has to get updated for company codes 10 for example.
    field z for company codes 30.
    now if i use same flow and map field x to both y and z then there is overwriting happening if 10 does not have value for x and 20 has then its not good.
    So if i use a separate flow with infosource then i will map only x to z so after loads which means for 10 code if no value went in first dtp to y then if code 30 has value for x then z will only be updated and y remains empty....
    Master Data Load for New Attribute 

  • Master data loads for Attributes and texts failing

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

  • Data load for VK11

    Hi All,
    can anyone suggest me about data load for VK11, I mean which method will be easier?
    appreciate if you can send me the developed custom code
    thanks.

    Hi Kiran,
    You can also use BAPI BAPI_PRICES_CONDITIONS.
    Please check this link for sample codes.
    Re: Sample code for  BAPI_PRICES_CONDITIONS
    Hope this will help.
    Regards,
    Ferry Lianto

  • How to Run ebs adaptor data load for 1 year using DAC

    Hi,
    iam trying to Run the ebs adaptor data load for 1 year for procurement and spend ,Please let me know the parameter to set in DAC.
    last extract date is set as Custom Format(@DAC_SOURCE_PRUNED_REFRESH_TIMESTAMP
    Thanks

    You need to set $$INITIAL_EXTRACT_DATE to a year ago. The LAST EXTRACT DATE is something used for incremental loads and you do not manually set that.
    if this helps, mark as correct or helpful.

Maybe you are looking for