Data Load (Info package and DTP) performance

Hi,
We are planning to perform initialisation expecting 30 million records.
To improve the data load performance i am planning to increase data package size from default value to double the default value
Will that improve the performance

Adjusting the info package size would be helpful...
But don't go with INCREASING the numbers, It's all depends on the your extract structure size, source system memory settings.
I would suggest to do some trail in DEV/QUALITY systems with enough data.
I remember playing in production itself for CRM - 0BPARTNER extractor - we actually reduced the number of records to be extracted per package to improve the performance - we did see the impacts.
First calculate how much is your current package size by number of records extracted per package now with your extract structure size..and then play around..sure it will help unless you DON'T have any excessive ABAP code user exits.
Let us know if you need further help on this

Similar Messages

  • Routines for File name at External Data in Info Package level.

    Hi All,
    Can any one give the example codes of how to write the routine for Files at External Data in Info Package level.
    Regards
    srinivas

    Hi Srinivas
    Here iam attaching a sample code in the infopackage level this code is used to select the Current version from TVARV table ..based on the version from variable the data is loaded into the ods..
    data: l_idx like sy-tabix.
              read table l_t_range with key
                   fieldname = 'FISCPER'.
              l_idx = sy-tabix.
    tables tvarv.
    data: v_prever(6)   type c,
          v_fiscper(7)  type n.
      clear tvarv.
      select single low
       from tvarv
       into v_prever
      where name = 'ZBSK_PREVIOUS_RELEASED_VERSION' and
            type = 'P' and
            numb = '0000'.
    concatenate v_prever(4) '0' v_prever+4(2) into v_fiscper.
       concatenate  '0' v_prever+4(2) v_prever(4) into v_fiscper.
            l_t_range-low = v_fiscper .
              modify l_t_range index l_idx.
    Hope the above code helps you..
    let me know in case of any concerns.. and further help needed..
    bye
    Shu Moh..

  • Master data loads for Attributes and texts failing

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

  • Master Data Loading for Prices and Conditions in CRM - "/SAPCND/GCM"

    Hi,
    Could anyone give me some inputs on Master Data Loading for Prices and Conditions in CRM.
    T. Code is:  /SAPCND/GCM
    I need to load data on a file (extracted from 4.6) for service contracts.
    I tried LSMW : for this transaction, recording does not work.
    I am trying loading thru Idocs (LSMW). But that too is note really working.
    Do we require some custom development for this , or is some SAP standard funcntionality available ??
    Can anyone provide some valuable inputs one this.
    Would appreciate your responses.

    Hi Tiest,
    Thanx for responding.
    U r right, our clint is upgrading from 4.6 to ECC.
    So as per the clients requirements, we are maintaining all the configs for Services in CRM.
    Services Data which was in 4.6 is being pulled put on flat files which needs to be loaded in CRM. So middleware would not be able to do this.
    What I am looking os some standard upload program.
    LSMW recording does not work.
    This I-Doc "CRMXIF_COND_REC_SLIM_SAVE_M", i am able to load a single record. But I am not able to find , how to make this function for multiple entries.
    IN standard we for loading master data thru I-docs, we map the values to the standard fields which are available in that I-Doc.
    But in this particular i-doc, there is a common field for which I need to define the field name and a field value..
    Till now, I am only able to define just one field name and a field value.
    I want this to word for mutliple entries.
    Hope u get my point.
    Thanx

  • Status of Info packages and process changes become yellow after Unicode con

    Dear Friends,
    We have been facing very critical issue in Production system after Unicode conversion.
    Sequences of Details are provided for your reference.
    1.  We suspended all BW loads for 48(2 days) hours before Unicode starts and Handover the BW system to Basis.
    2.  Basis started doing Unicode conversion on BW and they found some standard jobs ( BI_* , Ex: BI_PROCESS_LOADING) which are released status and they assumed that these jobs would run any time so that they suspended all of them but they didnu2019t inform to BW team that they had done. We didnu2019t know that when they did that like pre-Unicode installation, during the Unicode conversion and post Unicode.
    3. Basis handover the system to BW team for Unicode testing (we called five finger testing) and we didnu2019t find any issues in the testing. Again we handover the system to Basis for post installations settings for Unicode and back up things. That time we didnu2019t release any jobs to load data to BW because Basis didnu2019t inform us that we can start loads. 
    4. After post installation is done and basis told us that you can start loads from Monday. Entire Unicode activity had been done on weekend.
    5. Before releasing all BW loads to run regularly, I found the all completed process chains status become yellow and I did further detailed analysis to find why; then found some more strange things those are as follows.
              1. Only info packages in process chains are become yellow and other process variants are fine like activation,
                   DTPu2019s, index and DB stats.
              2. Requests in PSA which were completed successfully are become yellow.
              3. Info packages in the process chains are successfully completed are not updating correct status instead of green it
                  showing yellow because of that other dependent objects are not triggering.
    We all thought that suspension of standard jobs must have been caused the problem.
    How can we change the status of IPs, PSA and process chains automatically?
    We reactivate all PCu2019s and released to run tomorrow.
    Could you please share your thoughts about the issue and help us resolve?
    Thanks in Advance,
    Nageswar.

    Nageswar,
    A couple of questions :
    1. If your previous  PC runs were in yellow status - it would not impact furute PC runs..
    2. If your IPAKs are in yellow - then you might have an issue with deltas happening since the last status would have been shown as incomplete.
    3. Suspension of the standard jobs will not affect the status since the jobs were suspended but not the actual loads i.e the loads were not stopped halfway but only the released jobs were stopped - you might have to schedule the process chains again to get back the old schedule.
    Usually in a Unicode conversion - the data is exported out of the DB and imported back - this might have caused some inconsistency in the database for the RS tables used for the monitors.
    Which version of BW are you in ..?
    were there any errors in the export / import procedure for unicode conversion ?
    As part of testing did you check if loads went through properly ..?
    Was stats run on the tables after unicode ?
    Also if possible regenerate all the indices on the files.
    Rerun stats and indices for all RS* tables

  • Data Load PSA to IO (DTP) Dump: EXPORT_TOO_MUCH_DATA

    Hi Gurus,
    Iu2019m loading Data from PSA to IO: 0BPARTNER. I habe around 5 Mil entries.
    During the load the control Job dumps with the following dump:
    EXPORT_TOO_MUCH_DATA
    1. Data must be distributed into portions of 2GB
    2. 3 possible solutions:
        - Either increase the sequence counter (field SRTF2) to include INT4
        or
        -export less data
        or
        -distribute the data across several IDs
    If the error occures in a non-modified SAP program, you may be able to
    find an interim solution in an SAP Note.
    If you have access to SAP Notes, carry out a search with the following
    keywords:
    "EXPORT_TOO_MUCH_DATA" " "
    "CL_RSBK_DATA_SEGMENT==========CP" or "CL_RSBK_DATA_SEGMENT==========CM00V"
    "GET_DATA_FOR_EXPORT"
    This is not the first time I do such large load.
    The Field: SRTF2 is already an INT4 type.
    Version BI: 701. SP06
    I found a lot of OSS Notes for monitoring jobs, Industry solutions, by BI change runu2026 nothing however to data loading process.
    Has anyone encountered this problem please?
    Thanks in advance
    Martin

    Hi Martin,
    There were series of notes which may be applicable here.
    However if you have semantic grouping enabled it may be that this is a data driven issue.
    The System will try to put all records into one package in accordance with teh semantic key.
    If it is too generic many records could be input to one data package.
    Please choose another (more) fields for semantic grouping - or unselect all fields if the grouping is not nessessary at all.
    1409851  - Problems with package size in the case of DTPs with grouping.
    Hope this helps.
    Regards,
    Mani

  • Impact of hourly Data load on BW and R/3

    Hi...
    As of now, the delta load from R/3 to BW is taking place every night. But, the users claim that they need it hourly.
    Is this practice (hourly data load from r/3 to bw) is usually done? What would be the impact on BW and R/3? (any performance issues!)
    What is the minimum best time interval for the delta loads?
    Thanks,
    Sai.

    I think you have to let user business requirements dictate the answers as long as there are not any system or application constraints.
    We have a load that at run 4 times a day during a 9 hour window, once at the beginning to pickup any overnight work and then about every 3 hrs during the business day to pick up new information that is needed.
    No discernible impact on R3 here, but that probably depends on your transaction volume and efficiency of the extractors that are used.
    The user base does need to understand that loads will occur during the day and that depending on when the run a query, the results could change - but presumably that's the whole reason you are doing this in the first place.
    This does result in more Requests, so you want to make sure you are compressing your cubes based on the Request's age rather number of requests. 
    Can't really think of much else.

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

  • Error Regarding Data loading in Planning and budgeting cloud service.

    Hi ,
    I was able to load data in planning and budgeting cloud service before 1 month.
    I loaded via Administration -> Import and Export -> Import data from File.
    Today i loaded the same file to the cloud instance for the same entity that i loaded after clearing the existing data.
    I am getting error while validating itself as
    \\Unrecognized column header value(s) were specified for the "Account" dimension: "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec", "Jan", "Feb", "Mar", "Point-of-View", "Data Load Cube Name". (Check delimiter settings, also, column header values are case sensitive.)
    I checked the source file. Everything is correct. I actually loaded the same file before and i was able to load.
    Does anyone know the problem behind. Can anyone give me suggestion please
    Thanks in Advance
    Pragadeesh.J

    Thanks for your response John.
    I had Period and Year dimension in Columns
    I changed the layout of year dimension in period to POV and loaded the data file. I was able to load without any errors.
    I changed the layout again to the original form as before and loaded. I didn't get any errors again.
    It worked somehow.
    Thank you
    Cheers John

  • Need  a program to fetch the data from the package and delete the programs

    i need a program which uses the selection screen as input parameters which should accept as package and after execution it should show all the programs in the list and then the programs should be deleted from the package.

    Hi
    Use below query to fetch programs from the entered dev Class
    SELECT * FROM tadir into table itab
    WHERE pgmid = 'R3TR'
    AND object = 'PROG'
    AND devclass = p_class.
    In the program we can delete records from Tadir by looping ITAb,
    but other program related tables will have this entry .
    Hope this Helps.
    Praveen

  • Flatfile loading-info packager settings

    hi guys,
    Im trying to load one flatfile into infocube.
    when Im doing settings in infopackage,I came across a tabstrip with name EXTERNAL DATA....in that I didnot understand
    1.
    the two options they gave where I have to choose one.Can u explain the meaning of it.
    File is..Data file or Control file......
    what does these to options mean and where do we use them in real time...
    2.
    Currency conversion for external system ....what will happen if i choose this option...in which scenarios do we use this in real time....
    3.In DATA TARGETS tabstrip,there are some options to choose...
                         a.Update in all targets for which active rules exist
                          b.delete entire content of datatarget..
         what will happen if i choose these options.....where do we use them in real time....
    thanks and rgds
    Surekha
    Edited by: SChandx200 on Jul 3, 2009 5:47 AM

    1.File is..Data file or Control file......
    In this we have to choose Data File
    Please refer
    http://help.sap.com/saphelp_nw04/helpdata/en/80/1a6567e07211d2acb80000e829fbfe/frameset.htm
    2.
    Currency conversion for external system ....what will happen if i choose this option...in which scenarios do we use this in real time....
    This is important when we do flat file and there are Amounts with Currency.There are many currency related settings calculations that need to performed like determining the No of Decimal places for a Currency.
    For Eg.
    Whenever the amount 123 KRW is loaded in BW, it checks the TCURX table for the KRW currency. After finding the entry in TCURX table, it stores the amount value in target as 123 / {10** (2 u2013 (0))} that is 1.23 KRW. (The amount is divided by 100)
    For this to happen, we have to make a specific setting in the InfoPackage thru which we schedule the loads. The checkbox for u201CCurrency Conversion for External Systsu201D should be ticked as shown below

  • " Data Load error issues in  DTP "

    Hai Friend's,
    When I am loading Data from R/3 to BI.
    My Data is loaded into PSA.
    After  that  when  I  am  running   the  DTP, The Data is not coming to the Multi-Provider.
    Let me know the Solution. so that I can run the DTP.

    hI,
    DTP will help to load the data in DATA TARGETS ..
    so,after checking the PSA, check the data targets (infocube or DSO),which is targeted.
    if the data is there then check the connection between data targets and Multi-providers.
    or
    Give me more and clear picture.
    Hope you got the idea.
    Edited by: chandoo77 on Oct 7, 2011 2:33 PM

  • How to select date range 01/01/2007 to till date in info package selection

    HI
    I have a business requirement like i have to load data from 01.05.2007 to till date daily and the the till date should change daily when we are scheduling it daily.
    data: l_idx like sy-tabix.
    read table l_t_range with key
         fieldname = 'KDAY'.
              l_idx = sy-tabix.
              L_T_RANGE-IOBJNM   =      '/BI0/0CALDAY'.
              L_T_RANGE-FIELDNAME  =   'KDAY'.
              L_T_RANGE-SIGN      =     'EQ'.     
             L_T_RANGE-OPTION      =  '<>'.
              L_T_RANGE-LOW          = '01.05.2007'.
              L_T_RANGE-HIGH        =   'sy-datum'.
              modify l_t_range index l_idx.
    l_idx = sy-tabix.

    Hi,
    try to debug that one by putting a <b>break-point</b>.
    there is a reply from A.H.P , how to debug the routine in the infopackage.
    put a loop on the routine,
    data : debug(1).
    do.
    if debug = 'X'.
    exit.
    endif.
    enddo.
    and when run infopackage, go to sm50,
    on that process, menu program->debug program,
    in debug screen, type in debug, and fill with X and click 'edit'-pencil icon.
    F5 to next step.

  • Read data from excel sheet and then perform the required operations.

    Hi all
    I need to write a procedure which can read data from excel sheet.I have excel sheet in which i have to options one is modification and other is addition.so if it reads modification then i need to read the concerned table name then check its availability in pl-sql datbase.If table exists then reading the realated column in that row to fire the querry. The excel sheet is saved in local disk c.
    can anybody help me with this.How i need to start specialy to read the data from excel sheet saved in local disk c.
    Edited by: user13334062 on Jun 30, 2010 3:45 AM

    Hi
    If you can convert the excel to a csv format, then it can be simply query from DB Creating Oracle External Tables. Best part is that you may still change the CSV using EXCEL.
    Following action Points can be adopt;
    *1. Convert Excel File to a csv. File Save as CSV*
    *2. Create Oracle Directory* ( This has to be the location of your excel file )
    SQL> Create directory mydir as 'C:\testdb'; --- "testdb" is the location folder in win for your excel sheet.
    *3. Create the External Table*
    SQL> create table my_ext_tab (
    Field1 Datatype,
    Field2 Datatype,
    Field3 Datatype,
    Field4 Datatype,
    Field5 Datatype
    Organization external
    (type oracle_loader default directory mydir
    access parameters (records delimited by newline fields terminated by ',')
    location ('my_ext_tab.csv'))
    reject limit 100;
    *4. Now you can query the table "my_ext_tab"*
    Select * from "my_ext_tab";
    Please avoid the reformat the data column inside the spreadsheet (CSV).

  • Data Modeler importing packages and triggers

    Hi,
    I don't see any option to import triggers or packages from an Oracle Database, is it possible?
    Despite having no option for trigger import, and no option on the GUI tree or table properties to display the available table triggers, to my surprise there is a 'triggers' tab in the DDL generation window. So I think triggers are imported automatically, but neither are visible, nor can be created. Please correct me if I'm wrong.
    For packages I haven't seen any option, can this be true?
    Thanks

    Importing packages would be a useful thing, and also having more control of the other PL/SQL objects like triggers.
    By the way, DDL file generation lacks customization, like generating multiple files depending on the type of objects.
    I hope points like these could be introduced in the next release.

Maybe you are looking for

  • I recently upgraded to 10.8 from 10.6 and I would like to be able to see the folder/file size for each like in version 10.6

    I recently upgraded to 10.8 from 10.6 and I would like to be able to see the folder/file size for each, like in version 10.6. It used to show the size at the bottom of the "window" for the selected folder,drive etc.

  • Non-global zone devices

    We are currently trying to port an application from a Solaris 9 server to a branded Solaris 9 zone on a T5220. The application accesses the /dev/fb devises which link to the screen card device. We can’t get the zone to see the devices. Is this possib

  • Error 5002 when trying to look at iTunes Plus

    I just bought a new MBP, and since I used my education discount, I got a $100 gift card to use in the App Store/iTunes/etc. I have a bunch of songs I bought back when iTunes had DRM, so I was going to use this money to upgrade some of those purchases

  • Time Machine back up stopped working.

    Since July 21, I have been receiving the following message: "Time Machine couldn't complete the backup to 'Larry's Time Capsule'.  The backup disk ran out of space unexpectedly.  Time Machine will try to make more space available by removing expired

  • Google Map Missing; Error Message

    Hi! I just published some additional pages. When I went to website, my Google Map is not functioning. I get an Error Message: The Google Maps API server rejected your request. This could be because the API key used on this site was registered for a d