Data Loading in BI From R/3

Hi Experts,
Subject: Time taken to extract data from R/3 to BI.
1. R/3 = I have  created a Generic data source for BSIS table, and for this i had created a function module as the Extractor.
2, Done modeling for the same in BI.
it is working fine in Development.
3, In Production BI: When i execute the info package related to the data source it take 2 hours to extract 2lacs data.
   Some analysis on the same.
     a. if suppose we have 2lacs records to be fetched from R/3, then it is extracting data for each packet with 36232 number of records. it extracts like this up till 5 packets that is around 181160 records in max of 30 min and for the last packet which does not contain more then 20 thousand records it is taking 1:30 hours.
Please provide some information regarding this y it is behaving like this and expecting some better solution..
Thank in Advance...................

Let's take an examble of loading data to  infoobject
1. Create the infoobjects/infoprovider you might need in BI
2. Create your target infoprovider-Infoobject
3 Create a source system
4. Create a datasource or use replicated R/3 datasource
5. Create and configure an Infopackage that will bring your records to the PSA
6. Create a transformation from the datasource to the Infoprovider
7. Create a Data Transfer Process f(DTP) rom the datasource to the Infoprovider
8. Schedule the infopackage
9. Once succesful, run the DTP to get data from PSA to Infoobject
10. Check Infoobject for data
Gothourgh flat file loading
Loading Infoobject using Flat-File
http://help.sap.com/saphelp_nw2004s/helpdata/en/43/01ed2fe3811a77e10000000a422035/content.htm
/thread/377245 [original link is broken]
- http://help.sap.com/saphelp_nw04s/helpdata/en/43/03450525ee517be10000000a1553f6/frameset.htm
Please do search SDN before posting message. Already lot of posts are there....
Regards,
Senthil

Similar Messages

  • Data Loading in BW from XI

    Hi,
    I am sending the data from XI to BI7.0 by using PROXY.
    We have created the data source of type Web service in BI7.0.
    Data is getting loaded into PSA but the request is still with the status Yellow.
    How to close the request automatically when all the data is loaded into PSA.
    Thanks
    Mohan

    Hi,
    We have done the configuration based on that document steps only.
    We are able to send the data into PSA.
    But the data has to go from PSA to ODS by using DTP.
    Since the PSA request is in yellow, DTP is not extracting any data from PSA.
    Currently manually we are making the request into green and uploading the data.
    We are looking for a set up to be done in BW to make the PSA request into green from yellow when all the records are loaded from XI.
    Thanks
    Mohan

  • Master Data Load to APO from SAP MDM through SAP PI

    Hi SDners,
    This is a Parts Master Data Solution for one of the Largest Auto Manufacturer where SAP MDM will be the central hub of all the Global and Local attributes of Parts . The subscribing system is SAP SCM APO . Please advice if there is any possibility of direct integration between APO -MDM through PI. Does PI has the standard IDOC types to communicate with APO.
    Also does APO has some RFC/BAPI to do master data load for Product Master /n/sapapo/mat1
    Thanks,
    Prabuddha

    Hi,
    Check the LUWs in SM58 in source system and then execute it. Else check in BD87 and push the IDocs manually in ECC.
    Thanks
    Reddy
    Edited by: Surendra Reddy on Feb 24, 2009 10:59 AM

  • Data Load log #'s from 1003000 - 1003999

    Hi,Does anyone know as of what each and every # mean that is mentioned above. We are looking for a # that will tell us that data load has started and data load completed. We want to do a search within the log file based on these #'s.Thanks,Minash...

    I have written a parser for just this function. Do you do VB? Let me know if you want the code. <br><br>Jeff McAhren<br>Dallas, Texas

  • Data loading to Infocube from SAP R/3 System

    Hello Friends,
             I started load in BW four days back in background. Initially i could able to see the load running in SM50 t.code. This load  has 28 data packets. Out of 28, five data packets are successfully updated data to the cube, Remaining data packets are not through yet and in yellow colour. Now there is no job running in background in SM50 and i didn't find any runtime error in ST22.Source system is SAP R/3.The data is related to Employee HR Position.
    Could anyone plz throw some light on this.Thanks in advance. Your responose will be  appreciated.
    Regards,
    Sreekanth

    Hi,
    for this u have to chk the luw entries in SM58 of R/3....
    if there are some entries ur load won't be successful...
    and also chk these all
    1) wht in the details of extraction is ur data selction ended?
    2) after that chk in sm58 there shuld be no entire there...
    3) still ur load is in yellow, u can maually change the status to RED and goto each data packet and right click and select manual update...this takes soemtime to turn into green...
    do the same for all packets in yellow and then after the whole packets r green u can change the status from red to back to technical status..
    rgds,

  • How Offiline Data loading is possible from MS Access DB to Oracle DM

    HI All,
    I am new to Oracle.
    I am trying to migrate few tables from MS Access DB to Oracle DB (10g)
    I have used the migration utility from the Sql Developer tool.
    I was successful in creating the schema and also transferring the data for tables.
    But if my data in Access DB is updated everyday then how am i suppose to link this to newly created oracle tables?
    Is there any way to do this?
    And if i want to load the Oracle tables offline using "Generate Offline Data Move Scripts" option in sql developer then how do i do it?
    I tried using the option "Generate Data Move Scripts" right click option on converted object then the utility created few files on my local machine.
    I am not able to make my way fwd.
    Please help me.
    regards,
    Sushil

    Hi Sushil,
    I will try to address each of your questions:
    Q: "if my data in Access DB is updated everyday then how am i suppose to link this to newly created oracle tables? Is there any way to do this?"
    A: Depending on your reasons for maintaining the use of the MS Access database, you have two options.
    1. lf you are continuing to use the MS Access database because the application front end is MS Access forms & reports then you could just create linked tables to the newly migrated Oracle tables. Doing so will mean that any data modified via the frond end forms & reports will be saved directly in the Oracle database.
    2. If you wish to continue saving data to the MS Access database, then transfer it to the Oracle database, then you can just use the "Migrate Data" option in Oracle SQL Developer Migration Workbench to carry out this task. You just need to ensure that you've an open connection to your migration repository, and have the converted model for the migrated database. You will then just transfer the data online, like you have already done.
    Q: "And if i want to load the Oracle tables offline using "Generate Offline Data Move Scripts" option in sql developer then how do i do it?"
    A: To migrate your MS Access data offline, you need to do the following:
    1. Run the Exporter tool for MS Access, and select the "Export for Oracle SQL Developer" option. On the second screen of the tool, browse to the location of your MDB file, provide a location for the output directory and ensure you select the "Export Table Data" option before you click the "Export" button. This option generates a .DAT file for each table containing data in your MDB file.
    2. In Oracle SQL Developer, once you have carried out your migration, select the Migration > Script Generation > Generate Data Move Scripts menu item and select the Converted Model to generate the scripts for. An "MSAccess" folder will be generated in the path specified in the "Generate Offline Data Move Files" dialog.
    3. Navigate to the "MSAccess" folder generated in step 2, and this folder should contain a "oracle_ctl.bat" file, which
    for your converted model.
    4. Edit the "oracle_ctl.bat" file and update the script to replace <Username>/<Password> with the actual username/password combination required to connect to the migrated schema e.g. for a migrated Northwind database, this combination would be northwind/northwind. Save the changes to the file.
    5. Copy the .DAT files generated in step 1 to the /MSAccess/Oracle folder. The .ctl files refer to the .DAT file in order to load the data into Oracle using SQL*Loader.
    6. Open each of the .ctl files and check the file name referenced on the 2nd line of the file e.g. infile '[Categories].null', where the file name is "Categories.null". The ".null" part must be updated to ".dat". This is a known issue and a fix for this will be available in a future release of the Oracle SQL Developer Migration Workbench. Save the changes to the .ctl files.
    7. Open up a Command Prompt & navigate to the /MSAccess folder, and run the "oracle_ctl.bat" file to load the data into the Oracle database tables. If you experience any issues during this process, check the output in the command prompt & try to resolve any reported issues. If you are unable to do so, please refer to the Migration Workbench forum - Database and Application Migrations If you cannot find a solution from the existing threads, please post a new thread, giving the full syntax of any reported error messages.
    I hope this helps.
    Regards,
    Hilary

  • Data Loading into Infocube from ECC.

    Hi,
    Can someone help me with the list of steps to be followed for loading data into infocube using Data trasnformation process. Steps to be perfomed in the system.
    Also please share any links on various methods of loading data into infocube from ECC.
    Thanks!

    The steps would be *** foloows:-
    1) Create the info-objects.
    2) Create the infocube. If you a Planning area and the cube is a replica of the PA you can use /SAPAPO/TS_PAREA_TO_ICUBE to create the infocube.
    3) You create a source system as a file system.
    4) You create the datasource with the relevant fields.
    5) You create a transformation for the cube using the datasource you created in step 4.
    6) Last step would be to create DTP. When you create the DTP, on the first screen, check the checkbox for "Do not load data from PSA". This will give you additional fields to enter the file path. When this DTP is created and generated, goto the last tab of the DTP to execute.
    This will help you to directly pull the data from the file and load it into the cube.
    You can also create  a Process chain to automate this process.
    Hope this helps.
    Let me know in case of any more information.

  • Data Loading into BW ( from excel to database table)

    Hi Guys
    I have an excel sheet which has the fields:
    --> Soldto(3.1), Soldto(4.6), Shipto, Billto, Payer, SalesOrg, AAG.
    I also have a database table /BIC/AZSTXREF00 in BW QA (SE16), which has fields:
    --> /BIC/ZSOLD_TO(3.1), /BIC/ZBTOPRTY, PAYER, SHIP_TO, SOLD_TO(4.6), BILLTOPRTY, ACCNT_ASG, RECORDMODE. 
    The fields are in this order in the respective places.
    Please note that the fields are not in the same order in the two places. But they are equal in number.
    How can I delete the previous data in the database table in BW, and then load the fresh data from my excel sheet.
    Thanks.

    Hi Chintai,
    /BIC/AZSTXREF00 seems to be the Active Table of ZSTXREF  ODS. If this is the situation you can delete all ODS content from RSA1 --> right mouse click on ODS --> Delete Data.
    For the upload: right mouse click on ODS --> Show Data Flow --> Take the name of the InfoSource for this ODS --> go in InfoSources tree --> enter in Transfer Roules --> Modify mapping beetween fields of MS-Excel file and you Transfer Structure --> Activate --> Upload.
    Ciao.
    Riccardo.

  • Data loading of 0vendor from PSA to Data Target(info object)

    Dear Friends
    Here I have one problem that..i have loaded the master data 0vendor data is coming upto PSA but not going to data target i.e info object .
    Here I am giving the system message in RSMO (status tab)
    <b>Processing in Warehouse timed out; processing steps missing
    Diagnosis
    Processing the request in the BW system is taking a long time and the processing step has still not been executed.
    System response
    <DS:DE.RSCALLER>Caller  is still missing.
    Procedure
    Check in the process overview in the BW system whether processes are running in the BW system under the background user.
    If this is not the case, check the short dump overview in the BW system.</b>
    Yesterday also I got this error is it concerned to BASIS guy or have to do in BW.
    I remind you this is daily job .
    Thanks and Regards
    Rajasekar

    Hi Manfred
      Thanks for you response..
      i checked in ST22 i didnt find short dump and in RSMO i find <b>transfer(Idocs and trfc):Errors occured,
    Data Package 1 : arrived in BW ; Processing : Data packet not yet processed</b> in details tab..i have activated transfer rules also..data is available in PSA..in infopackage tab i have selected full update
    and in Processing tab:i find
    Data Package 1 ( 11442 Records ) : Errors occurred
    also showing transfer rules with red button.
    so if u give any solution for above problem i will be highly thankful to you
    Thanks and Regards
    Rajasekar

  • Adding leading zeros before data loaded into DSO

    Hi
    In below PROD_ID... In some ID leading zeros are missing before data loaded into BI from SRM into PROD_ID. Data type is character. If leading zeros are missing then data activation of DSO is failed due to missing zeros and have to manually add them in PSA table. I want to add leading zeros if they're missing before data loaded into DSO.... total character length is 40.. so e.g. if character is 1502 then there should be 36 zeros before it and if character is 265721 then there should be 34 zeros. Only two type of character is coming either length is 4 or 6 so there will be always need to 34 or 36 zeros in front of them if zeros are missing.
    Can we use CONVERSION_EXIT_ALPHPA_INPUT functional module ? As this is char so I'm not sure how to use in that case.. Do need to convert it first integer?
    Can someone please give me sample code? We're using BW 3.5 data flow to load data into DSO.... please give sample code and where need to write code either in rule type or in start routine...

    Hi,
    Can you check at info object level, what kind of conversion routine it used by.
    Use T code - RSD1, enter your info object and display it.
    Even at data source level also you can see external/internal format what it maintained.
    if your info object was using ALPHA conversion then it will have leading 0s automatically.
    Can you check from source how its coming, check at RSA3.
    if your receiving this issue for records only then you need to check those records.
    Thanks

  • Data extracting to BW from R3 taking too much time

    Hi,
    We have one delta data load to ODS from R3 this is taking 4-5 hours .this job runs in r3 itself for 4-5 hours even for 30-40 records.and after this ODS data updated to cube so but since in ODS itself takes too much time so delta brings 0 records in cube hence we have to update manually.
    Also as now job is running for load to ODS so can't we check records for delta in RSA3 Its giving me error saying  "error occurs during extraction ".
    can u please guide how we can make this loading faster if any index needs to be build how to proceed on that front
    Thanks
    Nilesh

    rAHUL,
    I tried with R its giving me dump with message "Resul of customer enhancemnet 19571 records"
    Erro details are -
    Short text
        Function module " " not found.
    What happened?
        The function module " " is called,
        but cannot be found in the library.
        Error in the ABAP Application Program
        The current ABAP program "SAPLRSA3" had to be terminated because
        come across a statement that unfortunately cannot be executed.
    What can you do?
        Note down which actions and inputs caused the error.
        To process the problem further, contact you SAP system
        administrator.
        Using Transaction ST22 for ABAP Dump Analysis, you can look
        at and manage termination messages, and you can also
        keep them for a long time.

  • Data Loading Error for cube 0TCT_C22

    Dear Colleagues,
    I am working on BI 7.0 / SP09.
    I am loading technical content cube 0TCT_C22 from datasource 0TCT_DS22. Till PSA, there is no problem with data load. But from PSA to data target, data load fails. Went to monitor and it shows teh error "Error calling number range object 0TCTPRCSCHN for dimension D2 (  ). Message no: RSAU023".
    I tried to find the SAP notes fro this, but no success. Also checked the Dump and application logs, but nothing is there.
    Please advice ASAP.
    Regards
    PS

    Hi Pank,
    I just solved the very similar issue. Try what I did to see if it works for you.  For each dimension in each Infocube a number range is created. For some weird reason during the activation of the Infocube the number range for the dimension giving troubles was not created. Look for it in TCODE SNRO and you should find a number range per each dimension in the cube but the one giving you error.
    To solve it, (the easiest way I found) just add any characteristic to the problematic dimension. Activate the Infocube. After that, modify again you Infocube and remove the characteristic you just added to leave the dimension how you need it. Activate the Infocube again. By doing that you will force the regeneration of the dimension and with it the number range. You can chek in TCODE SNRO and the number range should be there. Try loading your data again and it should work.
    One thing I don't understand is why that number range sometimes is not created during activation
    Good luck, I hope you can solve it!!!
    Regards,
    Raimundo Alvarez

  • "master data deletion for requisition" before master data loading

    Hello Gurus,
             in our bw syetem , for   process chains for loading  master infoobjects, all include "u201C master data deletion for requisition" ABAP
    process  except for one process chain. my question is:
           why that process chain for master data loading is different from others as for lacking "master data deletion for requisition" in it?
    so it does not matter if you include " master data deletion for requisition" ABAP  process in process chain for master data loading ?
    Many thank.

    Hi,
    ABAP Process means some ABAP program is being executed in this particular step.
    It's possible that for all of your process chains except for that one requirement was to do some ABAP program processing.
    You can check which program is executed by following below process:
    Open your process chain in planning view -> Double click on that particular ABAP process -> Here you can see program name as well as program variant.
    Hope this helps!
    Regards,
    Nilima

  • Data load Tuning

    Hello All,
    What are the Data Load Tuning ways from R/3 to BW we can possibly do, please help.
    Thanks,
    Suman

    Hi,
    To improve the data load performance
    1. If they are full loads then try to see if you make them delta loads.
    2. Check if there are complex routines/transformations being performed in any layer. In that case see if you can optimize those codes with the help of an abaper.
    3. Ensure that you are following the standard procedures in the chain like deleting Indices/secondary Indices before loading etc.
    For eg
    1) Create Index
    2) Delete Index
    3) Aggregate Creation on Info Cube
    4) Compressing Info Cube data
    5) Rollup Data to Aggregates
    6) Partitioning infoCube
    7) Load Master data before loading Transactional Data
    8) Adjusting Datapackage size
    https://forums.sdn.sap.com/click.jspa?searchID=10049032&messageID=4373697
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3a699d90-0201-0010-bc99-d5c0e3a2c87b
    4. Check whether the system processes are free when this load is running
    5. Try making the load as parallel as possible if the load is happening serially. Remove PSA if not needed.
    6. Goto manage ODS -> activate -> activate in parallel -> increase the number of processes from there.for direct access try TCode RSODSO_SETTINGS
    7. Remove Bex Reporting check box in ODS if not required.
    8. When the load is not getiing processed due to huge volume of data, or more number of records per data packet, Please try the below option.
    1) Reduce the IDOC size to 8000 and number of data packets per IDOC as 10. This can be done in info package settings.
    2) Run the load only to PSA.
    3) Once the load is succesfull , then push the data to targets.
    In this way you can overcome this issue.
    Ensure the data packet sizing and also the number range buffering, PSA Partition size, upload sequence i.e, always load master data first, perform change run and then transaction data loads.
    Use InfoPackages with disjoint selection criteria to parallelize the data export.
    Complex database selections can be split to several less complex requests.
    Number Range Buffering Performance  
    /thread/754694
    Check this oss note : 130253.Review the oss note 857998 and 130253. The first note tells you how to find the dimensions and infoobjects that needs number range buffering.
    Check this doc on BW data load perfomance optimization
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2
    BI Performance Tuning
    Business Intelligence Journal Improving Query Performance in Data Warehouses
    http://www.tdwi.org/Publications/BIJournal/display.aspx?ID=7891
    Achieving BI Query Performance Building Business Intelligence
    http://www.dmreview.com/issues/20051001/1038109-1.html
    SAP Business Intelligence Accelerator : A High - Performance Analytic Engine for SAP Ne tWeaver Business Intelligence
    http://www.sap.com/platform/netweaver/pdf/BWP_AR_IDC_BI_Accelerator.pdf
    BI Performance Audit
    http://www.xtivia.com/downloads/Xtivia_BIT_Performance%20Audit.pdf
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/10564d5c-cf00-2a10-7b87-c94e38267742
    https://websmp206.sap-ag.de/~form/sapnet?_SHORTKEY=01100035870000689436&
    Thanks,
    JituK

  • Data loading delay

    Hi Friends.,
               Shall i have an answer for one error,
    The Issue is: Every day i load to one info cube, whatever the cube it is, it takes 2 Hours for every load, but once it has taken 5 Hours, what might be the reason? just confusing with that, can anybody let me clarify !!!!
    Regards.,
    Balaji Reddy K.

    Reddy,
    1. Is the time taken for loading to PSA or to load from PSA to cube ? if it is to oad to PSA then  uaually the problem lies at the extractor
    2. If it is loading to the cube.. then check out if statistics are being maintained for the cube and they would give an accurate picture of where the dataload is taking up most time.
    Do an SQL trace during the data load and if you find a lot of aster Data Lookups .. make sure that master data is loaded and if there are a lot of looups to Table NRIV check if number range buffering is on so that dim IDs get generated faster
    Check if the data load happens fast if you drop any existing indexes...
    Are you loading any agregates after the data load ? check fi th aggregates are necessary or if they have been enabled for delta loads..
    If you have indexes active and there is a huge data loa , depending on the index , the data load can get delayed..
    If the cube is not compressd , some times the data load can get delayed..
    Also when the data load is going on check in SM50 and SM37 to see if the jobs are active - this means that the data load is active from both sides...
    Always update the statistics for the cube before the load and ater the load , this helps in deciphering the time it takes for the data load... after activating the statistics .. check table RSDDSTAT or the standard reports available as part of BW tecnical content..
    Hope it helps..
    Arun
    Assgn points if helpful

Maybe you are looking for

  • Elvismx no device detected & Elvismx hangs and stops working

    Hi There, I bought a Mydaq, it is detected on device manager and  NI Max. However none of the devices from Elvismx is working, once it launches hangs. I am using windows 7, Home Premium 64 Bits The NI troubleshoooting utility stops working and gives

  • How do I transfer my library from this mac to a new mac

    How do I transfer my itunes library from this MacBook Pro to a new one?  I am using an iPod Touch...

  • Can't send audio messages from Nokia E63 to email ...

    I used to send audio messages from my mobile to my friend's email, and it worked fine. But some days ago it stopped working. I keep on seeing this whenever I try to send audio message. "In order to send/download MM messages, the current active connec

  • Two video cards?

    The info on the Macbook Pro says: "... NVIDIA GeForce 9400M + 9600M GT 512M..." Do the Macbook Pros actually come with two video cards or are we to choose between the two when ordering. If they come with both, are we forced to allow the OS to choose

  • WANTED!! My Desktop Icons

    Ok heres the issue, I had a power out in my house and my Mac shut down. When i booted it back up there were no icons on the desktop. Except for The Disk i had in the drive and my external hard drive. Whats up?