Data Load Speed

Hi all.
We are starting the implementation of SAP at the company I work and I am designated to prepare the Data Load of the legacy systems. I have already asked our consultants about the data load speed but they didn´t answer really what I need.
Does anyone have a statistic of the data load speed using tools like LSMW, CATT, eCATT, etc... per hour?
I know that the speed depends of what data I´m loading and also the CPU speed but any information is good to me.
Thank you and best regards.

hi friedel,
Again here is the complete details regarding data transfer techniques.
<b>Call Transaction:</b>
1.Synchronous Processing
2.Synchronous and Asynchrounous database updates
3.Transfer of data for individual transaction each time CALL TRANSACTION statement is executed.
4.No batch input log gets generated
5.No automatic error handling.
<b>Session Method:</b>
1.Asynchronous Processing
2.Synchronous database updates.
3.Transfer of data for multiple transaction
4.Batch input log gets generated
5.Automatic error handling
6.SAP's standard approach
<b>Direct Input Method:</b>
1.Best suited for transferring large amount of data
2.No screens are processed
3.Database is updated directly using standard function modules.eg.check the program RFBIBL00.
<b>LSMW.</b>
1.A code free tool which helps you to transfer data into SAP.
2.Suited for one time transfer only.
<b>CALL DIALOG.</b>
This approach is outdated and you should choose between one of the above techniques..
Also check the knowledge pool for more reference
http://help.sap.com
Cheers,
Abdul Hakim

Similar Messages

  • How can I increase the data loading speed

    Hi Expert,
    I need to populate a table cross dblink from another database each night. There are about 600,000 rows. It takes about two hours. This is the steps:
    delete from target_table;
    insert into target_table
    select * from source_table@dblink;
    commit;
    I can't use truncate in case the loading fails that I still have the old data.
    How can I increate the loading speed?
    Thanks!

    DELETE and INSERT /*+ APPEND */ aren't a good combination, as the high water mark will keep going up.
    With a trivial number of rows like this I would not expect the delete or insert to be the problem. (How long does the delete take anyway?) It's more likely to be to do with the query over the db link. Is it just one table or is that a simplified example? Can you check the equivalent insert over on the remote database? If it's much faster I would investigate the network connection.

  • ODI data loading Speed

    Hi All,
    ODI data loading step's speed is too low.
    I am using;
    LKM=LKM SQL to Oracle,
    CKM=CKM SQL,
    IKM=IKM SQL Incremental Update for replication SQL Server to Oracle.
    I don't use Flow control in interfaces.
    SQL Server and Oracle database are installed on the same server.
    How can I do it faster?

    If the two database servers are on the same machine, and you are dealing with bulk data, you should use an LKM which uses bulk methods (BCP to extract the data from SQL Server and External tables to get the data into Oracle) - something like this KM https://s3.amazonaws.com/Ora/KM_MSSQL2ORACLE.zip (which is actually an IKM but the LKM is not so different.
    Hope it help

  • Data load happening terribly slow

    Hi all,
    I had opened up quality server and made some change(removed its compunding attribute) at definition level to an infoobject present in the InfoCube.
    The infoobject,IC and Update rules are all active.
    when i scheduled the load now
    ther is a huge variation in the data load speed.
    Before:
    3Hrs- 60 Lakhs
    Now:
    3hrs-10 Lakhs.
    Also i m observing in SM66 that the process are all reading NRIV Table
    Can anyone throw some insght to this scenario??
    Any useful input will be rewarded!
    Regards
    Dhanya.

    1. yes, select main memory. how many entries do you expect in this dimension?
    In other terms, how many different combinations of characteristics calues included in your DIM are to be posted?
    As I first guess, you should enter there 50'000 but please let me know the cardinality of this dimension.
    2. the fact to have or not master data doesn't apply. Your fact table is booked with DIMIDs as keys of the table. Every time you book a record, the system will check in the dimension tables if this combination of characteristics values have already one record in their DIM, if yes, fine, nothing to do . If a new combination comes, the system will have to add a record in the dimension, thus it will first look for the next number range value (= DIMID).
    In adition, the system will create master data IDs as well (even if there is no master data). In each Dimension table you'll find the corresponding master data SIDs for each of the IObjs belonging to the Dimension.
    Tha's why filling empty cubes takes much more time than loading a cube with data already. That's also why the more you load data the less time it takes.
    Please also make sure that all your F table indexes are dropped. (manage cube, performance tab, delete indexes prior loading).
    this will help considerably initial loads...
    Understanding these concepts of BW datawarehousing are of paramount importance in order to set the system up properly.
    Message was edited by:
            Olivier Cora

  • Essbase Studio Performance Issue : Data load into BSO cube

    Hello,
    Having succesfully built my outline by member loading through Essbase Studio, I have tried to load data into my application again with Studio. However I was never able to complete the data load because it is taking forever. Each time I tried to work with Studio in streaming mode (hoping to increase the query speed), the load gets terminated due to the following error : Socket read timed out.
    In the Studio properties file, I typed in, oracle.jdbc.ReadTimeout=1000000000, but the result has not changed. Even if it did work out, I am also not sure the streaming mode is gonna provide a much faster alternative to working in non-streaming mode. What I'd like to know is, which Essbase settings I can change (either in Essbase or Studio server) in order to speed up my data load. I am loading into a Block Storage database with 3 Dense, 8 Sparse and 2 attribute dimensions. I filtered some dimensions and tried to load data to see exactly how long it takes to create a certain number of blocks. With ODBC setting in Essbase Studio, it took me 2.15 hours to load data into my application where only 153 blocks were created with the block size of 24B. Assuming that in my real application the number of Blocks created are going to be at least 1000 times more than this , I need to make some changes in settings. I am transferring the data from Oracle Database, with 5 tables joined to a fact table (view) from the same data source. All the cache settings in Essbase are in default mode. Would changing cache settings, buffer size or multiple threads help to increase the performance? Or what would you suggest that I should do?
    Thank you very much.

    Hello user13695196 ,
    (sorry I no longer remember my system number here)
    Before it comes to any optimisation attemps in the essbase (also Studio) environment you should definitily make clear that your source data query performs well at the oracle db.
    I would recommand:
    1. to create in your db source schema a View from your sql statement (these behind your data load rule)
    2. query against this view with any GUI (Sql Developer, TOAD etc.) to fetch all rows and measure the time it takes to complete. Also count the effected (returned) number of rows for your information and for future comparing of results.
    If your query runs longer then you think is acceptable then
    a) check DB statistiks,
    b) check and/or consider creating indexes
    c) if you are unsure then kindliy ask your DBA for help. Usually they can help you very fast.
    (Don't be shy - a DBa is a human being like you and me :-) )
    Only when your sql runs fast (enough for you; or your DBA says is the best you can achieve) at the database move your effort over to essbase.
    One hint in addition:
    We had often problems when using views for dataload (not only performance but rather other strange behavior) . Thats the reaons why I like more directly to set up on (persistence) tables.
    Just to keep in mind: If nothing helps create a table from your view and then query your data from this table for your essbase data load. Normaly however this should be your last option.
    Best Regards
    (also to you Torben :-) )
    Andre
    Edited by: andreml on Mar 17, 2012 4:31 AM

  • Design pattern / data loading solution

    Hello all!
    I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
    What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
    The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
    class SpecificLoader extends LoadingOperation<CustomDataType>
    and doing similar things with Parser classes, but this seems a bit weird.
    Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...
    thanks for any help!
    psi have also asked this question here [http://stackoverflow.com/questions/4329087/design-pattern-data-loading-solution]

    rackham wrote:
    Hello all!
    I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
    What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
    The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
    class SpecificLoader extends LoadingOperation<CustomDataType>
    and doing similar things with Parser classes, but this seems a bit weird.
    Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...That depends on the specifics.
    The fact that it seems like processes are similar doesn't mean that they are in fact the same. My code editor and Word both seem to be basically the same but I am rather sure that generalizing between the two would be a big mistake.
    And I speak from experience (parsing customer data and attempting to generalize the process.)
    The problem with attempting to generalize is if you generalize functionality that is not in fact the same. And then you end up with conditional logic all over the place to deal with differences dependent on the users. Rather than saving time that actually costs time because the code becomes more fragile.
    Doesn't mean it isn't possible but just rather that you should insure that it is in fact common behavior before implementing anything.

  • Internal Disk to Disk Data Transfer Speed Very Slow

    I have a G5 Xserve running Tiger with all updates applied that has recently started experiencing very slow Drive to Drive Data transfer speeds.
    When transferring data from one drive to another ( Internal to Internal, Internal to USB, Internal, Internal to FW, USB to USB or any other combination of the three ) we only are getting about 2GB / hr transfer speeds.
    I initially thought the internal drive was going bad. I tested the drive and found some minor header issues etc... that were able to be repaired so I replace the internal boot drive
    I tested and immediately got the same issue.
    I also tried booting from a FW drive and I got the same issue.
    If I connect to the server over the ethernet network, I get what I would expect to be typical data transfer rates of about 20GB+ / hr. Much higher than the internal rates and I am copying data from the same internal drives so I really don't think the drive is the issue.
    I called AppleCare and discussed the issue with them. They said it sounded like a controller issue so I purchased a replacement MLB from them. After replacing the drive data transfer speeds jumped back to normal for about a day maybe two.
    Now we are back to experiencing slow data transfer speeds internally ( 2GB / hr ) and normal transfer speeds ( 20GB+ / hr ) over the network.
    Any ideas on what might be causing the problem would be appreciated

    As suggested, do check for other I/O load on the spindles. And check for general system load.
    I don't know of a good GUI in-built I/O monitor here (and particularly for Tiger Server), though there is iopending and DTrace and Apple-provided [performance scripts|http://support.apple.com/kb/HT1992] with Leopard and Leopard Server. top would show you busy processes.
    Also look for memory errors and memory constraints and check for anything interesting in the contents of the system logs.
    The next spot after the controllers (and it's usually my first "hardware" stop for these sorts of cases, and usually before swapping the motherboard) are the disks that are involved, and whatever widgets are in the PCI slots. Loose cables, bad cables, and spindle-swaps. Yes, disks can sometimes slow down like this, and that's not usually a Good Thing. I know you think this isn't the disks, but that's one of the remaining common hardware factors. And don't presume any SMART disk monitoring has predictive value; SMART can miss a number of these cases.
    (Sometimes you have to use the classic "field service" technique of swapping parts and of shutting down software pieces until the problem goes away. Then work from there.)
    And the other question is around how much time and effort should be spent on this Xserve G5 box; whether you're now in the market for a replacement G5 box or a newer Intel Xserve box as a more cost-effective solution.
    (How current and how reliable is your disk archive?)

  • Regarding ERPI Data Loading

    Dear All,
    I have few doubts on ERP Integrator.
    1) What are things required from Oracle GL to Planning for Data Loading using ERP Integrator? (Trail Balance is enough or we required some other file from Oracle GL)
    2) Is there any scheduling options available for Data loading using ERP Integrator?
    3) what is process for loading the data to Planning using ERP Integrator?
    4) How we load the data to Planning? (i.e. monthly load, hourly load)
    Anyone please guide me in this situation.
    Thanks,
    PC

    1) What are things required from Oracle GL to Planning for Data Loading using ERP Integrator? (Trail Balance is enough or we required some other file from Oracle GL)
    Assuming you have the right version of Oracle EBS, ERP Integrator queries the tables within the Oracle EBS database to get the appropriate information. In my case, the trail balance file was enough. Within the trail balance file you will have the appropriate dimension intersection (account, entity, period, etc.), the type of account (asset vs. liability, etc.) and finally the dollar amount.
    2) Is there any scheduling options available for Data loading using ERP Integrator?
    Yes. You can use FDQM to map and validate the data, then use the FDQM batch scheduler to load the data via command line or you can use the FDQM batch scheduler as well.
    3) what is process for loading the data to Planning using ERP Integrator?
    I'll try to do my best to summarize. (Assuming you are using FDQM) Create rules in ERPi -> Configure the adapters in the Workbench Client for the ERPi Rules -> Configure the FDQM Web Client to call the Adapters set in the Workbench Client -> Import the data into FDQM. Then from here you can call your command line automation for batching if you wish.
    4) How we load the data to Planning? (i.e. monthly load, hourly load)
    This depends on your business. Assuming you are going to load the data for budget and planning purposes then maybe your business is happy with a monthly load (and most of the time this is the case). An hourly load might be helpful if you deal with users that need up to date actuals. Loading hourly acutals data might be an overkill for a budget or planning application, but I have ran into situations where this is needed, but then find myself worried about speeding up the calculations after the data is loaded. Long store short you can load monthly or hourly.

  • Issue in Data Loading in BW

    Dear BW GURUS,
    We are facing data loading in BW. When we start process chain. it is showing Job id.  But the data loading is not starting.  We could not trace the log files for the same.
    Please throw some light on this.  Any pointers would be appreciable.
    Thanks in advance.
    Regards,
    Mohankumar.G

    Hi Mohan,
    By buffering the number ranges, the system reduces the number of database reads to the NRIV table, thus speeding up large data loads.
    The SAP BW system uses number ranges when loading master data or transactional data into BW. The system uses a unique master data number ID for each loaded record. Each new record reads the number range table NRIV to determine the next number in sequence. This ensures that there are no duplicate numbers. Not using unique number range values could compromise the datau2019s integrity.
    Number ranges can cause significant overhead when loading large volumes of data as the system repeatedly accesses the number range table NRIV to establish a unique number range. If a large volume of data is waiting for these values, then the data loading process becomes slow.
    One way to alleviate this bottleneck is for the system to take a packet of number range values from the table and place them into memory. This way, the system can read many of the new records from memory rather than repeatedly accessing and reading table NRIV. This speeds the data load.
    Regards,
    Vamsi Krishna Chandolu

  • Load speed: FLEX MX:TABNAVIGATOR LOAD SPEED VERSUS SPARK:TABBAR

    TODAY I NOTICED A LARGE DIFFERENCE BETWEEN
    THE OLDER MX:TABNAVIGATOR
    AND NEWER S:TABBAR (SPARK)
    Summary:
    My test application was built in flex builder 4 and I was using my localhost (asp.net devlepment enviroment) for testing.
    Tests I built:
    I built many proofs for concept using flex objects, ui  and data handlers.
    HOW WAS THIS TEST (which I spoke about in the title) CREATED:
    I built two components which will be used as CHILDREN to the main application.  The fist child component used the SPARK TABBAR and the other child component used the MX:TABNAVIGATOR.  The main application declared these children (using action script) initially and a List control actually loaded the children into the main application.
    BOTH CHILD COMPPONENTS had 10 tabs....
    5 of the 10 tabs contained "grand" child components (none of the components(main application/children/grandchildren) worked with external data)
    RESULTS:
    THE LOAD SPEED BETWEEN THE TWO CONTROLS was very noticable between the two controls.
    THE SPARK COMPONENT (S:tabbar) AVERAGED 3-5 SECONDS TO LOAD
    WHERE THE MX:TABNAVIGATOR ONLY TOOK 1-2 SECONDS
    HAS ANYBODY ELSE EXPERIENCED THE SAME RESULTS?
    THANKS,
    DOUG LUBY OF LOUISIANA
    WWW.douglubey.com
    SEARCH:  FLEX MX:TABNAVIGATOR LOAD SPEED VERSUS SPARK:TABBAR

    It definately seems to fit (pretty good).
    My labels in the "TABS" were of different length.
    So I converted them all to be the same length.
    <s:TabBar dataProvider="{viewstackSampleB}" id="sampleBMainTabBar"left="
    11" top="8" right="11" height="34" fontSize="6"/>
    <mx:ViewStack id="viewstackSampleB" cornerRadius="20" top="40" bottom="10" left="10" right="10">
    <s:NavigatorContent label="Profile___________________" width="100%" height="100%">
    <s:NavigatorContent left="0" top="0" right="0" bottom="0" backgroundColor="#000080">
    <sample:SampleB_Profile id="studentProfile" left="2" top="2" right="2" bottom="2"/>
    </s:NavigatorContent>
    </s:NavigatorContent>
    <s:NavigatorContent label="SampleB_DragNDrop1________" width="100%" height="100%">
    <s:NavigatorContent left="0" top="0" right="0" bottom="0" backgroundColor="#000080">
    <sample:SampleB_DragNDrop1 id="studentDragNDrop1" left="2" top="2" right="2" bottom="2"/>
    </s:NavigatorContent>  
    </s:NavigatorContent>
    <s:NavigatorContent label="SampleB_DragNDrop2________" width="100%" height="100%">
    <s:NavigatorContent left="0" top="-3" right="0" bottom="0" backgroundColor="#AEAEAE">
    <sample:SampleB_DragNDrop2 id="studentDragNDrop2" left="2" top="2" right="2" bottom="2"/>
    </s:NavigatorContent>  
    </s:NavigatorContent>  
    <s:NavigatorContent label="SampleB_PullDataFromParent" width="100%" height="100%">
    <s:NavigatorContent left="0" top="-3" right="0" bottom="0" backgroundColor="#AEAEAE">
    <sample:SampleB_PullingDataParent id="studentPullingDataParent" left="2" top="2" right="2" bottom="2"/>
    </s:NavigatorContent>  
    </s:NavigatorContent>  
    <s:NavigatorContent label="Tab Five__________________" width="100%" height="100%">
    <s:NavigatorContent left="0" top="-3" right="0" bottom="0" backgroundColor="#AEAEAE">
    </s:NavigatorContent>  
    </s:NavigatorContent>  
    <s:NavigatorContent label="Tab Six___________________" width="100%" height="100%">
    <s:NavigatorContent left="0" top="-3" right="0" bottom="0" backgroundColor="#AEAEAE">
    </s:NavigatorContent>  
    </s:NavigatorContent>  
    <s:NavigatorContent label="Tab Seven_________________" width="100%" height="100%">
    <s:NavigatorContent left="0" top="-3" right="0" bottom="0" backgroundColor="#AEAEAE">
    </s:NavigatorContent>  
    </s:NavigatorContent>
    <s:NavigatorContent label="Tab Eight_________________" width="100%" height="100%">
    <s:NavigatorContent left="0" top="-3" right="0" bottom="0" backgroundColor="#AEAEAE">
    </s:NavigatorContent>  
    </s:NavigatorContent>
    <s:NavigatorContent label="Tab Nine__________________" width="100%" height="100%">
    <s:NavigatorContent left="0" top="-3" right="0" bottom="0" backgroundColor="#AEAEAE">
    </s:NavigatorContent>  
    </s:NavigatorContent>
    <s:NavigatorContent label="Tab Ten___________________" width="100%" height="100%">
    <s:NavigatorContent left="0" top="-3" right="0" bottom="0" backgroundColor="#AEAEAE">
    </s:NavigatorContent>  
    </s:NavigatorContent>  
    </mx:ViewStack>
    THIS APPEARS TO HAVE THE SAME LOAD TIME as the TabNavigator.
    So I have to ask:
    What is a permanent fix for this.
    Where I can I update my current "Build" for FLEX 4
    so my application does not retain this bug.
    OR ANOTHER QUESTION...what can I exclude in my declarations to fix this bug.
    I would prefer to use the Spark TabBar over the mx:tabnavigator
    MY CURRENT "BUILD" is 4.0.1.277662

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • Data load times

    Hi,
    I have a question regarding data loads. We have a process cahin which includes 3 ods and cube.
    Basically ODS A gets loaded from R/3 and the from ODS A it then loads into 2 other ODS B, ODS C and CUBE A.
    So when I went to monitor screen of this load ODS A-> ODS B,ODS C,CUBE A. The total time shows as 24 minutes.
    We have some other steps in process chain where ODS B-> ODS C, ODS C- CUBE 1.
    When I go the monitor screen of these data loads the total time the tortal time  for data loads show as 40 minutes.
    I *am suprised because the total run time for the chain itself is 40 minutes where in the chain it incclude data extraction form R/3 and ODS's Activations...indexex....
    *Can anybody throw me some light?
    Thank you all
    Edited by: amrutha pal on Sep 30, 2008 4:23 PM

    Hi All,
    I am not asking like which steps needed to be included in which chain.
    My question is when look at the process chain run time it says the total time is equal to 40 minutes and when you go RSMO to check the time taken for data load from ODS----> 3 other data targets it is showing 40 minutes.
    The process chain also includes ods activation buliding indexex,extracting data from R/3.
    So what are times we see when you click on a step in process chain and displaying messages and what is time you see in RSMO.
    Let's take a example:
    In Process chain A- there is step LOAD DATA- from ODS A----> ODS B,ODS C,Cube A.
    When I right click on the display messages for successful load it shows all the messages like
    Job started.....
    Job ended.....
    The total time here it shows 15 minutes.
    when I go to RSMO for same step it shows 30 mintues....
    I am confused....
    Please help me???

  • Master Data loading got failed: error "Update mode R is not supported by th

    Hello Experts,
    I use to load master data for 0Customer_Attr though daily process chain, it was running successfully.
    For last 2 days master data loading for 0Customer_Attr got failed and it gives following error message:
    "Update mode R is not supported by the extraction API"
    Can anyone tell me what is that error for? how to resolve this issue?
    Regards,
    Nirav

    Hi
    Update mode R error will come in the below case
    You are running a delta (for master data) which afils due to some error. to resolve that error, you make the load red and try to repeat the load.
    This time the load will fail with update mode R.
    As repeat delta is not supported.
    So, now, the only thing you can do is to reinit the delta(as told in above posts) and then you can proceed. The earlier problem has nothing to do with update mode R.
    example your fiorst delta failed with replication issue.
    only replicating and repeaing will not solve the update mode R.
    you will have to do both replication of the data source and re-int for the update mode R.
    One more thing I would like to add is.
    If the the delat which failed with error the first time(not update mode R), then
    you have to do init with data transfer
    if it failed without picking any records,
    then do init without data transfer.
    Hope this helps
    Regards
    Shilpa
    Edited by: Shilpa Vinayak on Oct 14, 2008 12:48 PM

Maybe you are looking for

  • Is there any restriction on the length of all Primary keys in a table

    Hi all, Is there any restriction on the length of all Primary keys in a data base table? i have some 10 fields as primary key in a DB table and length exceeds 120 and getting a warning. Please let me know will there be any problems in future with res

  • Why doesn't apple create an iCar?

    You have the income, you have the connections. Worst case just get a merger with a car brand and continue steve jobs' extraoribary legacy.

  • 10.4.8 Nothing Downloads Properly

    Since updating to 10.4.8 (if updating is the correct term) have had nothing but problems with: The Update-download-install and use of 10.4.8 Still cannot download anything..all downloads are incomplete... This was never a problem with all previous op

  • What is the main purpose of the synchronou​s disply in LV?

    Anybody know what is really the synchronous display do. In the front panel, when we right click to the object,, >>Advance >>>Synchronous Display When we can use this option??

  • How do I change the hostname of my PC?

    I am using only base Arch, just the command line version, nothing extra. After fresh install, the default hostname is myhost. I want to change it. The command hostname V-Arch-Min does not work...