Impact of hourly Data load on BW and R/3

Hi...
As of now, the delta load from R/3 to BW is taking place every night. But, the users claim that they need it hourly.
Is this practice (hourly data load from r/3 to bw) is usually done? What would be the impact on BW and R/3? (any performance issues!)
What is the minimum best time interval for the delta loads?
Thanks,
Sai.

I think you have to let user business requirements dictate the answers as long as there are not any system or application constraints.
We have a load that at run 4 times a day during a 9 hour window, once at the beginning to pickup any overnight work and then about every 3 hrs during the business day to pick up new information that is needed.
No discernible impact on R3 here, but that probably depends on your transaction volume and efficiency of the extractors that are used.
The user base does need to understand that loads will occur during the day and that depending on when the run a query, the results could change - but presumably that's the whole reason you are doing this in the first place.
This does result in more Requests, so you want to make sure you are compressing your cubes based on the Request's age rather number of requests. 
Can't really think of much else.

Similar Messages

  • Master data loads for Attributes and texts failing

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

  • Master Data Loading for Prices and Conditions in CRM - "/SAPCND/GCM"

    Hi,
    Could anyone give me some inputs on Master Data Loading for Prices and Conditions in CRM.
    T. Code is:  /SAPCND/GCM
    I need to load data on a file (extracted from 4.6) for service contracts.
    I tried LSMW : for this transaction, recording does not work.
    I am trying loading thru Idocs (LSMW). But that too is note really working.
    Do we require some custom development for this , or is some SAP standard funcntionality available ??
    Can anyone provide some valuable inputs one this.
    Would appreciate your responses.

    Hi Tiest,
    Thanx for responding.
    U r right, our clint is upgrading from 4.6 to ECC.
    So as per the clients requirements, we are maintaining all the configs for Services in CRM.
    Services Data which was in 4.6 is being pulled put on flat files which needs to be loaded in CRM. So middleware would not be able to do this.
    What I am looking os some standard upload program.
    LSMW recording does not work.
    This I-Doc "CRMXIF_COND_REC_SLIM_SAVE_M", i am able to load a single record. But I am not able to find , how to make this function for multiple entries.
    IN standard we for loading master data thru I-docs, we map the values to the standard fields which are available in that I-Doc.
    But in this particular i-doc, there is a common field for which I need to define the field name and a field value..
    Till now, I am only able to define just one field name and a field value.
    I want this to word for mutliple entries.
    Hope u get my point.
    Thanx

  • Data Load (Info package and DTP) performance

    Hi,
    We are planning to perform initialisation expecting 30 million records.
    To improve the data load performance i am planning to increase data package size from default value to double the default value
    Will that improve the performance

    Adjusting the info package size would be helpful...
    But don't go with INCREASING the numbers, It's all depends on the your extract structure size, source system memory settings.
    I would suggest to do some trail in DEV/QUALITY systems with enough data.
    I remember playing in production itself for CRM - 0BPARTNER extractor - we actually reduced the number of records to be extracted per package to improve the performance - we did see the impacts.
    First calculate how much is your current package size by number of records extracted per package now with your extract structure size..and then play around..sure it will help unless you DON'T have any excessive ABAP code user exits.
    Let us know if you need further help on this

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

  • Error Regarding Data loading in Planning and budgeting cloud service.

    Hi ,
    I was able to load data in planning and budgeting cloud service before 1 month.
    I loaded via Administration -> Import and Export -> Import data from File.
    Today i loaded the same file to the cloud instance for the same entity that i loaded after clearing the existing data.
    I am getting error while validating itself as
    \\Unrecognized column header value(s) were specified for the "Account" dimension: "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec", "Jan", "Feb", "Mar", "Point-of-View", "Data Load Cube Name". (Check delimiter settings, also, column header values are case sensitive.)
    I checked the source file. Everything is correct. I actually loaded the same file before and i was able to load.
    Does anyone know the problem behind. Can anyone give me suggestion please
    Thanks in Advance
    Pragadeesh.J

    Thanks for your response John.
    I had Period and Year dimension in Columns
    I changed the layout of year dimension in period to POV and loaded the data file. I was able to load without any errors.
    I changed the layout again to the original form as before and loaded. I didn't get any errors again.
    It worked somehow.
    Thank you
    Cheers John

  • HT1595 It takes literally hours to load a move and I have good wireless signal...what's up with that?

    It is taking literally hours for movies to load on Apple TV using both NETFLIX and iTunes.  I have a strong wifi signal and according to the connection test, all is well.  What's up with that?

    Wifi signal and connection test don't really give a good indication of where things are at
    Check speedtest.net to see actual connection speed, needs at least 6mbps consistently for instant HD stream
    Check istumbler.net or netstumbler to verify interference
    Male sure you are on the ISPs DNS (settings - network - configure DNS - automatic)

  • My Macbook is working very slow lately. it did beach balling for sometimes and that stopped when i installed OS X snow leopard on it. but the problem is still there with loading videos. XBMC takes hours to load a movie and youtube do not work as it use to

    please help me fix it. and it even takes more then two hours to download something which is just 315 mb.

    please help me fix it. and it even takes more then two hours to download something which is just 315 mb.

  • Which LKM and IKM to use for Fast data loading b/w MSSQL 2005 and Oracle 11

    Hi,
    Can anybody help us to decide which LKMs and IKMs are best for data loading between MSSQL and Oracle.
    Staging Area is Oracle. We have to load around 400Million rows from MSSQL to Oracle 11g.
    Best regards,
    Muhammad

    Thanks Ayush,
    You are right and it has dumped the file very quickly; but it is giving error on sqlldr call thorugh jython. I have reaised SR with oracle to look into it further.
    thanks again and have a very nice time.
    Regards,
    Muhammad

  • Full and delta Data loads in SAP BI 7.0

    Hi,
    Kindly explain me the difference between full and delta data loads in cube and DSO.
    When do we select change log or active table(with or without archive) with full/delta in DTP for DSO.
    Please explain the differences in data load for above different combinations.
    With thanks and regards,
    Mukul

    Please search the forums. This topic has already been discussed a lot of times.

  • ODI: How determine if auto load is complete and all data has gone through

    Hi All,
    I am currently executing the below scenario and following http://docs.oracle.com/cd/E26070_11/otn/pdf/integration/E26075_01.pdf for the same.
    SRC: AGILE
    TRG: OBIEE
    As per the above i get out of the box packages which are used and executed , since this is something which comes out of the box how to determine whether my data load is complete and data has gone through from SRC->ODM->TRG?
    My Operator and sessesion show success and secondly i don see any errors in logs as well .
    Any help ? Anybody already experienced above scenario please help .
    Cheers,
    Mak

    So who do I believe?
    Blog:  http://www.jamoroki.com
    James King
    about.me/jamoroki
      <http://about.me/jamoroki>

  • Roll back data load

    Hi All,
    I am using essbase 7.1.5. Is there any way to rollback the database in case if it finds any error. For example if source file (Flat file) contains 1000 records where at 501 record essbase found an error and has rejected that record (501) and aborted the load. In this case is there a way to rollback the database before loading the 500 records.If so where should i do those settings?? Please advice.
    Thanks in advance.
    Hari

    A 6 Hour data load? I've never heard of such a thing. It sounds like the data should be sorted for a more optimum dataload to me.
    As for the two stage approach, this assumes that you are loading variances (delta's) to the existing values, rather than replacing the data itself. It further assumes that you can create an input level cube to handle the conversion. You have your base data in one scenario, and your variances/delta's in another. You can reload your delta's (as absolutes) any time, and derive the absolutes from the sum of the base and variance values.
    - Scenario
    -- Base (+) <--- this get's recalculated when the Delta's are considered "good"
    -- Delta (+) <--- this get's loaded for changes only, and reset to zero when the base is recalculated.
    You export the modified data from this cube to your existing/consolidation cube. If you can "pre-load" the changes (outside of your calc window for the main cube), you can optimize the calculation window -- although if it takes 6 hours to load the database your calc window is probably shot no matter what you do.
    However, if you mean that the load AND calc takes 6 hours, and it takes a relatively short time to load the data alone, this can be a performance enhancement because you can recalc this "input cube" in seconds from a new/complete load relative to the "in place" reset and reload changes approach (in your existing cube).
    You are simply redirecting your data into a staging table, essentially, and the staging table handles the conversion of variances to absolutes so you can make the process more efficient over all (it is often more efficient to break the process up into smaller pieces).

  • Essbase data load process never terminating

    Hi
    We are using Essbase 11.1.2.1.
    We try to load a data file into a BSO Essbase application (essbase export format, so no dataload). We are using "execute in background" option.
    If the file contents no unknown member : it is correctly loaded.
    If the file contents an unknown member : the data load is never terminated : we have killed it after more than 15 hours, and the request is still in "terminating" so we have to kill the essbase process on the server. No error message or file.
    Thanks in advance for your help
    Fanny

    Hi
    I try to start this discussion again, because we still have the same problem, and no idea of why (no idea from the Oracle support too) and my client is waiting for an explanation :-/
    So, some more details :
    Context :
    I have an export (essbase format) from a database.
    If I import this file in a database with all the members used in the file => no problem. The dataload takes around 10 seconds.
    If I import this file in a database with all the members used in the file, excepted 1 => problem. The data load never ends and I have to kill the Essbase application on the Essbase server.
    I have done another test :
    I have imported the file in a database with all the members used in the file (so no problem).
    I have exported level-0 data in columns.
    I have imported this new file in the database with all the members used in the file, using a DLR => no problem. The data load takes around 10 seconds, with a dataload.err generated.
    So, John, you are right, the client shoud really use a column format export. But I need to explain to him why he cannot (not should not) use an export format.
    If using an export format is possible : I have to identify my problem and make the right modification to solve the problem.
    If it not possible : this is a bug, and it is not my responsability anymore and I can close the project!
    Thanks in advance for your great ideas!
    Fanny

  • Data load status stay in yellow

    Hi,
    My BW platform is BW 701 with SP 5.  I am loading ECCS data hourly with 3.5 method which include transfer rule and update rule.  Most of the time the data load are successfully completed.  The total number of records is about 180,000 records and extracted in 4 packets. Once in a while, randomly one of the data packet stay in yellow could not complete the load.  But, in the next hour data load, the data loaded successfully.  We know it is not data issue.  Does anyone know why the data load is not consistently?  Any suggestions are much appreciated.
    Thanks for your suggestions,
    Frank

    HI Frank,
    This might be because some of the TRFc or Idcos might got hung.
    Check the source system job got finished or not.
    If the source system job got completed check the TRFCs and IDOCs if there are any hung TRFCs and IDOCs.
    in the monitor screen --> menu bar environment > select job overview> source system (enter id n pwd ) check the job status.
    once its done to check the TRFCs
    Info package>Monitor->Environment> Trasact Rfcs-->In source system --> it will diplsay all the TRFCs check for your hung TRFCs and try to manually flush the TRFC(F6).
    If idocs are hung change - process the IDOCs manually.
    Regards
    KP

  • Impact of Changing Data Package Size with DTP

    Hi All,
    We have delta dtp to load data from DSO to infocube. Default data package size with dtp is 50,000 records.
    Due to huge no of data, internal table memory space is used and data loading get fails.
    Then we changed the data package size to 10,000, which executes the data load successfully.
    DTP with package size of 50,000 took 40 minutes to execute and failed, but DTP with package size of 10,000 took 15 minutes (for same amount of data).
    Please find below my questions:
    Why a DTP with bigger size of packet runs longer than a DTP with lower packet size ?
    Also by reducing the standard data package size 50,000 to 10,000, will it impact any other data loading?
    Thanks

    Hi Sri,
    If your DTP is taking more time then check your transformation .
    1.Transformation with Routines always take more time so you if you want to reduce the time of execution then routine should be optimized for good performance .
    2.Also check if you have filter at DTP level .Due to filters DTP takes long time .If same data get filtered at routine level it take much lesser time .
    3.If you cannot change routine then you can set semantic keys at your DTP .The package data will be sorted as per semantic keys and thus it may be helpful at routine level for fast processing.
    4.Your routine is getting failed due to  internal table memory space so check if you have select statement in routine without FOR ALL ENTRIES IN RESULT_PACKAGE or SOURCE_PACKAGE line .if you will use this It will reduce record count .
    5.Wherever possible delete duplicate records and if possible filter useless data at start routine itself .
    6.Refresh internal table if data no longer needed .If your tables are global then data will be present at every routine level so refreshing will help to reduce size.
    7.The maximum memory that can be occupied by an internal table (including its internal administration) is 2 gigabytes. A more realistic figure is up to 500 megabytes.
    8.Also check no of jobs running that time .May be you have lots of jobs active at the same time so memory availability will be less and DTP may get failed .
    Why a DTP with bigger size of packet runs longer than a DTP with lower packet size ?
    *Start and end routine works at package level so routine run for each package one by one .By default package have sorted data based on keys (non unique keys (characteristics )of source or target) and by setting semantic keys you can change this order.So Package having more data will take more time in processing then package have lesser data .
    by reducing the standard data package size 50,000 to 10,000, will it impact any other data loading?
    It will only impact running of that load .but yes if lots of other loads are running simultaneously then server can allocate more space to them .So better before reducing package size just check whether it is helpful in routine performance (start and end ) or increasing overhead .
    Hope these points will be helpful .
    Regards,
    Jaya Tiwari

Maybe you are looking for

  • Issue w merging bck over to my first caller

    They need to fix the issue where you can't click back over when you get an 2nd inbound call I have to hang up and call back the first person I was in the phone with

  • Tables(Invoice Related)

    Hi, From which table can i get the following fields: Vendor name Vendor No Invoice No Invoice date Description Discount amount.

  • How to make photoshop back to its original color

    I found the problem to color in photoshop , I need help from the masters photoshop for more details please see the following picture - when blank mode everything looks normal - but when creating a new layer that problem begins this time I really need

  • Copy new .pdf files to desktop

    How do I copy one of the new .pdf files I've just converted to .pdf to my desktop?

  • Migration of Oracle 32-bit to 64-bit

    Hi, We have windows host x64 bit. Oracle is 10g 32 bit and we want to migrate it to 10g 64 bit. 1. We have installed the 64bit oracle software. 2. Then we uninstalled the 32bit software. With the uninstallation, the oracle services have also disappea