History Load Challenges from 3.x to 7.0

Hello,
We recently did upgrade from 3.x to 7.0 for one of our area we need to bring history data say 100 million records to 7.0 as part of the history.
Please advise me what are the chanllenges I could face during this load.Or if some one has document that would be great help as well.
Thanks = Points
Andy

Your biggest problem during loading will most likely be tablespace issues.  Make sure the basis team has added enough tablespace or changes it to "extend automatically". 
Another thing will be data volume.  You may want to break this up into smaller chunks (rather than 1 100M record load).  The reason for this is that if you load 99M records and the 99M+1 record has a special character, you will unhappily be forced to reload.  Sorry, but I would recommend 3 - 5M load sizes.  If you load too much data, the system will be much slower and the loads take a lot longer, so a small fast load would be better than a large, extremely slow load.  It's not a 1:1 thing where 3M records takes an hour and 9M should take 3 hours....more like 9M may take 20hrs instead.
I think I covered the special characters above .... make sure they are covered/understood what might come in.
Also, if try to bring in data in understandable chunks if possible.  Like, by fiscal year/period or fiscal year or company code.  That way, if you need to load a specific period, you don't need to reload the entire dataset.
Good Luck,
Brian

Similar Messages

  • Load data from one Application to another in BPC NW 7.5

    Hi
    I have a requirement to load data from One application to another application.
    eg. HR Plannning to CostCenter Planning.
    I am aware of a method to load data using transformation file mapping source and target dimensions and run DM package and this works fine.
    However, the challenge is if i try to automate this process then i will have to hardcode the application tech name and save as variant to use that in process chain. Now, if i hardcode the tech name of the application then it might happen that during full optimization the tech name might change and the automation load will fail.
    so how to handle this situation? any standard practices available?
    appreciate your inputs/thoughts.
    Regards.

    Hi,
    Exactly. The main problem is that there is no statement to transfer the data. Please take a look at the below link from help.sap:
    http://help.sap.com/saphelp_bpc75_nw/helpdata/en/f4/33f0bbc46c4a53860f1410f20ffb8b/content.htm
    Since the structure of both the applications is same, you dont require add_dim or rename_dim, etc. However, you definitely need the when statement.
    If you want to transfer all the records, then you can have something like
    *WHEN ACCOUNT
    *IS *
        *REC(EXPRESSION = %VALUE%)
    *ENDWHEN
    Hope this helps.

  • Error loading item from WAD

    Hello,
    I have problems with loading template from WAD to Browser.
    I created the template and then pushed the button Execute in the browser. I can see only message: Error Loading item*
    Can you help me pleas.
    Thanks
    Petr

    Hello,
    I created the template with one table, saved and executed in browser and I have again problem with loading item.
    My problem has longer history, because I worked on BWT system normally(I created many templates and I had no problems with loading items) a one day I started to work and I saw this error message. Now I can´t execute any templates (inclusive of all that worked correctly before)
    It´s possible that I made some change in WAD settings but I don´t know where....
    Thanks
    Petr

  • Owb11gr2 dimension property: support multiple history loading

    Hi all,
    does anyone know how the "support multiple history loading" feature for dimensions in 11gr2 is supposed to work? I have a scenario here which I think this feature will be able to solve, but I encountered some warning messages upon using it. The warning messages complained of chunking enabled for some sources in the mapping but not enabled for other sources. I dont understand what this chunking warning is referring to, how can I get rid of these warning messages?
    The scenario which I am trying to solve using this feature is:
    I have a warehouse table called supplier that captures changes. Changes to suppliername in the sourcetable named supplier will end up triggering history capturing in this warehouse table.
    Now assuming my warehouse supplier table only has this row:
    suppkey | suppname | suppcode | supptype | suppnature | effectivedate | expirationdate
    1 terry 123 A H 31-dec-2009 null
    What happens is there are more than 1 change to the suppliername column for the same row in the source database, for instance:
    update supplier set suppliername = 'james' where suppcode = '123'.
    update supplier set suppliername = 'tom' where suppcode = '123'.
    update supplier set suppliername = 'jason' where suppcode = '123'.
    Thus the corresponding cdc change table/subscriber view will contain 3 UN/UU rows that correspond to the same row in the source table.
    Now when executing the mapping, I want these 3 rows from the subscriber view to be loaded correctly into my warehouse table, triggering history capturing whenever needed.
    Desired warehouse table state:
    suppkey | suppname | suppcode | supptype | suppnature | effectivedate | expirationdate
    1 terry 123 A H 31-dec-2009 1-jan-2010
    1 james 123 A H 1-jan-2010 1-feb-2010
    1 tom 123 A H 1-feb-2010 1-mar-2010
    1 jason 123 A H 1-mar-2010 null
    I tried doing so without enabling the "support multiple history loading" option but the end result was not totally correct. Anyone has any advice on this chunking issue?

    Hi David,
    yes it is just validation warnings. I tested the mapping and history capturing seems to be working according to what I had expected, but I am just worried that these warnings may pose problems in unseen circumstances.
    And thanks for the link given. It describes perfectly the problem I had been facing when using owb10gr2!

  • Loading data from a remote machine

    Hi,
    I would like to know the process to load data from a file in remote machine to an oracle table. The load file is dumped to a folder in a remote machine and it is a pipe delimited one. I am using oracle 11g.
    Can I create a directory in oracle, which maps to the remote location, and create an external table?
    Thanks in advance
    Arun

    The best method would depend on the requirements. Are files regularly created on that server? What are the file sizes? Are files to be kept after loading for history or auditing purposes? Etc.
    You can run SQL*Loader on the remote platform and load the data, across the network, into the database.
    However, if the data set is large, it may be faster to zip that data, scp it across to the Oracle server, and then unzip and load it there.
    If the remote platform is a user platform and the user needs to load data - then that can be done via a web interface. User uploads the file to Oracle (via http). The file is received as a CLOB by the database. PL/SQL code can then process that file - either by parsing it directly, or unloading the CLOB to a local file and then loading that file via an external table.
    You'll need to look at what the requirements are, before looking for a solution.

  • Help in loading data from an infocube to another

    all,
    i have been trying to load data from a consolidation infocube of one BW system to a similar infocube in another BW system
    I managed to load data by identifying the fist system as the source system
    the challenge that i m facing is as follows
    the first infocube has approx 70000 records; while when i uploaded the data approx only 20000 records got updated
    would appreciate help on why this has happened
    thanks for your time

    Hi,
       Are you using Open hub interface?
       If yes, then the structure/datamodel of the two infocubes are same?
       If both the infocubes are having the same structure, then did you apply any transformation (custom)logic,selection field values in the infospoke definition ?
       If the all the above are true, then do check the the status overview of the executed infospokes grouping by logical target system, destination, infospoke and day.
       And also double click to access single request log and then drill down to record count and errors.
    regards,
    ravi

  • Automation Infocube History load

    Hi,
    Scenerio : i have 3 infocubes in which we load yearly data...say 2004, 2005 and 2006 data in each cube....now to load 2007 data, we have to delete contents of 2004 yr and dump thr 2007 records so 3 yrs of history will b maintained....now if 2008 comes, 2005 data will b deleted n so on ....
    i know, i can do this manually deleting and populating records at starting of each yr ..
    can ne1 has idea if i can automate the process and maintain this globally, like i can use the same code or method for other cubes as well....
    Can we implement this using decision making process types in process chains???
    waiting for ur reply guys.......
    Regards
    Ankur Shah

    Hey,
    i have 3 cubes and records are loaded from ODS.
    I require to load data from ODS to cube an its a delta load on daily basis. So for the current yr, it will b to say cube 1.For the yr 2005 and 2004, i have loaded the data to cube2 and cube3.okies..
    Now when 2007 comes in i require to delete contents of cube3 (2004 yr data) and load it with 2007 yr...and it will on daily basis from ODS..
    Suppose i included this flow in my Process chain, then too how i can trigger it to say particular cube...i mean now if 2008 comes, thn how the process chain knows which cube to delete n whr to load data...once it knows which cube to delete the data, i can obviously fill the records...
    Regards
    Ankur

  • . Can't watch or load videos from sd card

    I have a HP desktop I purchased in 2009. I have alway been able to load pictures from my camera with no problem, but recently I have been trying to watch and load videos from my sons cell sd card. It will not let me watch or load them. All I get is  "Photo gallery can't open pictures or videos - file format not supported - you don't have updates to photo gallery". I run windows vista. I really need some help with this. I have tried to look for updates to fix this but have not found any ------ please help!

    Hello t48,
    Welcome to the HP Forums, I hope you enjoy your experience! To help you get the most out of the HP Forums I would like to direct your attention to the HP Forums Guide First Time Here? Learn How to Post and More.
    I understand you are having issues viewing files from your son's phone, yet you are able to view files from your camera no problem. I would suspect that this is a difference in software technology and would recommend that you go to the manufacturer of the phone's website and look for their support forums for a solution. I believe you will need to download a program or codex to enable yourself to view the files in the format that they are currently in and that would be the place to look for which.
    I hope I have answered your question to your satisfaction. Thank you for posting on the HP Forums. Have a great day!
    Please click the "Thumbs Up" on the bottom right of this post to say thank you if you appreciate the support I provide!
    Also be sure to mark my post as “Accept as Solution" if you feel my post solved your issue, it will help others who face the same challenge find the same solution.
    Dunidar
    I work on behalf of HP
    Find out a bit more about me by checking out my profile!
    "Customers don’t expect you to be perfect. They do expect you to fix things when they go wrong." ~ Donald Porter

  • Data loading issue from BW Production to BW Sandbox

    Hello Guys,
           I am trying to load data from one info cube (Production) to another info cube (Sendbox) applying update rules for prototype model. I am not getting any error in monitoring window for Info Package. But what I see is '0 from 0 records' for long time. I have tested connection between two systems that looks O.K. There is data in Production Cube. This is first time I am loading data between two BW systems. I am not sure what could be wrong. Can some one please help me ASAP? Thanks a lot.
                 Export data source for that cube in production is created.
    Kate

    Hi Kate,
    The first thing you might want to check is the status of the extractor job in your Production system. The job name will be same as the request name BI_REQU*. So, in SM37 of Production system check for the extractor job <request number>  and check the status of the job. If it is running, it means that ur extraction is running still and you can wait for some more time or monitor the job closely in SM50 regarding it's progress.
    If it is in cancelled state, you need to check the job log or any short dumps for the error message.
    Regards,
    Sree

  • Unable to load data from an ODS to a Cube

    Hi all,
    I am traying to load data from an ODS to a cube, and I'm getting the following msg at the bottom of the screen monitor: <b>"No corresponding requests were found for request REQU_BZUH9ZPYNL5XLQNB7K8IQDKFS in the ODS/Cube" Message no. RSBM043.</b>
    I am unable to load the data. The QM status is yellow. When the process starts, on the left hand of the monitor it shows: "0 of 84408 records" which is ok. I load that with this package before and it works well.
    Can you help me what I have to do? I tried to wait but it was the same: no progress and no data in the cube.
    Thank you very much and kind regards,
    MM.
    May be this helps...
    When I look at the status, it says that I have a short dump at BW. It was CALL_FUNCTION_REMOTE_ERROR and the short text is "The function module "SUBST_TRANSFER_UPGEVAL" cannot be used for 'remote'". This short dump occurs very much before I shoot the upload.
    Thanks again.

    Hello MM,
    Can you do the following..
    make the Total status Red Delete the request from cube.
    goto ODS -> Remove the Data Mart Status -> Try loading it again.
    The error message that you get is common when we are trying to run a infopackage with the source object having no new request, (means all the requests available in the source are already moved to the targets). So please try the steps given above.
    Hope it will help!

  • To load data from a cube in SCM(APO) system to a cube in BI system.

    Experts,
         Please let me know whether it is possible to load data from a cube in SCM(APO) system to a cube in BI system.If so explain the steps to perform.
    Thanks,
    Meera

    Hi,
    Think in this way,
    To load the data fro any source we need datasource Ok. You can genare Export data source for Cube in APO,  then use that datasource for BW extraction, try like this. I think it will work, in my case I'm directly loading data from APO to BW using the DataSource sthat are genaraed on Planning Area.
    Why you need to take data from APO cube?. Is there any condition for that?. If it is not mandatory, you can use the dame datasource and load the data to BW, if they have any conditions while loading the data from APO to APO cube, they you try check wherther it is possible in BW or not. If possible then you use DataSource and do the same calculation in BW directly.
    Thanks
    Reddy

  • ERROR :  "Exceptions in Substep: Rules" while loading data from DSO to CUBE

    Hi All,
    I have a scenario like this.
    I was loading data from ODS to cube daily. everything was working fine till 2  days before. i added some characteristic fields in my cube while i didn't do any change in my key figures. now i started reloading of data from ODS to cubes. i got the following error  "Exceptions in Substep: Rules" after some package of data loaded successfully.
    I deleted the added fields from the infocube and then i tried toreload data again but i am facing the same error again  "Exceptions in Substep: Rules".
    Can any one tell me how to come out from this situation ?
    Regards,
    Komik Shah

    Dear,
    Check some related SDN posts:
    zDSo Loading Problem
    Re: Regarding Data load in cube
    Re: Regarding Data load in cube
    Re: Error : Exceptions in Substep: Rules
    Regards,
    Syed Hussain.

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

  • DTP load data from DSO to CUBE can't debug

    Hi,all expert.
    I met a problem yestoday.
    When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
    inconsistent input parameter(parameter:<unknow>,value <unknown>)
    message no: RS_EXCEPTION 101.
    I don't know why,PLZ help me,thank you very much.

    Hi,
    An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
    Thanks,
    Arminder

  • Error when loading data from DSO to Cube

    Hi,
    After upgrading to 2004s we get the following error message when trying to load data from a DSO to a Cube:
    Unexpected error: RSDRC_CONVERT_RANGES_TO_SELDR
    Has anyone experienced a similar problem or have some guidelines on how to resolve it?
    Kind regards,
    Fredrik

    Hej Martin,
    Tack!
    Problem solved
    //Fredrik

Maybe you are looking for

  • How to supress a UI popup at the time of Service start up in solaris-10

    Hi, I have written a service script(service.sh) for my application which will add my application(BTINM) as service and start the service. Internally inside the script i will call the sh script which will start my application server. When my applicati

  • HP 530 notebook PC - hard drive and RAM upgrade

    Hello, I'm using HP 530 and I want to upgrade it's hard drive. In documentation specified that my notebook supports only 80/120Gb hard drives and up to 2Gb RAM. Is it true? Can't i use more capacious HDD or RAM. And what type of HDD i should by? Is i

  • DS 5.2.4 Upgrade Issue - Unable to login to console

    Hello! I have not been able to login to DS console after I have patched it to 5.2.4. I have found another thread with the same issue with solution https://forums.oracle.com/forums/thread.jspa?messageID=8517561 (add ldap library into runtime linking e

  • I need help -extremely urgent!

    Hi there i have an assignment to submit in one month and i'm stuck using Java on my personal computer. I installed the lastest version 1.5 and when working with Textpad and compiling the code it leaves me with the following message: Registry key 'Sof

  • Link from Oracle DB server to Oracle DB server

    Hi, I need some help how to link oracle db server A to Oracle db server B. It is so difficult to set it up. I kept getting error 'Could not resolve the connect identified...' when I execute this query statement that links to B server: select COUNT(*)