Regarding Data Loads

Hi Gurus,
I want to know in which Real time scenario's we 'll use "Repair Full Request", Initialization w/o Transfer", early delta initialisation and when we can perform "selective deletion". I know these things theoritically but i don't know when and how we can use thesethings in Real time...
Please help me and i'll assign the points...
Regards
Ravi

Hi ,
Just to add more information ,
Full repair load is only relevant for ODS objects when you need to do a full load from a datasource which has been initialized. If you have done an Init, and then do a simple full load, the request will not activate, that's why the repair full load option works here. For the cube also you can mark the request as repair, but in any case it will load to the cube. The only thing you need to remember is that this will not do the job of removing some records and replacing them with the repaired data.
You can selectively delete from the cube, and then do a full load with the reqd selections.
Look at OSS Note 739863 'Repairing data in BW' for all the details!!
Go for Repair Full Request. In order not to disturb your delta selections you need to do the repair full load. After your repair full load you can continue loading deltas normally.
Create a new InfoPackage and mark this as Repair and then choose full upload and execute. There is no repair flag for deltas. and  we will not be losing any deltas.
generally we will go for intilization with out data transfer , suppose say  you have load which goes to many targets. that load got failed in one target, now you think you are in situation to run init for that target, but ideally you have to run init for all the targets, to avoid this ew generally do initilization for the particular target where you required and after that you run init without data to all the targets to avoid delat failures..or you have loaded to full load and you need to run initilization then this time we prefer inti with out data than with data to save time
we will go for selective deletion when compressed or if you have huge amount of data and you need to delete some of the request, then generally we prefer selective deletion instead of deleting entire data
Regards,
mallikarjun

Similar Messages

  • Error regarding data load into Essbase cube for Measures using ODI

    Hi Experts,
    I am able to load metadata for dimensions into Essbase cube using ODI but when we are trying same for loading data for Measures encountring following errrors:
    Time,Item,Location,Quantity,Price,Error_Reason
    '07_JAN_97','0011500100','0000001001~1000~12~00','20','12200','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    '28_JAN_97','0011500100','0000001300~1000~12~00','30','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    '28_JAN_97','0011500100','0000001300~1000~12~00','500','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    Can anyone look into this and reply quickly as it's urgent requiremet.
    Regards,
    Rohan

    We are having a similar problem. We're using the IKM SQL to Hyperion Essbase (DATA) knowledge module. We are mapping the actual data to the field called 'Data' in the model. But it kicks everything out saying 'Unknown Member [Data] in Data Load', as if it's trying to read that field as a dimension member. We can't see what we missed in building the interface. I would think the knowledge module would just know that the Data field is, um, data; not a dimension member. Has anyone else encountered this?
    Sabrina

  • Error Regarding Data loading in Planning and budgeting cloud service.

    Hi ,
    I was able to load data in planning and budgeting cloud service before 1 month.
    I loaded via Administration -> Import and Export -> Import data from File.
    Today i loaded the same file to the cloud instance for the same entity that i loaded after clearing the existing data.
    I am getting error while validating itself as
    \\Unrecognized column header value(s) were specified for the "Account" dimension: "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec", "Jan", "Feb", "Mar", "Point-of-View", "Data Load Cube Name". (Check delimiter settings, also, column header values are case sensitive.)
    I checked the source file. Everything is correct. I actually loaded the same file before and i was able to load.
    Does anyone know the problem behind. Can anyone give me suggestion please
    Thanks in Advance
    Pragadeesh.J

    Thanks for your response John.
    I had Period and Year dimension in Columns
    I changed the layout of year dimension in period to POV and loaded the data file. I was able to load without any errors.
    I changed the layout again to the original form as before and loaded. I didn't get any errors again.
    It worked somehow.
    Thank you
    Cheers John

  • Regarding data loading

    Hi Friends,
    I am loading data from oracle db connect to bi system.
    while i am loading it is giving below error.
    'Error occurred in the data selection'.
    Pls can any one help me.
    Thanks & Regards,
    ramnaresh.p.

    Hi,
    As in there is no usual criteria for " Job cancellation in the source system, it could be due to lot of issues:
    1> time limit exceed , memory over flow, currency conversion , message type x etc all these could be traced through ST22.Or even in job overview with an entry against ARfc .
    2> It might be due to a connection error with the source system that could tracted through sm59 (rfc connection test).
    3>The reason for job cancel will be clearly highlighted in the "job log" for the corresponding job in either SM37 or in "job overview:" in the environment drop down.
    4>It could be due to extractor problem , track it in rsa3.
    5>It could be due to an recorded IDoc (TRFC stuck) sm58.
    6>The ip specified in the question belongs to lo, so try going to lbwe and checking weather the extract structure is active or was there any enhancements in the data source which left it inactive.
    All these are the symtoms for the same, if you could specify any error in particular using all the above , then we could proceed to the solution .
    Regards,
    Raj

  • Regarding Data load in cube

    Hello Experts,
    While loading a data from ODS to Cube i am getting this error while executing DTP: " Exceptions in Substep: Rules"
    Please Help me.
    Thanks in advance

    i thik you are using 7.0 and doing full loads on DTP if you go to the error it will say duplicates record found the issues happens because in your dtp you have full load set from DSo to Cube, make it delta and reload CUBE ALL OVER again it should work.

  • Query regarding data loading from xls

    Hi
    I want to read data ( integers , only one column) from xls file. I donot want to load it in a table otherwise I could have tried using loader. But what I need to do is I have lakhs of rows in excel sheet and I need to pick them up in a query . I cannot create a table also as working on production . I s there any way I can pick it directly from the excel sheet as I it large volume and I cannot keep them in ( in.. clause) also .

    I want to read data ( integers , only one column)
    from xls file. I donot want to load it in a table
    otherwise I could have tried using loader. But what
    I need to do is I have lakhs of rows in excel sheet
    and I need to pick them up in a query . I cannot
    create a table also as working on production . I s
    there any way I can pick it directly from the excel
    sheet as I it large volume and I cannot keep them
    in ( in.. clause) also .Lakhs of rows!!!! You do realise that an excel spreadsheet is limited to 65536 rows? I'm right in thinking 1lakh = 100000 aren't I?

  • URGENT: regarding data load in essbase through ODI after upgradation

    Hi,
    I have got this error, Please give me the solution...
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
    File "<string>", line 29, in <module>
         at org.hsqldb.jdbc.Util.sqlException(Unknown Source)
         at org.hsqldb.jdbc.JDBCStatement.fetchResult(Unknown Source)
         at org.hsqldb.jdbc.JDBCStatement.executeQuery(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
    java.sql.SQLException: java.sql.SQLException: unexpected token: -
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
         at java.lang.Thread.run(Thread.java:662)
    Thanks,
    Rubi

    I have used some condition in data column.
    If I have deleted that condition and directly mapped in data column them interface works fine.
    But whenever I have used some normal CONVERT function like CONVERT("Source column name",Numeric), It gives me that error.

  • Data load times

    Hi,
    I have a question regarding data loads. We have a process cahin which includes 3 ods and cube.
    Basically ODS A gets loaded from R/3 and the from ODS A it then loads into 2 other ODS B, ODS C and CUBE A.
    So when I went to monitor screen of this load ODS A-> ODS B,ODS C,CUBE A. The total time shows as 24 minutes.
    We have some other steps in process chain where ODS B-> ODS C, ODS C- CUBE 1.
    When I go the monitor screen of these data loads the total time the tortal time  for data loads show as 40 minutes.
    I *am suprised because the total run time for the chain itself is 40 minutes where in the chain it incclude data extraction form R/3 and ODS's Activations...indexex....
    *Can anybody throw me some light?
    Thank you all
    Edited by: amrutha pal on Sep 30, 2008 4:23 PM

    Hi All,
    I am not asking like which steps needed to be included in which chain.
    My question is when look at the process chain run time it says the total time is equal to 40 minutes and when you go RSMO to check the time taken for data load from ODS----> 3 other data targets it is showing 40 minutes.
    The process chain also includes ods activation buliding indexex,extracting data from R/3.
    So what are times we see when you click on a step in process chain and displaying messages and what is time you see in RSMO.
    Let's take a example:
    In Process chain A- there is step LOAD DATA- from ODS A----> ODS B,ODS C,Cube A.
    When I right click on the display messages for successful load it shows all the messages like
    Job started.....
    Job ended.....
    The total time here it shows 15 minutes.
    when I go to RSMO for same step it shows 30 mintues....
    I am confused....
    Please help me???

  • Data load from R/3....(problem)

    Hello all,
    I have a query regarding data load.
    Business scenario  is :
    1) I had taken "init delta"  2 months ago....had started deltas at that time...
    2) somehow...i need to delete whole data....
    3) present RSA7  in R/3 :  Delta queue : 0 records
                                           Repeat Delta : 53000 records.
    4)reloaded whole data till date in BW.....through process chain ( Full load )..
    5) Now...my question is ....
    Should i need to do "init delta " again......and start delta ?????
    Will there be any missing data ????
    what will happen to "repeat delta queue data" ????
    Please tell me the steps ....i should take..
    Thanks...

    hello,
    One more thing .....
    I have to load fiscal period wise....
    April 2007  is : 001.2007 and 013.2006 as special period.....
    Now i have loaded till March 2007 : (012.2006)......
    So before starting deltas.....Is this sequence for loading data is correct ??
    1) 013.2006  to  016.2006 <b>(data for these periods may exist or may not )</b>
    2) 001.2007 to 002.2007 <b>(Current Open month)</b> ....(Full load)
    3) Then Init delta ......then delta ...???
    Please help me....its urgent .....
    Thanks....

  • Related Data Loading

    Hi,
    I am new to BI, I am going to work support project. Mainly monitoring the process chain & data loading.
    So please anyone tell me , what things i have to keep in mind for data loading and tell some real time scenarios and Issues regarding data loading please....
    <removed by moderator>.
    <removed by moderator>.
    Thanks & Regards,
    San
    Edited by: Siegfried Szameitat on Nov 17, 2008 12:27 PM

    Hi,
    Take a look at  the links below.
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/08f1b622-0c01-0010-618c-cb41e12c72be
    Data load errors - basic checks
    Process Chain Errors
    Regards.

  • Regarding MM Data Loads.

    Hi All,
    I am loading data for MM Cubes, but found that many data sources like 2LIS-02_CGR, 2LIS-02_SCN, 2LIS-02_SGR are blank and not having any data. I am doubting whether I forgot something to be done in R/3 to get the data. I have populated the setup tables also.
    Is there any sequence of data loading through these data sources? As few data sources like 2LIS_02_SCL the data transferred is in Lakhs of records but added 0 records.
    Please help me. I will assign points.
    Thanks and Regards,
    Sangini.

    Hi Sangini,
    Refer to following link
    http://help.sap.com/saphelp_nw04/helpdata/en/8d/bc383fe58d5900e10000000a114084/frameset.htm
    It refers to SAP std sequence of loading data through process chain .
    Hope it helps.
    Regards
    Mr Kapadia

  • Data load error regarding timestamp

    Hi All,
    I am getting data load error related to time.
    I am using  2lis_02_SCN for CNFTM(time stamp) filed 
    with value "08:00:" need to convert to 08:00:00...for some records and rest of the records are coming with normal 6 digit time.
    please send your suggestion as early as possible.
    Thanks
    Rupa

    Hi Rupa,
    Try out following sample code inthe transfer rule for the same info object and it should work
    lv_len  = strlen( str ).
    if lv_len = 4.
    concatenate str '00' into str.
    endif
    Regards
    Kapadia

  • Regarding master data loading for different source systems

    Hi Friends,
    I have an issue regarding master data loading.
    we have two source systems one is 4.6c and another is ecc 6.0.
    First i am loading the master data from 4.6c to bi7.0.
    Now this 4.6c is upgraded to ecc6.0.
    *In 4.6c and ecc6.0c master data is changing.
    After some time there is no 4.6c, only ecc 6.0 is there.
    Now if i load master data from ecc6.0 to bi7.0 what will happen.
    Is it possible ?
    Could you please tell me?
    Regards,
    ramnaresh.

    Hi ramnaresh porana,
    Yes, its possible. You can load data from ECC.
    Data will not change, may be you may get more fields in datasource at r/3 side, but BW/BI side no change in mappings structures are same. So data also same.
    You need to take care of Delta's before and after upgrade.
    Hope it Helps
    Srini

  • Regarding ERPI Data Loading

    Dear All,
    I have few doubts on ERP Integrator.
    1) What are things required from Oracle GL to Planning for Data Loading using ERP Integrator? (Trail Balance is enough or we required some other file from Oracle GL)
    2) Is there any scheduling options available for Data loading using ERP Integrator?
    3) what is process for loading the data to Planning using ERP Integrator?
    4) How we load the data to Planning? (i.e. monthly load, hourly load)
    Anyone please guide me in this situation.
    Thanks,
    PC

    1) What are things required from Oracle GL to Planning for Data Loading using ERP Integrator? (Trail Balance is enough or we required some other file from Oracle GL)
    Assuming you have the right version of Oracle EBS, ERP Integrator queries the tables within the Oracle EBS database to get the appropriate information. In my case, the trail balance file was enough. Within the trail balance file you will have the appropriate dimension intersection (account, entity, period, etc.), the type of account (asset vs. liability, etc.) and finally the dollar amount.
    2) Is there any scheduling options available for Data loading using ERP Integrator?
    Yes. You can use FDQM to map and validate the data, then use the FDQM batch scheduler to load the data via command line or you can use the FDQM batch scheduler as well.
    3) what is process for loading the data to Planning using ERP Integrator?
    I'll try to do my best to summarize. (Assuming you are using FDQM) Create rules in ERPi -> Configure the adapters in the Workbench Client for the ERPi Rules -> Configure the FDQM Web Client to call the Adapters set in the Workbench Client -> Import the data into FDQM. Then from here you can call your command line automation for batching if you wish.
    4) How we load the data to Planning? (i.e. monthly load, hourly load)
    This depends on your business. Assuming you are going to load the data for budget and planning purposes then maybe your business is happy with a monthly load (and most of the time this is the case). An hourly load might be helpful if you deal with users that need up to date actuals. Loading hourly acutals data might be an overkill for a budget or planning application, but I have ran into situations where this is needed, but then find myself worried about speeding up the calculations after the data is loaded. Long store short you can load monthly or hourly.

  • Regarding Master Data Loading by using Process Chain

    hai
    can anyone tell me 'step-by-step process and what are processes , process types' are used for loading the Master Data , Transaction into ODS and into Infocube.
    i ll assing maximum points
    bye
    rizwan

    HI Mohammand,
    1. Master Data loading:
    http://help.sap.com/saphelp_nw04/helpdata/en/3d/320e3d89195c59e10000000a114084/content.htm
    2. Transactional data to ODS:
    http://help.sap.com/saphelp_nw04/helpdata/en/ce/dc87c1d711f846b34e0e42ede5ebb7/content.htm
    3. Transactional data to CUBE:
    http://help.sap.com/saphelp_nw04/helpdata/en/d5/e80d3dbd82f72ce10000000a114084/content.htm
    Hope it Helps
    Srini

Maybe you are looking for

  • How to generate excel file in oracle forms 10g on client machine

    dear Sir, I am using just file server(installed 10g dev suite) not a oracle application server, I am running my application from another machine ,it running fine i want to generate excel report on client machine presently i m using OLE2 for fetching

  • Driving a TV from the iBook

    Was at a friends recently and discovered they have no DVD player (lets not go down that black hole !) [G4 iBook 1.2GHz] But said I have my iBook so lets just play my DVD through your TV. Easier said than done ! 1. I tried using the small jack socket

  • Issue while changing validity date for assigned roles: SAP IDM 7.2 SP8

    Hello Experts I assigned the Task on repository for validity modification for Roles as in below screenshot: When I modify the role validity, Task defined for Validity modification doesnt get triggered and IDM executes the tasks defined as Modify Task

  • Oracle EE vs SE

    Hi, 1. First, how do we find if we are using an Enterprise Version or Standard Edition Oracle. 2. Does it help to have the Enterprise Edition if the client has large data about (300 GB) and if he wishes to run multiple instances ? Any help in answeri

  • JSF Tab component and Dynamic Faces AjaxZone

    Has anyone tried to use a JSF Tab Component in a Dynamic Faces Ajax Zone. I would like to try to have a page that has a list of the alphabet, each tab being one letter and then adding terms an definitions to each tab. Do you have any comments or sugg