ODI data loading Speed

Hi All,
ODI data loading step's speed is too low.
I am using;
LKM=LKM SQL to Oracle,
CKM=CKM SQL,
IKM=IKM SQL Incremental Update for replication SQL Server to Oracle.
I don't use Flow control in interfaces.
SQL Server and Oracle database are installed on the same server.
How can I do it faster?

If the two database servers are on the same machine, and you are dealing with bulk data, you should use an LKM which uses bulk methods (BCP to extract the data from SQL Server and External tables to get the data into Oracle) - something like this KM https://s3.amazonaws.com/Ora/KM_MSSQL2ORACLE.zip (which is actually an IKM but the LKM is not so different.
Hope it help

Similar Messages

  • Data Load Speed

    Hi all.
    We are starting the implementation of SAP at the company I work and I am designated to prepare the Data Load of the legacy systems. I have already asked our consultants about the data load speed but they didn´t answer really what I need.
    Does anyone have a statistic of the data load speed using tools like LSMW, CATT, eCATT, etc... per hour?
    I know that the speed depends of what data I´m loading and also the CPU speed but any information is good to me.
    Thank you and best regards.

    hi friedel,
    Again here is the complete details regarding data transfer techniques.
    <b>Call Transaction:</b>
    1.Synchronous Processing
    2.Synchronous and Asynchrounous database updates
    3.Transfer of data for individual transaction each time CALL TRANSACTION statement is executed.
    4.No batch input log gets generated
    5.No automatic error handling.
    <b>Session Method:</b>
    1.Asynchronous Processing
    2.Synchronous database updates.
    3.Transfer of data for multiple transaction
    4.Batch input log gets generated
    5.Automatic error handling
    6.SAP's standard approach
    <b>Direct Input Method:</b>
    1.Best suited for transferring large amount of data
    2.No screens are processed
    3.Database is updated directly using standard function modules.eg.check the program RFBIBL00.
    <b>LSMW.</b>
    1.A code free tool which helps you to transfer data into SAP.
    2.Suited for one time transfer only.
    <b>CALL DIALOG.</b>
    This approach is outdated and you should choose between one of the above techniques..
    Also check the knowledge pool for more reference
    http://help.sap.com
    Cheers,
    Abdul Hakim

  • How can I increase the data loading speed

    Hi Expert,
    I need to populate a table cross dblink from another database each night. There are about 600,000 rows. It takes about two hours. This is the steps:
    delete from target_table;
    insert into target_table
    select * from source_table@dblink;
    commit;
    I can't use truncate in case the loading fails that I still have the old data.
    How can I increate the loading speed?
    Thanks!

    DELETE and INSERT /*+ APPEND */ aren't a good combination, as the high water mark will keep going up.
    With a trivial number of rows like this I would not expect the delete or insert to be the problem. (How long does the delete take anyway?) It's more likely to be to do with the query over the db link. Is it just one table or is that a simplified example? Can you check the equivalent insert over on the remote database? If it's much faster I would investigate the network connection.

  • ODI Data load fails everytime with different resons

    Hi All,
    We are loading data from Oracle Views to MSSQL tables using ODI 11g.
    one of the interface in our package is getting filed every time at different steps with different errors like temp space,Agent Failed, Connection closed.
    It has 15 sources (Oracle Views) which are unioned to load the target table. And One of the view takes 2 hours to query the database.
    This interface should take 4 hours approximately as per our previous loads but it is gets hang, runs for hours and then errored out.
    Can any one help me on this?

    Hi,
    What you could do is create a new procedure, then add a new command, set the technology to O/S
    Then use something like
    essmsh -D /<enterlocation>/ordclract.mxls 3416683,1342131001
    You might need to put the location of essmsh as well.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Data load into Essbase (append instead of overwrite)

    Hello,
    We are loading data from oracle table to target Essbase cube. How do we handle ODI data load to append instead of overwrite last value?
    Ex: We have data source with M:1 mapping, so we incoporated case statement [Case when Group A, B, C then D] is there a setting in ODI that allows data to append (add) instead of overwriting?
    Currently, the data value in C is loaded into D instead of A+B+C into D.
    Thanks.

    You can put the CASE WHEN in the target mapping and still use a load rule, a load rule does not have anything to do with what you do in the target mappings.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Data load happening terribly slow

    Hi all,
    I had opened up quality server and made some change(removed its compunding attribute) at definition level to an infoobject present in the InfoCube.
    The infoobject,IC and Update rules are all active.
    when i scheduled the load now
    ther is a huge variation in the data load speed.
    Before:
    3Hrs- 60 Lakhs
    Now:
    3hrs-10 Lakhs.
    Also i m observing in SM66 that the process are all reading NRIV Table
    Can anyone throw some insght to this scenario??
    Any useful input will be rewarded!
    Regards
    Dhanya.

    1. yes, select main memory. how many entries do you expect in this dimension?
    In other terms, how many different combinations of characteristics calues included in your DIM are to be posted?
    As I first guess, you should enter there 50'000 but please let me know the cardinality of this dimension.
    2. the fact to have or not master data doesn't apply. Your fact table is booked with DIMIDs as keys of the table. Every time you book a record, the system will check in the dimension tables if this combination of characteristics values have already one record in their DIM, if yes, fine, nothing to do . If a new combination comes, the system will have to add a record in the dimension, thus it will first look for the next number range value (= DIMID).
    In adition, the system will create master data IDs as well (even if there is no master data). In each Dimension table you'll find the corresponding master data SIDs for each of the IObjs belonging to the Dimension.
    Tha's why filling empty cubes takes much more time than loading a cube with data already. That's also why the more you load data the less time it takes.
    Please also make sure that all your F table indexes are dropped. (manage cube, performance tab, delete indexes prior loading).
    this will help considerably initial loads...
    Understanding these concepts of BW datawarehousing are of paramount importance in order to set the system up properly.
    Message was edited by:
            Olivier Cora

  • "UNICODE_IN_DATA" error in ODI 11.1.1.5 data load interface

    Hello!
    I am sorry, I have again to ask for help with the new issue with ODI 11.1.1.5. This is a multiple-column data load interface. I am loading data from tab-delimited text file into Essbase ASO 11.1.2. The ODI repository database is MS SQL Server. In the target datastore some fields are not mapped to the source but hardcoded with a fixed value, for example, since only budget data is always loaded by default, the mapping for "Scenario" field in the target has an input string 'Budget'. This data load interface has no rules file.
    At "Prepare for loading" step the following error is produced:
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
    File "<string>", line 86, in <module>
    AttributeError: type object 'com.hyperion.odi.common.ODIConstants' has no attribute 'UNICODE_IN_DATA'
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
         at java.lang.Thread.run(Thread.java:662)
    I will be very grateful for any hints

    Have you changes any of the Hyperion Java Files?
    I have not seen this exact error before but errors like this when the KM is not in sync with the Java Files.
    Also I always suggest using a rules file.
    If you have changed the files, revert back to the original odihapp_common.jar and see if it works, if you have changed the files to get round the issues I described in the blog you should be alright just to have changed odihapp_essbase.jar
    This is the problem now with Oracle and all there different versions and patches of ODI, it seems to me they have put effort into the 10.1.3.x Hyperion modules and then in 11.1.1.5 just given up and totally messed a lot of things up.
    I hope somebody from Oracle read this because they need to get there act together.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • ODI - How to clear a slice before executing the data load interface

    Hi everyone,
    I am using ODI 10.1.3.6 to load data daily into an ASO cube (version:11.1.2.1). Before loading data for a particular date, I want the region to be cleared in the ASO cube defined by "that date".
    I suppose I need to run a PRE_LOAD_MAXL_SCRIPT that clears the area defined by an MDX function. But I don't know how I can automatically define the region by looking at several coloums in the data source.
    Thanks a lot.

    Hi, thank you for the response.
    I know how to clear a region in ASO database. I wrote a MaxL like the following:
    alter database App.Db clear data in region '{([DAY].[Day_01],[MONTH].[Month_01],[YEAR].[2011])}'
    physical;
    I have 3 seperate dimensions such as DAY, MONTH and YEAR. My question was, I don't know how I can automize the clearing process before each data load for a particular date.
    Can I somehow automatically set the Day, Month, Year information in the MDX function by looking at the day,month,year coloumns in the relational data source. For example if I am loading data for 03.01.2011, I want my MDX function to become {([DAY].[Day_01],[MONTH].[Month_03],[YEAR].[2011])}'. In the data source table I also have seperate coloumns for Day, Month , Year which should make it easier I guess.
    I also thought of using Substitution variables to define the region, but then again the variables need to be set according to the day,month, year coloums in the data source table. I also would like to mention that the data source table is truncated and loaded daily, so there can't be more than one day or one month etc in the table.
    I don't know if I could clearly stated my problem, please let me know if there are any confusing bits.
    Thanks a lot.

  • No dataload.err: Using ODI to load data into Essbase

    Hi,
    I am using ODI to load data into Essbase. In the load data KM options, I select the rule file and mention that errors be logged.
    After executing the data load through ODI, I get a log file and an error file. But the error file does not have the bad record like the "dataload.err" has when I load data through EAS and use the same data and rules file.
    Am I missing some option? Is there some option to log the bad data to the error file or some option to generate a dataload.err file while using ODI?
    Thanks.

    Hi John,
    I have set LOG_ERRORS to Yes and given a file name. A file is generated everytime the interface is executed, but the file is blank.
    I have also set the Log_enabled to Yes. The log file is generated and has the error in it. The error line in the Log file ( not the error file) looks like this -
    2009-04-16 10:45:38,457 INFO [DwgCmdExecutionThread]: ODI Hyperion Essbase Adapter Version 9.3.1.1
    2009-04-16 10:45:38,457 INFO [DwgCmdExecutionThread]: Connecting to Essbase application [XXXXXX] on [XXXXXX]:[1423] using username [admin].
    2009-04-16 10:45:39,315 INFO [DwgCmdExecutionThread]: Successfully connected to the Essbase application.
    2009-04-16 10:45:39,315 INFO [DwgCmdExecutionThread]: Essbase Load IKM option RULES_FILE = d_PL
    2009-04-16 10:45:39,315 INFO [DwgCmdExecutionThread]: Essbase Load IKM option RULE_SEPARATOR = ,
    2009-04-16 10:45:39,315 INFO [DwgCmdExecutionThread]: Essbase Load IKM option PRE_LOAD_MAXL_SCRIPT =
    2009-04-16 10:45:39,315 INFO [DwgCmdExecutionThread]: Essbase Load IKM option POST_LOAD_MAXL_SCRIPT =
    2009-04-16 10:45:39,315 INFO [DwgCmdExecutionThread]: Essbase Load IKM option ABORT_ON_PRE_MAXL_ERROR = true
    2009-04-16 10:45:39,315 INFO [DwgCmdExecutionThread]: Essbase Load IKM option CLEAR_DATABASE = None
    2009-04-16 10:45:39,315 INFO [DwgCmdExecutionThread]: Essbase Load IKM option COMMIT_INTERVAL = 1000
    2009-04-16 10:45:39,315 INFO [DwgCmdExecutionThread]: Essbase Load IKM option CALCULATION_SCRIPT = null
    2009-04-16 10:45:39,315 INFO [DwgCmdExecutionThread]: Essbase Load IKM option RUN_CALC_SCRIPT_ONLY = false
    2009-04-16 10:45:39,346 DEBUG [DwgCmdExecutionThread]: LoadData Begins
    2009-04-16 10:45:39,549 DEBUG [DwgCmdExecutionThread]: Error occured in sending record chunk...Cannot end dataload. Analytic Server Error(1003014): Unknown Member [DPT_DFDFDF] in Data Load, [2] Records Completed
    2009-04-16 10:45:39,549 DEBUG [DwgCmdExecutionThread]: Sending data record by record to essbase
    2009-04-16 10:45:40,968 INFO [DwgCmdExecutionThread]: Logging out and disconnecting from the essbase application.
    There is nothing in the error file. There should have been a record for the unknown member, DPT_DFDFDF.
    I am using ODI version 10.1.3.5.
    and John, thanks for the ODI blog. It has helped me a lot in the past couple of months.
    Thank you.

  • Error regarding data load into Essbase cube for Measures using ODI

    Hi Experts,
    I am able to load metadata for dimensions into Essbase cube using ODI but when we are trying same for loading data for Measures encountring following errrors:
    Time,Item,Location,Quantity,Price,Error_Reason
    '07_JAN_97','0011500100','0000001001~1000~12~00','20','12200','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    '28_JAN_97','0011500100','0000001300~1000~12~00','30','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    '28_JAN_97','0011500100','0000001300~1000~12~00','500','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    Can anyone look into this and reply quickly as it's urgent requiremet.
    Regards,
    Rohan

    We are having a similar problem. We're using the IKM SQL to Hyperion Essbase (DATA) knowledge module. We are mapping the actual data to the field called 'Data' in the model. But it kicks everything out saying 'Unknown Member [Data] in Data Load', as if it's trying to read that field as a dimension member. We can't see what we missed in building the interface. I would think the knowledge module would just know that the Data field is, um, data; not a dimension member. Has anyone else encountered this?
    Sabrina

  • ODI Data Quality metabase Load connections

    Hi Guys
    I am trying to get started with ODI data quality and profiling. I would like to connect from the ODI metabase manager load connections to the database on my local machine using the following
    username: thiza
    Password:
    url:10.12.12.12:1521:EDW
    The problem is ODI metabase manager requirs tns name. I tried to put the following string on the tns but still not working
    EDW=(DESCRIPTION =(ADDRESS = (PROTOCOL = TCP)(HOST = tvictor-za)(PORT = 1521))(CONNECT_DATA =(SERVER = DEDICATED)(SERVICE_NAME = EDW)))
    Can anyone give step by step on how to connect to oracle database from ODI metabase manager(load connections)
    any help will be highyl appreciated.
    Thanks
    Umapada
    I tried to put the following string on the tns but still not working
    only "EDW" in place of TNS Name in metabase administrator. But when testing the connection at the time of creating entity, it is saying "Please wait, Validating Connection" but this wait never ends and continues for hour.
    Edited by: user10612738 on Mar 2, 2009 10:54 PM
    Seems following shared library is missing. Dont know how to get it from Oracle?
    Has anybody found this problem earlier.
    2009-03-03 12:12:15 02837 WARNING CONNECT Remote oracle connection failure, couldn't load file "/ora/ora10g/odi101350/oracledq/metabase_server/metabase/lib/pkgOracleAdapter/pkgOracleAdapter.sl": no such file or directory - couldn't load file "/ora/ora10g/odi101350/oracledq/metabase_server/metabase/lib/pkgOracleAdapter/pkgOracleAdapter.sl": no such file or directory
    2009-03-03 12:12:15 02837 WARNING ADAPTER Authentication failed. - couldn't load file "/ora/ora10g/odi101350/oracledq/metabase_server/metabase/lib/pkgOracleAdapter/pkgOracleAdapter.sl": no such file or directory
    2009-03-03 12:12:15 02837 INFO CLIENT_DISCONNECT interpffffec50
    2009-03-03 12:12:15 02837 INFO METABASE removing session directory ->/ora/ora10g/odi101350/oracledq/metabase_data/tmp/session/3fb551e8-4c99-4be3-860d-5953ef6512fe<-
    2009-03-03 12:12:25 27177 INFO CLIENT_DISCONNECT interp0029c060

    If you are trying to connect to an oracle box. Try to do this. Worked for me.
    Go to add loader connections.
    And instead of pasting the entire string. Try to use only the name of the TNS.
    In your case this would be EDW.
    Once added save your metabase connections and go into the data quality and profiling part.
    Type in the name of the metabase u recently created, along with the user name and password.
    It should log you on. Worked for me. I have tried a lot of oracle connections. But the only problem for me is that i have never been able to configure a ODBC loader connections with SQL server.
    Hope this helps.
    Chapanna

  • HFM Data Load Error in ODI

    Hi,
    I'm loading data into HFM from flat file. When the interface is executed only some of the data are getting loaded. When i checked for the errors in the log, I'm getting the following error message in log:
    'Line: 56, Error: Invalid cell for Period Apr'
    Then i found that its an invalid intersection in HFM which am trying to load.
    In FDM there is an option to validate invalid intersections during data load.
    I would like to know how to do so in ODI to overcome this kind of error i.e. is there any option in ODI to ignore this kind of error.
    Kndly help me.
    Thanks in advance

    Hi,
    I think even if the metadata exists still there might be some issues with HFM forbidden cells. There are HFM rules that determines which intersections are editable/loadable which are not. Please look at your HFM admin regarding forbidden rules. Or otherwise change the property of Custom dimensions so that it accepts data into all intersections.
    Thanks,
    Debasis

  • Data load Error to Planning

    Hi All,
    I have created One Interface for loading data to Planning Application i have make the Account Dimension as Load dimension and Segment as Driver dimension (member is Audio)
    I have created one form for this now when i am running the interface its get successfully executed but at report statics it showing following problem
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 2, in ?
    Planning Writer Load Summary:
         Number of rows successfully processed: 0
         Number of rows rejected: 48
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:345)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:169)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2374)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1615)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1580)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java:2755)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java:68)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2515)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:534)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:449)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1954)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:322)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:224)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:246)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:237)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:794)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:114)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:619)
    I have find out a log file for it which says :
    Account,Data Load Cube Name,MP3,Point-of-View,Error_Reason
    Max Hits Per Day,Consol,Jan,"USD,Cannot load dimension member, error message is: java.lang.IllegalArgumentException: Unbalanced double quotes found at character 3 in the member list: "USD
    Avg Hits Per Day,Consol,Jan,"USD,Cannot load dimension member, error message is: java.lang.IllegalArgumentException: Unbalanced double quotes found at character 3 in the member list: "USD
    Can any buddy tell me what i did wrong. My Data file not contains any Double Quote mark with USD the POV is like this :---> USD,E01_0 ,Jan,Current,Working,FY11
    Thanks
    Gourav Atalkar
    Edited by: Gourav Atalkar(ELT) on May 3, 2011 4:06 AM

    As it is comma separated and all the members for the POV need to be together try enclosing it in quotes e.g "USD,E01_0 ,Jan,Current,Working,FY11"
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Essbase Studio Performance Issue : Data load into BSO cube

    Hello,
    Having succesfully built my outline by member loading through Essbase Studio, I have tried to load data into my application again with Studio. However I was never able to complete the data load because it is taking forever. Each time I tried to work with Studio in streaming mode (hoping to increase the query speed), the load gets terminated due to the following error : Socket read timed out.
    In the Studio properties file, I typed in, oracle.jdbc.ReadTimeout=1000000000, but the result has not changed. Even if it did work out, I am also not sure the streaming mode is gonna provide a much faster alternative to working in non-streaming mode. What I'd like to know is, which Essbase settings I can change (either in Essbase or Studio server) in order to speed up my data load. I am loading into a Block Storage database with 3 Dense, 8 Sparse and 2 attribute dimensions. I filtered some dimensions and tried to load data to see exactly how long it takes to create a certain number of blocks. With ODBC setting in Essbase Studio, it took me 2.15 hours to load data into my application where only 153 blocks were created with the block size of 24B. Assuming that in my real application the number of Blocks created are going to be at least 1000 times more than this , I need to make some changes in settings. I am transferring the data from Oracle Database, with 5 tables joined to a fact table (view) from the same data source. All the cache settings in Essbase are in default mode. Would changing cache settings, buffer size or multiple threads help to increase the performance? Or what would you suggest that I should do?
    Thank you very much.

    Hello user13695196 ,
    (sorry I no longer remember my system number here)
    Before it comes to any optimisation attemps in the essbase (also Studio) environment you should definitily make clear that your source data query performs well at the oracle db.
    I would recommand:
    1. to create in your db source schema a View from your sql statement (these behind your data load rule)
    2. query against this view with any GUI (Sql Developer, TOAD etc.) to fetch all rows and measure the time it takes to complete. Also count the effected (returned) number of rows for your information and for future comparing of results.
    If your query runs longer then you think is acceptable then
    a) check DB statistiks,
    b) check and/or consider creating indexes
    c) if you are unsure then kindliy ask your DBA for help. Usually they can help you very fast.
    (Don't be shy - a DBa is a human being like you and me :-) )
    Only when your sql runs fast (enough for you; or your DBA says is the best you can achieve) at the database move your effort over to essbase.
    One hint in addition:
    We had often problems when using views for dataload (not only performance but rather other strange behavior) . Thats the reaons why I like more directly to set up on (persistence) tables.
    Just to keep in mind: If nothing helps create a table from your view and then query your data from this table for your essbase data load. Normaly however this should be your last option.
    Best Regards
    (also to you Torben :-) )
    Andre
    Edited by: andreml on Mar 17, 2012 4:31 AM

  • How can i add the dimensions and data loading into planning apllications?

    Now please let me know how can i add the dimensions and data loading into planning apllication without manuallly?

    you can use tools like ODI or DIM or HAL to load metadata & data into planning applications.
    The data load can be done at the Essbase end using rules file. But metadata changes should flow from planning to essbase through any of above mentioned tools and also there are many other way to achieve the same.
    - Krish

Maybe you are looking for

  • Does any know how i can change my email address on my iphone to match the new email on my itunes

    I have recently changes email address with my itunes and now my iphones itunes does not work correctly. Does anybody know how to sinc this correctly? Thanks 

  • Panoramic software for mac

    Hi everyone, I'm trying to find some panoramic software to stich or join photos together.  In another thread https://discussions.apple.com/message/17427714#17427714 I've been informed that you can use other software with iphoto causing it no problems

  • All of my headphones are breaking! What should I do? (More info below)

    My iPod seems to break headphones- I've gone through multiple pairs. Usually one side of the headphones stop working for no reason and then a few days later the other side stops working. This has only happened with my iPod touch (5th gen-newest). I'v

  • Can't scan anymore.

    I have a C4680. Can't scan anymore. What might be the problem? Then i tried to download a drive from HP thinking it might solve the problem. The screen says click "download" to start downloading. But where is the "download" button? Thank you. This qu

  • Background task stays with status "in process" without creating any error

    Hi all, we created a two step approval workflow for parked documents based on the SAP WF templates WF WS10000051 (WF frame) and WS10000053 (two step approval sub WF). The problem now is, that after the two approvals have been given and the release fl