ODI to planning data loading

Gents ,
iam rajesh i would to discuss regarding ODI connectivity to Planning
As i am completely new to the ODI and even i dont know how to login that i have been provided 3 username (ie ODI work,MAster,Profiling) and respective password
even ODI is installed in my pc and it was working perfectly all right till last year but suddenly there is problem
i would like to have some assistance in following
1. how to login to ODI and even i want to c the previous connection (ODI to planning)
2. what might be the problem when ODI unable to send data to planning
3. and even how to establish the new connection with out disturbing the old connection .
I know it sounds typical but i am completely helpless as i dont have any clue on this
regards
rajesh
Edited by: user8800516 on Feb 1, 2010 9:38 PM

Hi,
I know you have another post of the subject but I am finding it hard to understand what you are actually asking.
I think you need to expand on where you are finding issues.
Cheers
John
http://john-goodwin.blogspot.com/

Similar Messages

  • Why the delivery date is the same date as 'transptn plan date" & loading date' & ' good issue' & GR end date'

    Hi Experts,
    why the delivery date is the same date as ‘transptn plan date” & loading date’ & ‘ good issue’ & GR end date’.
    in shipping tab i can see Planned Deliv. Time  170 Days ... wat could be the reason.
    Many Thanks:
    Raj Kashyap

    Hi Jurgen,,
    Thanks for quick reply!!
    But i didnot find any things like that .. what could be the customizing .. and we are using GATP from APO side.
    \Raj Kashyap

  • Planning - Data Load Setting

    On 11.1.2.1, is this setting a single load setting, meaning when you have different csv files with different driver dimensions, do you have to change the data load settings every time before using the appropriate csv file with outlineloader utility? If this is the case, then to use the batch option, the /TR option is more appropriate. Is my understanding correct?
    Thanks.

    You have two options you either the driver information through planning administration, though if your file data load dimension/drivers change you need to update through planning again.
    or you have the option of including the driver information directly in data load file.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Performance issues with Planning data load & Agg in 11.1.2.3.500

    We recently upgraded from 11.1.1.3 to 11.1.2.3. Post upgrade we face performance issues with one of our Planning job (eg: Job E). It takes 3x the time to complete in our new environment (11.1.2.3) when compared to the old one (11.1.1.3). This job loads then actual data and does the aggregation. The pattern which we noticed is , if we run a restructure on the application and execute this job immediately it gets completed as the same time as 11.1.1.3. However, in current production (11.1.1.3) the job runs in the following sequence Job A->Job B-> Job C->Job D->Job E and it completes on time, but if we do the same test in 11.1.2.3 in the above sequence it take 3x the time . We dont have a window to restructure the application to before running Job E  every time in Prod. Specs of the new Env is much higher than the old one.
    We have Essbase clustering (MS active/passive) in the new environment and the files are stored in the SAN drive. Could this be because of this? has any one faced performance issues in the clustered environment?

    Do you have exactly same Essbase config settings and calculations performing AGG ? . Remember something very small like UPDATECALC ON/OFF can make a BIG difference in timing..

  • Transportation planning date

    hi experts,
               when i create sales order,in schedule line the planned delivery date i am getting is after 4 days,
    the problem is with the planning date,loading date and gooods issue date,how to change that dates.where to change
    thanks in advance.

    Hi Senya,
    The system does backward scheduling by default and all the dates are calculated backwards from the requested delivery date, the date calculation is dependent upon the various lead times that you have defined in your MMR.
    For ex in MMR u have defined lead times like this: GI time - 5 days ( Time req for your order to reach u when dispatched from your supplier), loading tiime- 1 day ( time req to load the matl in transportation mode), transportation planning time- 1 day ( time req to make arrangements for finding and arranging transportaion means), Matl avb time- 1 day ( time when the matl should be ready and availaible with you to carry out subsequent processes of scheduling)
    Supose u make a order with delivery date from 10 days from now depending upon the settings made in MMR for lead times calculation would be somwthing like this:
    Requested Delivery date: 15/06/09
    GI Date: 10/06/09
    Loading date: 09/06/09
    Transportaion planning date: 08/06/09
    Matl avb date: 07/06/09
    If my order date is on or after my matl avb date then its fine and my RDD would be confirmed otherwise backward scheduling would fail then system would do forward scheduling and give u a new delivery date and that date would become your new confirmed delivery date.
    Matl avb date: 04/06/09
    Transportaion planning : 05/06/09
    Loading date: 06/06/09
    GI date: 07/06/09
    Confirmed delivery date: 12/06/09
    This would be like when somebody orders on today's date ( 05/06/09) but wants deliver in 5 days from now (10/06/09), system would not allow as per default backward scheduling.
    Hope this is now somewhat clear to you.
    Experts if m wrong anywhere pls feel free to correct me.
    Regards,
    Kant

  • Transaction Data load fail

    Hi Experts,
    I have an issue when I was loading transaction data from flat file to BPC model. I am getting error "Task name LOAD: Cannot perform read". It was working fine in the development box. The transformation file are all ok. Any inputs on this?
    Thank you.

    Hi Ng,
    Please check your BPC cube properties. Please play with below options.
    if you have selected option 2, then change to 1 and load the data or do it vice versa.
    BPC cube(/CPMB/)-->right click-->Planing specific properties--> change real time load behavior
    1.Real time data target can be loaded with data; planing not allowed
    2.Real time data target can be planned; Data loading not allowed.
    Thanks

  • Data load Error to Planning

    Hi All,
    I have created One Interface for loading data to Planning Application i have make the Account Dimension as Load dimension and Segment as Driver dimension (member is Audio)
    I have created one form for this now when i am running the interface its get successfully executed but at report statics it showing following problem
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 2, in ?
    Planning Writer Load Summary:
         Number of rows successfully processed: 0
         Number of rows rejected: 48
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:345)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:169)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2374)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1615)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1580)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java:2755)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java:68)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2515)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:534)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:449)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1954)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:322)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:224)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:246)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:237)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:794)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:114)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:619)
    I have find out a log file for it which says :
    Account,Data Load Cube Name,MP3,Point-of-View,Error_Reason
    Max Hits Per Day,Consol,Jan,"USD,Cannot load dimension member, error message is: java.lang.IllegalArgumentException: Unbalanced double quotes found at character 3 in the member list: "USD
    Avg Hits Per Day,Consol,Jan,"USD,Cannot load dimension member, error message is: java.lang.IllegalArgumentException: Unbalanced double quotes found at character 3 in the member list: "USD
    Can any buddy tell me what i did wrong. My Data file not contains any Double Quote mark with USD the POV is like this :---> USD,E01_0 ,Jan,Current,Working,FY11
    Thanks
    Gourav Atalkar
    Edited by: Gourav Atalkar(ELT) on May 3, 2011 4:06 AM

    As it is comma separated and all the members for the POV need to be together try enclosing it in quotes e.g "USD,E01_0 ,Jan,Current,Working,FY11"
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • How can i add the dimensions and data loading into planning apllications?

    Now please let me know how can i add the dimensions and data loading into planning apllication without manuallly?

    you can use tools like ODI or DIM or HAL to load metadata & data into planning applications.
    The data load can be done at the Essbase end using rules file. But metadata changes should flow from planning to essbase through any of above mentioned tools and also there are many other way to achieve the same.
    - Krish

  • Data load using ODI 11.1.1.5 to hyperion.

    Hi All,
    I have installed ODI and we have already installed Hyperion.
    After installation of ODI , I have created Master repository and Work repository.
    Now i am planning to load data to hyperion planning applications dimensions(Eg. Entity).
    1) In Topology section, i did right click on Hyperion planning and clicked on "New Data Server". Filled the required infomartion and saved it.
    2) Then i right clicked on newly created Data server and opted "New physical schema", but after that it is not showing any application in Application(Catalog) and Application (Work Catalog) drop downs.
    Before those steps we had created planing applications in Hyperion.
    Please let me know what is wrong/missing in above steps or share the steps to load the data in hyperion.
    -PM

    Yes , i have entered the values in text boxes/drop downs . Afterward i tried to save it but it ask to fill "context".Then i seletected Context option, In that i have only one option global, I opted that.
    And in second field logical schema , doesnt have any value. System doesnt allow to save without giving these values.
    Can you please me further steps.
    -PM

  • ERPI: Data Loading problem Hyperion Planning & Oracle EBS

    Hi
    I am trying to load data from Oracle EBS to Hyperion Planning.
    When i push data Zero rows are inserted in Target.
    When i look at Table " SELECT * FROM TDATASEG "
    It is showing me data but it is not comminting data in Target application.
    The reason is Data difference in Source (EBS) and Target.
    In my source Year is 2013 but in Target 'FY14' so for Entity Source is '21' but Target is '2143213'
    Can you please let me know how to solve this issue?
    can i place a Lookup table for this in EPRI.
    i am using EPRI and ODI to push data.
    Regards
    Sher

    Have you setup the data load mapping correctly to map the source value to the proper target value? Based on what you are describing it seems that the system generated * to * map is being used, if you are mapping a source to a different target, this needs to be added to the data load mapping.

  • Automate dimension/data load update in Planning?

    Hi all,
    Without App Link and Translator (clients not buy), how to automate dimension/data load update in the planning? Any sample?
    Regards,
    Kenneth

    Hi John,
    I did the tests in Windows 2003 Server with Planning 9.3.1.
    1. 'bug 6829439: Task Flow is always set to active although the job has completed successfully'.
    I had a flow with 2 tasks. In this case when I launch the job it remains in a '4% complete' state because once the first task is completed, the systems doesn't put the task in 'completed' state and thus doesn't start the second one.
    2. 'bug 6785224: Manage TaskFlow does not show with interface tables'.
    When working with interface tables one cannot schedule taskflows.
    A colleague of mine working for another client has already opened up calls with Hyperion regarding this issues. Hyperion provided the bug numbers and informed us they will be fixed in version 9.5.
    3. Now I am testing ODI connectors on a Planning 9.2 systems and have some small problems (I've just opened a thread on this forum).
    Daniela S.

  • Extraction of data from Planning and load it into Oracle/SQL server (RDBMS)

    Hi All,
    ODI can extract data from Oracle/SQL server RDBMS and load it into Hyperion planning, but I wanted to know if it is possible to extract data from Hyperion Planning through ODI and load it into Oracle or SQL server RDMBS i.e the other way round.
    Kindly let me know if that is possible or not,If yes then please let me know what is the exact process to achieve this through ODI.
    Thanks & Regrads,
    Gurpreet

    Yes this can be done. Remember that Planning data is actually stored in Essbase so the Knowledge module you will need to use is LKM Essbase to SQL (DATA)

  • "UNICODE_IN_DATA" error in ODI 11.1.1.5 data load interface

    Hello!
    I am sorry, I have again to ask for help with the new issue with ODI 11.1.1.5. This is a multiple-column data load interface. I am loading data from tab-delimited text file into Essbase ASO 11.1.2. The ODI repository database is MS SQL Server. In the target datastore some fields are not mapped to the source but hardcoded with a fixed value, for example, since only budget data is always loaded by default, the mapping for "Scenario" field in the target has an input string 'Budget'. This data load interface has no rules file.
    At "Prepare for loading" step the following error is produced:
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
    File "<string>", line 86, in <module>
    AttributeError: type object 'com.hyperion.odi.common.ODIConstants' has no attribute 'UNICODE_IN_DATA'
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
         at java.lang.Thread.run(Thread.java:662)
    I will be very grateful for any hints

    Have you changes any of the Hyperion Java Files?
    I have not seen this exact error before but errors like this when the KM is not in sync with the Java Files.
    Also I always suggest using a rules file.
    If you have changed the files, revert back to the original odihapp_common.jar and see if it works, if you have changed the files to get round the issues I described in the blog you should be alright just to have changed odihapp_essbase.jar
    This is the problem now with Oracle and all there different versions and patches of ODI, it seems to me they have put effort into the 10.1.3.x Hyperion modules and then in 11.1.1.5 just given up and totally messed a lot of things up.
    I hope somebody from Oracle read this because they need to get there act together.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • ODI - How to clear a slice before executing the data load interface

    Hi everyone,
    I am using ODI 10.1.3.6 to load data daily into an ASO cube (version:11.1.2.1). Before loading data for a particular date, I want the region to be cleared in the ASO cube defined by "that date".
    I suppose I need to run a PRE_LOAD_MAXL_SCRIPT that clears the area defined by an MDX function. But I don't know how I can automatically define the region by looking at several coloums in the data source.
    Thanks a lot.

    Hi, thank you for the response.
    I know how to clear a region in ASO database. I wrote a MaxL like the following:
    alter database App.Db clear data in region '{([DAY].[Day_01],[MONTH].[Month_01],[YEAR].[2011])}'
    physical;
    I have 3 seperate dimensions such as DAY, MONTH and YEAR. My question was, I don't know how I can automize the clearing process before each data load for a particular date.
    Can I somehow automatically set the Day, Month, Year information in the MDX function by looking at the day,month,year coloumns in the relational data source. For example if I am loading data for 03.01.2011, I want my MDX function to become {([DAY].[Day_01],[MONTH].[Month_03],[YEAR].[2011])}'. In the data source table I also have seperate coloumns for Day, Month , Year which should make it easier I guess.
    I also thought of using Substitution variables to define the region, but then again the variables need to be set according to the day,month, year coloums in the data source table. I also would like to mention that the data source table is truncated and loaded daily, so there can't be more than one day or one month etc in the table.
    I don't know if I could clearly stated my problem, please let me know if there are any confusing bits.
    Thanks a lot.

  • Loading non sap planning data into BWBPS

    Hi BPS Gurus,
                         I'm new to BPS I want to load hyperion planning data into BWBPS Could you suggest me how to do it.
    Thanks in Advance.
    Thanks & Regards,
    Bhaskar.

    Hi raki,
    My mail id is
    [email protected]
    Regards,
    Bhaskar.

Maybe you are looking for