Loading data including Smart Lists data using ODI Planning KM

Hi all,
a question regarding ODI loading Smart Lists Data...
Can somebody confirm if we can load data from a source file which has the smart lists labels, instead of the smart list values id (ex: YES instead of 1)? i suppose Planning will do the conversion when loading to essbase
By the way, is Outline load utility working in the same way?
Thanks in advance

Hi John,
not because your site was blocked by the customer's firewall :-(
by the way I was referring to loading data containing smart lists accounts. In any case, I have seend that you have other blog entry where you were loading data using labels instead of keys
http://john-goodwin.blogspot.com/2009/05/odi-series-loading-smart-list-data-into.html
This is what I was looking for.
Another question, in outline load we have the chance of putting driver members in the source file which makes the process more flexible. Think if our driver members (Accounts) change dynamically.
Wuold it be any customization possible to add the same functionality to the ODI Planning KM?
We are not likely to use Essbase KM and joins with Plannign repository tables as our source data has dates, free text, figures,smarlists accounts, etc.
Thanks!

Similar Messages

  • Can we load data using .xls in user define format(without using default template)

    Hi All,
    I'm new bee to FDM. Part of HFM support i use FDM to load flatfile data. Just has a bit more knowledge than end user.
    Requirement is that i need to load data from MS excel to Planning application via FDM.
    Previously application is in Excel(Macro driven) and upstream(data) is also in Excel(multi tab).
    As of my knowledge data can be loaded from .csv file(Excel save as CSV) with single tab.
    Could you please let me know possibilities to load data from .xls(.xlsx) to FDM.
    Thanks in advance.

    If you want to load data using Excel, utilising FDM's out-of-the-box functionality you will have to use one of the templates supplied i.e. Excel Trial Balance or Excel Multi-load template.

  • Server Hang While trying to load Data using SQLLDR

    Hi guru's
    OS Version: Centos 6 64 bit
    DB Version : 11gR2 64 bit
    When i am trying to load data using SQL ldr the loading in between hangs. As well as when i am trying to transfer text files which are approx 14Gb using winscp then it show me error as mentioned below:
    Host is not communicating for more than 15 seconds.Still waiting..............
    Not:if problem repeats , by turning off 'Optimize connection buffer size'
    Warning : Abort this operation will close the connection
    I talked to our n/w admin regarding the same problem
    He blaim that its due to oracle not by any n/w
    can anyone help me in this respect.

    Check MOS note :
    SQL*Loader Hangs After Start [ID 1186863.1]

  • Export the data using ODI

    I want to export the data using Odi from essbase v9 cubes. ODI and essbase server and in different servers.
    I am using the calculation script in the LKM part, but the exported files are in Essbase servers but ODI not able to recognize the exported files.
    Report script is taking too much of time to export the data therefore using the calculation script.
    Is this something related to the agent? However I am not able to create an agent to the essbase server. I am succesful in creating the agent for the local system where odi is installed.
    Please suggest.
    Regards,
    Murali

    Are you on 10.1.3.6.x? Then: http://john-goodwin.blogspot.com/2008/09/odi-series-part-2-agent.html
    Are you on 11g? Then: http://john-goodwin.blogspot.com/2010/12/managing-odi-11g-standalone-agents.html
    I will say with only a mild amount of shame and a large amount of gratitude that I installed both releases' agents through John's blog posts.
    Regards,
    Cameron Lackpour
    Edited by: CL on Jun 4, 2012 5:48 PM
    Whoops, had the same link in there twice.

  • Loading Data Using Outline Load Utility - Error

    Trying to load some data using Outline Load Utility. I followed the Oracle documentation on this and Under the Administration -> Manage Data Load area I defined Accounts as data load dimension, and period as driver dimension. I then added Jan - Feb as driver members. I wanted to load data for Jan. My data header looks like this:
    Accounts,Jan,Point-of-View,Data Load Cube Name
    acct1, 768, "2010, Ver1, Actuals, entity1, etc" Plan1
    The command i typed is this:
    C:\Oracle\Middleware\user_projects\epmsystem1\Planning\planning1>outlineload /A:Plan1 /U:Admin /M /I:C:\MockData2.txt /D:Accounts /L:C:\dataload.log /X:c:\dataload.exc
    I get the following errors:
    [Mon Nov 22 11:03:02 CST 2010] Unrecognized column header value "Jan".
    [Mon Nov 22 11:03:02 CST 2010] Unrecognized column header value "Point-of-View".
    [Mon Nov 22 11:03:02 CST 2010] Unrecognized column header value "Data Load Cube Name".
    [Mon Nov 22 11:03:02 CST 2010]Unable to obtain dimension information and/or perform a data load: Unrecognized column header value(s), refer to previous messages. (Note: column header values are case sensitive.)
    [Mon Nov 22 11:03:02 CST 2010]Planning Outline data store load process finished with exceptions: not all input records were read due to errors (or an empty input file). 0 data records were read, 0 data records were processed, 0 were successfully loaded, 0 were rejected.
    This is version 11.1.2. What am I doing wrong here? I also find it interesting that the command for data load and metadata load is the same as per oracle docs. I guess Planning knows if we're trying to load data or metadata based on the CSV file header?

    I don't usually bother with loading data using the outline load utility but as a test on 11.1.2 using the planning sample application I gave it a quick go.
    In planning, went to Administration > Data Load Settings > Picked Account as the data load dimension, Period as the driver dimension and selected Jan as a member
    I created a file :-
    Account,Jan,Point-of-View,Data Load Cube Name
    TestMember,100,"Local,E05,Actual,NoSegment,Working,FY11",Consol
    And used the following command line from the directory with the planning utilities
    OutlineLoad /A:plansamp /U:admin /M /N /I:F:/temp/dload.csv /D:Account /L:F:/temp/outlineLoad.log /X:F:/temp/outlineLoad.exc
    The log produced :-
    [Tue Nov 23 10:02:01 GMT 2010]Successfully located and opened input file "F:\temp\dload.csv".
    [Tue Nov 23 10:02:01 GMT 2010]Header record fields: Account, Jan, Point-of-View, Data Load Cube Name
    [Tue Nov 23 10:02:01 GMT 2010]Located and using "Account" dimension for loading data in "plansamp" application.
    [Tue Nov 23 10:02:01 GMT 2010]Load dimension "Account" has been unlocked successfully.
    [Tue Nov 23 10:02:01 GMT 2010]A cube refresh operation will not be performed.
    [Tue Nov 23 10:02:01 GMT 2010]Create security filters operation will not be performed.
    [Tue Nov 23 10:02:01 GMT 2010]Examine the Essbase log files for status if Essbase data was loaded.
    [Tue Nov 23 10:02:01 GMT 2010]Planning Outline load process finished (with no data load as specified (/N)). 1 data record was read, 1 data record was processed, 1 was accepted, 0 were rejected.
    There you go no problems.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Need Help..!! Loading Data Using Outline Load Utility - No Data

    Hi All,
    I've tried to load data using Outline Load Utility but the data is not loaded into my Essbase.
    Here are the step what I've done :
    1. In planning, Administration > Data Load Settings
    - Entity as the data load dimension,
    - Account as the driver dimension and selected Rental, test_desc as the member
    2. The .csv file
    Entity, Rental, test_desc, Point-of-View,Data Load Cube Name
    TLT, 100, 100, "FY11,Plan,Working,No Product,No Condition Type,No Vehicle Type,No Term Period,No Term,No Movement, Local, Apr", Retail
    3. Run the Outline Load through cmd
    E:\Oracle\Middleware\user_projects\epmsystem1\Planning\planning1>OutlineLoad /A:PLNDEV /U:xyz /M /N /I:E:/testload.csv /D:Entity /L:E:/OutlineLogs /X:E:/outlineLoad.exc
    and below it the OutlineLogs :
    [Mon Jul 25 04:01:42 PDT 2011]Successfully located and opened input file "E:\testload.csv".
    [Mon Jul 25 04:01:42 PDT 2011]Header record fields: Entity, Rental, test_desc, Point-of-View, Data Load Cube Name
    [Mon Jul 25 04:01:42 PDT 2011]Located and using "Entity" dimension for loading data in "PLNDEV" application.
    [Mon Jul 25 04:01:42 PDT 2011]Load dimension "Entity" has been unlocked successfully.
    [Mon Jul 25 04:01:42 PDT 2011]A cube refresh operation will not be performed.
    [Mon Jul 25 04:01:42 PDT 2011]Create security filters operation will not be performed.
    [Mon Jul 25 04:01:42 PDT 2011]Examine the Essbase log files for status if Essbase data was loaded.
    [Mon Jul 25 04:01:42 PDT 2011]Planning Outline load process finished (with no data load as specified (/N)). 1 data record was read, 1 data record was processed, 1 was accepted, 0 were rejected.
    No data rejected there, but when I tried to retrieve the data from my Planning Form or Essbase Add-In, there is no data.
    Do I miss something in my step above..??
    Please advise.
    Thanks.
    Regards,
    VieN

    Hi John,
    thanks for reply..
    I'm not check the essbase log yet..
    From the OutlineLogs (Planning Outline load process finished (with no data load as specified (/N)). 1 data record was read, 1 data record was processed, 1 was accepted, 0 were rejected.), it doesn't mean the data successfully loaded, isn't it...??
    Thanks.
    Regards,
    VieN

  • Loading DATA using this Data source 0RT_PA_TRAN_CONTROL_TR,2LIS_44_POSREC

    Hi gurus,
    I am trying to Load data using this 2 data sources, I loading for first time.
    Can any one guide me step by step how to load the data by using this two data Sources SAP BI.Below is the data sources names.
    *0RT_PA_TRAN_CONTROL_TR - BW Data Source
    *2LIS_44_POSREC - ECC Data Source

    Hi Lisa,
    @@ In this scenario , we are uploading data in PIPE and then pushing data to data source- info cubes. We are trying to upload sales data in PIPE using an internal function module /POSDW/CREATE_TRANSACTIONS_INT.Using the POS monitor (/POSDW/MON0) , we are able to view the data.
    w.r.t your above sentences I understood that you were almost having the same requirement as I do. So did you get the solution?? If yes, kindly share the procedure about how to load the Data into the PIPE.

  • Loading Data Using Process Chains

    Hi All,
    we are having 3 layers for loading data using DB connect i.e 1st and 2nd layers we are having DSO's and the 3rd layer is cube.we are following the following updates for the 3 layers
    1st layer- full
    2nd layer - delta
    3rd -delta
    is there any possibilites of getting duplicates in the cube and if so how to solve this one using process chains
    regards,

    Hi,
    Your dataflow looks ok.
    Hopefully you are using "Std DSO", there is no chances of getting duplicate records.
    But in case...
    If you are using "WO DSO" with check box "Do not check Uniqueness of records"
    In this situation you will get the duplicate... as "WO DSO" with this property will simply keep on adding adding the records.
    Thanks
    Mayank  

  • How to load Matrix report data into basic table data using ODI

    Hi,
    How to load Matrix report data into basic table data using oracle Data Integrator ?
    Requirement Description:
    Following is the matrix report data:
    JOB                       DEPT10                DEPT20 
    ANALYST                                           6000
    CLERK                   1300                     1900 Need to convert it into below format:
    JOB                             Dept                        Salary
    ANALYST                  DEPT10     
    ANALYST                  DEPT20                     6000
    CLERK                       DEPT10                    1300
    CLERK                       DEPT20                    1900
        Thanks for your help in advance. Let me know if any further explanation is required.

    Your list seems to be a little restrictive, you can do a lot more with ODI procedures.
    If you create new procedure, and add a step. In the 'command on source' tab set you technology and schema as per your source database. Use the unpivot functionality as described in the link, please, rather than using 'SELECT *' use the appropriate column names and alias them for eg:
    SELECT job as job,
    deptsal as deptsal,
    saldesc as saledesc
    FROM pivoted_data
    UNPIVOT (
    deptsal --<-- unpivot_clause
    FOR saldesc --<-- unpivot_for_clause
    IN (d10_sal, d20_sal, d30_sal, d40_sal) --<-- unpivot_in_clause
    Then in your 'command on target' tab set the technology and schema to your target db, then put your INSERT statement for eg:
    INSERT INTO job_sales
    (job,
    deptsal,
    saledesc
    VALUES
    :job,
    :deptsal,
    :saledesc
    Therefore you are using bind variables from source to load data into target.
    Obviously if the source and target table are in the same database, then you can have it all in one statement in the 'command on target' as
    INSERT INTO job_sales
    (job,
    deptsal,
    saledesc
    SELECT job as job,
    deptsal as deptsal,
    saldesc as saledesc
    FROM pivoted_data
    UNPIVOT (
    deptsal --<-- unpivot_clause
    FOR saldesc --<-- unpivot_for_clause
    IN (d10_sal, d20_sal, d30_sal, d40_sal) --<-- unpivot_in_clause
    also set the log counter as 'Insert' on the tab where your INSERT statement is, so you know how many rows you insert into the table.
    Hope this helps.
    BUT remember that this feature only came out in Oracle 11g.

  • Unable to load data to Hyperion planning application using odi

    Hi All,
    When I try to load data into planning using odi, the odi process completes successfully with the following status in the operator ReportStatistics as shown below but the data doesn't seem to appear in the planning data form or essbase
    can anyone please help
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
    File "<string>", line 2, in <module>
    Planning Writer Load Summary:
         Number of rows successfully processed: 20
         Number of rows rejected: 0
    source is oracle database
    target account dimension
    LKM SQL TO SQL
    IKM SQL TO HYPERION PLANNING is used
    In Target the following columns were mapped
    Account(load dimension)
    Data load cube name
    driverdimensionmetadata
    Point of view
    LOG FILE
    2012-08-27 09:46:43,214 INFO [SimpleAsyncTaskExecutor-3]: Oracle Data Integrator Adapter for Hyperion Planning
    2012-08-27 09:46:43,214 INFO [SimpleAsyncTaskExecutor-3]: Connecting to planning application [OPAPP] on [mcg-b055]:[11333] using username [admin].
    2012-08-27 09:46:43,277 INFO [SimpleAsyncTaskExecutor-3]: Successfully connected to the planning application.
    2012-08-27 09:46:43,277 INFO [SimpleAsyncTaskExecutor-3]: The load options for the planning load are
         Dimension Name: Account Sort Parent Child : false
         Load Order By Input : false
         Refresh Database : false
    2012-08-27 09:46:43,339 INFO [SimpleAsyncTaskExecutor-3]: Begining the load process.
    2012-08-27 09:46:43,355 DEBUG [SimpleAsyncTaskExecutor-3]: Number of columns in the source result set does not match the number of planning target columns.
    2012-08-27 09:46:43,371 INFO [SimpleAsyncTaskExecutor-3]: Load type is [Load dimension member].
    2012-08-27 09:46:43,996 INFO [SimpleAsyncTaskExecutor-3]: Load process completed.

    Do any members exist in the account dimension before the load? if not can you try adding one member manually then trying the load again.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Error regarding data load into Essbase cube for Measures using ODI

    Hi Experts,
    I am able to load metadata for dimensions into Essbase cube using ODI but when we are trying same for loading data for Measures encountring following errrors:
    Time,Item,Location,Quantity,Price,Error_Reason
    '07_JAN_97','0011500100','0000001001~1000~12~00','20','12200','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    '28_JAN_97','0011500100','0000001300~1000~12~00','30','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    '28_JAN_97','0011500100','0000001300~1000~12~00','500','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    Can anyone look into this and reply quickly as it's urgent requiremet.
    Regards,
    Rohan

    We are having a similar problem. We're using the IKM SQL to Hyperion Essbase (DATA) knowledge module. We are mapping the actual data to the field called 'Data' in the model. But it kicks everything out saying 'Unknown Member [Data] in Data Load', as if it's trying to read that field as a dimension member. We can't see what we missed in building the interface. I would think the knowledge module would just know that the Data field is, um, data; not a dimension member. Has anyone else encountered this?
    Sabrina

  • Problem loading data to Hyperion Planning app. from Oracle EBS using ERPi

    I'm using ERPi to load data to Planning application from EBS.
    I have configured Source and Target, created import formats, location, data load rules etc.
    When I'm running data load rule it is showing that data is imported and exported sucessfully. I can see data in the right format in data load workbench. ODI shows process completed sucessfully. But still there is no data loaded in Planning.
    Even AIF_HS_BALANCES staging view in ERPi repository shows correct data.
    Can some one tell me what could be the problem.
    Or where I can see logs for the data load step for Planning.
    ODI operater logs are not very clear to me. Same for log tables in ODI work repository.

    Hi Tony
    Apologies for the delay (my email spammed the notification that you had replied, helpful!!). When you say that you have done it but discourage accessing the tables directly how did you achieve this? Were you just careful which data intersections your selected from the tables (e.g. always base level data so it is stored not calculated)?
    I have used the HFM relational tables a couple of times to make use of the metadata that they contain but not tried to get data before.
    Regards
    Stuart

  • Error in loading data using SQL loader

    I am getting a error like ‘SQL*Loader -350 syntax error of illegal combination of non-alphanumeric characters’ while loading a file using SQL loader in RHEL. The command used to run SQL*Loader is:
    Sqlldr userid=<username>/<password> control =data.ctl
    The control file, data.ctl is :
    LOAD data
    infile '/home/oraprod/data.txt'
    append  into table test
    empid terminated by ',',
    fname terminated by ',',
    lname terminated by ',',
    salary terminated by whitespace
    The data.txt file is:
    1,Kaushal,halani,5000
    2,Chetan,halani,1000
    I hope, my question is clear.
    Please revert with the reply to my query.
    Regards

    Replace ''{" by "(" in your control file
    LOAD data
    infile 'c:\data.txt'
    append  into table emp_t
    empid terminated by ',',
    fname terminated by ',',
    lname terminated by ',',
    salary terminated by whitespace
    C:\>sqlldr user/pwd@database control=c.ctl
    SQL*Loader: Release 10.2.0.3.0 - Production on Wed Nov 13 10:10:24 2013
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 1
    Commit point reached - logical record count 2
    SQL> select * from emp_t;
         EMPID FNAME                LNAME                    SALARY
             1 Kaushal              halani                     5000
             2 Chetan               halani                     1000
    Best regards
    Mohamed Houri

  • How to load data using two maps for the same source file in Import Manager

    Hi,
    I am trying to load the source data to MDM using two map files, since one map is too big (creating memory error).
    However the first map data able to see after loading (SAP data) but when loading second map(nonSAP data) using the remote key to match the previously loaded SAP key ,even if MDM is processing the data it is not reflecting in data manager.
    Is there any reason for this.
    Appreciate your suggestion and tips on this.
    regrds,
    Reo

    Hi Reo,
    As ur requirement seems,In the second pass perform recording matching by select default import action as "<b>update null fields only"</b> for those data records which are matching and "<b>create</b> " for those data records which are not matching with previous data records from SAP system.
    If you define "update null fields only", then only the key of the new record from NON SAP will get appended to the existing record. And if you select "create" then it will create new records and will apply NON SAP key on it.
    Please check the remote key values using data manager.Key is applied in the records or not.
    Remember that import maps are client specific. Remote key comprises of both key and client system.So be sure that you are connecting to two different client systems.
    I have tried this and this is working fine on my system.
    Thanks,
    Shiv Prashant Dixit
    Thanks,

  • Problem Loading Data Using UTL File Package

    Hi Friends..
    My Database is Oracle 10gR2 and OS is WINDOWS.
    I have one excel file which contains 10 fields.
    My requirement is to Load data from 1 excel file into two tables..
    tables are master and detail
    Below are sample data and structure for it..
    Excel file format
    TEST.CSV
    Srno Empno Empname City Challanno challandate Materialno materialname Materialqty Materialcost
    1     232 raj      Hyderabad      533     20/04/2010     11     abc      34     10
    1     232 raj      Hyderabad      533     20/04/2010     12      aa      4     110
    1     231 ram Baroda      533     20/04/2010     14      abcd      33     210
    Master table
    empno
    Challanno
    challandate
    Detail table
    empno
    materialno
    materialname
    materialqty
    materialcost
    My question is ..While reading 1st line if its empno is new then record is entered in master table for the first time and remaining records of same empno is entered in detail table.. now when empno no changes then new entry is done in master table and associated records are entered in detail table..
    So in this case for empno 232 master table would have
    232,533,20/4/2010
    Detail table for empno 232 would now have 2 records..
    232,11,abc,34,10
    232,12,aa,4,110
    I am using UTL_FILE package to achieve this as the file is on server...
    Kindly please help me to proceed in this..
    Really appreciate your help...

    sai121 wrote:
    Its ok..if u dont wanna reply sir... but thats what i m told to do..and i have 4yrs of industry experience too...but cant argue with boss..u know that right.It's not that people don't want to reply, it's that what you're asking for is something achieved very simply in a few lines of code using External tables but is a convoluted and complex thing to do using UTL_FILE, so why would anybody want to waste their time giving you a load of code to achieve what you want when they know it's the wrong approach anyway.
    I've been computer programming for 28 years (jeez has it really been that long :D ), so I wouldn't be telling you that you're doing it the wrong way without knowing that there are better ways to do it and you're asking for the wrong way. Speak with your boss, tell him that you've been recommended to use External Tables instead because they're the right way to read such data and UTL_FILE is not the right approach.

Maybe you are looking for