EPM 11.1.2.2: Load sample data

I have installed EPM 11.1.2.2 and now I'm trying to load the sample data. I could use help locating instructions that will help with this. While earlier versions of epm documentation seemed to have instructions for loading the sample data, I have yet to find it for 11.1.2.2. Perhaps the earlier versions' steps are supposed to work on this version as well. So I read it. It says to initialize the app then in EAS right click on a consol icon where yuo have the option to load data. Unfortunately, I am using EPMA and I saw no option to initialize when I created the app. In addition, EAS doesnt have the console icon and right clicking on the application doesnt do anything. In short, prior documentation hasnt helped so far. I did find the documentation for EPM 11.1.2.2 but I have yet to locate within it the the instructions for loading the sample application and data. Does anyone have any suggestions?

I considered your comment about the app already existing but if this were true, I think the database would exist in EAS but it doesn't.
I've also deleted all applications, even re-installed the essbase components including the database so it would all be fresh, and then tried to create a new one but still no initialization option is available. At this point, I'm assuming there is something wrong with my installation. Any suggestion on how to proceed would be appreciated.
I've noticed that I dont have the create classic application option in my menu, only the transform from classic to epma option. I also can't navigate to http://<machinename>:8300/HyperionPlanning/AppWizard.jsp. When I noticed these things, I did as John suggested in another post and reselected the web server. This didnt make a difference so I redeployed apps AND selected the web server. Still not seeing it in the menu.
Edited by: dirkp:) on Apr 8, 2013 5:45 PM

Similar Messages

  • Can't Upload Sample Data to SAP HANA Sandbox-Can't Navigate to SQL Editor

    Hi HANA Freaks,
    I want to load sample data E2EModelingTutorialData into SAP HANA Sandbox but the following steps are required.
    To load the data to your SAP In-Memory Database do the following:
    - Copy the files to your database server.
    - Copy all SQL statements in the file SQLStatements.txt in an SQL view of your In-Memory Computing Studio
    - Modify the load statement according to your file location, e.g.
      LOAD FROM '<TO BE CHANGED>\EFASHION_SHOP_FACTS.CTL';
      could become
      LOAD FROM '/tmp/EFASHION_SHOP_FACTS.CTL';
    - Run the SQL statements which takes some minutes
    - Verify that all the files *bad.txt are of size 0 which means that the import of all records was successful.
    At steps 2 cant navigate to the editor where the SQL statements will be pasted for execution.
    Pls can some one give a step / step tutelage to achieve this?
    Many Thanks !

    Hi,
    For the step2,
    1. Right click on the navigator panel,
    2. Click to SQL Editor, where you can write/paste the SQL code
    3. Click on the execute link.
    Hope it is useful.
    Thanks

  • Loading BI Apps with demo (sample) data

    I would like to load the out of the box OBI apps with sample data from Oracle so that the clients see how Emoployee Expense reports would look like. does anyone know how to go about it . In simple terms , this is more like using BISAMPLE schema in OBIEE. Any help willl be greatly appreciated.

    Are you asking for sample data for the Fusion Applications similar to what EBS had with the "Vision" database, e.g. data populated in the Fusion Applications schema OOTB ? If so unfortunately such data is not available.
    Jani Rautiainen
    Fusion Applications Developer Relations
    https://blogs.oracle.com/fadevrel/

  • Erroe gettin in loading the data for sample application

    Dear Expert iam getting the followiong error while loading the data for sample application in Essbase Administrator Console form SampleApp_data.txt
    Parallel dataload enabled: [1] block prepare threads, [1] block write threads.
    Unknown Member [E01_0] in Data Load, [1] Records Completed
    Unexpected Essbase error 1003014
    can you please help me in this regrard,
    Thanks and regards,
    Kishore

    Hi,
    The error message is saying you don't have a member named "E01_0" in your outline, you can check this by opening the outline in EAS and trying to find the member.
    Once you have checked that go into planning web, administration, dimensions, select entities and search for "E01_0", if it exists then you have not refresh from planning to essbase. You will need to go manage database and refresh.
    Cheers
    John

  • Unable to load sample planning data through EAS

    Hi
    I am using planning 9.3.1 version. I installed Planning ,EAS, previously I loaded the sample data a lot of times , right now I installed Provider services now I am getting error in loading the data through EAS when I am loading the data I am unable to select the error file but I can select the load file . Even when I am hitting the online help it is not opening too.Previously before installing APS I am able to load. Does any one face this issue.....
    Thanks

    Hi
    I am using planning 9.3.1 version. I installed Planning ,EAS, previously I loaded the sample data a lot of times , right now I installed Provider services now I am getting error in loading the data through EAS when I am loading the data I am unable to select the error file but I can select the load file . Even when I am hitting the online help it is not opening too.Previously before installing APS I am able to load. Does any one face this issue.....
    Thanks

  • BP 258 Sample Data Load

    Hello,
    I've configured the prereq's for the Best Practices for BPC MS 7.5 (V1.31) and am on BP 258 trying to upload the sample data provided (258_Y2010_Sales_Actual.txt).  When I run the DM package for IMPORT, it errors out with :
    Task Name : Convert Data
    Dimension  list          CATEGORY,P_CC,RPTCURRENCY,TIME,CUSTOMER,PRODUCT,SALESACCOUNT,AMOUNT
    [2010.004] Calculated member or invalid member          ACTUAL,SALESTEAM1,USD,2010.004,CUSTOMERA,PRODUCTA,SalesUnitPrice,-15000
    [2010.004] Calculated member or invalid member          ACTUAL,SALESTEAM1,USD,2010.004,CUSTOMERA,PRODUCTA,SalesQuantity,200
    [2010.004] Calculated member or invalid member          ACTUAL,SALESTEAM1,USD,2010.004,CUSTOMERA,PRODUCTA,TotalSalesRevenue,-3000000
    All I can figure from this limited info is that it does not like the conversion of the TIME dimension.  I checked my dim members and have time dim's covering the ones shown in the file (2010.004 for example).  The BP doc says nothing about maintaining the conversion file or a custom transformation file - it says to leave that empty to default to IMPORT.xls. 
    I just finished the BPC 410 course, but can't correlate what we did to this error (or I missed it). 
    Can anyone shed some light on this error? 
    thanks

    Thank you for responding.  I ran through all checks and all are OK to me. 
    1 - Optimize all green no errors
    2 - Yes I already checked this but am not 100% sure I am seeing what the system is seeing.  My dim members include the TIME items that exist in the sample data.  Only one value did not exist = members start on 2010.004 and sample data included 2010.001.  I removed those records, but it still fails.
    Here is the reject list:
    Task Name : Convert Data
    [Dimension:  TIME]
    2010.007
    2011.002
    2010.005
    2010.008
    2011.001
    2010.006
    2011.003
    2010.011
    2010.004
    2010.001
    2010.012
    2010.009
    This tells me (an inexperienced BPC person) there is a problem in conversion of the external to internal data for that dimension.  However, I have already validated that they are equal - I can't see how 2010.004 does not equate to 2010.004, unless it is comparing to the EVDESC and not the ID (2010.004 versus 2010 APR).  Am I correct on that assumption? 
    3 - Yes I've also tried changing to a new transformation file with conversion and delimiters differently mapped, but all the same result. I'm sure I am missing something trivial here!  So I do appreciate any help to figure that out. 
    Thanks again

  • FDQM Sample data load file for HFM

    Hi all,
    I have just started working on FDQM 11.1.1.3. I had done integration of HFM application with FDQM. I required to load data to HFM application i don't have any gl file or csv file with hfm dimension . so any one can help me get this file.......
    and one more thing.......
    i just want to know what is the basic steps i need to perform to load data to HFM application after creating fdm application and integrating with hfm application.
    Thanks.....

    Hi,
    After creating fdm application and integrating with hfm application also includes the Target Application name setting in FDM
    Now the FDM application is ready with its target set to HFM application
    1. You can create an Import format as below for the application using only Custom1 and Custom2 dimensions from 4 available Custom dimensions(you can modify according to your dimensions):
    Use '|'(Pipe as delimeter)
    Account|Account Desription|Entity|C1|C2|Amount
    2. You can create a text file with data like mentioned below by making a combination of members from each dimension specified in import format:
    11001|Capex|111|000|000|500000
    11002|b-Capex|111|000|000|600000
    Note: these dimension memers should be mapped in 'Mappings' section; use the members specified in file as source and select any target member of HFM for them
    3. Then tag this import format with any location
    4. Import the text file using 'Browse' option
    5. Validate the data loaded
    Note: If mapping is done for all dimension members used in file, then validation will be successful
    6. Export the data
    Now you can export and hence load the data to HFM application using Replace/Merge option.
    Here you are with the basic steps to load the data from FDM to HFM.
    Regards,
    J

  • Error while loading Master Data - BPC 10.0 NW

    Hello gurus,
    I am trying to load master data from a BW InfoObject to a dimension in BPC, but am getting the error as shown below:
    The transformation and conversion files are configured as shown below:
    I would greatly appreciate if you can help me troubleshoot the issue with the java script command.  Thanks in advance for your help.
    Best regards,
    Vijay

    Hello Vijay,
    Which Patch are you using for EPM add in.
    It seems that js:%external%.replace("#","_") does not work with some patches
    so in that case you have to use js:%external%.toString().replace("#","_") in your conversion file.
    Deepesh

  • Best method to load XML data into Oracle

    Hi,
    I have to load XML data into Oracle tables. I tried using different options and have run into a dead end in each of those. I do not have knowledge of java and hence have restricted myself to PL/SQL solutions. I tried the following options.
    1. Using DBMS_XMLSave package : Expects the ROWSET and ROW tags. Connot change format of the incoming XML file (Gives error oracle.xml.sql.OracleXMLSQLException: Start of root element expected).
    2. Using the XMLPARSER and XMLDOM PL/SQL APIs : Works fine for small files. Run into memory problems for large files (Gives error java.lang.OutOfMemoryError). Have tried increasing the JAVA_POOL_SIZE but does not work. I am not sure whether I am changing the correct parameter.
    I have read that the SAX API does not hog memory resources since it does not build the entire DOM tree structure. But the problem is that it does not have a PL/SQL implementation.
    Can anyone PLEASE guide me in the right direction, as to the best way to achieve this through PL/SQL ??? I have not designed the tables so am flexible on using purely relational or object-relational design. Although would prefer to keep a purely relational design. (Had tried used object-relational for 1. and purely relational for 2. above)
    The XML files are in the following format, (EXAMINEEs with single DEMOGRAPHIC and multiple TESTs)
    <?xml version="1.0"?>
    <Root_Element>
    <Examinee>
    <MACode>A</MACode>
    <TestingJID>TN</TestingJID>
    <ExamineeID>100001</ExamineeID>
    <CreateDate>20020221</CreateDate>
    <Demographic>
    <InfoDate>20020221</InfoDate>
    <FirstTime>1</FirstTime>
    <LastName>JANE</LastName>
    <FirstName>DOE</FirstName>
    <MiddleInitial>C</MiddleInitial>
    <LithoNumber>73</LithoNumber>
    <StreetAddress>SomeAddress</StreetAddress>
    <City>SomeCity</City>
    <StateCode>TN</StateCode>
    <ZipCode>37000</ZipCode>
    <PassStatus>1</PassStatus>
    </Demographic>
    <Test>
    <TestDate>20020221</TestDate>
    <TestNbr>1</TestNbr>
    <SrlNbr>13773784</SrlNbr>
    </Test>
    <Test>
    <TestDate>20020221</TestDate>
    <TestNbr>2</TestNbr>
    <SrlNbr>13773784</SrlNbr>
    </Test>
    </Examinee>
    </Root_Element>
    Thanks for the help.

    Please refer to the XSU(XML SQL Utility) or TransX Utility(for Multi-language Document) if you want to load data in XML format into database.
    Both of them require special XML formats, please first refer to the following docs:
    http://otn.oracle.com/docs/tech/xml/xdk_java/doc_library/Production9i/doc/java/xsu/xsu_userguide.html
    http://otn.oracle.com/docs/tech/xml/xdk_java/doc_library/Production9i/doc/java/transx/readme.html
    You can use XSLT to transform your document to the required format.
    If you document is large, you can use SAX method to insert data into database. But you need to write the code.
    The following sample may be useful:
    http://otn.oracle.com/tech/xml/xdk_sample/xdksample_040602i.html

  • BPC:NW - Best practices to load Transaction data from ECC to BW

    I have a very basic question for loading GL transaction data into BPC for variety of purposes, would be great if you can point me towards best practices/standard ways of making such interfaces.
    1. For Planning
    When we are doing the planning for cost center expenses and need to make variance reports against the budgets, what would be the source Infocube/DSO for loading the data from ECC via BW, if the application is -
    YTD entry mode:
    Periodic entry mode:
    What difference it makes to use 0FI_GL_12 data source or using 0FIGL_C10 cube or 0FLGL_O14 or 0FIGL_D40 DSOs.
    Based on the data entry mode of planning application, what is the best way to make use of 0balance or debit_credit key figures on the BI side.
    2. For consolidation:
    Since we need to have trading partner, what are the best practices for loading the actual data from ECC.
    What are the typical mappings to be maintained for movement type with flow dimensions.
    I have seen multiple threads with different responses but I am looking for the best practices and what scenarios you are using to load such transactions from OLTP system. I will really appreciate if you can provide some functional insight in such scenarios.
    Thanks in advance.
    -SM

    For - planning , please take a look at SAP Extended Financial Planning rapid-deployment solution:  G/L Financial Planning module.   This RDS captures best practice integration from BPC 10 NW to SAP G/L.  This RDS (including content and documentation) is free to licensed customers of SAP BPC.   This RDS leverages the 0FIGL_C10 cube mentioned above.
      https://service.sap.com/public/rds-epm-planning
    For consolidation, please take a look at SAP Financial Close & Disclosure Management rapid-deployment solution.   This RDS captures best practice integration from BPC 10 NW to SAP G/L.  This RDS (including content and documentation) is also free to licensed customers of SAP BPC.
    https://service.sap.com/public/rds-epm-fcdm
    Note:  You will require an SAP ServiceMarketplace ID (S-ID) to download the underlying SAP RDS content and documentation.
    The documentation of RDS will discuss the how/why of best practice integration.  You can also contact me direct at [email protected] for consultation.
    We are also in the process of rolling out the updated 2015 free training on these two RDS.  Please register at this link and you will be sent an invite.
    https://www.surveymonkey.com/s/878J92K
    If the link is inactive at some point after this post, please contact [email protected]

  • Loading transaction data from flat file to SNP order series objects

    Hi,
    I am an BW developer and i need to provide data to my SNP team.
    Can you please let me know more about <b>loading transaction data (sales order, purchase order, etc.,) from external systems into order based SNP objects/structure</b>. there is a 3rd party tool called webconnect that gets data from external systems and can give data in flat file or in database tables what ever required format we want.
    I know we can use BAPI's, but dont know how. can you please send any <b>sample ABAP program code that calls BAPI to read flat file and write to SNP order series objects</b>.
    Please let me know ASAP, how to get data from flat file into SNP order based objects, with options and I will be very grateful.
    thanks in advance
    Rahul

    Hi,
    Please go through the following links:
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4180456
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4040057
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=3832922
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4067999
    Hope this helps...
    Regards,
    Habeeb
    Assign points if helpful..:)

  • Problem in loading transactional data from to 0MKT_DSO1(ods) to 0MKTG_C01

    Hi,
    I am trying to load lead transaction data to the standard Crm lead management cube from ODS.There is a problem while loading transaction data from 0MKT_DSO1(ods) to the infocube 0MKTG_C01 as the field 0STATECSYS2(CRM STATUS)  is set to 10 in ods -meaning incorrect transaction. This feild is not there in the infocube.
    There is a routine in the cube that deletes data records with (0statecsys2) set to 10.
    THIS field is not coming in the transaction.
    so, where can i see the master data in crm source system? and why is that feild getting set to 10 ?
    thanks in advance!

    Thanks for the reply..
    I have checked the Fact table which shows
    1. packet Dimension
    2. Time dimension
    3. Unit dimension.
    I have kept the 0CALDAY as the time characteristics.
    Sample data i have loaded from ODS to Cube.
    Sample data in ODS.
    Sales order No_____0CALDAY_____AMOUNT
    800001___________12/02/2009____15
    I have loaded this data in Cube with Full Upload.
    Data in Cube.
    Sales order No_____0CALDAY_____AMOUNT
    800001___________12/02/2009____15
    Again i am loading the same data to cube
    Data in cube after loading.
    Sales order No_____0CALDAY_____AMOUNT
    800001___________12/02/2009____15
    800001___________12/02/2009____15
    The data is duplicated and it is not cumulating.
    Am i missing anything on this.
    Pls help..
    Thanks,
    Siva.

  • Navteq sample data

    hi,
    I have problems during the navteq sampla data import (http://www.oracle.com/technology/software/products/mapviewer/htdocs/navsoft.html). I'm follwing exactly the instructions from the readme file.
    Import: Release 10.2.0.1.0 - Production on Thursday, 25 September, 2008 10:35:28
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/********@ORCL REMAP_SCHEMA=world_sample:world_sample dumpfile=world_sample.dmp directory=sample_world_data_dir
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    . . imported "WORLD_SAMPLE"."M_ADMIN_AREA4" 394.5 MB 118201 r
    After the last log message the systems hangs with 50% cpu time and nothing happens. What could be the problem? Only the table M_ADMIN_AREA4 contains the 118201 rows.
    Regards, Arnd
    Edited by: aspie on Sep 25, 2008 11:45 AM

    Hi, I followed the instructions from the readme file:
    1. Create a work directory on your machine. Change directory to your work
    directory and download the world_sample.zip file into it. Unzip the
    world_sample.zip file:
    C:\> mkdir world_sample
    C:\> cd world_sample
    C:\world_sample> unzip world_sample.zip
    2. In your work directory, open a Sqlplus session and connect as the SYSTEM
    user. Create a user for the World Sample data (if you have not previously
    done so):
    C:\world_sample> sqlplus SYSTEM/password_for_system
    SQL> CREATE USER world_sample IDENTIFIED BY world_sample;
    SQL> GRANT CONNECT, RESOURCE TO world_sample;
    3. This step loads the World Sample data into the specified user, and creates
    the metadata required to view the sample data as a Map in MapViewer. Before
    you begin, make a note of the full directory path of your work directory,
    since you will be prompted for it. You will also be prompted for the user
    (created in step 2), and the SYSTEM-user password. This step assumes that
    you are already connected as the SYSTEM user, and that you are in your work
    directory. To begin, run the load_sample_data.sql script in your Sqlplus
    session. Exit the Sqlplus session after the script has successfully
    concluded:
    SQL> @load_sample_data.sql
              Enter value for directory_name: C:\world_sample
              Enter value for user_name: world_sample
              Password: password_for_system
    SQL> exit;
    In step 2 I created a user dedicated tablespace. The same problem.
    Regards Arnd.

  • Stage tab delimited CSV file and load the data into a different table

    Hi,
    I pretty new to writing PL/SQL packages.
    We are using Application express for our development. We get CSV files which is stored as a BLOB content in a table. I need to write a trigger that would get executed once the user the uploads the file and parse thru the Blob content and upload or stage the data in a different table.
    I would like to see if there is any tutorial or article that could explain the above process with the example or sample code to do the same. Any help in this regard will be highly appreciated.

    Hi,
    This is slightly unusual but at the same time easy to solve. You can read through a blob using the dbms_lob package, which is one of the Oracle supplied packages. This is presumably the bit you are missing, as once you know how you read a lob the rest is programming 101.
    Alternatively, you could write the lob out to a file on the server using another built in package called utl_file. This file can be parsed using an appropriately defined external table. External tables are the easiest way of reading data from flat files, including csv.
    I say unusual because why are you loading a csv file into a blob? A clob is almost understandable but if you can load into a column in a table why not skip this bit and just load the data as it comes in straight into the right table?
    All of what I have described is documented functionality, assuming you are on 9i or greater. But you didn't provide a version so I can't provide a link to the documentation ;)
    HTH
    Chris

  • Sample Data ODM

    where can i get the sample datas for the odm schema ?
    the *.ctl and *.dat files for the sql-loader.
    directory $ORACLE_HOME/dm/demo/sample is not present in my
    insallation of 10g on Linux.
    datamining makes no sense without datas.....

    thanks for your help marko,
    i found the datas but still have no idea what datamining is..i think i have to read some books.

Maybe you are looking for