Data load between disparate outlines

<p>Hello,</p><p>We have a Hyperion Planning DB withAccounts, Time and 5 custom dimensions. We also have9 attribute dimensions-  7 of them pertaining to Entities, 2pertaining to Employee dimension. If we wish to create a newASO DB with 16 dimensions including the attribute dimensions,</p><p> </p><p>1. will we be able to load the data from the source BSO DB dataexport file to the target ASO 16 dimension DB? If yes, how?</p><p>2. what are the considerations to make the source records matchup with the intersections in the new DB?</p><p> </p><p>Just for information we have HAL in our environment for dataextraction, transformation and loading. We operate on a SQL ServerDB.</p><p>Regards,</p><p>Vijay</p>

<p>Hi Claudia,</p><p>Thanks for your input. Here is how the hierarchieslook: </p><p><b>Source Database(BSO) - 7 dimensions: </b></p><p>Accounts, Time, Scenarios, Versions, FiscalYears, Employees, Entities                               </p><p><b>Target Database (ASO) - 16 dimensions:</b></p><p>Accounts, Time, Scenarios, Versions, FiscalYears, Employees, Entities,Custom Dim 1, ...., Custom Dim 9</p><p>Level zero for the members will be exactly the same onlyfor the common set of dimensions between the two hierarchies.</p><p>Can we manipulate the exported data to accomodate CustomDimension 1 to 9 before loading into the destination; is thistechnically feasible; any thoughts? </p><p>Regards,</p><p>Vijay</p>

Similar Messages

  • Issue with Data Loading between 2 Cubes

    Hi All
    I have a Cube A which has huge amount of data. Around 7 years of data. This cube is on BWA. In order to empty out space from this Cube we have created a new Cube B.
    We have now starting to load data from Cube A to Cube B based on created on. But we are facing a lot of memory issues hence we are unable to load data for a week in fact. As of now we are loading based on 1 date which is not useful as it will take lot of time to load 4 years of data.
    Can you propose some alternate way by which we can make this data transfer faster between 2 cubes ? I though of loading Cube B from DSO under Cube A but that's not possible as the DSO does not have that much old data.
    Please share your thoughts.
    Thanks
    Prateek

    HI SUV / All
    I have tried running based on Parallel process as there are 4 for my system. there are no routines between Cube to Cube. There is already a MP for this cube. I just want to shift data for 4 years from this cube into another.
    1) Data packet as 10, 000 - 8 out of some 488 packets failed
    2) Data packet as 20, 000 - 4 out of some 244 packets failed
    3) Data packet as 50,000 - Waited for some 50 min. No extraction. So killed the job.
    Error : Dump: Internal session terminated with runtime error DBIF_RSQL_SQL_ERROR (see ST22)
    In ST22:
    Database error text........: "ORA-00060: deadlock detected while waiting for
      resource"
    Can you help resolving this issue or some pointers ?
    Thanks
    Prateek

  • Which LKM and IKM to use for Fast data loading b/w MSSQL 2005 and Oracle 11

    Hi,
    Can anybody help us to decide which LKMs and IKMs are best for data loading between MSSQL and Oracle.
    Staging Area is Oracle. We have to load around 400Million rows from MSSQL to Oracle 11g.
    Best regards,
    Muhammad

    Thanks Ayush,
    You are right and it has dumped the file very quickly; but it is giving error on sqlldr call thorugh jython. I have reaised SR with oracle to look into it further.
    thanks again and have a very nice time.
    Regards,
    Muhammad

  • Data load taking very long time between cube to cube

    Hi
    In our system the data loading is taking very long time between cube to cube using DTP in BI7.0,
    the maximum time consumption is happening at start of extraction step only, can anybody help in decreasing the start of extraction timing please
    Thanks
    Kiran

    Kindly little bit Elaborate your issue, Like how is mapping between two cubes, Is it One to one mapping or any Routine is there in Transformation. Any Filter/ Routine in DTP.  Also before loading data to Cube did you deleted Inedxes?
    Regards,
    Sushant

  • Relationship between ERPi metadata rules and Data load rules

    I am using version 11.1.1.3 to load EBS data to ERPi and then to FDM. I have created a Metadata rule where I have assigned a ledger from the Source Accounting Entities. I have also created a Data Load rule that references the same ledger as the metadata rule. I have about 50 ledgers that I have to integrate so I have added the source adapters that reference the Data Load rule.
    My question is... What is the relation between the Meatdata rule and the Data load rule with the ledger? If you can specify only one Ledger per metadata rule, then how does FDM know to use another metadata rule with another ledger attached to it? Thanks!!!

    Aksik
    1 How freequently this activation problem occurs. If it is one time replicate the datasource and activate thetransfer structure( But in general as you know activation of transfer structure should be done automatically after transport of the object)
    2 One thing for difference of time is environmental as you know in production system so many jobs will run at the same time so obiously system performance will be slow compare to Dev System. In your case both the systems are performing equally. You said in dev system for 50000 records half an hour and in production 200000 records 2hrs so records are more in Production system and it took longer time. If it is really causing problem then you have to do some performance activities.
    Hope this helps
    Thnaks
    Sat

  • Outline load utility Data load error in Planning

    Hi,
    We are loading the data using OutlineLoad utility in Hyperion Planning 11.1.2.1,but data is not loading and the log file showing the below error.
    Unable to obtain dimension information and/or perform a data load: com.hyperion.planning.olap.NoAvailableOlapConnectionsException: Unable to obtain a connection to Hyperion Essbase. If this problem persists, please contact your administrator.
    Can you please why we are getting this error.
    Thanks.

    Hi John,
    I tried refresh the db and running the utility again, still its showing same error,
    and the log file showing the following.
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2560/Info(1051164)
    Received login request from [::ffff:IP]
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2560/Info(1051187)
    Logging in user [admin@Native Directory] from [::ffff:IP]
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2788/Info(1051001)
    Received client request: List Applications (from user [admin@Native Directory])
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///11872/Info(1051001)
    Received client request: List Databases (from user [admin@Native Directory])
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///11232/Info(1051164)
    Received login request from [::ffff:IP]
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///11232/Info(1051187)
    Logging in user [admin@Native Directory] from [::ffff:IP]
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2824/Info(1051001)
    Received client request: Get Application Info (from user [admin@Native Directory])
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2560/Info(1051001)
    Received client request: Get Application State (from user [admin@Native Directory])
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2788/Info(1051001)
    Received client request: Select Application/Database (from user [admin@Native Directory])
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///2788/Info(1051009)
    Setting application BUD_APP active for user [admin@Native Directory]
    [Sat Oct 15 16:52:23 2011]Local/ESSBASE0///11872/Info(1051001)
    Received client request: Logout (from user [admin@Native Directory])
    Thanks

  • EIS data load via rules file

    Hi,
    PLease let me know the process to control the EIS data load into essbase using a rules file. I did not find the option in EIS. please help me.
    Thanks,
    pr

    In EIS
    1) You have to define the Logical OLAP Model connecting to the relational source.It defines the joins between fact table and dimension tables.
    2) Based on the OLAP Model You have to create meta outline which defines the rules for loading members and data into essbase.

  • EIS to Essbase data load process

    I have two user defined queries in a metaoutline. When i perform dataload operation from EIS, the Essbase application log says that dataload updated [] cells and Data Load Elapsed Time [] seconds but the EIS dataload process is still running. It seems that Essbase is loading the data only for the first query and completing dataload while EIS is still running second query.Has anyone else experienced this problem? Essbase and EIS both are 6.5.0.

    In EIS
    1) You have to define the Logical OLAP Model connecting to the relational source.It defines the joins between fact table and dimension tables.
    2) Based on the OLAP Model You have to create meta outline which defines the rules for loading members and data into essbase.

  • Data Load error Please respond ASAP

    Hi ,
    When i am trying to load data for year 07 using rule files its showing an error and aborting the data load
    Error Msg " Data Value [2007] Encountered before all dimensions Selected,[5] Records Completed"
    Any help greatly appreciated.
    Thanks

    Key here is that 5 records were loaded before failing on the 6th, indicating it's not a header issue on the load rule.
    Check the first 5 records and see how the column containing 2007 differs between them and the failing 6th record. Perhaps the 6th record has a null (as indicated eariler) or prehaps 2007 is not the name of the member in the outline.
    I have had both issues give similar error results.
    J

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Data load stuck from DSO to Master data Infoobject

    Hello Experts,
    We have this issue where data load is stuck between a DSO and master data infoobject
    Data uploads from DSO( std) to master data infoobject.
    This Infoobject has display and nav attributes in it which are mapped from DSO to Infoobject.
    Now we have added a new infoobject as attribute to the master data infoobject and made it as NAV attri.
    Now when we are doing full load via DTP the load is stuck and is not processing.
    Earlier it took only 5 mns of time to complete the full load.
    Please advise what could be the reason and cause behind this.
    Regards,
    santhosh.

    Hello guys,
    Thanks for the quick response.
    But its nothing proceeding further.
    The request is still running.
    earlier this same data is loaded in 5 mns.
    Please find the screen shot.
    master data for the infoobjects are loaded as well.
    I can see in SM50 the process at P table of the infoobject the process is.
    Please advise.
    Please find the detials
    Updating attributes for InfoObject YCVGUID
    Start of Master Data Update
    Check Duplicate Key Values
    Check Data Values
    Process time dependent attributes- green.
    No Message: Process Time-Dependent Attributes- yellow
    No Message: Generates Navigation Data- yellow
    No Message: Update Master Data Attributes - yellow
    No Message: End of Master Data Update - yellow
    and nothing is going further in Sm37
    Thanks,
    Santhosh.

  • Problem with Master Data Load

    Dear Experts,
    If somebody can help me by the following case, please give me some solution. Iu2019m working in a project BI 7.0 were needed to delete master data for an InfoObject material. The way that I took for this was through tcode u201CS14u201D. After that, I have tried to load again the master data, but the process was broken and the load done to half data.
    This it is the error:
    Second attempt to write record 'YY99993' to /BIC/PYYYY00006 failed
    Message no. RSDMD218
    Diagnosis
    During the master data update, the master data tables are read to determine which records of the data package that was passed have to be inserted, updated, or modified. Some records are inserted in the master data table by a concurrently running request between reading the tables at the start of package processing and the actual record insertion at the end of package processing.
    The master data update tries to overwrite the records inserted by the concurrently running process, but the database record modification returns an unexpected error.
    Procedure
    u2022     Check if the values of the master data record with the key specified in this message are updated correctly.
    u2022     Run the RSRV master data test "Time Overlaps of Load Requests" and enter the current request to analyze which requests are running concurrently and may have affected the master data update process.
    u2022     Re-schedule the master data load process to avoid such situations in future.
    u2022     Read SAP note 668466 to get more information about master data update scheduling.
    Other hand, the SID table in the master data product is empty.
    Thanks for you well!
    Luis

    Dear Daya,
    Thank for your help, but I was applied your suggesting. I sent to OSS with the following details:
    We are on BI 7.0 (system ID DXX)
    While loading Master Data for infoobject XXXX00001 (main characteristic in our system u2013 like material) we are facing the following error:
    Yellow warning u201CSecond attempt to write record u20182.347.263u2019 to BIC/ XXXX00001 was successfulu201D
    We are loading the Master data from data source ZD_BW_XXXXXXX (from APO system) through the DTP ZD_BW_XXXXX / XXX130 -> XXXX00001
    The Master Data tables (S, P, X) are not updated properly.
    The following reparing actions have been taken so far:
    1.     Delete all related transactional and master data, by checking all relation (tcode SLG1 à RSDMD, MD_DEL)
    2.     Follow instructions from OSS 632931 (tcode RSRV)
    3.     Run report RSDMD_CHECKPRG_ALL from tcode SE38 (using both check and repair options).
    After deleting all data, the previous tests were ok, but once we load new master data, the same problem appears again, and the report RSDMD_CHECKPRG_ALL gives the following error.
    u201CCharacteristic XXXX00001: error fund during this test.u201D
    The RSRV check for u201CCompare sizes of P and X and/or Q and Y tables for characteristic XXXX00001u201D is shown below:
    Characteristic XXXX00001: Table /BIC/ PXXXX00001, /BIC/ XXXXX00001 are not consistent 351.196 derivation.
    It seems that our problem is described in OSS 1143433 (SP13), even if we already are in SP16.
    Could somebody please help us, and let us know how to solve the problem?
    Thank for all,
    Luis

  • Is it  possible to upload few column in table through  Apex Data Loading

    Hi All,
    I have to do upload into the table through a csv file . The table's primary key i have to load the rest through user's uploaded file. Is it possible to do the data loading to the table only to required columns and fill the other columns from backend. Or is there any other way to do this?

    Hi,
    Your query is not really clear.
    >
    Is it possible to do the data loading to the table only to required columns and fill the other columns from backend. Or is there any other way to do this?
    >
    How do you plan to "link" the rows from these 2 sets of data in the "backend"? There has to be a way to have a relation between them.
    Regards,

  • Data load failed while loading data from one DSO to another DSO..

    Hi,
    On SID generation data load failed while loading data  from Source DSO to Target DSO.
    Following are the error which is occuuring--
    Value "External Ref # 2421-0625511EXP  " (HEX 450078007400650072006E0061006C0020005200650066
    Error when assigning SID: Action VAL_SID_CONVERT, InfoObject 0BBP
    So, i'm  not getting  WHY in one DSO i.e Source  it got successful but in another DSO i.e. Target its got failed??
    While analyzing all i check that SIDs Generation upon Activation is ckecked in source DSO but not in Target DSO..so it is reason its got failed??
    Please explain..
    Thanks,
    Sneha

    Hi,
    I hope your data flow has been designed in such a way where the 1st DSO as a staging Device and all transformation rules and routine are maintained in between 1st to 2nd dso and sid generation upon activation maintained in 2nd DSO.  By doing so you will be getting your data 1st DSO same as your source system data since you are not doing any transformation rules and routine etc.. which helps to avoid data load failure.  
    Please analyze the following
    Have you loaded masterdata before transaction data ... if no please do it first
    go to the property of first dso and check whether there maintained sid generation up on activation (it may not be maintained I guess)
    Goto the property of 2nd Dso and check whether there maintained sid generation up on activation (It may be maintained I hope)
    this may be the reason.
    Also check whether there is any special char involvement in your transaction data (even lower case letter)
    Regards
    BVR

  • XML Data Load into releational structures

    Hi,
    I am very unexperienced in using XML and have the problem
    to import very large XML data files into existing reletional structures.
    In our production DB we don't use the java engine, so
    that PL/SQL an the SQL Loader are the only available ways to import the data.
    At the moment we get flat files and use the SQL Loader utility. But an interface to a new system send XML data now and I have to fill the same old releational structure with the new data.
    Can anybody give me a hint about the best technic for an high performance import. Are there any existing tools for the relational mapping?
    Regards Ralph

    Thank you for your reply.
    You are right. We only want to break the XML to fill our relational structures. We don't need the XML data further on. But we have to load the data in temporary structures, because we have to transform the data in our own format. (The system which delivers the XML data is external and uses another data model)
    Is there no more elegant way with use of databse built in technics? The XML data we get can be validated against a XML schema.
    So I thought, it could be a way to load the XML in the XDB and register the schema in the database. After that store the XML data in the default generated object relational structures and then programm the data transformation and the data flow between these default structures to our target data structures with PL/SQL.
    I don't know if this way is performant enough.
    If I use an external tool i have to code the relational mapping outside the database and insert the data with use of ODBC in temporary structures which i have to create manualy.
    So I hoped to find a way to load the data in any relational structure using the advantages of XML and XML schema and code the neccasary logic inside the DB.
    Do you have any further hints for my problem?
    Regards Ralph

Maybe you are looking for

  • Is there an adapter for the mini discs to use in the Mac

    I have one of the mini discs that I received with my new Blackberry which will get eaten if i try to use the disc slot on my lap top, does Apple make some type of adapter to be able to read these mini discs?

  • Oracle 8i on Redhat Linux

    unable to run dbassist and unable access sqlplus from Server. please letme know how to set the DISPLAY and Xhost variable which is required run dbassist. error _X11Transockert INETconnect: cannot connect: errno=111                                    

  • Files wont Load to edit

    I am admin for the site. I can load the index file but no other pages show up in the browser. I can use "CHOOSE A FILE" and see all the files there. selecting one will not allow it to preview anyway. selecting ok will not load the page for edit. Must

  • Unable to uninstall QuickTime and Apple Software Update

    HELP US This is really screwed up. I got the notice to update to 10, ran the update, threw an error about looking for a file, quicktime.msi. Nowhere on my machine. I followed the troubleshooting tips. I am now unable to uninstall Quick Time, still lo

  • Hi, my hp dv4-1503tu fingerprin​t sensor is not working after I formatted it. please help me out

    After formatting the finger print sensor is not working. I have got hp dv4-1503tu. Kindly, help me out. Warm regards, Tanbir Singh