Data Migration using ODI

Hi All-
Has anyone used ODI for Data Migration (from any legacy to any new application)?
I'm looking for documents on how the solution was designed and implemented?
Regards,
-Ranjit

Hi,
Yes I'm using ODI for data migration.
The migration is beetwen Legacy (slqserver) to SAP (SqlServer), we also use WEB SERVICE.

Similar Messages

  • Logical data Model Using ODI

    Can I create Logical data Model Using ODI ? If yes how can I do it ?

    User,
    ODI is not a tool to design. It's a comprehensive integration platform.
    However if you would like to model your data, you can use for example Oracle Designer 9i, which is availble one the Oacle website for free.
    Oracle 10gR2 Designer is also availble at: http://www.oracle.com/technology/software/products/designer/htdocs/winsoft.html
    Good luck.
    Edited by: kumper on 2008-10-16 00:33

  • GL Data load using ODI to Essbase

    Hi, I am trying to load GL actual data to essbase application using ODI. Source file is having 10 columns and the Target is having 11 columns. We are using rules file to load data into essbase. Rules file will split 9th column as two columns and will load the data into Essbase. When we test the rules file in essbase data is getting load into application. when we use the same rule file in ODI interface data is not getting load and giving error as "Unknown member" for the member which we are spliting in 9th column. Source file: HSP_RATES ACCOUNT PERIOD YEAR SCENARIO VERSION CURRENCY ENTITY SFUND PROGRAM DATA HSP_InputValue 611101 Jul FY13 ACTUAL Final Local 0000 SBNR AC0001PS0001 25000 AC0001PS0001 is the concatenated string from GL. we will split this as two columns using rules file to load into essbase application. Please suggest what might be the reason for the error. How to do the mapping between source and target. I have mapping one column(AC0001PS0001) to two dimensions (Program, Activity) in Essbase. Please suggest. Thanks Sri

    In ODI, what you have to do is to split it in the ODI itself. While you are mapping, you can use SQL functions to map it to two different columns. Similar to the way you are doing it in Rule file.
    Regards
    Amarnath
    ORACLE | Essbase

  • Data loading using ODI

    Hi,
    I am using ODI 10g for loading data into oracle 10g database from a flat file. The flat file is in fixed format, but the data is in a single row, so the data is not loaded correctly. For example, 1000 records are shown in 1 line.
    Can someone please suggest, what customization should I make to the IKM to achieve this.
    Please let me know the questions you may have.
    Thanks a lot.

    Hi John,
    I reversed my Planning model and able to see those columns now. I tried executing my interface and got the below error
    The source result set contains column [EMPLOYEE_ACCOUNT] which has no corresponding column on the planning side.
    My source is an oracle table and i have seperate columns for each dimension member. What I did is, mapped the Data column (JobTitle) from source to the Target column(JobTitle) and rest all dimension member columns from source to Point of View column seperated by a comma. My Planning cube name is EMPLOYEE, so in th data cube name column i manually entered "EMPLOYEE" and tried executing it.
    Please guide me as always
    Thanks,
    Sravan

  • Data Migration using solution Manager

    Hi All,
    Is there any way to use solution manger for data migration.
    Thanks in advance.
    Regards,
    Venkanna

    Hi,
    in case you want to use it as middleware to connect 2 systems then rather use SAP XI / PI. SolMan is not providing such a functionality!
    Regards,
    Kai

  • Reverse engg data store using ODI SDK

    how to I reverse engineer a data store and not a model using ODI SDK
    Is there any class for reverse engg? Or should I use KM's?

    When I reverse engg my delimited or fixed file, I am able to see the data of datastore; I used same datastore in interface
    Yes , I used the data store in interface using code(sdk)
    I am not doing anything manually in studio; everything is done through java code (using sdk)...but I have checked data of datastore by right clicking on it, its giving proper data
    filedescriptor contains start position and num of bytes
    your text file is looks like FIXED format,
    my Delimited format text file:
    empNo,empName
    1,Deepa
    2,Deepali
    3,Patil
    4,Deeps
    and FIXED format text file:
    10   
    Georges                                     
    Hamilton                                    
    15/01/2001 00:00:00
    11   
    Andrew                                      
    Andersen                                    
    22/02/1999 00:00:00
    12   
    John                                        
    Galagers                                    
    20/04/2000 00:00:00
    13   
    Jeffrey                                     
    Jeferson                                    
    10/06/1988 00:00:00
    20   
    Jennie                                      
    Daumesnil                                   
    28/02/1988 00:00:00
    21   
    Steve                                       
    Barrot                                      
    24/09/1992 00:00:00
    22   
    Mary                                        
    Carlin                                      
    14/03/1995 00:00:00
    30   
    Paul                                        
    Moore                                       
    11/03/1999 00:00:00
    31   
    Paul                                        
    Edwood                                      
    18/03/2003 00:00:00
    32   
    Megan                                       
    Keegan                                      
    29/05/2001 00:00:00
    40   
    Rodolph                                     
    Bauman                                      
    29/05/2000 00:00:00
    41   
    Stanley                                     
    Fischer                                     
    12/08/2001 00:00:00
    42   
    Brian                                       
    Schmidt                                     
    25/08/1992 00:00:00
    50   
    Anish                                       
    Ishimoto                                    
    30/01/1992 00:00:00
    51   
    Cynthia                                     
    Nagata                                      
    28/02/1994 00:00:00
    52   
    William                                     
    Kudo                                        
    28/03/1993 00:00:00
    if file format is Delimited we need not to worry about physical length or logical length, bcoz we are able to do reverse engg for that; which automatically set all fields
    odiColumn.setLength(length); method sets both physical and logical length

  • Repeat SSMA Data Migration Using Command Line

    Ok, I'm using the following script to move a select group of tables from a MySQL DB to SQL Server 2012, Is there a better way? Notice I must detail a different "migrate-data" line for each table I'm moving. I have like 400 tables to move and this
    works, but figure there might be a short cut somewhere.
    <script-commands>
    <create-new-project overwrite-if-exists="true" project-name="SqlMigration2" project-folder="e:\SSMAProjects\"/>
    <connect-source-database server="source_1"/>
    <connect-target-database server="target_1"/>
    <map-schema sql-server-schema="ntelos_yoda.dbo" source-schema="ntelos_yoda"/>
    <migrate-data write-summary-report-to="e:\SSMAProjects\Reports\ConvertSchemaReport1.xml" object-name="ntelos_yoda.account" object-type="Tables"/>
    <migrate-data write-summary-report-to="e:\SSMAProjects\Reports\ConvertSchemaReport1.xml" object-name="ntelos_yoda.account_email" object-type="Tables"/>
    <migrate-data write-summary-report-to="e:\SSMAProjects\Reports\ConvertSchemaReport1.xml" object-name="ntelos_yoda.account_email_template" object-type="Tables"/>
    <save-project/>
    <close-project/>
    </script-commands>

    hi,
    please check this link  https://msdn.microsoft.com/en-us/library/hh313125(v=sql.110).aspx

  • Data Migration using LSMW or BDC ?

    Hi,
    Can anyone provide me with the following information about uploading MM Master Data using LSMW or BDC :-
    1. What and all tables are used for uploading Material Master, Vendor Master, Info Record, Open PO, Open PR, RFQ, Open Contracts/Agreements ?
    2. What problems are faced during Data Upload ?
    3. What error appears/encouontered during upload ?
    4. What is the diffrence b/w LSMW and BDC ? Both can be used for Data upload so, what differences are b/w them ?
    5. Any other thing to remember/know during Data Upload ?
    Thanks,
    Lucky

    Hi Lucky,
    Dont get angry by seeing my response.
    See each and every question posted by u is available , in this forum,
    u need to do just a search on individual post.
    see what ever ur posted, u r definetly got the solutions through others,
    remember most of these answers are fetching by this threads itself (60-70%)
    a few solutions are there that are given by own,
    so better to search first, in the forum , if not satisfied with the solutions then .........
    Just its an friendly talk
    and Reward points for helpful answers
    Thanks
    Naveen khan

  • Data load using ODI 11.1.1.5 to hyperion.

    Hi All,
    I have installed ODI and we have already installed Hyperion.
    After installation of ODI , I have created Master repository and Work repository.
    Now i am planning to load data to hyperion planning applications dimensions(Eg. Entity).
    1) In Topology section, i did right click on Hyperion planning and clicked on "New Data Server". Filled the required infomartion and saved it.
    2) Then i right clicked on newly created Data server and opted "New physical schema", but after that it is not showing any application in Application(Catalog) and Application (Work Catalog) drop downs.
    Before those steps we had created planing applications in Hyperion.
    Please let me know what is wrong/missing in above steps or share the steps to load the data in hyperion.
    -PM

    Yes , i have entered the values in text boxes/drop downs . Afterward i tried to save it but it ask to fill "context".Then i seletected Context option, In that i have only one option global, I opted that.
    And in second field logical schema , doesnt have any value. System doesnt allow to save without giving these values.
    Can you please me further steps.
    -PM

  • How to use IDoc. in data migration?

    Dear Expert,
    How can I use IDoc. method in Data Migration???

    Hi,
    You can use Idoc method in data migration using LSMW - Idoc method.
    In LSMW select the Idoc method and give the Message type and Basic type according to your requirement.
    Before that you need to maintain the Idoc inbound processing.
    Goto LSMW initial screen,
    LSMW -> Settings -> IDoc Inbound Processing. Maintain File port, partner type and partner no.
    After maintaining the Partner no. goto WE20 (maintain partner no.) and add the message type in the Inbound parameters of the partner no. Also we need to maintain the Process code for the message type.
    After all this settings we can use Idoc method for data migration.
    The rest of the steps are same as other methods till convert data.
    Regards,
    Asif Ali Khan

  • Oracle Legacy System to SAP Data Migration

    Hi Experts,
    New to data migration:
    Can you guide me in how oracle staging is useful for data migration:
    Here is my few doubts:
    1. What is Oracle Staging?
    2. How Oracle staging is useful for data migration?
    3. I see few ETL tools for data migration such as Informatica, Ascential Datastage etc. but our requirement is how can we use oracle staging for data migration?
    4. What are the benefits in using oracle staging for data migration?
    Expecting your response of above queries.
    Thanks,
    --Kishore                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

    Here is my few doubts:
    1. What is Oracle Staging?It is where ODI creates temporary tables. It does the transformation and if required cleans the data aswell.
    2. How Oracle staging is useful for data migration?ODI loads source data into temporary tables(staging) and applying all the required mappings, staging filters, joins and constraints. The staging area is a separate area in the RDBMS (a user/database) where Oracle Data Integrator creates its temporary objects and executes some of the rules (mapping, joins, final filters, aggregations etc.). When performing the operations this way, Oracle Data Integrator behaves like an E-LT as it first extracts and loads the temporary tables and then finishes the transformations in the target RDBMS.
    3. I see few ETL tools for data migration such as Informatica, Ascential Datastage etc. but our requirement is how can we use oracle staging for data migration?
    4. What are the benefits in using oracle staging for data migration?You can refer https://blogs.oracle.com/dataintegration/entry/designing_and_loading_your_own
    http://docs.oracle.com/cd/E21764_01/integrate.1111/e12643/intro.htm#autoId10
    Expecting your response of above queries.
    Thanks,
    --Kishore                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Module for data migration from pdf(zip) to pdf(unzip)

    Hi all,
    I am Working on Data migration using SAP XI with out using IR.
    I want to Zip the pdf file in the Sender CC and unzip the same in Receiver CC.
    I am using PAYLOADZIPBEAN to zip the pdf.
    Did any one work on zipping and unzipping the PDF files?
    Kindly let me know the Module I need to use for the same.
    Thanks,
    Sree

    Hi Sree,
    Please go through this blog and follow the step by step procedure for configuring your scenario.
    /people/stefan.grube/blog/2007/02/20/working-with-the-payloadzipbean-module-of-the-xi-adapter-framework
    Its a nice blog and i have configured my scenario using the same already.
    Thanks and Regards,
    Sanjeev.

  • Planning EPMA migration using LCM

    Hi Experts,
    Kindly advice how can we migrate EPMA Planning data?
    As per document:
    http://download.oracle.com/docs/cd/E17236_01/epm.1112/epm_lifecycle_management/frameset.htm?ch07s04s03.html
    >
    Planning data migration is not supported in Lifecycle Management.Does it different if we use Essbase Data migration using LCM under essbase Node?
    Ver: 11.1.2 (EPMA Plannning)
    Regards
    Kumar

    Hi Alp,
    Thanks for the help.
    It means I should fallow
    Step 1: Migrate security and all artifacts of EPMA Planning using LCM except Planning Data - OK
    Step 2: Export/Import Essbase data using EAS/Maxl Script – seems OK
    Q1: we have option of essbase data under essbase node while migrate essbase application/Database using LCM, Why can't we just migrate Planning data using LCM itself?
    Regards
    Kumar

  • Best practices for data migration

    Hi All,
    This thread is useful for those who can use their opportunity to share the ideas and knowledge for making better and best use of data migration using Business Objects Data Services.

    Hans,
    The current release of SAP Best Practices for Data Migration, v1.32, supports only MS SQL Server.  The intent when developing the DM content was to quickly set up a temporary, standardized data migration environment, using tools that are available to everyone.  SQL Server Express was chosen to host the repositories, because it is easy to set up and can be downloaded for free.  Sone users have successfully deployed the content on Oracle XE, but as you have found, the MigrationServices web application works only with SQL Server.
    The next release, including the web app, will support SQL Server and Oracle, but not DB2.
    Paul

  • DBCS data migration

    I need to migrate client data from on premise oracle database (11g) to cloud database . I went through this website which talks about all possibvle ways to load dbcs database tables , http://www.ateam-oracle.com/loading-data-in-oracle-database-cloud-service/ . I am mostly interested in Using outbound SOAP and REST invocation APIs . But I am not very sure how to approach this process . Can someone guide me what should be high level steps that need to be taken if I want to transfer data from table A in an Oracle database to a cloud database table using SOAP and REST ? I am using the trial version of Oracle cloud database .
    From my initial investigation I think , there has to be a PL/SQL code written in on premise database whcih can read data from database tables and transfer that to cloud table . On cloud side , we need to write a REST service which will have a PL/SQL code to POST this data to the cloud tables . If this is true , then if you could guide me with a sudo code for PL/SQL on on-premise databse side  that would be helpful too .
    Also I am aware and have tested data migration using SQL developer , but I dont think this is appropiate for heavy data load , again you can correct if I am wrong here , or if you are using any better migrate enterprise wide data please do share your experience .
    Regards
    Bond

    Have you looked at this?
    http://docs.oracle.com/cloud/latest/dbcs_schema/CSDBU/GUID-3B14CF7A-637B-4019-AAA7-A6DC5FF3D2AE.htm#CSDBU179

Maybe you are looking for