Router in ODI

HI Friends,
I am using ODI 11g.
I have a source Table and 3 target Tables.
Can i load the data from Source to 3 targets based on diffrent conditions just like router Transformation in Informatica?.
Please let me know the stpes for this or any documentation can help.
Thanks,
Lony

Hi Lony,
It is also possible in oracle data integrator, but little bit tuff
Routing is very easy using variables but while coming to the transformations part we need to identify the data at interface level then only we can achieve this....
Cheers,
Surya

Similar Messages

  • Need clustering of ODI 10g

    Hi Gurus,
    We have a requirement in which we need to install Clustering (High Availability) for ODI 10g.
    Can anyone please help me regarding this issue ASAP.
    Any Help is Greatly Appreciable.
    Regards,
    Pavan Kumar.

    ODI 11g by itself does not have any failover mechanism, ODI uses the fail-over mechanism implemented by the Application Server under which ODI J2EE Agents will be running i.e. if we have an ODI J2EE agent deployed on Weblogic for instance.
    However, we can use the Load Balancing feature in ODI 10g or 11g to attain Pseudo High Availability, which can be implemented in the below mentioned steps,
    1.     On a specific host machine, define a "parent" Agent whose only function is to route ODI Scenario start-up commands to the appropriate "child" Agents.
    2.     Define more than one "child" Agent
    3.     For each "child" Agent on the cluster set the Concurrent Session Number parameter with a large value.
    4.     Open the Load balancing tab of the "parent" Agent and activate the checkbox for the "child" Agents on the cluster nodes (and not for the "parent" Agent itself). Limit the number of sessions on the "parent" Agent to (number of "child" Agents+2).
    5.     Route all ODI Scenario start-up commands and schedules to the "parent" Agent. This Agent will then forward ODI Scenarios to the most suitable child Agent
    Roles of the “Parent” agent:
         The "parent" Agent should be used to dispatch the executions on its "child" Agents that are alive or running.
         The "parent" Agent is able to detect that a "child" Agent is no longer running.
         Also, “parent” Agent is able to detect a "child" Agent that has just been started.
    Note however that, once an execution has started on Agent A, if Agent A "dies", the current execution will NOT be moved on another Agent.
    The "dying" execution should be manually pushed into the queue in order to have the "parent" Agent redistribute it on its still alive "child" Agents.
    One concern is the "parent" Agent:
         If it dies, the already distributed tasks will continue to be executed on the corresponding "child" Agent.
         If the "parent" Agent dies, it should be restarted as soon as possible, in order to keep the flow active.
         Given its very important role, it should be placed on a machine that has a high uptime coefficient. If the machine stops, the Agent should be restarted ASAP.
    Regards,
    Rickson Lewis
    http://www.linkedin.com/in/rickson

  • Which tool is better for ETL?

    I want to transfer data from RLDB to Essbase, there are many method and tools, I don't know which tool is better for my case. Now, what I am doing now is:
    1. Create views in Oracle
    2. Export view to falt file
    3. Import flat file to Essbase using rule file and MaxL import script.
    4.The data size: flat file rows 3-4 million, time used: 13 hours
    I want to use more powerful ETL tools, such as: Informatica PowerCenter, OBIEE, FDM, EIS, ..., I have basic knowledge for the above tools, but don't know which tool is better for my case? Please advise?

    First to streamline your existing process, you could get rid of step 2 and add a odbc connection to the server and Sql statement to the load rule to pull directly from your Oracle database. Of course with the number of rows you have, if you use the same extract multiple times, I would create a table to load into from the view and pull from that table.
    As for using an etl (Elt) tool, Informatica is pretty powerful and there is a version of it (although not supported in the future called DIM). I might go the ELT route with ODI(Oracle Data Integrator). In the future I might use OBIEE, but I don't think it fully ready for Essbase until the next version and I don't think it will rank on the same level as a full ETL tool for conversions.
    Edited by: GlennS_2 on Jan 2, 2010 9:58 AM
    As for EIS, I would not use it, but might use Essbase Studio. (It can use OBIEE as a source) again this is not an etl tool, it would just take youe existing views and load them.

  • Help using oracle syntax "SUM(col1) over (order by col2)" using ODI

    Hi all
    I want to load data from oracle to ESSBASE using ODI, and I know oracle have such syntax sum(col1) over (order by col2,col3) which can get the accumulation data, e.g
    Oracle data table
    col1, col2, value
    A 2009-1 10
    A 2009-2 10
    A 2009-3 10
    And the essbase need
    col1 col2 value
    A 2009-1 10
    A 2009-2 20
    A 2009-3 30
    However after i try this in ODI, error occur:
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 32, in ?
    java.sql.SQLException: ORA-00979: not a GROUP BY expression
    and the original generated SQl by ODI :
    select 'HSP_InputValue' "HSP_Rates",MAP_KMDZ_TABLE.BUD_DYKM "Account",MAP_MONTH.ESS_MONTH "Period",MAP_YEAR.ESS_YEAR "Year",'Actual' "Scenario",'Draft' "Version",TEMP_LIRUN.CURRENCY "Currency",MAP_COMPANYCODE.ESS_COMPCODE "Entity",substr(MAP_KMDZ_TABLE.BUD_BUSINESSOBJECT,1,80) "BusinessObject",'Route_NoRoute' "Route",MAP_TRANSPORT.ESS_TRANSPORT "Transport",substr(MAP_KMDZ_TABLE.BUD_BUSINESSACTIVITY,1,80) "BusinessActivity",substr(MAP_KMDZ_TABLE.BUD_CHANNEL,1,80) "Source",'NoCounterparty' "Counterparty",sum(TEMP_LIRUN.DATAVALUE) over (order by MAP_KMDZ_TABLE.BUD_DYKM,MAP_YEAR.ESS_YEAR,MAP_MONTH.ESS_MONTH,TEMP_LIRUN.CURRENCY,MAP_COMPANYCODE.ESS_COMPCODE,MAP_TRANSPORT.ESS_TRANSPORT,MAP_KMDZ_TABLE.BUD_BUSINESSACTIVITY,MAP_KMDZ_TABLE.BUD_BUSINESSOBJECT,MAP_KMDZ_TABLE.BUD_CHANNEL) "Data" from ETL_DEV.TEMP_LIRUN TEMP_LIRUN, ETL_DEV.MAP_KMDZ_TABLE MAP_KMDZ_TABLE, ETL_DEV.MAP_MONTH MAP_MONTH, ETL_DEV.MAP_YEAR MAP_YEAR, ETL_DEV.MAP_COMPANYCODE MAP_COMPANYCODE, ETL_DEV.MAP_TRANSPORT MAP_TRANSPORT where      (1=1) And (TEMP_LIRUN.COSTELMNT=MAP_KMDZ_TABLE.SAP_ZZKM)
    AND (TEMP_LIRUN.FISCYEAR=MAP_YEAR.SAP_YEAR)
    AND (TEMP_LIRUN.FISCPER3=MAP_MONTH.SAP_MONTH)
    AND (TEMP_LIRUN.COMP_CODE=MAP_COMPANYCODE.SAP_COMPCODE)
    AND (TEMP_LIRUN.WWHC=MAP_TRANSPORT.SAP_WWHC) Group By MAP_KMDZ_TABLE.BUD_DYKM,
    MAP_MONTH.ESS_MONTH,
    MAP_YEAR.ESS_YEAR,
    TEMP_LIRUN.CURRENCY,
    MAP_COMPANYCODE.ESS_COMPCODE,
    substr(MAP_KMDZ_TABLE.BUD_BUSINESSOBJECT,1,80),
    MAP_TRANSPORT.ESS_TRANSPORT,
    substr(MAP_KMDZ_TABLE.BUD_BUSINESSACTIVITY,1,80),
    substr(MAP_KMDZ_TABLE.BUD_CHANNEL,1,80)
    I know ODI think sum.. over must append group by , however it's not! How to solve this problem.
    Thank All for your attention
    SOS!
    Ethan

    Hi Ethan,
    In my exeprnc I faced a similar kind of situation.
    Two work arounds.
    1. Write one procedure and execute the same using ODI procedure.
    2. Customize a Km and use that KM in your interface.
    I guess in your query Group by function is not needed. (if this is the case you can achive this by a smple customization step in KM)
    for example : your current KM will generate a query like this:-
    select x,y, sum(x) over (order by y) as sumx FROM TestTable group by x, y
    and you need a query like this
    select x,y, sum(x) over (order by y) as sumx FROM TestTable
    go to your KM (duplicate the KM which you are using and rename _withoutGroup )
    remove the group by function from select query
    (remove the API function <%=snpRef.getGrpBy()%> from insert into i$ table step)
    please let me know if you need more help on this
    regards,
    Rathish

  • Problem in converting EBCDIC to ASCII in ODI

    Hi All,
    I am new to ODI.
    I am trying to convert EBCDIC file to ASCII using ODI. Each of my record is of length 140. There is no record separator, but in ODI we need to specify record separator.
    So how do i convert this file.
    I am using jdbc:snps:dbfile?ENCODING=cp037 in my data server.
    IS there any way by which i can tell ODI that the next record is after 140 characters???
    Thanks

    Yeah, I ended up going the Jython route.
    First I created a procedure called "ftpGetEbcdicFile" that takes the following options:
    1 Remote Server
    2 Remote Login
    3 Remote Password
    4 Remote Directory
    5 Remote Filename
    6 Local Directory
    7 Local Filename
    8 Record Size
    This procedure has 2 commands/steps:
    ftpGetBinary:
    import snpsftp
    # Retrieve procedure parameters
    remoteServer = '<%=odiRef.getOption("Remote Server")%>'
    remoteLogin = '<%=odiRef.getOption("Remote Login")%>'
    remotePassword = '<%=odiRef.getOption("Remote Password")%>'
    remoteDirectory = '<%=odiRef.getOption("Remote Directory")%>'
    remoteFilename = '<%=odiRef.getOption("Remote Filename")%>'
    localDirectory = '<%=odiRef.getOption("Local Directory")%>'
    localFilename = '<%=odiRef.getOption("Local Filename")%>'
    # Build full file paths
    remotePath = remoteDirectory + "/" + remoteFilename
    localPath = localDirectory + "/" + localFilename
    # Execute ftp
    ftp = snpsftp.SnpsFTP(remoteServer, remoteLogin, remotePassword)
    try:
         ftp.get(remotePath, localPath, 'BINARY')
    finally:
         ftp.close()
    and insertNewlines:
    # Retrieve procedure parameters
    localDirectory = '<%=odiRef.getOption("Local Directory")%>'
    localFilename = '<%=odiRef.getOption("Local Filename")%>'
    recordSize = <%=odiRef.getOption("Record Size")%>
    localPath = localDirectory + "/" + localFilename
    # There is probably a better way to do this
    tmpPath = localDirectory + "/" + localFilename + "~"
    inputFile = open(localPath, 'rb')
    outputFile = open(tmpPath, 'wb')
    while 1:
         line = inputFile.read(recordSize)
         if line:
              outputFile.write("%s%c" % (line, 0x15))
         else:
              break
    outputFile.close()
    inputFile.close()
    Then I created package that calls the procedure with the desired options. It still needs some work, but I think it will get the job done for now.
    Any advice or corrections from the ODI gurus out there?

  • Using ODI to load 2 similar Essbase cubes

    Hello all,
    I am trying to create an ODI routine that incorporates the following details:
    -Loads specific Essbase member properties from a single flat file to 2 similar Essbase cubes
    -The Essbase cubes are virtually identical, except that one is ASO and one is BSO, therefore the properties may differ for specific members (i.e. DataStorage, Consolidation, Formula, etc.). There are also other differences in dimensionality, but those details are inconsequential to this routine...
    -For each type of member property, the flat file has 2 sets of columns - one for ASO and one for BSO. Therefore, one column might be called "ASODataStorage" while another one is called "BSODataStorage". They have different values in them respective to the Essbase architecture.
    -There is also a set of columns called "ASOFlag" and "BSOFlag" that flag which members should even be loaded to that respective cube. So, one sub hierarchy may be needed for the ASO cube, but not the BSO cube.
    Earlier, I created 2 contexts for these cubes so I could use the same ODI routine and source metadata to update both cubes with a simple change of context. This part is working correctly. The challenge I'm facing now, however, is how to design a routine that will select the appropriate columns of data based on my context. Therefore, if one context is called "ASO", it should select the "ASO" columns and apply only those properties/members. And a similar situation should happen when the "BSO" context is created.
    I have tinkered with the idea of using a variable that stores the current context information, and becomes evaluated at runtime and then changes the source columns based on the variable value. However, I'm having trouble getting this to work correctly (I am a newbie to variables). I am also having trouble figuring out how to map all of the source columns to the target columns, as I am going from double property columns in the source to a single set of property columns in the Essbase metdata target.
    Is there a better way to approach this in ODI? Please note that I am aware that EIS can handle this as well...but I am looking for an ODI solution.
    Thanks in advance!
    -OS

    Hi,
    In simple terms you should be able to create a package, drag the variable on to the package and set the value as ASOflag
    Then drag your interface on to the diagram to run the ASO load (the interface should have a filter that uses the variable to filter the required records)
    Then drag the variable on to the diagram and set the value to BSOflag.
    Then drag your interface on to the diagram to run the BSO load (the interface should have a filter that uses the variable to filter the required records)
    Though you say you are changing contexts so you might have to go down the route of generating a scenario from a package for the ASO load, then a scenario for the BSO load, then create another package using the ODIstartscen tool and execute the scenarios with the specified scenarios.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • ODI Transformations

    Hi All,
    I am in the process of developing complex interfaces(mappings) using ODI.i did lot of development using other etl tool informatica.
    since i am in the initial stages....could you please help me in finding ways implement following transformations in ODI.
    1) Lookup tranformation
    2) Insert/update Transformation
    3) router transformation. etc...,
    and also suggest some good reference materials on the above topics.
    Thanks

    Hi,
    You have to work out your transformations in ODI designer .. "Diagram" and in "Flow" tabs.
    Now to do
    1. Lookup -- bring up the lookup tables in the datastore by reversing(same like ur source and target tables/views) and drag in the designer along with ur source for transformations. do ur txfm by either drag n drop or post ur query in the "Expression Editor" by selecting the target column.
    2. Insert/Update- will be controlled by ur ODI IKM`s (Incrementel Update)
    3. I dont know about router Txmf.
    Visit : http://www.oracle.com/technology/products/oracle-data-integrator/10.1.3/htdocs/1013_support.html for ODI docs
    Thanks

  • Lookups in ODI

    Hi ,
    Is there is any Lookup Transformation,Router (like informatica) avilable in ODI.
    Thanks.

    Maybe not in 10G but there is LOOKUP functionality in 11G
    *"Lookups*
    *A wizard is available in the interface editor to create lookups using a source as the driving table and a model or target datastore as the driving table. These lookups now appear as a compact graphical object in the Sources diagram of the interface. The user can choose how the lookup is generated: as a Left Outer Join in the FROM clause or as an expression in the SELECT clause (in-memory lookup with nested loop). This second syntax is sometimes more efficient on small lookup tables.*
    *This feature simplifies the design and readability of interfaces using lookups, and allows for optimized code for executing lookups."*
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • OdiStartScen always using "OracleDIAgent" physical agent (ODI-17554)

    Hi. I received an ODI package (A) which contains an invocation to other ODI package (B) through an OdiStartScen call in the flow diagram.
    The OdiStartScen call doesn't set the context nor the agent codes (just set the scene name, the version as -1 and the invocation mode as asynchronous).
    In my topology, I have three physical agents: OracleDIAgent (for "Production" context), OracleDIAgent_T (for "Test" context) and OracleDIAgent_D (for "Development" context). I have just one logical agent which choose between the physical ones through the context distribution previously described.
    If I execute the "parent" package (A), it doesn't matter if a choose whatever context (for example, "Development" or "Test"), the OdiStartScen call to package (B) always is routed to the "OracleDIAgent" physical agent (the "Production" one).
    Moreover, if I delete all my agents (specially "OracleDIAgent"), create just the ones which correspond to "Development" (the physical OracleDIAgent_D and the logical one) and execute the "parent" package, I receive the following error when the execution reaches the OdiStartScen call:
    ODI-1426: Agent OracleDIAgent is not defined in the topology for the master repository.
    Caused By: com.sunopsis.core.SnpsInexistantObjectException: ODI-17554: Agent "OracleDIAgent" does not exist.
    I'm confused.
    Any help in clarifying this behaviour would be much appreciated.
    Notes:
    - the agent which first receive the execution order (for package A) is an Java EE Agent (WebLogic 11g)
    - the project comes from 11.1.1.5 and now is deployed on a 11.1.1.6 through export/import.
    Thanks,
    Esteban

    Hi Phanikanth, actdi, thank you for your answers.
    It seem the cause is one missing step in the Java EE Agent WebLogic domain creation. According the "Fusion Middleware Installation Guide for Oracle Data Integrator" [1], the default Java EE agent created from the static WebLogic template is called "OracleDIAgent"; if you want to use another name, you have to create a new template from ODI Studio and use it to create the Java EE Agent domain. In our case, all the Java EE Agents were created using the default template.
    [1]
    http://docs.oracle.com/cd/E23943_01/core.1111/e16453/configure.htm
    3.3 Configuring Java EE Components
    3.3.1 Declaring the Java EE Agent in Topology

  • . What is the error handling mechanism available in ODI ?

    What is the error handling mechanism available in ODI ?

    We have something called CKM in ODI. It provides option of Static control and Flow control.
    What you need to do is provide the proper constrains / validation rules on the source and target models. When you execute the interface , if any of the rule is violated , those rows goes to E$_ tables also called Error tables.
    Static control example :
    Before executing the interface , you can check the data health of your source model.
    Flow Control example :
    Thus for example you have 100 rows , out of which 10 rows violates the rule , then 90 rows will go to target table and 10 will go to error table. If you correct the values or modify the constraint and re execute the interface , those 10 rows will also go to target.
    Other than that if you mean Error handling in the package or load plan , you can use OK and KO appropriately and route the flow as per your requirement. This is all custom approach , which will vary from design to design.

  • Hyperion Planning on 64 bit Solaris. ODI on 32 bit Windows.

    Hi,
    I have Hyperion planning on 64 bit Solaris and ODI installed on 32 bit windows server. Could not reverse a planning app in ODI.
    The error message says - Could not connect to Planning instance.
    I have checked the RMI service at port 11333 on solaris and its running. I can telnet to this port from the windows server successfully.
    Has any one come across such an issue before or knows of any issues regarding this?
    Any help will be greatly appreciated.
    Thanks.

    I had a problem with ODI being on linux and planning on windows.
    I could telnet the ping / telnet across between boxes. When reversing it could connect to planning but could not return, it turned out to be an issue with loopback on linux that had an extra ip of 10.0.0.0
    I removed the loopback from the host and it started reversing without a problem.
    If you telnet then maybe it is some sort of routing host file issue.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • How to import from Oracle to ODI View query definition?

    The scenario:
    I have a View in Oracle database with name 'XV'.
    When I loaded metadata from this Oracle database the 'XV' structure was imported too.
    Is it possible to get in ODI the 'XV' query definition in format 'CREATE VIEW ... ' or 'SELECT * FROM'?
    Thanks a lot for answers.

    Still cannot connect.
    The Administration Tool help suggests the tnsnames.ora is not required:
    "For Oracle Database data sources, the data source name is either a *full connect string* or a net service name from the tnsnames.ora file. If you enter a net service name, you must ensure that you have set up a tnsnames.ora file within the Oracle Business Intelligence environment, in ORACLE_HOME/network/admin".
    I've tried the tnsnames.ora route, but this still does not work. I created a tnsnames.ora file in <MW_HOME>\Oracle_BI1\network\admin. Is this correct?
    I'm assuming that I don't need an Oracle client install on the same (Windows) PC that I've got Oracle BI 11g installed on. Is this correct?

  • ODI suites to my requirement ??????

    Hi all,
    My requirement is i will have a root folder inside that root folder i will n number of sub folders.
    Client place different files inside each sub folder . The requirement is Can ODI pick up the files and based on the configuration from the DB will it route to distination
    folder dynamically.
    The detsination folder name should be read from config DB and it should route there.
    Thanks
    Phani

    This is possible. You need to create dynamic filename from the DB and pass that as a parameter to the Filename.
    http://blogs.oracle.com/dataintegration/2009/04/using_parameters_in_odi_the_dy_1.html

  • ODI: Can I use with Planning BPMA Apps?

    Hi,
    I'd like to using ODI to translate metadata/data for SQL AS to Planning Application.
    My Planning Application created over BPMA (not-classic).
    Can I use ODI to work with BPMA App?
    What a difference between usage of ODI for BPMA and Classic Apps?
    Edited by: Antony NoFog on 19.01.2010 12:17

    No you can't use ODI to directly update EPMA applications.
    You can use ODI to populate EPMA interface tables but expect a lot of transformation if you go down that route.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Router Transformation Error

    Hi,
    I tried doing Router Transformation by following the video.
    But it is giving me following error during package execution:
    ORA:00942 Table or view does not exist.
    My project name is DEMO and variable INIT and TRG_NAME.
    When I provide #DEMO.TRG_NAME in Resource name as specified in video
    then interface issues an error.

    918554 wrote:
    Hey,
    I am a bit confused..
    Where do i need to refresh the variable in pkg or variable...
    You can refresh variable from the pkg by right clicking onto the variable and choosing execute step .
    I need to assign Refresh variable property or i need to click on refreshing tab...
    Not sure of the question .. your variable should be configured already .
    in operator what should i check..
    After variable refresh/execution you can chen the variable value in operator.
    When I write this particular query in editor it gives me invalid character error...
    This is expected as this part of code is specific to ODI .
    Kindly guide me...

Maybe you are looking for