ODI Questions

I have some ODI questions
Can i install ODI without having Fusion installed?
Wouldn't ODI without Fusion be merely an ETL tool? (assuming that you can install without Fusion)
Can i use ODI in an pub/sub architectural model for event driven data changes? I would still need an ESB(or Fusion) correct?
Why would I use ODI instead of Fusion? or vice versa?
thanks in advance

Can i install ODI without having Fusion installed? - I assume you mean without installing the SOA components of FMW. Note that ODI is also part of Oracle Fusion Middleware.
Wouldn't ODI without Fusion be merely an ETL tool? - Correct, although it's rather ELT.
Can i use ODI in an pub/sub architectural model for event driven data changes? - I guess that depends on the nature of the event, but it's certainly not the core area for which ODI is targetted.
Why would I use ODI instead of Fusion? - Check the white paper referenced in the other thread: Comparison of BPEL and ODI for Integrations
Gerhard

Similar Messages

  • ODI Question(s)

    I have some ODI questions
    Can i install ODI without having Fusion installed?
    Wouldn't ODI without Fusion be merely an ETL tool? (assuming that you can install without Fusion)
    Can i use ODI in an pub/sub architectural model for event driven data changes? I would still need an ESB(or Fusion) correct?
    Why would I use ODI instead of Fusion? or vice versa?
    thanks in advance

    What do you mean by Fusion? Fusion Middleware is a large stack of product suites and ODI is one of those products. If you are starting with ODI then you may refer -
    http://www.oracle.com/technetwork/middleware/data-integrator/overview/index.html
    http://download.oracle.com/docs/cd/E21764_01/integrate.1111/e12641/toc.htm
    http://download.oracle.com/docs/cd/E21764_01/core.1111/e16453/toc.htm
    Regards,
    Anuj

  • Basic ODI Question

    All:
    I am new to ODI and have some simple questions:
    I have built several interfaces that bulk load data into an Oracle database. These interfaces all load parts of the data into various tables. I have created a package that string these 5 interfaces together.
    Here is the question:
    * How can I, on a row by row basis, start a transaction, commit when the 5 interfaces have run correctly, else rollback
    * The last interface has to either insert a record into a target table if there isn't one or update the data that is ther. The logic (like creating the primary key) is different in each case. How best to handle this?
    * How can I procude a detailed report on what was loaded into which table?
    If anyone can point me in the correct direction that would help.
    Thanks,
    Matt

    Hi Matt
    Q> How can I, on a row by row basis, start a transaction, commit when the 5 interfaces have run correctly, else rollback*
    You can set the commit option to false on the target.or else in KM setps select the commit to no commit.Now if you have one interface and it executed successfully then data is going to be commited whether you have selected no commit or comit to false.If interface fails then data is not going to be commited.
    Here you have mentioned 5 interface one by one.So you select "no commit" or commit set to false for each interface.Now if one interface fails means session is unsuccessful.So its not going to commited.But if all interface executed successfully means its going to commit automtically without depending on your "no commit"/"commit set to false"
    So moral: ODI issues commit at end of an successful session
    Q> The last interface has to either insert a record into a target table if there isn't one or update the data that is ther. The logic (like creating the primary key) is different in each case. How best to handle this?*
    1st you can use IKM Incremental Update and 2nd IKM Merge. in both cases update and insertion are done based on your keys.So there is nothing to worry.
    Q> How can I procude a detailed report on what was loaded into which table?*
    You can use <%odiRef.getPrevStepLog("STEP_NAME")%>. Here you can all the deatils about your interface execution, how many updates, how many inserts. etc etc
    for more details on it visit
    http://gerardnico.com/doc/odi/webhelp/en/ref_api/getprevsteplog.htm
    Thats all Matt.Hope you enjoyed.
    Thanks

  • ERROR ODI-1134 Agent  encountered an error: ODI-1217: Session PRC_CALL_JAVA_GEO_CDC (472) fails with return code 7000. Caused by: ODI-1226

    ODI-1590: The execution of the script failed. Caused By: org.apache.bsf.BSFException: BeanShell script error: Sourced file: inline evaluation of: ``import oracle.odi.km.exception.OdiKmException; import GeoHub_migrator;  com.Data . . . '' : Typed variable declaration : Class: com.DataMigrator not found in namespace : at Line: 4 : in file: inline evaluation of: ``import oracle.odi.km.exception.OdiKmException; import migrator;  com.Data . . . '' : com .DataMigrator BSF info: PRC_CALL_JAVA_GEO at line: 0 column: columnNo                 at bsh.util.BeanShellBSFEngine.eval(Unknown Source)                 at bsh.util.BeanShellBSFEngine.exec(Unknown Source)                 at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:357)                 at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:173)                 at oracle.odi.runtime.agent.execution.interpreter.SessionTaskScriptingInterpretor.scripting(SessionTaskScriptingInterpretor.java:173)                 at oracle.odi.runtime.agent.execution.SessionTask.scripting(SessionTask.java:117)                 at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)                 at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:19)                 at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)                 at oracle.odi.runtime.agent.execution.SessionTask.processTask(SessionTask.java:214)                 at oracle.odi.runtime.agent.execution.SessionTask.doExecuteTask(SessionTask.java:135)                 at oracle.odi.runtime.agent.execution.AbstractSessionTask.execute(AbstractSessionTask.java:856)                 at oracle.odi.runtime.agent.execution.SessionExecutor$SerialTrain.runTasks(SessionExecutor.java:2004)                 at oracle.odi.runtime.agent.execution.SessionExecutor.executeSession(SessionExecutor.java:544)                 at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor$1.doAction(TaskExecutorAgentRequestProcessor.java:709)                 at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor$1.doAction(TaskExecutorAgentRequestProcessor.java:624)                 at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:203)                 at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor.doProcessStartAgentTask(TaskExecutorAgentRequestProcessor.java:789)                 at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor.doProcessStartScenTask(StartScenRequestProcessor.java:570)                 at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$StartScenTask.doExecute(StartScenRequestProcessor.java:1182)                 at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:177)                 at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$1.run(DefaultAgentTaskExecutor.java:64)                 at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:49)                 at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor.executeAgentTask(DefaultAgentTaskExecutor.java:78)                 at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor.doExecuteAgentTask(TaskExecutorAgentRequestProcessor.java:149)                 at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor.process(TaskExecutorAgentRequestProcessor.java:140)                 at oracle.odi.runtime.agent.RuntimeAgent.startScenario(RuntimeAgent.java:884)                 at oracle.odi.runtime.agent.InternalRuntimeAgent.startScenario(InternalRuntimeAgent.java:84)                 at com.sunopsis.dwg.tools.StartScen.startScenOnLocalAgent(StartScen.java:1224)                 at com.sunopsis.dwg.tools.StartScen.actionExecute(StartScen.java:324)                 at com.sunopsis.dwg.function.SnpsFunctionBaseRepositoryConnected.execute(SnpsFunctionBaseRepositoryConnected.java:192)                 at oracle.odi.runtime.agent.execution.SessionTask.execIntegratedFunction(SessionTask.java:942)                 at oracle.odi.runtime.agent.execution.SessionTask.executeOdiCommand(SessionTask.java:575)                 at oracle.odi.runtime.agent.execution.cmd.OdiCommandExecutor.execute(OdiCommandExecutor.java:44)                 at oracle.odi.runtime.agent.execution.cmd.OdiCommandExecutor.execute(OdiCommandExecutor.java:20)                 at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)                 at oracle.odi.runtime.agent.execution.SessionTask.processTask(SessionTask.java:214)                 at oracle.odi.runtime.agent.execution.SessionTask.doExecuteTask(SessionTask.java:135)                 at oracle.odi.runtime.agent.execution.AbstractSessionTask.execute(AbstractSessionTask.java:856)                 at oracle.odi.runtime.agent.execution.SessionExecutor$SerialTrain.runTasks(SessionExecutor.java:2004)                 at oracle.odi.runtime.agent.execution.SessionExecutor.executeSession(SessionExecutor.java:544)                 at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor$1.doAction(TaskExecutorAgentRequestProcessor.java:709)                 at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor$1.doAction(TaskExecutorAgentRequestProcessor.java:624)                 at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:203)                 at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor.doProcessStartAgentTask(TaskExecutorAgentRequestProcessor.java:789)                 at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor.doProcessStartScenTask(StartScenRequestProcessor.java:570)                 at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$StartScenTask.doExecute(StartScenRequestProcessor.java:1182)                 at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:177)                 at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor

    This is the wrong forum for an ODI question.

  • Struggling with oracle clustered production environment

    Hi Experts,
    I have a requirement to upgrade Oracle Data Integrator(ODI) from 10.1.3.5 to 11.1.1.6.3 We have a Clustered production environment where N1 will be up when N2 be down and viceversa.
    Here N1 and N2 are the ODI servers as well as DB(11g Release 2) servers. They both access the SHARED CLUSTERED database.
    From ODI we will generally point the Oracle clustered IP(Virtual IP) which will internally point either N1 or N2 whichever is active.
    ODI application wise we are clear about the procedure.
    Having some issues on DB related activities.
    1. Should I break the cluster definitely? Cant I do the activity without breaking the cluster?
    2. Do I need to point N1, N2, Clustered IP (Virtual IP) while doing the activities?
    3. Since its a clustered database, do I need to db related activities once or twice? (Twice means, manually on both the servers)
    4. As they are using same file structures (RAC), If the Virtual IP points N1 by default, assume that I create two new users and log in is success. Now i manually point N2 and what will happen if I try to log in to the two new users from N2. will it work?
    5. If it will not work, then what would be the solution for this?
    Please someone suggest about the clustered environment production problem.
    As this is high priority, early response would be highly appreciated.
    Need Expert's suggestions for the sequence of steps that to be carried out for a successful migration in clustered environment.
    Many thanks in advance..!

    This is an ODI question.
    Please discontinue your question here and continue it in your duplicate thread here Struggling with oracle clustered production environment
    Apart from that, ODI is a front-end tool. If you have any clue about RAC: for ODI it doesn't matter whether your database is RAC or not. the issue is you installed a front-end tool in a RAC environment.
    Finally:
    As this is high priority, early response would be highly appreciated.For high priority questions there is paid support. This is a forum of volunteers.
    Asking for high priority is IMO insulting and rude.
    Sybrand Bakker
    Senior Oracle DBA

  • To build 2 real time measures in RPD

    Hi,
    I want to know, how to build 2 real time measures in RPD by using Session and Repository Variables in the expression?
    With Regards,
    Sumedh

    Hi ,
    Is this an ODI question or OBIEE one ?
    For OBIEE , post your query in Business Intelligence Suite Enterprise Edition
    Thanks,
    Sutirtha

  • Help understanding ERPi and planning

    Hi All,
    Planning/Essbase. Version is 11.1.2.
    Can you please help me understand the role of the ERPi adapter. I have hyperion planning that sits on an essbase cube. We load data and metadata from E-Business Suite 11. From forms and reports I want to drill back to the data in EBS. I have installed and configured the ERPi adaptor and the scenarios in ODI
    Question 1)
    Do I need to load data and metadata to be able to drill-through.
    Question 2)
    I have set my target and source systems - where do I configure the "Drill Through"
    thanks in advance for any help offered.

    Hi There,
    To answer your questions...
    1. This part of the process from what I've seen is integral to the product working - so the answer is yes
    2. Drill through is defined in a number of places one of those is in the "Target Application Registration" within ERPi and there is also a flag within the the adapter.
    Oracle have produced some good step by step documentation on this, please check metalink for note ID 951369.1 and this will take you through the process.
    HTH
    Mark

  • How can you resolve an M: M relationship in BI repository?

    HI,
    I want to know, how can we resolve an M: M relationship in BI repository?
    Regards,
    Sumedh

    Hi ,
    Is this an ODI question or OBIEE one ?
    For OBIEE , post your query in Business Intelligence Suite Enterprise Edition
    Thanks,
    Sutirtha

  • Advantage of Using Quick Paint Reports

    Hello
    Can anyone please let me know what is the advantage of using quick Paint report.
    Thanks
    Regards
    Ramesh Kumar S

    This is more a designing/reporting question than an ETL/ODI question ;).
    The advantage of the dimension is that you can select or aggregate data per month, year, weekends, quarters, day of week, ...
    If you only keep the date you will have to had all these complex formula's in logical columns or calculated items (assuming you use OBIEE). Performances will be impacted and code will be duplicated at many locations.
    Hope it answers the question.
    Jerome

  • BI Apps ODI Load Plan Execution Error and Question on Rerun

    I am following the technetwork Cookbook: Installing and Configuring Oracle BI Applications 11.1.1.7.1 to install and configure my first BIApps with ODI. Smooth until I execute the Load Plan.
    Following errors were found on first attempt of execution. Without doing any change, I re-executed the load plan, error again but this time the message is difference (2nd attempt)
    Question:
    - How shall I start to diagnostic the ODI error?
    - Apparently there are something left in the failure execution, how I can clean it up for reruning the Load Plan?
    Many thanks.
    ODI errors on first execution
    ODI-1519: Serial step "Start Load Plan (InternalID:1500)" failed because child step "Global Variable Refresh (InternalID:2500)" is in error.
    ODI-1519: Serial step "Global Variable Refresh (InternalID:2500)" failed because child step "1 Domain (InternalID:5500)" is in error.
    ODI-1519: Serial step "1 Domain (InternalID:5500)" failed because child step "2 Domain SDE (InternalID:35500)" is in error.
    ODI-1519: Serial step "2 Domain SDE (InternalID:35500)" failed because child step "Serial (InternalID:36500)" is in error.
    ODI-1519: Serial step "Serial (InternalID:36500)" failed because child step "3 SDE General Domain (InternalID:54500)" is in error.
    ODI-1519: Serial step "3 SDE General Domain (InternalID:54500)" failed because child step "Load Target Table (InternalID:55500)" is in error.
    ODI-1519: Serial step "Load Target Table (InternalID:55500)" failed because child step "EBS_12_1_1 - DSN 1000 (InternalID:56500)" is in error.
    ODI-1519: Serial step "EBS_12_1_1 - DSN 1000 (InternalID:56500)" failed because child step "DOMAIN (InternalID:57500)" is in error.
    ODI-1519: Serial step "DOMAIN (InternalID:57500)" failed because child step "Parallel (InternalID:58500)" is in error.
    ODI-1518: Parallel step "Parallel (InternalID:58500)" failed; 5 child step(s) in error, which is more than the maximum number of allowed errors (0) defined for the parallel step.  Failed child steps: COMMON (InternalID:59500), HUMAN_RES (InternalID:90500), FINANCIALS (InternalID:95500), SUPP_CHAIN (InternalID:122500), PROJECTS (InternalID:98500)
    ODI-1518: Parallel step "COMMON (InternalID:59500)" failed; 11 child step(s) in error, which is more than the maximum number of allowed errors (0) defined for the parallel step.  Failed child steps: SDE_ORA_DOMAINGENERAL_PRODUCT_CLASS (InternalID:69500), SDE_ORA_DOMAINGENERAL_FND_LOOKUPS PAY_GROUP (InternalID:88500), SDE_ORA_DOMAINGENERAL_FND_LOOKUPS MARITAL_STATUS (InternalID:85500), SDE_ORA_DOMAINGENERAL_PRODUCT_MASTERORG (InternalID:73500), SDE_ORA_DOMAINGENERAL_PRODUCT_CATEGORYNAME (InternalID:67500), SDE_ORA_DOMAINGENERAL_STATE (InternalID:61500), SDE_ORA_DOMAINGENERAL_FND_LOOKUPS CUSTOMER_CATEGORY (InternalID:83500), PRODCAT (InternalID:75500), SDE_ORA_DOMAINGENERAL_FLEXFIELD (InternalID:89500), SDE_ORA_DOMAINGENERAL_FND_LOOKUPS ORGANIZATION_SIZE (InternalID:82500), UOM - Serial (InternalID:78500)
    ODI-1217: Session SDE_ORAR1211_ADAPTOR_SDE_ORA_DOMAINGENERAL_PRODUCT_CLASS (38500) fails with return code 8000.
    ODI-1226: Step Run SDE_ORA_DomainGeneral_Product_Class fails after 1 attempt(s).
    ODI-1240: Flow Run SDE_ORA_DomainGeneral_Product_Class fails while performing a Integration operation. This flow loads target table W_DOMAIN_MEMBER_GS.
    ODI-1228: Task SDE_ORA_DomainGeneral_Product_Class (Integration) fails on the target ORACLE connection BIAPPS_DW.
    Caused By: java.sql.SQLException: Listener refused the connection with the following error:
    ORA-12516, TNS:listener could not find available handler with matching protocol stack...
    ODI errors on second execution
    ODI-1519: Serial step "Start Load Plan (InternalID:1500)" failed because child step "Global Variable Refresh (InternalID:2500)" is in error.
    ODI-1519: Serial step "Global Variable Refresh (InternalID:2500)" failed because child step "1 Domain (InternalID:5500)" is in error.
    ODI-1519: Serial step "1 Domain (InternalID:5500)" failed because child step "2 Domain SDE (InternalID:35500)" is in error.
    ODI-1519: Serial step "2 Domain SDE (InternalID:35500)" failed because child step "Serial (InternalID:36500)" is in error.
    ODI-1519: Serial step "Serial (InternalID:36500)" failed because child step "3 SDE General Flexfield (InternalID:37500)" is in error.
    ODI-1519: Serial step "3 SDE General Flexfield (InternalID:37500)" failed because child step "Finalize Flexfield (InternalID:50500)" is in error.
    ODI-1519: Serial step "Finalize Flexfield (InternalID:50500)" failed because child step "EXEC_TABLE_MAINT_PROC (InternalID:51500)" is in error.
    ODI-1217: Session EXEC_TABLE_MAINT_PROC (107500) fails with return code 20000.
    ODI-1226: Step TABLE_MAINT_PROC fails after 1 attempt(s).
    ODI-1232: Procedure TABLE_MAINT_PROC execution fails.
    ODI-1228: Task TABLE_MAINT_PROC (Procedure) fails on the target ORACLE connection BIAPPS_DW.
    Caused By: java.sql.SQLException: ORA-20000: Error creating Index/Constraint : W_FLEX_SQL_G_U1 => ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found

    Hi,
    The first error happened is
    ORA-12516, TNS:listener could not find available handler with matching protocol stack...
    The above is due to the listener been disconnected while the LP is executing. This can be solved by restarting the load.
    The second error happened is
    ORA-20000: Error creating Index/Constraint : W_FLEX_SQL_G_U1 => ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found
    The above is due to the duplicate records found in the table w_flex_sql_g. You can solve this by deleting the duplicated records in that table like referred in the below link
    OBIEE, Endeca and ODI: BIApps and ODI 11.1.1.7.1 Full Load
    Regards,
    Saro

  • Cluster with 2 linux machines and ODI console - some questions

    Hello,
    I need to setup domain with ODI plugins (console etc.) on clustered environment. OS is Oracle Linux 6.3.
    I've read this documentation:http://docs.oracle.com/cd/E13222_01/wls/docs81/adminguide/createdomain.html#CreateClusteredDomain and I have some additional questions:
    - I know I need to install weblogic on 2 machines. Should I install Oracle Data Integrator on 2 machines as well?
    - Creating domains, starting domains etc. I assume I should do this on first server? (for example, via ssh). Or I will need to login via cluster ip addres?
    - MultiCast Address: this is not entirely clear for me. Should this ip aldready exist in my environment - should I configure my network interfaces somehow? Or, I simply need to provide any ip from 224.0.0.0 to 239.255.255.255 and this will work?

    MukeshNegi wrote:
    Which version of weblogic you are using ?Weblogic 11g, 64-bit.
    MukeshNegi wrote:
    if you are using shared filesystem between your machines then you don't need to install again on second server, Simply register ORACLE_HOME for ODI and oracle_common with oraInventory on second server.What do you mean "shared file system"? Let's say I have 2 separate physical machines, and they exists in the same LAN network. And I assume ORACLE_HOME is weblogic home directory, but what is "oracle_common"? Can you describe all of this more detailed?
    MukeshNegi wrote:
    Simply go to $ODI_ORACLE_HOME/common/bin on server1
    run config.sh and select following from domain template
    - Oracle Enterprise Manager Plug-in for ODI
    - Oracle Enterprise Manager Plug-in
    - Oracle Data Integrator Console
    - Oracle Data Integrator Agent
    - Oracle JRFShould I do the same on second server machine? If not, how weblogic will know about other physical machine in my network and that it should be available to join my cluster? There is no domain and no admin server set on second server, should not I do this? There are a lot of tutorials describing how to setup cluster via config.sh od enterprise manager console, but:
    - they describe how to add managed server to my cluster, but I need to know about physical machine servers. So, should I create managed server on second machine somehow? What about my domains - they should be re-created the same way on second server? I can't find any information about this, there are only enterprise manager screenshoots showing how to create managed server on the same physical machine and how to join managed servers into one cluster. But all of this don't tell me anything what should I do to complete my scenario.
    - cluster ip addreess. I still don't understand this. End user should be able to access odi console via cluster address, am I right? So, is there any system network configuration required? How this ip addres is created?
    - I have set up all of this (odi/weblogic/domain) on single machine and I have second server with only operating system installed (the same Oracle Linux). What's the simples way to join this second physical machine and make all of this working as clustered environment? Is there any step by step instruction/tutorial etc describing ALL steps should be done?
    Sorry for basic questions, I'm really newbie with this and I hope you are patient enough to answer all of this ;)
    Edited by: 960949 on 2012-12-10 01:36
    Edited by: 960949 on 2012-12-10 01:38
    Edited by: 960949 on 2012-12-10 01:56

  • ODI database connection - session question

    Hi,
    I have a package, where I have used a procedure.
    In my procedure, I open a database connection.
    My question is - In my next procedure - will I be able to use my same database connection which I created in previous step?
    Re-phrasing the question - for any number of database transaction happening in a package, will ODI opens multiple database connection per step or will there be only one single database connection?
    Thank you,
    Paras

    I believe its one per session so one connection inside the package. I think you should be , although i have not tried it practically.

  • ODI RFI questions

    Dear all,
    I am new to ODI and would like to seek answers for the RFI questions posted by a prospect:
    Built-in Functions for Data Validation: Does the tool have functions for data validation like ‘delete duplicates’, ‘missing values’, ‘incorrect data’) # 100% match
    Splitting Data Streams/Multiple Targets: Is it possible to read a data source once and load the results into two or more tables?
    Conditional Splitting: The same, but then in a conditional way, for example, if revenue is higher then 1000 put the results in table 1 else in table 2
    Union: Put rows of different tables with the same structure into one table or dataset
    Pivoting: Is it possible to transform de-normalised data, having data in the column names, into rows
    Depivoting: The other way around, transform (highly) normalised data to de-normalised data, putting data in the columns
    Key Lookups in Memory: Can one load a table completely into internal memory and search the table? (without having to make joins)
    Key Lookups Reusable across Processes: Are these tables reusable across different loading processes in such a way the key lookup table is loaded once into memory?
    Impact Analysis: Is it possible to make an impact analysis of proposed changes (when an attribute or table must change)
    Support for Data Mining Models: Is it possible during the loading process to make use of the results of a data mining process?
    Debugging Support
         Step-by-step running: Can one run the process flow step-by-step?
         Row-by-row running: Can one run the process flow row-by-row?
         Breakpoints: Can one set a breakpoint on a particular process step or a row of data?
         Watches: Can one define watch points, so the system postpone running when a certain condition is met?
         Compiler / Validater: Is it possible to validate the process flow (and/or code) with one click of the mouse and are errors reported and marked?
    Workflow Monitor: Does it provide utility to monitor and manage the runtime environment in real time?
    Powerful Scheduler: Does it support graphical job sequencer and nested ETL session? Can it schedule ETL sessions based on time or the occurrence of a specified event, including support for command-line scheduling
    Supports CWM: Is the ETL tool CWM-compliant, in other words does it support the Common Warehouse Meta Model?
    Server GRID: Does it support grid computing to leverage available computing resources to maximize throughput and fault tolerance?
    High Availability: Does it support HA? How?
    Regards,
    William

    Hi William,
    Let me try to contribute a little.
    It is possible to do any of this requirements but some need to be customized. I already solved almost all these points by customization.
    Any way, I should recomend to contact an Oracle representative to give you a better help...
    Cezar

  • QUESTION:  Essbase data extraction and Installing ODI Agent??

    For extracting data from Essbase cubes, ODI has "LKM Hyperion Essbase DATA to SQL".
    We can use (1). ReportScript, or (2). MDX-query, or (3). CalcScript
    For data-extraction using CalcScript, ODI Agent must be running on the same server as the Essbase server.
    Does anyone know if there is a need for ODI Agent on the Essbase machine if we use MDX-query method for data-extraction?
    We would like to avoid installing ODI Agent for Essbase data-extraction.
    .

    Thanks John.
    One related question. To move data from one Essbase cube to another Essbase cube using ODI Interface, Can we do it efficiently through MDX-query?
    We want to avoid Replicated-partitioning OR CalcScripts, if possible.
    BTW... Your ODI/Hyperion blog is a bible for us.

  • Questions about ODI Agent

    Being new to ODI, I have some simple questions.
    ODI server is installed on Windows-2008 machine. Its repository is on MS-SQL-Server database which is also on a Windows-2008 machine.
    A non-ODI process is creating files on a Unix server, which needs to be picked up by ODI and loaded into a MS-SQL-Server table.
    *(Q 1):* Do we need to install ODI Agent on Unix, so that ODI can "see" the unix files? OR is there another simpler way??
    *(Q 2):* Is there an Oracle link about installing ODI11g Agent on Unix? I only see documents talking about installing ODI Agent on Windows.
    *(Q 3):* What minimum ODI components need to be installed on Unix, so that ODI Agent runs on Unix?

    Nasar Ali-Khan wrote:
    Being new to ODI, I have some simple questions.
    ODI server is installed on Windows-2008 machine. Its repository is on MS-SQL-Server database which is also on a Windows-2008 machine.
    A non-ODI process is creating files on a Unix server, which needs to be picked up by ODI and loaded into a MS-SQL-Server table.
    *(Q 1):* Do we need to install ODI Agent on Unix, so that ODI can "see" the unix files? OR is there another simpler way??
    You need to install ODI agent on Unix or go for accessing UNIX file from WINDOWS application eg Samba
    *(Q 2):* Is there an Oracle link about installing ODI11g Agent on Unix? I only see documents talking about installing ODI Agent on Windows.
    The Agent installation process is basically same
    *(Q 3):* What minimum ODI components need to be installed on Unix, so that ODI Agent runs on Unix?Go for AGENT only installation
    Thanks,
    Sutirtha

Maybe you are looking for