Moving data with ODI in a 24-7 shop
Anybody have experience moving data in a 24-7 IT shop?
Yes you can help Cezar. Did you have any problem with locking tables when you were transfering data?
Similar Messages
-
Slow performance while integrating data with ODI while open report
In ODI i have a schedulled package that runs every 4 hours, this package loads data from an DB table to BAM.
Normally each execution loads about 4000 records and takes about 7 seconds to be completed.
However, in some executions the same 4000 records take about 30 minutes to be loaded into BAM; normally this happens when a user has a BAM report open in the browser.
Regards
Hugo Calado
Edited by: user12198856 on 23/Ago/2011 3:45Hi Manisha,
When previewing a report, Report Designer need connect to the report data sources, run dataset queries, cache the data on the local computer, process the report to combine data and layout, and render the report. The report processor also runs all the queries
for datasets in the report using the current parameter defaults, and saves the results as a local data cache (.rdl.data) file.
The first time that previewing a report that relies on a server reference. In your scenario, you can try to improve preview performance through the link below:
http://technet.microsoft.com/en-us/library/ee240846.aspx
Hope this helps.
Regards,
Heidi Duan
Heidi Duan
TechNet Community Support -
HFM DATA LOAD WITH ODI HANGS LONG TIME
Hi all,
There's a very strange problem when I loading data from MS SQLServer to HFM with ODI. Specifically, there are 2 interfaces to 2 applications on the same HFM server. Data amount is about 1,300,000 and 650,000 separately.
The strange thing is when I execute interface individually, it sometimes works well. However when I execute the package contains 2 interfaces, the larger one almost hangs on about 10+ hours every time whether I use an agent or not.
After some research, it seems that the session hangs on because it cannot get return info from HFM but loading data has already completed. I found some similar problems on OTN like 64bit driver and jre compatible error OR deadlock on table. Different with this one. So, can anyone help on this? Much appreciate in advance!!!
BTW, ODI and HFM are on the same server but ODI repositary and source of interface are on another MS SQL data server. The version is as below:
HFM 11.1.1.3.0.956
ODI 11.1.1.6.0
win server 2003 x86
MS SQLServer 2008 R2
win server 2008 x64
Regards,
SteveHi SH,
source is MS SQLServer 2008 R2, staging area is on the source side, target is a HFM 11.1.1.3.0.956 based on SQLSERVER.
KM is a standard 'IKM SQL to Hyperion Financial Management Data'.
No transformation logic but only a filter to select data in current year.
Besides, I have do some performance tuning as guide tolds:
REM #
REM # Java virtual machine
REM #
set ODI_JAVA_HOME=D:\oracle\Java\jdk1.6.0_21
REM #
REM # Other Parameters
REM #
set ODI_INIT_HEAP=512m
set ODI_MAX_HEAP=1024m
set ODI_JMX_PROTOCOL=rmi
In Regedit:
EnableServerLocking: 1
MaxDataCacheSizeinMB :1000
MaxNumDataRecordsInRAM: 2100000
MultiServerMaxSyncDelayForApplicationChanges:300
MultiServerMaxSyncDelayForDataChanges:300
After some reaserch, I think the problem can be located at the HFM-ODI adapter or HFM side(maybe HFM cannot respond a completed info to ODI), do you have any idea? Thanks in advance -
How to export data from DB to flat file with ODI?
we want to export DB2 tables to flat files.
Who can tell me how to do it with ODI?
Please give me a simple example or steps.
Thank you.There are two ways
Either use IKM Sql to File Append
(or)
OdiSqlUnload
For OdiSqlUnload you can use this technique to provide the connection parameters - http://odiexperts.com/?p=1985 -
Issues with ODI connection to Hyperion Financial Management
Hi,
I am having some issues with ODI connection to Hyperion Financial Management. ODI and Financial management are setup on different machines.
The ‘Cluster(Data Server)’ name in Topology is given as ‘hfm03cl’. On reverse-engineering a model it gives the following error:
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 38, in ?
com.hyperion.odi.common.ODIHAppException: Error occurred in driver while connecting to Financial Management application [HFM1] on [hfm03cl] using user-name [admin].
at com.hyperion.odi.hfm.ODIHFMAppConnection.<init>(ODIHFMAppConnection.java:58)
When HFM is installed on the same machine, it works fine. How does ODI identify a cluster on a different machine? Do I have to give the machine name along with the cluster name? If so, what is the format?
Thanks,
SuYou will need to have the HFM client installed on any machine that will be running ODI (i.e. Designer or Agent) that will be executing connections to an HFM server.
Once you have the HFM client installed, see if you can actually connect to HFM from that machine using the HFM client.
Couple of other thing, in the Physical Schema, have you set the Application Catalog and Work Catalog to the same HFM application?
Also, when you are reversing, are you using the local agent or one you have installed using agentservice?
geeo -
Which CKM is used for moving data from Oracle to delimited file ?
Hi All
Please let me know Which CKM is used for moving data from Oracle to delimited file ?
Also is there need of defining each columns before hand in target datastore. Cant ODI take it from the oracle table itself ?Addy,
A CKM is a Check KM which is used to validate data and log errors. It is not going to assist you in data movement. You will need an LKM SQL to File append as answered in another thread.
Assuming that you have a one to one mapping, to make things simpler you can duplicate the Oracle based model and create a file based model. This will take all the column definitions from the Oracle based model.
Alternatively, you can also use an ODI tool odiSQLUnload to dump the data to a file
HTH -
Unable to install the Demonstration Environment with ODI 11g
Hi,
Can someone please help me with Demonstration environment installation with ODI 11g on a windows system.
I have downloaded the zipped file from the OTN site.
I am unable to understand the installation steps in the user guide.Posting them for your reference.
"To manually install the Demonstration environment, do the following:
Unzip oracledi-demo.zip in the ODI_HOME folder.
Verify that the JAVA_HOME environment variable is set and contains the path of a JVM suitable for Oracle Data Integrator.
If this variable is not set correctly, set it to a valid java machine location.
For example:
On UNIX operating systems:
setenv JAVA_HOME/usr/local/java
On Windows operating systems:
Set the JAVA_HOME variable graphically "
Regards,
Lovey SaxenaHi;
Did you try to start installation? Your Java_Home could be avaliable on your env settings, so you dont need to set this for installation. If its fails please try to use this syntax:
Set JAVA_HOME=C:\xxx
Regard
Helios -
How to Read file name which we are dealing with ODI File tool
Hi,
We are using ODi10g version and we have requirement to move file from one place to another place. We are using ODIFileMove utility but we also want to read file name.
Any help.
Thanks in Advance.You can accomplish this with a fairly simple Jython script. Use the os.listdir(<directory>) command to get the name of files in a given directory.
You can then (still in the Jython script) loop through the files and move them to a desired location (bypassing the OdiFileMove tool) OR use the Jython script to write the file names to a SQL table. Then, use an ODI procedure to loop through the newly inserted records and store the file name in an ODI variable that you can then use in your OdiFileMove tool etc.
I often refer to this blog entry from Gurcan Orhan as a starting point for this kind of task: Loading multiple files with ODI | Gurcan Orhan&#039;s Oracle Data Integrator Blog -
How to use odiRef.getJDBCConnection("WORKREP") to store data in ODI Vars
[blog |http://bahchis.com/2011/03/03/odi-repositories/] Sergey Bahchissaraitsev
Hi Everyone,
I saw in this blog the new 11g functionality to get the JDBC Connection Object for the Work Repository. I really want to use this in an ODI variable. Specifically I want to read the current Operator logs in the Work Repository and store this one value in an ODI Variable.
Previously I used topology to guide me to the correct Work Repository but this is cumbersome. I want to go directly to the current Work Repository for my connection and the new parameter "WORKREP" allows for this.
Here's my old code in an ODI variable. This code requires a Schema to be setup.
select count(*)
from snp_sess_task_log
where sess_no = <%=odiRef.getSession("SESS_NO")%>
How can I get the same result in an ODI variable using the odiRef.getJDBCConnection("WORKREP") function to point me to the correct Work repository? Using this would be a huge improvement to how I retrieve operator data in ODI. Appreciate your thoughts.
Sincerely,
Chris RothermelHi Sergey,
Thanks for responding here as well! As I wrote on your blog...
Thanks for writing back. I have been using the logical and physical schemas but there are limitations and it becomes overly complex.
Thanks to your post on the new functionality of pointing out the new 'WORKREP' parameter. With it I'm now able to write some Jython code to pull in the values from 'WORKREP' regardless of topology settings. Once that is done I write it to the temporary memory DB and then read the values into my ODI variables.
Your blog and specifically the 'WORKREP' parameter is the key to this solution. It took me most of the day, but I got it working.
Yes i hear your concerns with the In-Memory Engine. I got this tip from ODI Experts and it is actually serving my needs very well. I did have a hiccup with it when I CLOSED the connection I couldn't access the values in my Session from the temporary table. I think closing it may have wiped the memory. Something to look at further.
The good news is I only need the In-Memory Engine for temporary storage. All of the permanent values are in the Operator repository, I just use the In-Memory Engine as a way to populate ODI variables since we can't use JDBC to populate ODBC variables.
It was a good day and I appreciate your blog as well as the ODI Experts website where I found a nice Jython for ODI Beginners document.
Have a great weekend! -
Problem using SQL Loader with ODI
Hi,
I am having problems using SQL Loader with ODI. I am trying to fill an oracle table with data from a txt file. At first I had used "File to SQL" LKM, but due to the size of the source txt file (700MB), I decided to use "File to Oracle (SQLLDR)" LKM.
The error that appears in myFile.txt.log is: "SQL*Loader-101: Invalid argument for username/password"
I think that the problem could be in the definition of the data server (Physical architecutre in topology), because I have left blank Host, user and password.
Is this the problem? What host and user should I use? With "File to SQL" works fine living this blank, but takes to much time.
Thanks in advanceI tried to use your code, but I couldn´t make it work (I don´t know Jython). I think the problem could be with the use of quotes
Here is what I wrote:
import os
retVal = os.system(r'sqlldr control=E:\Public\TXTODI\PROFITA2/Profita2Final.txt.ctl log=E:\Public\TXTODI\PROFITA2/Profita2Final.txt.log userid=MYUSER/myPassword @ mySID')
if retVal == 1 or retVal > 2:
raise 'SQLLDR failed. Please check the for details '
And the error message is:
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 5, in ?
SQLLDR failed. Please check the for details
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.h.y(h.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source) -
I am z/OS system programmer, our company is using IMS as its main OLTP database. We are investigating moving data off the mainframe for data warehousing and online fraud detection. One option is using IBM InfoSphere CDC and DB2, another option is using IMS connect and writing our own program, I am wondering what is the oracle solution for this kind of issue?
I am not oracle technician but I googled and find out Oracle has some product like Oracle Legacy Adapters, OracleAS CDC Adapter and Oracle Connect on z/OS, however I didn't find them in Oracle site(https://edelivery.oracle.com/), I don't know whether these products are deprecated or not?!
I would very much appreciate any help or guidance you are able to give meThank you for responding.
I've considered dumping the data into a flat file and using SQL*Loader to import as you suggest but this would require some scripting on a per-table basis. Again: all I want to do is to copy the contents of a table from one database to another. I do not think I should have to resort to creating my own dump and load scripts in order to do that. However I agree with you that this type of solution may be my final solution.
I've tried the db link solution. It was just a slow as the 'imp' solution for some reason. Don't know why. The tables are rather large (3 tables of a few Gb each) and therefore require intermediate commits when loaded. Otherwise the rollback segment will run out of space. So the 'db link solution' is really a PL/SQL script with a commit for each x records.
I think Oracle is making it a bit difficult for me to copy the contents of a table from one database to another and to do it efficiently. Perhaps I'm missing something here? -
Hi,
Whether Hyperion EPM 11.1.1.2 version compatible with ODI 10.1.3.5.0 for Planning/HFM metadata loading and Essbase data loading?
Thanks,
Naveen SuramHi,
They are certified for version 11, though there is still an issue when you use the calc script method for exporting data.
There is also a problem with loading data this exists on all the versions and it due to be fixed in the next EPM release later this year.
Cheers
John
http://john-goodwin.blogspot.com/ -
How to do file validation with ODI 10.1.3.5
Hi Team,
Please help me, how to handle file validation with ODI?
My source is files.
Requirement:
With ODI i have to do the file validation and file data loading successfully. Once it is successfully done then i have to move the particular file and move to different directory.
Any internal tool is there in ODI package to watch the file.
Regards,
SureshHi Suresh,
You can do your file loading and validation in an ODI interface.
Once you are done with the interface, add it to a package, you can set up subsequent steps in the package to move the file to another location using the built in ODI file utilities in the package.
Terrence. -
How to configure OBIA 11.1.1.8.1 with ODI
Hi forum,
I want to configure Oracle Business Intelligence Apps 11.1.1.8.1 installed on Linux with ODI ver 11.1.1.7.
Can anybody please provide:
1) A tutorial about how to setup ODI for OBIA 11.1.1.8.1?
2) Than how to perform functional configuration using Oracle Data Integrator, Oracle BI Applications Configuration Manager and Functional Setup Manager?
3) How to Create, Run and Customize ETL in Oracle Data Integrator?
Thanksa) Do what most people do and read the documentation (Oracle Business Intelligence Applications 11g Release 1 (11.1.1.8.1) Documentation Library) that comes with the product, installation details will be documented. No one is going to go through all the installation steps for you in these forums. The links I provided before are good starting points. If you are still struggling get hire a consultant to help you out.
b) The ODI studio is just a client so you can install it on whatever OS you like. The repositories are simply databases and can be installed on a database whether it is hosted on Windows or Linux etc and the agent is Java so again is platform independent. You can configure the setup distribution however you want as long as the repositories can be connected to via JDBC from the studio client. -
I want to export the data using Odi from essbase v9 cubes. ODI and essbase server and in different servers.
I am using the calculation script in the LKM part, but the exported files are in Essbase servers but ODI not able to recognize the exported files.
Report script is taking too much of time to export the data therefore using the calculation script.
Is this something related to the agent? However I am not able to create an agent to the essbase server. I am succesful in creating the agent for the local system where odi is installed.
Please suggest.
Regards,
MuraliAre you on 10.1.3.6.x? Then: http://john-goodwin.blogspot.com/2008/09/odi-series-part-2-agent.html
Are you on 11g? Then: http://john-goodwin.blogspot.com/2010/12/managing-odi-11g-standalone-agents.html
I will say with only a mild amount of shame and a large amount of gratitude that I installed both releases' agents through John's blog posts.
Regards,
Cameron Lackpour
Edited by: CL on Jun 4, 2012 5:48 PM
Whoops, had the same link in there twice.
Maybe you are looking for
-
Motion project aspect ratios.
The Problem: When opening a motion project in FCP, the aspect ratio is completely changed and never looks right even when both settings are exactly the same. The Details: Motion Ratio= Project: NTSC Broadcast SD 720 X 486 29.97fps NTSC D1/DV Anamorph
-
IPod Touch in stock at new Solihull Apple Store,UK
For information of all those waiting for the Touch I have just been to the grand opening of the new Apple Store at Solihull,UK. As soon as the doors opened people went straight to the counter to buy the Touch. They have both models and they were flyi
-
Campus User Tracking Quick report
Hi All, I'm new to Cisco works and I'm trying to generate a report and wondering if anyone can point me in the right direction. I am trying to run report via campus user tracking --> quick report with the vlad id. But my problem is, for some of the v
-
Weblogic 10R3 on cloud is hanging frequently
Hi, I am using weblogic 10R3 AMI on cloud. The entire weblogic server is hanging when I am trying to administer my web apps using its administrative console. When it hangs, it is not accepting any other requests. Every time the server hangs, I am kil
-
HT2513 Upgraded to Lion. To-Do list is located below calendar. How do I move it?
I just upgraded to Lion. I couldn't find my To-Do list at all, so I searched on it in Calender's find window. Now it appears attached to the bottom of my calendar and is hard to access. How do I move it to the right side, like I see in so many other