Journalizing in odi
Hi,
Can someone inform me what is the use of journalizing in odi and steps to use it with an example?
Hello Friend,
On the User's Guide, at the Designer section, there is an entire section dedicated to "Changed Data Capture", which explains the concepts of Journaling and gives examples of how it works, with step-by-step instructions.
Just look for "Changed Data Capture" at the User's Guide.
Best regards,
Luiz
Similar Messages
-
ODI CDC - Getting Duplicate Records in Journal Data
Dear Gurus
I am getting the following issues in CDC using Oracle Logminer
01) All operations on Source Data (Insert, Update) are shown as only Insert in Model -> CDC -> Journal Data
02) The records in Model -> CDC -> Journal Data are double
03) These are not travelling to Desitination table, I want to load the destiation table
04) Is it possible to have the last value and the new value both available on same screen as output in ODI, I want to see what data changed before actually populating the tablesHi Andreas, Mayank.
Thanks for your reply.
I created my own DSO, but its giving error. And I tried with the stanadard DSO too. Still its giving the same error as could not activate.
In error its giving a name of function module RSB1_OLTPSOURCE_GENERATE.
I searched in R3 but could not get that one.
Even I tried creating DSO for trial basis, they are also giving the same problem.
I think its the problem from BASIS side.
Please help if you have any idea.
Thanks. -
ODI-20120: Error while Journalizing
Hello All,
While I was trying to "start journal" I have received the error: ODI-20120: Error while Journalizing.
Not sure about the reson of the error. I have configured the Journalizing tab of Module. I have opted for simple journalizing.
Can any one help me how to resolve the error on the question?
Regards,
User1672911Hi,
Following are the details.
Error message is :
"ODI-20120: Error while Journalizing" . Nothing more is appearing using which further investigation can be done.
I am using ODI 11G
JKM name is : JKM Oracle Simple
Regards,
User1672911 -
not able to see ikm oracle incremental update and ikm oracle slowly changing dimensions under PHYSCIAL tab in odi 12c
But i'm able to see other IKM's please help me, how can i see themNope, It has not been altered.
COMPONENT NAME: LKM Oracle to Oracle (datapump)
COMPONENT VERSION: 11.1.2.3
AUTHOR: Oracle
COMPATIBILITY: ODI 11.1.2 and above
Description:
- Loading Knowledge Module
- Loads data from an Oracle Server to an Oracle Server using external tables in the datapump format.
- This module is recommended when developing interfaces between two Oracle servers when DBLINK is not an option.
- An External table definition is created on the source and target servers.
- When using this module on a journalized source table, the Journaling table is first updated to flag the records consumed and then cleaned from these records at the end of the interface. -
Hi all,
I was trying to use the 'Change data capture' feature in ODI. I was able to start the journal for 1 of my models. In the operator the process failed giving the error :
java.sql.SQLException: ORA-00439: feature not enabled: Streams Capture
Then, I thought the problem might have been because the db user did not have privileges. So i executed the following pl sql block in the sql prompt :
BEGIN
DBMS_STREAMS_AUTH.GRANT_ADMIN_PRIVILEGE(GRANTEE => 'DATA2');
END;
/ as per the instructions in the designer(DATA2 is the name of the db user, from where the model takes the data ). Still the same error came. Then I figured that in the V$OPTION view the value of the parameter 'Streams Capture' was false. Now I am trying to set this 'Streams Capture' parameter to 'TRUE'. The 'update' command didnt seem to work. The error I got was :
ORA-02030: can only select from fixed tables/views
How do I set the 'Streams Capture' parameter to 'TRUE'?
And am I on the right track?Please help.
P.S : I am using the Oracle 10g Express Edition.
Regards,
DivyaI'm not sure that Express has the LogMiner fiunctionality available. I think this may be an Enterprise feature.
-
ODI can not capture RRN (rowid) in DB2/iSeries
HI, all . I found that we can not config ODI to get the RRN (rowid) when CDC by reading journal . Any idea ?
Hi, Thanks for your reply . it's great help .Please help me with these concerns :
1. If ODI can capture missed transaction journal ,they must know the last journal 's information of a table .where these information stored ? In ODI Reposistory ?
2.For CDC , if the source database is restored by the backup set of yesterday .Can ODI re-synchronized these change to the target datase or we have re run Load data ?
4.For journalling in DB2 , If the changes are created by a mirror tool from another DB2 database , does these changes are recorded in native journal ?
3.We have a target table is partitioned by date time ranged , the data in the target table is OFFLINE if the data is <sysdate -90 .
The source table in DB2 is not partitioned , data is fully ONLINE , if there are some DML happend in the data which older than sysdate-90 in the source table , how can these change reflect to the target table ?
4.Can CDC installed in MySQL , we have several MySQL Database
5. We would like to use native journal , not trigger in CDC for source database in DB2 . We found that there are some limitation of native journal , such as the number of primary key columns must be not exceed 16 .
For a given physical file( a table) in DB2 , it has the number of primary key 's column are 18 ( dued to the designed is created in 2000) , we found some error when declare primary key in source table in ODI Designer .
How can we overcome this limitation ?
6.Can ODI capture TRUNCATE command in DB2 source database ? -
Using journalized data in an interface with aggragate function
Hi
I am trying to use the journalized data of a source table in one of my interfaces in ODI. The trouble is that one of the mappings on the target columns involves a aggregate function(sum). When I run the interface i get an error saying "not a group by expression". I checked the code and found that the jrn_subscriber, jrn_flag and jrn_date columns are included in the select statement but not in the group by statement(the group by statement only contains the remiaining two columns of the target table).
Is there a way around this? Do I have to manually modify the km? If so how would I go about doing it?
Also I am using Oracle GoldenGate JKM (oracle to oracle OGG).
Thanks and really aprreciate the help
Ajay'ORA-00979' When Using The ODI CDC (Journalization) Feature With Knowledge Modules Including SQL Aggregate Functions [ID 424344.1]
Modified 11-MAR-2009 Type PROBLEM Status MODERATED
In this Document
Symptoms
Cause
Solution
Alternatives :
This document is being delivered to you via Oracle Support's Rapid Visibility (RaV) process, and therefore has not been subject to an independent technical review.
Applies to:
Oracle Data Integrator - Version: 3.2.03.01
This problem can occur on any platform.
Symptoms
After having successfully tested an ODI Integration Interface using an aggregate function such as MIN, MAX, SUM, it is necessary to set up Changed Data Capture operations by using Journalized tables.
However, during execution of the Integration Interface to retrieve only the Journalized records, problems arise at the Load Data step of the Loading Knowledge Module and the following message is displayed in ODI Log:
ORA-00979: not a GROUP BY expression
Cause
Using both CDC - Journalization and aggregate functions gives rise to complex issues.
Solution
Technically there is a work around for this problem (see below).
WARNING : Oracle engineers issue a severe warning that such a type of set up may give results that are not what may be expected. This is related to the way in which ODI Journalization is implemented as specific Journalization tables. In this case, the aggregate function will only operate on the subset which is stored (referenced) in the Journalization table and NOT over the entire Source table.
We recommend to avoid such types of Integration Interface set ups.
Alternatives :
1.The problem is due to the missing JRN_* columns in the generated SQL "Group By" clause.
The work around is to duplicate the Loading Knowledge Module (LKM), and in the clone, alter the "Load Data" step by editing the "Command on Source" tab and by replacing the following instruction:
<%=odiRef.getGrpBy()%>
with
<%=odiRef.getGrpBy()%>
<%if ((odiRef.getGrpBy().length() > 0) && (odiRef.getPop("HAS_JRN").equals("1"))) {%>
,JRN_FLAG,JRN_SUBSCRIBER,JRN_DATE
<%}%>
2. It is possible to develop two alternative solutions:
(a) Develop two separate and distinct Integration Interfaces:
* The first Integration Interface loads data into a temporary Table and specify the aggregate functions to be used in this initial Integration Interface.
* The second Integration Interfaces uses the temporary Table as a Source. Note that if you create the Table in the Interface, it is necessary to drag and drop the Integration Interface into the Source panel.
(b) Define two connections to the Database so that the Integration Interface references two distinct and separate Data Server Sources (one for the Journal, one for the other Tables). In this case, the aggregate function will be executed on the Source Schema.
Show Related Information Related
Products
* Middleware > Business Intelligence > Oracle Data Integrator (ODI) > Oracle Data Integrator
Keywords
ODI; AGGREGATE; ORACLE DATA INTEGRATOR; KNOWLEDGE MODULES; CDC; SUNOPSIS
Errors
ORA-979
Please find above the content from OTN.
It should show you this if you search this ID in the Search Knowledge Base
Cheers
Sachin -
Need helps with getting ODI CDC to work
Hi, I'm new to ODI. I'm trying to get the ODI CDC to work and for now I'm only interested in seeing that the changes are captured correctly.
I've set the d/b to archivelog mode and granted all the rights I can think of to the d/b user. I've defined the CDC in Consistent Mode for the model, defined the CDC for my tables, started the journal, etc.
When I right-click on the table and do Change Data Capture/Journal Data... I get ORA-00904 Table or View not found (stack trace below)
What is missing? Thanks for your assistance.
See com.borland.dx.dataset.DataSetException error code: BASE+62
com.borland.dx.dataset.DataSetException: Execution of query failed.
at com.borland.dx.dataset.DataSetException.a(Unknown Source)
at com.borland.dx.dataset.DataSetException.queryFailed(Unknown Source)
at com.borland.dx.sql.dataset.QueryProvider.a(Unknown Source)
at com.borland.dx.sql.dataset.JdbcProvider.provideData(Unknown Source)
at com.borland.dx.dataset.StorageDataSet.refresh(Unknown Source)
at com.borland.dx.sql.dataset.QueryDataSet.refresh(Unknown Source)
at com.sunopsis.graphical.frame.a.jb.dj(jb.java)
at com.sunopsis.graphical.frame.a.jb.<init>(jb.java)
at com.sunopsis.graphical.frame.a.jd.<init>(jd.java)
at com.sunopsis.graphical.frame.a.je.<init>(je.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at com.sunopsis.graphical.frame.bb.b(bb.java)
at com.sunopsis.graphical.tools.utils.swingworker.v.call(v.java)
at edu.emory.mathcs.backport.java.util.concurrent.FutureTask.run(FutureTask.java:176)
at com.sunopsis.graphical.tools.utils.swingworker.l.run(l.java)
at edu.emory.mathcs.backport.java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:665)
at edu.emory.mathcs.backport.java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:690)
at java.lang.Thread.run(Unknown Source)
Chained exception:
java.sql.SQLException: ORA-00942: table or view does not exist
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:316)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:282)
at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:639)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:185)
at oracle.jdbc.driver.T4CPreparedStatement.execute_for_describe(T4CPreparedStatement.java:503)
at oracle.jdbc.driver.OracleStatement.execute_maybe_describe(OracleStatement.java:965)
at oracle.jdbc.driver.T4CPreparedStatement.execute_maybe_describe(T4CPreparedStatement.java:535)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1051)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:2984)
at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3026)
at com.borland.dx.sql.dataset.o.f(Unknown Source)
at com.borland.dx.sql.dataset.QueryProvider.e(Unknown Source)
at com.borland.dx.sql.dataset.JdbcProvider.provideData(Unknown Source)
at com.borland.dx.dataset.StorageDataSet.refresh(Unknown Source)
at com.borland.dx.sql.dataset.QueryDataSet.refresh(Unknown Source)
at com.sunopsis.graphical.frame.a.jb.dj(jb.java)
at com.sunopsis.graphical.frame.a.jb.<init>(jb.java)
at com.sunopsis.graphical.frame.a.jd.<init>(jd.java)
at com.sunopsis.graphical.frame.a.je.<init>(je.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at com.sunopsis.graphical.frame.bb.b(bb.java)
at com.sunopsis.graphical.tools.utils.swingworker.v.call(v.java)
at edu.emory.mathcs.backport.java.util.concurrent.FutureTask.run(FutureTask.java:176)
at com.sunopsis.graphical.tools.utils.swingworker.l.run(l.java)
at edu.emory.mathcs.backport.java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:665)
at edu.emory.mathcs.backport.java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:690)
at java.lang.Thread.run(Unknown Source)
java.sql.SQLException: ORA-00942: table or view does not exist
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:316)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:282)
at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:639)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:185)
at oracle.jdbc.driver.T4CPreparedStatement.execute_for_describe(T4CPreparedStatement.java:503)
at oracle.jdbc.driver.OracleStatement.execute_maybe_describe(OracleStatement.java:965)
at oracle.jdbc.driver.T4CPreparedStatement.execute_maybe_describe(T4CPreparedStatement.java:535)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1051)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:2984)
at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3026)
at com.borland.dx.sql.dataset.o.f(Unknown Source)
at com.borland.dx.sql.dataset.QueryProvider.e(Unknown Source)
at com.borland.dx.sql.dataset.JdbcProvider.provideData(Unknown Source)
at com.borland.dx.dataset.StorageDataSet.refresh(Unknown Source)
at com.borland.dx.sql.dataset.QueryDataSet.refresh(Unknown Source)
at com.sunopsis.graphical.frame.a.jb.dj(jb.java)
at com.sunopsis.graphical.frame.a.jb.<init>(jb.java)
at com.sunopsis.graphical.frame.a.jd.<init>(jd.java)
at com.sunopsis.graphical.frame.a.je.<init>(je.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at com.sunopsis.graphical.frame.bb.b(bb.java)
at com.sunopsis.graphical.tools.utils.swingworker.v.call(v.java)
at edu.emory.mathcs.backport.java.util.concurrent.FutureTask.run(FutureTask.java:176)
at com.sunopsis.graphical.tools.utils.swingworker.l.run(l.java)
at edu.emory.mathcs.backport.java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:665)
at edu.emory.mathcs.backport.java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:690)
at java.lang.Thread.run(Unknown Source)Update...
I traced it to the Start Journal step issue. The Operator shows that step 8 - Journalizing - xxxx - Create Change Set, produces Oracle error ORA-00600. What does this means? The SQL that it tries to execute is:
==============================================
BEGIN
DBMS_CDC_PUBLISH.CREATE_CHANGE_SET(
change_set_name => 'TID_SOURCE',
description => 'Sunopsis change set for model : TID_SOURCE',
change_source_name => 'HOTLOG_SOURCE',
begin_date => sysdate
END;
==============================================
The strack trace is as follows:
600 : 60000 : java.sql.SQLException: ORA-00600: internal error code, arguments: [kcbgcur_9], [8388665], [23], [25165824], [8388608], [], [], []
ORA-06512: at "SYS.DBMS_CAPTURE_ADM_INTERNAL", line 121
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_CDC_PUBLISH", line 560
ORA-06512: at line 1
java.sql.SQLException: ORA-00600: internal error code, arguments: [kcbgcur_9], [8388665], [23], [25165824], [8388608], [], [], []
ORA-06512: at "SYS.DBMS_CAPTURE_ADM_INTERNAL", line 121
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_CDC_PUBLISH", line 560
ORA-06512: at line 1
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:316)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:282)
at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:639)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:185)
at oracle.jdbc.driver.T4CPreparedStatement.execute_for_rows(T4CPreparedStatement.java:633)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1086)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:2984)
at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3057)
at com.sunopsis.sql.SnpsQuery.executeUpdate(SnpsQuery.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execStdOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source) -
What are all Built-in-Templets available in ODI?
What is the use of Built-in-Templets?
What are all Built-in-Templets available in ODI?Hi Harmeet,
ODI's biggest asset is Knowledge module.This is the built in templates of ODI.
Knowledge Modules (KMs) are code templates. Each KM is dedicated to an individual task in the overall data integration process.
There are almost 6 types of KMs coming with ODI.
Reverse-engineering knowledge modules (RKM) are used for reading the table and other object metadata from source databases.
Journalizing knowledge modules (JKM) record the new and changed data within either a single table or view or a consistent set of tables or views.
Loading knowledge modules (LKM) are used for efficient extraction of data from source databases and include database-specific bulk unload utilities where available.
Check knowledge modules (CKM) Checks data integrity against CONSTRAINTS defined on a Datastore. Rejects invalid records in the error table created dynamically.
Integration knowledge modules (IKM) are used for efficiently transforming data from staging area to the target tables, generating the optimized native SQL for the given database.
Service knowledge modules (SKM) provide the ability to expose data as Web services.
You can see the list of KMs in your installation folder /impexp directory.
Thanks,
Guru -
Hello,
i am facing problem with upgrade of master repository ODI 10.1.3.5.0 -> 10.1.3.6.1 odi_patch_10.1.3.6.1 (p9377717_101360_Generic)
i followed steps as described in upgrade procedure (Remove the content of the oracledi/lib/scripts/ sub-directory your Oracle Data Integrator installation directory.
Copy the content of the oracledi sub-directory of the temporary directory to your Oracle Data Integrator installation directory. The temporary directory content should overwrite the Oracle Data Integrator installation directory content.)
when running upgrade script (./mupgrade.sh) there is NO ORACLE Technologies appearing in the wizard....
only Hypersonic SQL and Informix left
same in Windows XP and Linux.....i just downloaded and unpacked odi_patch_10.1.3.6.1 (p9377717_101360_Generic.zip) from support.oracle.com
how can i check? there is the structure of the patch
bin
demo
doc
drivers
impexp
lib
tools
./bin:
startcmd.bat
./demo:
xml
./demo/xml:
personal.xsd
./doc:
index_km.htm
km
webhelp
./doc/km:
odiafm_93110_readme.pdf
odiap_93110_readme.pdf
odiess_readme.pdf
odiess_users.pdf
odigs_sapabapbw.pdf
odigs_sapabap.pdf
odi_km_ref_guide v1.3.pdf
./doc/webhelp:
en
./doc/webhelp/en:
index.hhc
index.hhk
printable
ref_tools
release_snps.htm
setup
usermanual
whgdata
whxdata
./doc/webhelp/en/printable:
snps_ref_tools.pdf
snps_setup.pdf
snps_users.pdf
./doc/webhelp/en/ref_tools:
odiftpget.htm
odiftpput.htm
odiscpget.htm
odiscpput.htm
odisftpget.htm
odisftpput.htm
snpsfiledelete.html
./doc/webhelp/en/setup:
setup.htm
sunopsis.log
./doc/webhelp/en/usermanual:
technos
./doc/webhelp/en/usermanual/technos:
how_to.htm
jms
jms_xml
./doc/webhelp/en/usermanual/technos/jms:
creating_a_jms_data_server.htm
creating_a_physical_schema_for_jms.htm
defining_a_jms_model.htm
choosing_the_right_kms_for_jms.htm
jms_standard_properties.htm
using_jms_properties.htm
./doc/webhelp/en/usermanual/technos/jms_xml:
creating_a_jms_xml_data_server.htm
creating_and_reverse-engineering_a_jms_xml_model.htm
creating_a_physical_schema_for_jms_xml.htm
choosing_the_right_kms_for_jms_xml.htm
./doc/webhelp/en/whgdata:
whlstfl0.htm
whlstfl11.htm
whlstfl16.htm
whlstfl18.htm
whlstfl20.htm
whlstfl21.htm
whlstfl22.htm
whlstfl23.htm
whlstfl24.htm
whlstfl25.htm
whlstfl26.htm
whlstfl3.htm
whlstfl4.htm
whlstfl7.htm
whlstfl8.htm
whlstf0.htm
whlstf1.htm
whlstf10.htm
whlstf11.htm
whlstf12.htm
whlstf13.htm
whlstf14.htm
whlstf15.htm
whlstf16.htm
whlstf17.htm
whlstf18.htm
whlstf19.htm
whlstf2.htm
whlstf20.htm
whlstf21.htm
whlstf22.htm
whlstf23.htm
whlstf24.htm
whlstf25.htm
whlstf26.htm
whlstf27.htm
whlstf28.htm
whlstf29.htm
whlstf3.htm
whlstf30.htm
whlstf31.htm
whlstf32.htm
whlstf33.htm
whlstf34.htm
whlstf35.htm
whlstf36.htm
whlstf37.htm
whlstf38.htm
whlstf39.htm
whlstf4.htm
whlstf40.htm
whlstf41.htm
whlstf42.htm
whlstf43.htm
whlstf44.htm
whlstf45.htm
whlstf46.htm
whlstf47.htm
whlstf48.htm
whlstf49.htm
whlstf5.htm
whlstf50.htm
whlstf51.htm
whlstf52.htm
whlstf53.htm
whlstf54.htm
whlstf55.htm
whlstf56.htm
whlstf57.htm
whlstf58.htm
whlstf59.htm
whlstf6.htm
whlstf60.htm
whlstf61.htm
whlstf62.htm
whlstf63.htm
whlstf64.htm
whlstf65.htm
whlstf66.htm
whlstf67.htm
whlstf68.htm
whlstf69.htm
whlstf7.htm
whlstf70.htm
whlstf71.htm
whlstf72.htm
whlstf8.htm
whlstf9.htm
whlsti0.htm
whlsti1.htm
whlsti2.htm
whlstt0.htm
whlstt1.htm
whlstt10.htm
whlstt100.htm
whlstt101.htm
whlstt102.htm
whlstt103.htm
whlstt104.htm
whlstt11.htm
whlstt12.htm
whlstt13.htm
whlstt14.htm
whlstt15.htm
whlstt16.htm
whlstt17.htm
whlstt18.htm
whlstt19.htm
whlstt2.htm
whlstt20.htm
whlstt21.htm
whlstt22.htm
whlstt23.htm
whlstt24.htm
whlstt25.htm
whlstt26.htm
whlstt27.htm
whlstt28.htm
whlstt29.htm
whlstt3.htm
whlstt30.htm
whlstt31.htm
whlstt32.htm
whlstt33.htm
whlstt34.htm
whlstt35.htm
whlstt36.htm
whlstt37.htm
whlstt38.htm
whlstt39.htm
whlstt4.htm
whlstt40.htm
whlstt41.htm
whlstt42.htm
whlstt43.htm
whlstt44.htm
whlstt45.htm
whlstt46.htm
whlstt47.htm
whlstt48.htm
whlstt49.htm
whlstt5.htm
whlstt50.htm
whlstt51.htm
whlstt52.htm
whlstt53.htm
whlstt54.htm
whlstt55.htm
whlstt56.htm
whlstt57.htm
whlstt58.htm
whlstt59.htm
whlstt6.htm
whlstt60.htm
whlstt61.htm
whlstt62.htm
whlstt63.htm
whlstt64.htm
whlstt65.htm
whlstt66.htm
whlstt67.htm
whlstt68.htm
whlstt69.htm
whlstt7.htm
whlstt70.htm
whlstt71.htm
whlstt72.htm
whlstt73.htm
whlstt74.htm
whlstt75.htm
whlstt76.htm
whlstt77.htm
whlstt78.htm
whlstt79.htm
whlstt8.htm
whlstt80.htm
whlstt81.htm
whlstt82.htm
whlstt83.htm
whlstt84.htm
whlstt85.htm
whlstt86.htm
whlstt87.htm
whlstt88.htm
whlstt89.htm
whlstt9.htm
whlstt90.htm
whlstt91.htm
whlstt92.htm
whlstt93.htm
whlstt94.htm
whlstt95.htm
whlstt96.htm
whlstt97.htm
whlstt98.htm
whlstt99.htm
./doc/webhelp/en/whxdata:
whftdata0.xml
whftdata1.xml
whftdata2.xml
whftdata3.xml
whfts.xml
whfwdata0.xml
whfwdata1.xml
whfwdata10.xml
whfwdata11.xml
whfwdata12.xml
whfwdata13.xml
whfwdata14.xml
whfwdata15.xml
whfwdata16.xml
whfwdata17.xml
whfwdata18.xml
whfwdata19.xml
whfwdata2.xml
whfwdata20.xml
whfwdata21.xml
whfwdata22.xml
whfwdata23.xml
whfwdata24.xml
whfwdata25.xml
whfwdata26.xml
whfwdata27.xml
whfwdata28.xml
whfwdata29.xml
whfwdata3.xml
whfwdata30.xml
whfwdata31.xml
whfwdata32.xml
whfwdata4.xml
whfwdata5.xml
whfwdata6.xml
whfwdata7.xml
whfwdata8.xml
whfwdata9.xml
whidata0.xml
whidata1.xml
whidata2.xml
whidata3.xml
whidata4.xml
whidata5.xml
whidata6.xml
whidx.xml
whtdata0.xml
whtdata7.xml
./drivers:
ess_es_server.jar
ess_japi.jar
HFMDriver.dll
HFMDriver64.dll
odihapp_common.jar
odihapp_essbase.jar
odi_hfm.jar
odi-sap.jar
snpsfile.jar
snpsxmlo.jar
./impexp:
GAC_Hypersonic SQL Default.xml
GAC_Informix Default.xml
KM_CKM Teradata.xml
KM_IKM File to Teradata (TTU).xml
KM_IKM Oracle Slowly Changing Dimension.xml
KM_IKM SQL to Hyperion Essbase (DATA).xml
KM_IKM SQL to Teradata (TTU).xml
KM_IKM Teradata Control Append.xml
KM_JKM DB2 400 Simple (Journal).xml
KM_JKM Oracle to Oracle Consistent (OGG).xml
KM_LKM DB2_400 Journal to SQL .xml
KM_LKM File to Netezza (NZLOAD).xml
KM_LKM File to Oracle (SQLLDR).xml
KM_LKM File to Teradata (TTU).xml
KM_LKM MSSQL to Oracle (BCPSQLLDR).xml
KM_LKM SQL to Teradata (TTU).xml
KM_RKM MSSQL.xml
KM_RKM Oracle Olap (Jython).xml
KM_RKM Oracle.xml
KM_ RKM Salesforce.com.xml
KM_RKM Teradata.xml
LANG_SQL.xml
PROF_DESIGNER.xml
PROF_METADATA ADMIN.xml
PROF_NG DESIGNER.xml
PROF_NG METADATA ADMIN.xml
PROF_NG REPOSITORY EXPLORER.xml
PROF_NG VERSION ADMIN.xml
PROF_OPERATOR.xml
PROF_REPOSITORY EXPLORER.xml
PROF_SECURITY ADMIN.xml
PROF_TOPOLOGY ADMIN.xml
PROF_VERSION ADMIN.xml
TECH_Attunity.xml
TECH_BTrieve.xml
TECH_DBase.xml
TECH_Derby.xml
TECH_File.xml
TECH_Generic SQL.xml
TECH_Hyperion Essbase.xml
TECH_Hyperion Financial Management.xml
TECH_Hyperion Planning.xml
TECH_Hypersonic SQL.xml
TECH_IBM DB2 UDB.xml
TECH_IBM DB2400.xml
TECH_Informix.xml
TECH_Ingres.xml
TECH_Interbase.xml
TECH_JMS Queue.xml
TECH_JMS Queue XML.xml
TECH_JMS Topic.xml
TECH_JMS Topic XML.xml
TECH_LDAP.xml
TECH_Microsoft Access.xml
TECH_Microsoft Excel.xml
TECH_Microsoft SQL Server.xml
TECH_MySQL.xml
TECH_Netezza.xml
TECH_Operating System.xml
TECH_Oracle BAM.xml
TECH_Oracle.xml
TECH_Paradox.xml
TECH_PostgreSQL.xml
TECH_Progress.xml
TECH_Salesforce.com.xml
TECH_SAP ABAP.xml
TECH_SAP Java Connector.xml
TECH_SAS.xml
TECH_Sunopsis Engine.xml
TECH_Sybase AS Anywhere.xml
TECH_Sybase AS Enterprise.xml
TECH_Sybase AS IQ.xml
TECH_Teradata.xml
TECH_Universe.xml
TECH_XML.xml
./lib:
scripts
snpshelp.zip
snpsws.zip
sunopsis.zip
./lib/scripts:
DERBY
HYPERSONIC_SQL
IBM_DB2_UDB
IBM_DB2_400
INFORMIX
MICROSOFT_SQL_SERVER
ORACLE
POSTGRESQL
SYBASE_AS_ANYWHERE
SYBASE_AS_ENTERPRISE
SYBASE_AS_IQ
xml
./lib/scripts/DERBY:
E_CREATE.xml
E_DROP.xml
M_CREATE.xml
M_DROP.xml
patches
W_CREATE.xml
W_DROP.xml
./lib/scripts/DERBY/patches:
E_04.02.02.01_04.02.03.01.xml
E_410201_420101.xml
E_420101_420201.xml
M_04.02.02.01_04.02.03.01.xml
M_410201_420101.xml
M_420101_420201.xml
W_04.02.02.01_04.02.03.01.xml
W_410201_420101.xml
W_420101_420201.xml
./lib/scripts/HYPERSONIC_SQL:
E_CREATE.xml
E_DROP.xml
M_CREATE.xml
M_DROP.xml
patches
W_CREATE.xml
W_DROP.xml
./lib/scripts/HYPERSONIC_SQL/patches:
E_04.02.02.01_04.02.03.01.xml
E_300101.xml
E_300101_300102.xml
E_300102_300103.xml
E_300103_300104.xml
E_300104_310101.xml
E_310101_320301.xml
E_320301_400101.xml
E_400101_410101.xml
E_410101_410201.xml
E_410201_420101.xml
E_420101_420201.xml
M_04.02.02.01_04.02.03.01.xml
M_300101.xml
M_300101_300102.xml
M_300102_300103.xml
M_300103_300104.xml
M_300104_300105.xml
M_300105_310101.xml
M_310101_310201.xml
M_310201_320301.xml
M_320301_400101.xml
M_400101_410101.xml
M_410101_410201.xml
M_410201_420101.xml
M_420101_420201.xml
W_04.02.02.01_04.02.03.01.xml
W_300101.xml
W_300101_300102.xml
W_300102_300103.xml
W_300103_310101.xml
W_310101_310201.xml
W_310201_320301.xml
W_320301_400101.xml
W_400101_410101.xml
W_410101_410201.xml
W_410201_420101.xml
W_420101_420201.xml
./lib/scripts/IBM_DB2_UDB:
E_CREATE.xml
E_DROP.xml
M_CREATE.xml
M_DROP.xml
patches
W_CREATE.xml
W_DROP.xml
./lib/scripts/IBM_DB2_UDB/patches:
E_04.02.02.01_04.02.03.01.xml
E_300101.xml
E_300101_300102.xml
E_300102_300103.xml
E_300103_300104.xml
E_300104_310101.xml
E_310101_320301.xml
E_320301_400101.xml
E_400101_410101.xml
E_410101_410201.xml
E_410201_420101.xml
E_420101_420201.xml
M_04.02.02.01_04.02.03.01.xml
M_300101.xml
M_300101_300102.xml
M_300102_300103.xml
M_300103_300104.xml
M_300104_300105.xml
M_300105_310101.xml
M_310101_310201.xml
M_310201_320301.xml
M_320301_400101.xml
M_400101_410101.xml
M_410101_410201.xml
M_410201_420101.xml
M_420101_420201.xml
W_04.02.02.01_04.02.03.01.xml
W_300101.xml
W_300101_300102.xml
W_300102_300103.xml
W_300103_310101.xml
W_310101_310201.xml
W_310201_320301.xml
W_320301_400101.xml
W_400101_410101.xml
W_410101_410201.xml
W_410201_420101.xml
W_420101_420201.xml
./lib/scripts/IBM_DB2_400:
E_CREATE.xml
E_DROP.xml
M_CREATE.xml
M_DROP.xml
patches
W_CREATE.xml
W_DROP.xml
./lib/scripts/IBM_DB2_400/patches:
E_04.02.02.01_04.02.03.01.xml
E_300101.xml
E_300101_300102.xml
E_300102_300103.xml
E_300103_300104.xml
E_300104_310101.xml
E_310101_320301.xml
E_320301_400101.xml
E_400101_410101.xml
E_410101_410201.xml
E_410201_420101.xml
E_420101_420201.xml
M_04.02.02.01_04.02.03.01.xml
M_300101.xml
M_300101_300102.xml
M_300102_300103.xml
M_300103_300104.xml
M_300104_300105.xml
M_300105_310101.xml
M_310101_310201.xml
M_310201_320301.xml
M_320301_400101.xml
M_400101_410101.xml
M_410101_410201.xml
M_410201_420101.xml
M_420101_420201.xml
W_04.02.02.01_04.02.03.01.xml
W_300101.xml
W_300101_300102.xml
W_300102_300103.xml
W_300103_310101.xml
W_310101_310201.xml
W_310201_320301.xml
W_320301_400101.xml
W_400101_410101.xml
W_410101_410201.xml
W_410201_420101.xml
W_420101_420201.xml
./lib/scripts/INFORMIX:
E_CREATE.xml
E_DROP.xml
M_CREATE.xml
M_DROP.xml
patches
W_CREATE.xml
W_DROP.xml
./lib/scripts/INFORMIX/patches:
E_04.02.02.01_04.02.03.01.xml
E_300101.xml
E_300101_300102.xml
E_300102_300103.xml
E_300103_300104.xml
E_300104_310101.xml
E_310101_320301.xml
E_320301_400101.xml
E_400101_410101.xml
E_410101_410201.xml
E_410201_420101.xml
E_420101_420201.xml
M_04.02.02.01_04.02.03.01.xml
M_300101.xml
M_300101_300102.xml
M_300102_300103.xml
M_300103_300104.xml
M_300104_300105.xml
M_300105_310101.xml
M_310101_310201.xml
M_310201_320301.xml
M_320301_400101.xml
M_400101_410101.xml
M_410101_410201.xml
M_410201_420101.xml
M_420101_420201.xml
W_04.02.02.01_04.02.03.01.xml
W_300101.xml
W_300101_300102.xml
W_300102_300103.xml
W_300103_310101.xml
W_310101_310201.xml
W_310201_320301.xml
W_320301_400101.xml
W_400101_410101.xml
W_410101_410201.xml
W_410201_420101.xml
W_420101_420201.xml
./lib/scripts/MICROSOFT_SQL_SERVER:
E_CREATE.xml
E_DROP.xml
M_CREATE.xml
M_DROP.xml
patches
W_CREATE.xml
W_DROP.xml
./lib/scripts/MICROSOFT_SQL_SERVER/patches:
E_04.02.02.01_04.02.03.01.xml
E_300101.xml
E_300101_300102.xml
E_300102_300103.xml
E_300103_300104.xml
E_300104_310101.xml
E_310101_320301.xml
E_320301_400101.xml
E_400101_410101.xml
E_410101_410201.xml
E_410201_420101.xml
E_420101_420201.xml
M_04.02.02.01_04.02.03.01.xml
M_300101.xml
M_300101_300102.xml
M_300102_300103.xml
M_300103_300104.xml
M_300104_300105.xml
M_300105_310101.xml
M_310101_310201.xml
M_310201_320301.xml
M_320301_400101.xml
M_400101_410101.xml
M_410101_410201.xml
M_410201_420101.xml
M_420101_420201.xml
W_04.02.02.01_04.02.03.01.xml
W_300101.xml
W_300101_300102.xml
W_300102_300103.xml
W_300103_310101.xml
W_310101_310201.xml
W_310201_320301.xml
W_320301_400101.xml
W_400101_410101.xml
W_410101_410201.xml
W_410201_420101.xml
W_420101_420201.xml
./lib/scripts/ORACLE:
E_CREATE.xml
E_DROP.xml
M_CREATE.xml
M_DROP.xml
patches
W_CREATE.xml
W_DROP.xml
./lib/scripts/ORACLE/patches:
E_04.02.02.01_04.02.03.01.xml
E_300101.xml
E_300101_300102.xml
E_300102_300103.xml
E_300103_300104.xml
E_300104_310101.xml
E_310101_320301.xml
E_320301_400101.xml
E_400101_410101.xml
E_410101_410201.xml
E_410201_420101.xml
E_420101_420201.xml
M_04.02.02.01_04.02.03.01.xml
M_300101.xml
M_300101_300102.xml
M_300102_300103.xml
M_300103_300104.xml
M_300104_300105.xml
M_300105_310101.xml
M_310101_310201.xml
M_310201_320301.xml
M_320301_400101.xml
M_400101_410101.xml
M_410101_410201.xml
M_410201_420101.xml
M_420101_420201.xml
W_04.02.02.01_04.02.03.01.xml
W_300101.xml
W_300101_300102.xml
W_300102_300103.xml
W_300103_310101.xml
W_310101_310201.xml
W_310201_320301.xml
W_320301_400101.xml
W_400101_410101.xml
W_410101_410201.xml
W_410201_420101.xml
W_420101_420201.xml
./lib/scripts/POSTGRESQL:
E_CREATE.xml
E_DROP.xml
M_CREATE.xml
M_DROP.xml
patches
W_CREATE.xml
W_DROP.xml
./lib/scripts/POSTGRESQL/patches:
E_04.02.02.01_04.02.03.01.xml
E_320301_400101.xml
E_400101_410101.xml
E_410101_410201.xml
E_410201_420101.xml
E_420101_420201.xml
M_04.02.02.01_04.02.03.01.xml
M_320301_400101.xml
M_400101_410101.xml
M_410101_410201.xml
M_410201_420101.xml
M_420101_420201.xml
W_04.02.02.01_04.02.03.01.xml
W_320301_400101.xml
W_400101_410101.xml
W_410101_410201.xml
W_410201_420101.xml
W_420101_420201.xml
./lib/scripts/SYBASE_AS_ANYWHERE:
E_CREATE.xml
E_DROP.xml
M_CREATE.xml
M_DROP.xml
patches
W_CREATE.xml
W_DROP.xml
./lib/scripts/SYBASE_AS_ANYWHERE/patches:
E_04.02.02.01_04.02.03.01.xml
E_300101.xml
E_300101_300102.xml
E_300102_300103.xml
E_300103_300104.xml
E_300104_310101.xml
E_310101_320301.xml
E_320301_400101.xml
E_400101_410101.xml
E_410101_410201.xml
E_410201_420101.xml
E_420101_420201.xml
M_04.02.02.01_04.02.03.01.xml
M_300101.xml
M_300101_300102.xml
M_300102_300103.xml
M_300103_300104.xml
M_300104_300105.xml
M_300105_310101.xml
M_310101_310201.xml
M_310201_320301.xml
M_320301_400101.xml
M_400101_410101.xml
M_410101_410201.xml
M_410201_420101.xml
M_420101_420201.xml
W_04.02.02.01_04.02.03.01.xml
W_300101.xml
W_300101_300102.xml
W_300102_300103.xml
W_300103_310101.xml
W_310101_310201.xml
W_310201_320301.xml
W_320301_400101.xml
W_400101_410101.xml
W_410101_410201.xml
W_410201_420101.xml
W_420101_420201.xml
./lib/scripts/SYBASE_AS_ENTERPRISE:
E_CREATE.xml
E_DROP.xml
M_CREATE.xml
M_DROP.xml
patches
W_CREATE.xml
W_DROP.xml
./lib/scripts/SYBASE_AS_ENTERPRISE/patches:
E_04.02.02.01_04.02.03.01.xml
E_300101.xml
E_300101_300102.xml
E_300102_300103.xml
E_300103_300104.xml
E_300104_310101.xml
E_310101_320301.xml
E_320301_400101.xml
E_400101_410101.xml
E_410101_410201.xml
E_410201_420101.xml
E_420101_420201.xml
M_04.02.02.01_04.02.03.01.xml
M_300101.xml
M_300101_300102.xml
M_300102_300103.xml
M_300103_300104.xml
M_300104_300105.xml
M_300105_310101.xml
M_310101_310201.xml
M_310201_320301.xml
M_320301_400101.xml
M_400101_410101.xml
M_410101_410201.xml
M_410201_420101.xml
M_420101_420201.xml
W_04.02.02.01_04.02.03.01.xml
W_300101.xml
W_300101_300102.xml
W_300102_300103.xml
W_300103_310101.xml
W_310101_310201.xml
W_310201_320301.xml
W_320301_400101.xml
W_400101_410101.xml
W_410101_410201.xml
W_410201_420101.xml
W_420101_420201.xml
./lib/scripts/SYBASE_AS_IQ:
E_CREATE.xml
E_DROP.xml
M_CREATE.xml
M_DROP.xml
patches
W_CREATE.xml
W_DROP.xml
./lib/scripts/SYBASE_AS_IQ/patches:
E_CREATE.xml
E_DROP.xml
E_04.02.02.01_04.02.03.01.xml
E_300101.xml
E_300101_300102.xml
E_300102_300103.xml
E_300103_300104.xml
E_300104_310101.xml
E_310101_320301.xml
E_320301_400101.xml
E_400101_410101.xml
E_410101_410201.xml
E_410201_420101.xml
E_420101_420201.xml
M_CREATE.xml
M_DROP.xml
M_04.02.02.01_04.02.03.01.xml
M_300101.xml
M_300101_300102.xml
M_300102_300103.xml
M_300103_300104.xml
M_300104_300105.xml
M_300105_310101.xml
M_310101_310201.xml
M_310201_320301.xml
M_320301_400101.xml
M_400101_410101.xml
M_410101_410201.xml
M_410201_420101.xml
M_420101_420201.xml
W_CREATE.xml
W_DROP.xml
W_04.02.02.01_04.02.03.01.xml
W_300101.xml
W_300101_300102.xml
W_300102_300103.xml
W_300103_310101.xml
W_310101_310201.xml
W_310201_320301.xml
W_320301_400101.xml
W_400101_410101.xml
W_410101_410201.xml
W_410201_420101.xml
W_420101_420201.xml
./lib/scripts/xml:
CONN_SecurityConnection.xml
CONVDT_CONVDATATYPESLST.xml
DT_DATATYPESLST.xml
FIELD_FIELD_LST.xml
FLOOK_LOOKUP_LST.xml
LANG_SQL.xml
LOCREP_MASTERREPOSITORY.xml
OBJ_Column.xml
OBJ_DataServer.xml
OBJ_Datastore.xml
OBJ_Model.xml
OBJ_OBJ_SNPOPENTOOL_7800.xml
OBJ_OBJ_SNPSCENFOLDER_7700.xml
OBJ_Solution.xml
PROF_DESIGNER.xml
PROF_METADATAADMIN.xml
PROF_NGDESIGNER.xml
PROF_NGMETADATAADMIN.xml
PROF_NGREPOSITORYEXPLORER.xml
PROF_NGVERSIONADMIN.xml
PROF_OPERATOR.xml
PROF_REPOSITORYEXPLORER.xml
PROF_SECURITYADMIN.xml
PROF_TOPOLOGYADMIN.xml
PROF_VERSIONADMIN.xml
TECH_Attunity.xml
TECH_File.xml
TECH_HyperionEssbase.xml
TECH_HyperionFinancialManagement.xml
TECH_HypersonicSQL.xml
TECH_Informix.xml
TECH_SAPABAP.xml
TECH_SAPJavaConnector.xml
TECH_SAS.xml
./tools:
cdc_iseries
web_services
./tools/cdc_iseries:
SAVPGM0110
./tools/web_services:
odi-public-ws.aar -
ODI updates all rows even when no change exists
We use database triggers to update the created_by, creation_date, last_updated_by, last_updated_date columns on the records. From looking at these columns I can tell that ODI is updating every target record with the save values that are already present.
I have tried using Journaling, but that I can't get that working, and I there are cases where I am not going to have the luxury of adding journal columns and triggers to my source database.
Is there a way to have the interface only update records where the values have actually changed?
Thanks,
KtIn the interface diagram window, you will need to mark the columns created_by, creation_date, last_updated_by, last_updated_date to be executed on "Target".
In addition,
For created_by, creation_date, mark them for "Insert" only.
For last_updated_by, last_updated_date, mark them for "Update" only.
Hope that helps -
Capture change on certain columns only in ODI,
Hi experts:
Need you help to suggest the proper way to meet the following requirements.
Requirements”
(a) Source– Oracle 11g, need to track changes to 2 columns only - [Table A, Column A1] & [Table B, Column B1]. These 2 columns may be changed many times during a day.
(b) Every 4 hours, based on what are captured from (a), find out the distinct rows changed (i.e. based on primary keys of Table A, Table B) , and only ship the most current row images for these rows to target
(c) Target – Sybase, meet to perform transformations on the 2 column values (i.e. Column A1, Column B1) to map to target table [Table C, Column C1]
From my limited understanding, I think trigger-based CDC is to be used. However, not sure how to do this , especially (a) and (b). For example, where do we configure trigger logic to achieve (a) from the ODI studio ?
Your help is much appreciated.
RegardsHi,
887999 wrote:
Hi experts:
Need you help to suggest the proper way to meet the following requirements.
Requirements”
(a) Source– Oracle 11g, need to track changes to 2 columns only - [Table A, Column A1] & [Table B, Column B1]. These 2 columns may be changed many times during a day.
(b) Every 4 hours, based on what are captured from (a), find out the distinct rows changed (i.e. based on primary keys of Table A, Table B) , and only ship the most current row images for these rows to target-- Check out the JV$ and JVD$ views that are created when you start the ODI Journal, they do exactly what you have described in (b) - You get the latest update based on the SCN number when they occured.
(c) Target – Sybase, meet to perform transformations on the 2 column values (i.e. Column A1, Column B1) to map to target table [Table C, Column C1]To be honest the target is irrelvant, you just need to decide if you want to do the transformation on the source on the way out of oracle (set staging area different to target and choose your Source logical schema).
I would design an interface that uses Table A and Table B as the sources, do the join , transformation on Oracle and map to Target Table C, choose your staging area based on where you want the joins / transformation to take place, pick a Knowlege module based on Technology and how you want to update the target.
>
From my limited understanding, I think trigger-based CDC is to be used. However, not sure how to do this , especially (a) and (b). For example, where do we configure trigger logic to achieve (a) from the ODI studio ?You can use Synchronous (Trigger based) or Asynchronous (Logminer / Streams based) to perform what you want, see this nice guide on setting up CDC and consuming the changes :
http://soainfrastructure.blogspot.ie/2009/02/setting-up-oracle-data-integrator-odi.html
If your friendly with your source system DBA then I prefer Asynchronous CDC , its less intrusive than triggers, it does however need a bit of knowledge on how to monitor it, Metalink / Support has plenty of info.
>
Your help is much appreciated.You welcome, have a play with it in a demo environment and get a feel for how you consume the captured changes (Lock Subscriber, Extend Window, Consume, Purge + Unlock, Loop) etc.
The guide I've linked to uses an ODI WAit for Data to trigger the consumption of changes, you have stated every 4 hours so I would skip the ODI wait for data and simply schedule your package to run every 4 hours. -
What are the Relations between Journalizing and IKM?
What is the best method to use in the following scenario:
I have about 20 source tables with large amount of data.
I need to create interfaces that join the source tables into target tables.
The source tables are inserted every few secondes with about hundreds to thousands rows.
There can be a gap of few seconds between the insert of different tables that sould be joined.
The source and target tables are on the same Oracle instance and schema.
I want to understand the role of: 'Journalizing CDC' and 'IKM - Incremental Update' and
how can i use it in my scenario?
In general What are the relations between 'Journalizing' and 'IKM'?
Should i use both of them? Or maybe it is better to deelte and insert to the target tables?
I want to understand what is the role of 'Journalizing CDC'?
Can 'IKM - Incremental Update' work without 'Journalizing'?
Does 'Journalizing' need to have PK on the tables?
What should i do if i can't put PK (there can be multiple identical rows)?
Thanks in advance YaelHi Yael,
I will try and answer as many of your points as I can in one post :-)
Journalizing is way of tracking only changed data in your source system, if your source tables had a date_modified you could always use this as a filter when scanning for changes rather than CDC, Log based CDC (Asynchronous in ODI, Logminer/Streams or Goldengate for example) removes the overhead of of placing a trigger on the source table to track changes but be aware that it doesnt fully remove the need to scan the source tables, in answer to you question about Primary keys, Oracle CDC with ODI will create an unconditional log group on the columns that you have defined in ODI as your PK, the PK columns are tracked by the database and presented in a Journal table (J$<source_table_name>) this Journal table is joined back to source table via a journalizing view (JV$<source_table_name>) to get the rest of the row (ie none PK columns) - So be aware that when ODI comes around to get all data in the Journalizing view (ie Inserts, Updates and Deletes) the source database performs a join back to the source table. You can negate this by specifying ALL source table columns in your PK in ODI - This forces all columns into the unconditional log group, the journal table etc. - You will need to tweak the JKM to then change the syntax sent to the database when starting the journal - I have done this in the past, using a flexfield in the datastore to toggle 'Full Column' / 'Primary Key Cols' to go into the JKM set up (there are a few Ebusiness suite tables with no primary key so we had to do this) - The only problem with this approach is that with no PK , you need to make sure you only get the 'last' update and in the right order to apply to your target tables, without so , you might process the update before the insert for example, and be out of sync.
So JKM's provide a mechanism for 'Change data only' to be provided to ODI, if you want to handle deletes in your source table CDC is usefull (otherwise you dont capture the delete with a normal LKM / IKM set up)
IKM Incremental update can be used with or without JKM's, its for integrating data into your target table, typically it will do a NOT EXISTS or a Minus when loading the integration table (I$<target_table_name>) to ensure you only get 'Changed' rows on the load into the target.
user604062 wrote:
I want to understand the role of: 'Journalizing CDC' and 'IKM - Incremental Update' and
how can i use it in my scenario?Hopefully I have explained it above, its the type of thing you really need to play around with, and throroughly review the operator logs to see what is actually going on (I think this is a very good guide to setting it up : http://soainfrastructure.blogspot.ie/2009/02/setting-up-oracle-data-integrator-odi.html)
In general What are the relations between 'Journalizing' and 'IKM'?JKM simply presents (only) changed data to ODI, it removes the need for you to decide 'how' to get the updates and removes the need for costly scans on the source table (full source to target table comparisons, scanning for updates based on last update date etc)
Should i use both of them? Or maybe it is better to deelte and insert to the target tables?Delete and insert into target is fine , but ask yourself how do you identify which rows to process, inserts and updates are generally OK , to spot a delete you need to compare the table in full, target table minus source table = deleted rows , do you want to copy the whole source table every time to perform this ? Are they in the same database ?
I want to understand what is the role of 'Journalizing CDC'?Its the ODI mechanism for configuring, starting, stopping the change data capture process in the source systems , there are different KM's for seperate technologies and a few to choose for Oracle (Triggers (Synchronous), Streams / Logminer (Asynchronous), Goldengate etc)
Can 'IKM - Incremental Update' work without 'Journalizing'?Yes of course, Without CDC your process would look something like :
Source target ----< LKM >---- Collection table (C$) ----<IKM>---- Integration table (I$) -----< IKM >---- Target table
With CDC your process looks like :
Source Journal (J$ table with JV$ view) ----< LKM >---- Collection table (C$) ----<IKM>---- Integration table (I$) -----< IKM >---- Target table
as you can see its the same process after the source table (there is an option in the interface to enable the J$ source , the IKM step changes with CDC as you can use 'Synchronise Journal Deletes'
Does 'Journalizing' need to have PK on the tables?Yes - at least a logical PK in the datastore, see my reply at the top for reasons why (Log Groups, joining back the J$ table to the source table etc)
What should i do if i can't put PK (there can be multiple identical rows)? Either talk to the source system people about adding one, or be prepared to change the JKM (and maybe LKM, IKM's) , you can try putting all columns in the PK in ODI. Ask yourself this , if you have 10 identical rows in your source and target tables, and one row gets updated - how can you identify which row in the target table to update ?
>
Thanks in advance YaelA lot to take in, as I advised I would reccomend you get a little test area set up and also read the Oracle database documentation on CDC as it covers a lot of the theory that ODI is simply implementing.
Hope this helps!
Alastair -
We have a requirement where ODI interface should be triggered whenever there is an update on a particular column of a table.
We have around 1 million records in source tables so polling through BPEL takes a while, is there a way it can be done entirely in BPEL?That tool is commonly used for Journal process... If you use the ODI journal process than that will be the API to check for data change.
If you are able to deal with Logminer there is an excellent set of KM and ODI toll that can help you detect the update at the column without need to create trigger... and with NO QUERY at source... at systems with high transactional access it means a very important gain of performance..
Does it make sense to you? -
Hi,
Oracle's use ODI,
Change Dada Capture during its capabilities to use.
Does not capture the changed data.
Originally, I was all taken care of,
Many times I tried to test the same way.
Initially, the work has been good, the moment he began them.
One point I do not know whether this is a problem.
The success and failure of the log to log bigyohaebwa do not differ.
So, reinstalling Oracle.
Once again, he succeeded.
I think to create a table in DBMS_CDC_PUBLISH.CREATE_CHANGE_SET Journal,
Gets value of this table do not like coming.
Please help.Hi,
Are you able to find out any info on this?
Appreciate your help on any leads..
Regards
Naveen
Maybe you are looking for
-
Issue creating a MatView for another user w/ a private DB Link
I am having an issue creating a materialized view for another user where the select statement for the MV uses a private db link. Looks like the following: Logged in as user X, I run: create materialzied view Y.foo (select blah from tbl@y_priv_db_link
-
Retaining Photoshop Layers in Illustrator
Trying to import a PSD file and retain its layers into Illustrator. It is coming in FLAT and I don't know why. These are a certain group of files. A different group is coming in with layers and I don't know why one is different than the other. Could
-
Hello! I want to know if there's a way to edit the 'New Task' iView from the Universal Worklist iview (com.sap.netweaver.bc.uwl.ui.UWL). I want to put a specific USER in the field 'Assigned to' and make it fixed, where the user could not change it. I
-
TS3988 if i upgrade my i cloud memory will it work on my i phone i pad and mac
i upgrade my i cloud memory will it work on my i phone i pad and mac
-
How can I stop the E:/ folder from popping up?
I would like to know how I can stop the E:/ folder from popping up everytime I perform an action e.g. transfering songs, and deleting songs from library.