Odi 10g Issue while implenting CDC on table
Hi All,
Am new to odi 10g ,
Am getting following error while selecting journalize data only option in the interface
Error:
This data source is not journalized in this context .
com.sunopsis.tools.core.exception.snpsSimpleMessageExcption:null
Thanks
Edited by: vam on Dec 24, 2012 1:03 AM
Hi and welcome,
You first have to journalize your datastore.
You should follow a tutorial like this one : http://odiexperts.com/changed-data-capture-cdc/
Hope it helps.
Regards,
JeromeFr
Similar Messages
-
Issue while filling Inventory setup tables.
Hi,
Here we are facing a critical issue, while filling Inventory setup tables, we have done BX setup successfully. Not able complete BF and UM because of performance issues.
We filled 40% of BF and UM, since we dont have system downtime and stoped the setup at that movement. Shall we do BF & UM setup without R/3 system downtime. Is there any other way we can fill the BF, UM setup tables.
Your help is really appreaciated.
Thanks
BKHi BK,
Ideally you need the downtime when doing the setup for current and previous period as these are the only open ones where you can expect changes to happen due to user postings. What selection did you give when doing the set up? One idea is to break up the set up job so that each job loads for a week and then run the jobs in parallel. This would work for BF and then for UM of course you need to run it per company code. Are you using on ODS in your data model? -
I'm facing performance issue while accessing the PLAF Table
Dar all,
I'm facing performance issue while accessing the PLAF Table.
The START-OF-SELECTION of the report starts with the following select query.
SELECT plnum pwwrk matnr gsmng psttr FROM plaf
INTO CORRESPONDING FIELDS OF TABLE it_tab
WHERE matnr IN s_matnr
AND pwwrk = p_pwwrk
AND psttr IN s_psttr
AND auffx = 'X'
AND paart = 'LA' .
While executing the report in the Quality system it does not face any performance issue...
When it comes to Production System the above said select query itself it is taking 15 - 20 minutes time to move further.
Kindly help me to over come this problem...
Regards,
JessiHi,
"Just implement its primary Key
WHERE PLNUM BETWEEN '0000000001' AND '9999999999' " By this you are implementing the Primary Key
This statement has nothing to do with performance, because system is not able to use primary key or uses every row.
Jessica, your query uses secondary index created by SAP:
1 (Material, plant) which uses fields MANDT MATNR and PLWRK.
but it is not suitable in your case.
You can consider adding new one, which would containt all fields: MANDT, MATNR, PWWRK, PSTTR AUFFX PAART
or - but it depends on number of rows meeting and not meeting (auffx = 'X' AND paart = 'LA' ) condition.
It could speed the performance, if you would create secondary index based on fields MANDT, MATNR, PWWRK, PSTTR
and do like Ramchander suggested: remove AUFFX and PAART from index and where section, and remove these unwanted rows
after the query using DELETE statement.
Regards,
Przemysław
Please check how many rows in production system -
Performance issue while accessing SWwLOGHIST & SWWHEAD tables
Hi,
We have applied workflow function in Purchase Order (PO) for release strategry purpose. We have two release levels.
The performance issue came while user who tried to update the existing PO in which the PO had already released. because after user amend the PO and then they pressed "SAVE" button in PO's screen, the release strategry will reset and it tried to read the SWwLOGHIST table, it took few seconds to minutes to complete the save process.
My workflow's schedule job details as below:
SWWERRE - every 20 minutes
SWWCOND - every 30 minutes
SWWDHEX - every 3minutes
Table Entries:
SWWHEAD - 6mil entries
SWwLOGHIST - 25mil entries
Should we do data archiving on the above workflow tables?
Is it only the solution?
Kindly advice,
Thanks,
Regards,
ThomasHi,
The sizes for both tables as below:-
SWWLOGHIST - 3GB
SWWWIHEAD - 2GB
I've a REORGchk_alltables in weekly basis, in DB2/DB6 (DB) do I need to manually reorg of the tables or rebuild the indexes?
You can refer the attached screenshots for both tables.
Thanks,
Regards,
Thomas -
Performance Issue while Joining two Big Tables
Hello Experts,
We've the following Scenario, wherein we need to have Sales rep associated for a Sales Order. This information is available in VBPA table with Sales Order, Sales Order Item and Partner Function being the key.
NOw I'm interested in only one Partner Function for e.g. 'ZP'. This table is having around 120 million records.
I tried both options:
Option 1 - Join this table(VBPA) with Sales Order Item table(VBAP) within the Data Foundation Layer of the Analytic View and doing the filtering on Partner Function
Option 2 - Create a Attribute View for VBPA having filtering on Partner Function and then join this Attribute View in the Logical Join Layer with the Data Foundation table.
Both these options are killing the performance.
Is there any way out to achieve this ?
Your expert opinion is greatly appreciated!!
Thanks & regards,
JomyHi,
Lars is correct. You may have to spend a little bit more time and give a bigger picture.
I have used this join. It takes about 2 to 3 seconds to execute this join for me. My data volume is less than yours.
You must be have used a left outer join when joining the attribute view (with constant filter ZP as specified in your first post) to the data foundation. Please cross check once again, as sometimes my fat finger inadvertently changed the join type and I had to go back and fix it. If this is a left outer join or a referential join, HANA does not perform the join if you are not requesting any field from the attribute view on table VBPA. This used to be a problem due to a bug in SP4 but got fixed in SP5.
However, if you have performed this join in the data foundation, it does enforce, the join even if you did not ask any fields from the VBPA table. The reason being that you have put a constant filter ZR (LIPS->VBPA join in data foundation as specified in one of your later replies).
If any measure you are selecting in the analytic view is a restricted measure or a calculated measure that needs some field from VBPA, then the join will be enforced as you would agree. This is where I had most trouble. My join itself is not bad but my business requirement to get the current value of a partner attribute on a higher level calculation view sent too much data from analytic view to calculation view.
Please send the full diagram of your model and vizplan. Also, if you are using a front end (like analysis office), please trap the SQL sent from this front end tool and include it in the message. Even a straight SQL you have used in which you have detected this performance issue will be helpful.
Ramana -
Issue while checking data in table
i have writtten aprogram to check the data in the table. program is like
REPORT ZTEST3.
tables zempdata.
data wa_zempdata type zempdata.
loop at zempdata into wa_zempdata.
WRITE: / 'Runtime 1', wa_zempdata-employee_number.
WRITE: / 'Runtime 2', wa_zempdata-employee_name.
ENDSELECT.
its giving me error saying VERSION expected after zempdata.
is there any way to resolve it or any other simple way to check the dat in the table through report program as i don't have acces on se16HI,
Try with the below code
DATA : it_empdata TYPE TABLE OF zempdata,
wa_empdata TYPE zempdata.
SELECT * FROM zempdata INTO TABLE it_empdata.
LOOP AT it_empdata INTO wa_empdata.
WRITE :/ wa_zempdata-employee_number, wa_zempdata-employee_name.
ENDLOOP.
Regards
Bala Krishna -
Hi,
I am facing issue while filling the setup table for Quality management datasource.
Error message:
No summarization result is selected,
Message no. MCEX105
Thanks,
RamHi Ram,
I hope you are using the same tcode - Go to Txn OLIQBW
give the Name of run and execute in Background.
I hope you done the same thing, if not please let me know
Regards,
Kiran -
Getting Error Out Of Memory while importing the work repository in ODI 10g
I exported the work repository from topology of ODI 10g of one DB and tried importing it in the another ODI 10g topology of another DB.While importing i got the error 'Out of Memory' .
Can somebody suggest me ,how to solve the heap size out of memory issue while importing in ODI 10g.
Thanks in Advance.Hi,
you have to post your question in ODI forum
Data Integrator
Suresh -
ODI 10g - session keep a table locked
Hi,
We have a random issue, with ODI session that keep a lock on a table, even replication is finished and session becomes inactive
It generated dead locks as a trigger has to update the target table.
what happened :
- user application create rows (13)
- ODI scenario replicate the rows (contract table)
- 2nd scenario based on same source with another sunscriber run a stored procedure to create records in another table (around 30, positions table)
this 2nd locked the target table, and when the run of the procedure finished, and commited, the lock was not released
- ODI replicate another table (price) 30mn later, a trigger on target update position table with new values
---> trigger failed with deadlock (ora 60)
---> ODI failed as the trigger raised back the error
this issue happened after 10 hours of same activity without issue, chained lot of time, but suddenly the lock become persistent (more than 4 hours)
what can I do?
use ODI 10g 10.1.3.5.0 - RDBMS 10.2.0.4Hi !
For small tables wich are mostly accessed with full table scan you can use
ALTER TABLE <table_name> STORAGE (BUFFER_POOL KEEP);KEEP pool should be properly sized , setting will cause once the table is read oracle will avoid flushing it out from pool.
T -
ODI 10g Load table havfing XMLTYPE as one of the data type
I'm trying load data from source to target(Both Oracle) same table. The table has XMLTYPE as one of the data type. I can load the data, if I exclude that one particular column.
Is there any way to load the data from that XMLTYPE column as well? Please provide your input.
Source and traget has the same table structure.
Version: ODI 10G
below is the sample table structure.
CREATE TABLE APPLICATION
APPLID VARCHAR2(20 BYTE) NOT NULL,
JOBTYPE VARCHAR2(20 BYTE),
USRID VARCHAR2(20 BYTE) NOT NULL,
APPDOC SYS.XMLTYPE,
APPLSTATUS VARCHAR2(30 BYTE),
APPLDATE DATE
ThanksHi Pankaj,
Remember that 1 character is not equal to 1 byte anymore while numbers/hexadecimals are still.
So, if you have structures that are a mixture of characters and integers/hexadecimals etc, you can not move data from structure to structure as easily as before.
In code you mentioned:
LOOP AT L_TAB INTO L_REC.
APPEND L_REC TO LIST_TAB.
ENDLOOP.
Avoid moving directly from L_REC to LIST_TAB. You can try to define a work area (WA_TAB) for LIST_TAB and use a MOVE-CORRESPONDING from L_REC to the WA_TAB. Then append WA_TAB.
If the field names from L_REC do not match the field names in LIST_TAB, you can manually move each field value to the correct field in the work area.
Reward points if this solves your issue -
Issue while insert and update data to DB tables
Hello all,
i am having an issue while insert the data to DB table.
my scenario is DB1 to DB2. i had a sender channel with select query which fetches data from DB1 and inserts to DB2.
so the select query will fetch the records that were INSERTED to DB1 and records that were UPDATED to DB1 and needs to insert/update to DB2 table.
Now the issue is i am able to insert the records but not able toupdate the records to DB2 table due to primary key issue.
im message mapping
sender message type is as follows:
<src_message1>
----<row>
-------<fieldA>
-------<filedB>
-------<filedC>
Receiver message type as follows:
<trgt_message1>
----<STATEMENT_1>
----<TABLE_NAME>
----<ACTION> INSERT
----<TABLE>
----<ACCESS>
----<field1> primary key
----<field2>
----<field3>
----<field4>
----<KEY>
----<field1>
----<field2>
----<field3>
----<field4>
my query in sender channel is : select filedA, filedB, filedC from test_table where createdate=sysdate or updatedate=sysdate
so it feteches the data from DB1 and inserting to DB2 but not updating the records to DB2 due to primarykey issue.
please suggest how to solve ....will it solve by using UPDATE_INSERT for action?
Best Regards,SARANHi Nagarjuna,
i have done the following changes to target mapping structure;
1. action as UPDATE_INSERT
2. in access tab, i had mapped fieldDate to field4.
3. in Key tab, i assigned the sysdate to field4.
but issue still exist. could you please check my above changes are correct or not. if wrong please provide me the details that needs to be done.
thanks in advance.
i'm providing the error details again:
my query in sender channel is : select filedA, filedB, filedC, FiledDate from TEST_TABLE where fieldDate=sysdate or updatedate=sysdate
it returns 4 records as follows:
fieldA--fieldB-fieldC---fieldDate
1001----EU---- 1----
2011-11-10
1002----CN---- 0----
2011-11-10
1003----AP---- 1----
2008-03-15 (already exist in DB2)
1004----JP---- 1----
2007-04-12 (already exist in DB2)
the first two records are created today and remaining 2 records are updated the fieldC from 0 to 1 ( in DB1 )
while inserting these 4 records to DB2, we get the following error "java.sql.SQLException: ORA-00001: unique constraint (data.TEST_TABLE_PK) violated" .
Best Regards,SARAN -
Error while creating publisher change tables in CDC
Hi,
I am implementing Change Data Capture. I got getting following error while creating publisher change tables in Staging database. My database version is 10.2.0.2.0 .
I appreciate your help.
ERROR at line 1:
ORA-29540: class oracle/CDC/PublishApi does not exist
ORA-06512: at "SYS.DBMS_CDC_PUBLISH", line 611
ORA-06512: at line 2
Thanks,
Venkat.This problem got fixed when I ran below script!!
@$ORACLE_HOME/rdbms/admin/initcdc.sql; -
Issue while deleting a row from a table
Dear friends,
i am getting an issue while deleting a row from a table, pls check screen shots , the first screen shot is my table contents
when i delete 2 row , the second row is deleting properly like below screen shot
but i want like below screen shot , Col1 contents should be like pic 1 . could any one pls let me know how to solve this issue.
Thanks
VijayaHi vijaya,
please try this code, it will help you.
DATA : it_rows TYPE wdr_context_element_set,
wa_rows LIKE LINE OF it_rows.
DATA lo_nd_table TYPE REF TO if_wd_context_node.
DATA lt_table TYPE wd_this->elements_table.
DATA lo_el_table TYPE REF TO if_wd_context_element.
DATA ls_vbap TYPE wd_this->element_table.
DATA: ld_index TYPE i.
data value TYPE sy-index.
* navigate from <CONTEXT> to <table> via lead selection
lo_nd_table= wd_context->get_child_node( name = wd_this->wdctx_table ).
* @TODO handle non existant child
* IF lo_nd_table IS INITIAL.
* ENDIF.
* get element via lead selection
* alternative access via index
* lo_el_table = lo_nd_table->get_element( index = 1 ).
* @TODO handle not set lead selection
IF lo_el_table IS INITIAL.
ENDIF.
* navigate from <CONTEXT> to <table> via lead selection
lo_nd_table = wd_context->get_child_node( name = wd_this->wdctx_table ).
* @TODO handle non existant child
* IF lo_nd_table IS INITIAL.
* ENDIF.
lo_nd_table->get_static_attributes_table( IMPORTING table = lt_table ).
* @TODO handle non existant child
* IF lo_nd_table IS INITIAL.
* ENDIF.
** @TODO compute values
** e.g. call a model function
* navigate from <CONTEXT> to <table> via lead selection
lo_nd_table = wd_context->get_child_node( name = wd_this->wdctx_table ).
* @TODO handle non existant child
* IF lo_nd_table IS INITIAL.
* ENDIF.
** @TODO compute values
** e.g. call a model function
it_rows = lo_nd_table>get_selected_elements( ).
CALL METHOD lo_nd_table->GET_LEAD_SELECTION_INDEX
RECEIVING
INDEX = value .
LOOP AT it_rows INTO wa_rows.
CALL METHOD wa_rows->get_static_attributes
IMPORTING
static_attributes = ls_table.
READ TABLE lt_table INTO ls_table WITH KEY col1 = ls_table-col1.
ld_index = value.
ENDLOOP.
CLEAR : ls_table-col2,
ls_table-col2.
MODIFY lt_table INDEX ld_index FROM ls_table.
lo_nd_table->bind_table( new_items = lt_table set_initial_elements = abap_true ). -
An issue while installing oracle 10g express edition on Ubuntu 10.04-64bit
Hi,
I'm running into an issue while installing oracle 10g express edition on my Ubuntu 10.04-64 bit installed machine using oracle-xe-universal_10.2.0.1-1.0_i386.deb package. The issue actually came up when I was trying to reinstall the package after uninstalling it due to some problem I had with my login credentials. To elaborate further on it, the /etc/init.d/oracle-xe script does not get created when I execute the debian package and thus i'm unable to configure the required credentials and stuff after the installation. I doubt whether this has something to do with the procedure I followed when uninstalling the previous package it. But what I did was just what's mentioned in the following [http://download.oracle.com/docs/cd/B25329_01/doc/install.102/b25144/toc.htm] user guide published by oracle.
Another important clue would be, the platform of my machine is amd64bit although I'm dealing with an installation package preferred for i386. But I believe it's the only thing available in the oracle software download section. Hence I had to use the --force-architecture option to install it. But after reading some post published on this forum, I've figured out that it does not affect the usual execution of the software. Anyway, I would greatly appreciate some help on this matter.
[1] http://download.oracle.com/docs/cd/B25329_01/doc/install.102/b25144/toc.htm
Thanks and Regards,
Prabath Abeysekara
Edited by: 829281 on Jan 18, 2011 7:52 PMHi,
Finally I've managed to find a solution for this issue. Thanks a lot for all the replies.
Here's how I got rid of it.
Addressing all the facts one by one,
1.The problem of not getting the /etc/init.d/oracle-xe script generated was avoided
by removing all the configuration files that were previously created while installing
the product. The following command can be used to accomplish that task.
*$ dpkg --purge oracle-xe*
2.Next, Initially the password I used during the configuration process was "admin".
Although I tried a couple of times installing the product providing the aforementioned
password, the web UI kept prompting "Invalid Credentials" message not allowing me
to login. But, this particular issue could be avoided by giving some other simple
password.
Finally, got it working and once again thanks a bunch for taking the burden of
replying this thread.
Cheers,
Prabath -
ISSUES WHILE STARTING HTTP SERVER FOR A NEW INSTALLATION OF 10G AS
I installed 10g AS version 10.1.3.1. I am having this issue while running the installation:
The Configuration assistants are failing with the following error:
ias-component/process-type/process-set:
HTTP_Server/HTTP_Server/HTTP_Server/
Error
--> Process (index=1,uid=621769602,pid=30999)
failed to start a managed process after the maximum retry limit
Log:
/opt/dba/oracle/product/10.1.3.1/OracleAS_1/opmn/logs//HTTP_Server~1.log
Configuration assistant "Oracle Process Management and Notification Configuration Assistant" failed
I am using a staticports.ini file with the following values
Oracle HTTP Server port =81
Oracle HTTP Server Listen port = 81
Oracle HTTP Server SSL port = 444
Oracle HTTP Server Listen (SSL) port = 444
I am not sure what the following are for and what values i should for them so i commented them
#Oracle Notification Server Request port = port_num
#Oracle Notification Server Local port = port_num
#Oracle Notification Server Remote port = port_num
#ASG port = port_num
Has anyone encountered this issue ??Port numbers less than 1024 are privileged ports on Unix/Linux. In order for OHS to run on one of the privileged port (e.g. 81 and 441 in your case) you must enable OHS to run as root. By default OHS runs a non-root user, the user that installed Oracle Application Server.
So to start with installer do not specify custom (static) ports and complete the installation. OHS will be assigned some non-privileged port by installer. Post installation follow the instructions here to change the ports to 81 and 441:
http://download.oracle.com/docs/cd/B32110_01/core.1013/b32196/ports.htm#CIHJEEJH
Thanks
Shail
Maybe you are looking for
-
I am currently using Mozilla/5.0 Firefox/3.6.12. After having upgraded to this version I find that email links to my Outlook client will not automatically open a new email page pre-populated with the recipient's address. All other links on the page h
-
Certificate Error when using pub services with IE7
everytime i try to use pub services admin console i get the following warning im using IE7 any idea how to stop this nag ? : There is a problem with this website's security certificate. The security certificate presented by this website was not issue
-
The flash installer for Adobe Flash Player 11.5.502.149 doen't look correct. Given the rash of fake Flash installers, it's rather disconceting to get something like this from the http://get.adobe.com/flashplayer/. Here's a screen shot of the instal
-
hi friends can any one tell me how can we remove an error from psa thanks in advance with regards kalyan
-
How to identify mandatory input or output via standard Oracle API
Hi, I would like to know through the standard Oracle API, is there a way to identify the mandatory input or output required during API loading? Thanks and Regards, SC