Journalised data is not loaded in Target??
Hi All,
I am new to ODI.
In journalisng Tab,I am using "Simple" as my Journalising mode and KM as "JKM oracle simple".
Using CDC, I am capturing the new and changed data(Viewed through Journal data).
But the data is not been loaded in the target.
Please give me the correct steps to Follow.
Thanks in advance.
Hi Imus,
Are you sure that the created subscriber is the same typed at interface? (it is case sensitive...)
Cezar Santos
Similar Messages
-
CDC Error (JOURNALIZED DATA is not loading into tartet database)
HI,
I have enabled source database for CDC and got green mark on source database model tables.
while inserting data into source, J$ tables updating JRN_SUBSCRIBER and other filed.
when I run the package/ interface JOURNALIZED DATA is not loading into target database.
i have implemented cdc for 7 source table.
and using JKM MSSQL Simple
and enable JOURNALIZED DATA in interface level.
and
source database is : MSSQL Server
Target Database : Oralce 11g.
please advice me.
thanks in advance.
Zakeer HussainZakeer look into this link , -http://odiexperts.com/?p=1096 . Hope this helps.
also before running Can you right click on the Source Datastore and click on Journal Data and can you see the data ? and if still the data is not passing through ,in that case make temporary objects t- yes in LKM , IKM and debug and see at which step data is not flowing through and look if there is any filter or condition which is stopping it .
Still not able to figure out please tell us which step the data is not flowing through we will try to guide you. -
Data is not loading into the target table in BODS
BODS job is executing successfully but data is not loading.
can anyone please let me know what is the problem
Regards,
DevaHi Deva
The Information provided is very limited, what are your Source and Target Databases?
What is the structure of the job design?
If the job completed without error, review the job logs to confirm that rows were indeed processed.
Regards
Charles -
PSA data change not loaded to the target.
Hi all,
We have the data in PSA (it is the master data)- while transferring the data from PSA to the target, system throws error - on analysing the error we found that the characters in the data is wrong, the same was corrected in the PSA - on reloading after correction to the target, system is ignoring the corrected records and loading only the other records.
What could be the reason for this behaviour.
Can experts help.
Regards,
M.MDear Mr.Goud/Murali,
Thank you for your immediate reply and instructions.
Mr. Goud,
The PSA QM status is only green but even then the corrected record is not loaded to the target.
Mr. Murali,
As per your information we do not have an Error DTP - i have only one DTP and we are in BI7.00 - I have deleted the request from the target and corrected the erroneous record and am trying to load only that record by using the filter option in the DTP - still the data is not loaded.
Can you suggest method of adopting - i am not able to understand why that corrected record is not loaded. Is it that manually corrected record in the PSA will not be loaded into the target?
Kindly clarify.
Regards,
M.M -
Urgent::::Data is not loaded for a perticular info object in ods
Hi All,
We have loaded data into an ODS( 0PRC_DS01) in development server. It was successfully loaded into it all info objects.Reports were working well.
When we transported it to production, data loading was taking longggg time (15000 for 9 hours).So, we have done two things to improve the loading speed.
1) we have created an index based on the fields in the where clause RCLNT,RLDNR,RYEAR, DOCNR, POPER in JVSO1 table.
(JVSO1 is an R/3 table from where key data coming to datasource 0EN_JVA_2.)
2)We have updated the optmizer statistics.
Now the problem is, data is not loaded to one perticular info object JOINT VENTURE PARTNER in dso. Which was loaded successfully in development.
Please help us........We will assign points for helpful answersHi Chek in the transfer and update rules whether u mapped the fields with target and also check whether u have routine. and check whether the data is coming for that object from the source.
Khaja -
Page data does not load with goButton? (new window of identical page)
Hi guys. I ran into a problem. Essentially, I need to print a .jsff page but before I do, I have to alter its format a little bit. The solution I decided to use was to create a new .jspx page containing almost the exact same content. I even copied the data bindings and everything. Finally, I created a goButton on my original page like so:
<af:goButton text="Print as BOSSMON" destination="SubmitPrint.jspx?org.apache.myfaces.trinidad.agent.email=true"
targetFrame="_blank">
</af:goButton>This opens my page nicely in the format I want. But...there is no data on the page! It just opens a frame of the page. Can someone help me resolve this?
Otherwise, is there any other technique I could use to accomplish my goal?Hi guys. Thanks to everyone for their suggestions. I finally got a working solution, something totally different from the suggestions.
My goal was to print a .jsff page nicely. But before I could do that, I needed to modify some data on the page. I also wanted to do this in as few clicks as possible. I tried the goButton technique but the binding data would not load. I couldnt figure out why.
Nonetheless, I finally managed to make my own version (albeit, not that neat!). I accomplished this by essentially making a template which automatically opens in showprintablebehavior format. I tried to follow a bunch of online tutorials/blogs but nothing worked for me. Here are the steps I used:
1) Creating a new .jspx page.
2) Copy the entire source of the .jsff page (the one which needs to be printed) into our new template .jspx page. (do have to modify the beginning parts by adding <af:document> <af:form> etc)
3) Create a pageDef for the template page and copy the entire pageDef as well (do have to change the name of the pagedef)
4) Modify your print template to your liking.
5) Add a taskflow rule that links the original .jsff page to the new template page
6) Add a commandButton on the original .jsff page which opens this printing page in a new window (remember to use "dialog:....."). Here is mine:
<af:commandButton text="Open Printer Friendly Version" id="printer_popup" action="dialog:edit_print" useWindow="true"
windowEmbedStyle="window" inlineStyle="display:block;"
windowHeight="100" windowWidth="100">
</af:commandButton>7) Now, in the printing template page, add a command button which simply has the showprintablebehaviour tag within it as well as a client listener to invoke some javascript. Here is what I used:
<af:commandButton text="Printer Page" id="sha_print">
<af:clientListener method="do_loading" type="focus"/>
<af:showPrintablePageBehavior/>
</af:commandButton>8) In the beginning of this page, you will have an <af:document> tag. Modify it so it sets focus to our print button above when the page loads. Like so:
<af:document id="d2" initialFocusId="sha_print">9) Now add a resource tag to your page which has the "do_loading" javascript method.
<f:facet name="metaContainer">
<af:resource type="javascript">
function do_loading(event) {
var target = document.getElementById('sha_print');
target.click();
top.close();
//alert("got focus sir!");
</af:resource>
</f:facet>Thats it! What this does is when you click on the command button to open the printer template, our template page opens but as it opens, it automatically sets focus to the printer button using the document tag. The printer button has a listener which activates when it receives focus (in our case, as soon as the page loads). This listener invokes the javascript method, which progamatically clicks the printer button and closes this window. What your left with is a printer friendly page of your modified page!
All the data is present there too!
If anyone has any questions/comments, please do ask. -
Impact on transactional Loads if Master data is not loaded
Scenario for your reference:
The loads to ZPO_LINE from PC3 was failing for past 20 days from The fix was applied on April 21.
I need help to decide the effects of not loading ZPO_LINE for past 20 days. And create a detailed plan for data loads required.
If the master data is not loaded for 20 days will it affect the transaction loads happened during those days?
And how can i find out the impact and correct the transaction data if it does?
Can any 1 help me with this?Hi,
If i understand your scenario, you have a scenario where the Master data load has not been updated for the last 20days but, the transaction data was loaded without any interruption.
In such a scenario, the Transactional loads will only be affected if there is some field which undergoes transformation after looking up on the master data.
So, first load the master data and run the attrib change run process.
After this, if the transaction data is in full update mode, then you dont need to do anything as data will be refreshed with correct values in the next load.
If delta loads are there, you might need to performa full repair.
Regards,
Rahul -
AWM - After successull cube maintainance data does not load
Hi All,
After creating the cube when i maintain the cube the data does not load. I went back and looked at the mapping at the cube level and the dimension level and everything looks fine. Any clues why the data does not get loaded. Also, i did not recieve any error while maintaing the cube.
thanksHi,
I was able to resolve this issue. I did not map the cube properly especially the time dimension. Due to this the data did not get loaded in the cube. ReMapping fixed the problem.
Hope this helps other newbies.. -
Embedded Database - data does not load
Hello.
I am currently using NetBeans 6.5 and am experimenting with the CarsApp tutorial.
Whenever I create an application using the server database, the data loads automatically when the application starts.
However, when I create the application using an embedded database, the data does not load and I just get a blank table.
For more information, go to: http://forums.sun.com/thread.jspa?threadID=5366394&tstart=0
Thank you in advance!I'm sorry, I didn't know it was so looked down upon.
I thought that since this topic pertained to both Destop Applications and JDBC, I could get the best and quickest answer if I posted in both.
Apparently, that was a mistake.
Thanks, anyways! -
Data is not replicated on target database - oracle stream
has set up streams replication on 2 databases running Oracle 10.1.0.2 on windows.
Steps for setting up one-way replication between two ORACLE databases using streams at schema level followed by the metalink doc
I entered a few few records in the source db, and the data is not getting replication to the destination db. Could you please guide me as to how do i analyse this problem to reach to the solution
setps for configuration _ steps followed by metalink doc.
==================
Set up ARCHIVELOG mode.
Set up the Streams administrator.
Set initialization parameters.
Create a database link.
Set up source and destination queues.
Set up supplemental logging at the source database.
Configure the capture process at the source database.
Configure the propagation process.
Create the destination table.
Grant object privileges.
Set the instantiation system change number (SCN).
Configure the apply process at the destination database.
Start the capture and apply processes.
Section 2 : Create user and grant privileges on both Source and Target
2.1 Create Streams Administrator :
connect SYS/password as SYSDBA
create user STRMADMIN identified by STRMADMIN;
2.2 Grant the necessary privileges to the Streams Administrator :
GRANT CONNECT, RESOURCE, AQ_ADMINISTRATOR_ROLE,DBA to STRMADMIN;
In 10g :
GRANT CONNECT, RESOURCE, AQ_ADMINISTRATOR_ROLE,DBA to STRMADMIN;
execute DBMS_STREAMS_AUTH.GRANT_ADMIN_PRIVILEGE('STRMADMIN');
2.3 Create streams queue :
connect STRMADMIN/STRMADMIN
BEGIN
DBMS_STREAMS_ADM.SET_UP_QUEUE(
queue_table => 'STREAMS_QUEUE_TABLE',
queue_name => 'STREAMS_QUEUE',
queue_user => 'STRMADMIN');
END;
Section 3 : Steps to be carried out at the Destination Database PLUTO
3.1 Add apply rules for the Schema at the destination database :
BEGIN
DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
schema_name => 'SCOTT',
streams_type => 'APPLY ',
streams_name => 'STRMADMIN_APPLY',
queue_name => 'STRMADMIN.STREAMS_QUEUE',
include_dml => true,
include_ddl => true,
source_database => 'REP2');
END;
3.2 Specify an 'APPLY USER' at the destination database:
This is the user who would apply all DML statements and DDL statements.
The user specified in the APPLY_USER parameter must have the necessary
privileges to perform DML and DDL changes on the apply objects.
BEGIN
DBMS_APPLY_ADM.ALTER_APPLY(
apply_name => 'STRMADMIN_APPLY',
apply_user => 'SCOTT');
END;
3.3 Start the Apply process :
DECLARE
v_started number;
BEGIN
SELECT decode(status, 'ENABLED', 1, 0) INTO v_started
FROM DBA_APPLY WHERE APPLY_NAME = 'STRMADMIN_APPLY';
if (v_started = 0) then
DBMS_APPLY_ADM.START_APPLY(apply_name => 'STRMADMIN_APPLY');
end if;
END;
Section 4 :Steps to be carried out at the Source Database REP2
4.1 Move LogMiner tables from SYSTEM tablespace:
By default, all LogMiner tables are created in the SYSTEM tablespace.
It is a good practice to create an alternate tablespace for the LogMiner
tables.
CREATE TABLESPACE LOGMNRTS DATAFILE 'logmnrts.dbf' SIZE 25M AUTOEXTEND ON
MAXSIZE UNLIMITED;
BEGIN
DBMS_LOGMNR_D.SET_TABLESPACE('LOGMNRTS');
END;
4.2 Turn on supplemental logging for DEPT and EMPLOYEES table :
connect SYS/password as SYSDBA
ALTER TABLE scott.dept ADD SUPPLEMENTAL LOG GROUP dept_pk(deptno) ALWAYS;
ALTER TABLE scott.EMPLOYEES ADD SUPPLEMENTAL LOG GROUP dep_pk(empno) ALWAYS;
Note: If the number of tables are more the supplemental logging can be
set at database level .
4.3 Create a database link to the destination database :
connect STRMADMIN/STRMADMIN
CREATE DATABASE LINK PLUTO connect to
STRMADMIN identified by STRMADMIN using 'PLUTO';
Test the database link to be working properly by querying against the
destination database.
Eg : select * from global_name@PLUTO;
4.4 Add capture rules for the schema SCOTT at the source database:
BEGIN
DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
schema_name => 'SCOTT',
streams_type => 'CAPTURE',
streams_name => 'STREAM_CAPTURE',
queue_name => 'STRMADMIN.STREAMS_QUEUE',
include_dml => true,
include_ddl => true,
source_database => 'REP2');
END;
4.5 Add propagation rules for the schema SCOTT at the source database.
This step will also create a propagation job to the destination database.
BEGIN
DBMS_STREAMS_ADM.ADD_SCHEMA_PROPAGATION_RULES(
schema_name => 'SCOTT',
streams_name => 'STREAM_PROPAGATE',
source_queue_name => 'STRMADMIN.STREAMS_QUEUE',
destination_queue_name => 'STRMADMIN.STREAMS_QUEUE@PLUTO',
include_dml => true,
include_ddl => true,
source_database => 'REP2');
END;
Section 5 : Export, import and instantiation of tables from
Source to Destination Database
5.1 If the objects are not present in the destination database, perform
an export of the objects from the source database and import them
into the destination database
Export from the Source Database:
Specify the OBJECT_CONSISTENT=Y clause on the export command.
By doing this, an export is performed that is consistent for each
individual object at a particular system change number (SCN).
exp USERID=SYSTEM/manager@rep2 OWNER=SCOTT FILE=scott.dmp
LOG=exportTables.log OBJECT_CONSISTENT=Y STATISTICS = NONE
Import into the Destination Database:
Specify STREAMS_INSTANTIATION=Y clause in the import command.
By doing this, the streams metadata is updated with the appropriate
information in the destination database corresponding to the SCN that
is recorded in the export file.
imp USERID=SYSTEM@pluto FULL=Y CONSTRAINTS=Y FILE=scott.dmp IGNORE=Y
COMMIT=Y LOG=importTables.log STREAMS_INSTANTIATION=Y
5.2 If the objects are already present in the desination database, there
are two ways of instanitating the objects at the destination site.
1. By means of Metadata-only export/import :
Specify ROWS=N during Export
Specify IGNORE=Y during Import along with above import parameters.
2. By Manaually instantiating the objects
Get the Instantiation SCN at the source database:
connect STRMADMIN/STRMADMIN@source
set serveroutput on
DECLARE
iscn NUMBER; -- Variable to hold instantiation SCN value
BEGIN
iscn := DBMS_FLASHBACK.GET_SYSTEM_CHANGE_NUMBER();
DBMS_OUTPUT.PUT_LINE ('Instantiation SCN is: ' || iscn);
END;
Instantiate the objects at the destination database with
this SCN value. The SET_TABLE_INSTANTIATION_SCN procedure
controls which LCRs for a table are to be applied by the
apply process. If the commit SCN of an LCR from the source
database is less than or equal to this instantiation SCN,
then the apply process discards the LCR. Else, the apply
process applies the LCR.
connect STRMADMIN/STRMADMIN@destination
BEGIN
DBMS_APPLY_ADM.SET_SCHEMA_INSTANTIATION_SCN(
SOURCE_SCHEMA_NAME => 'SCOTT',
source_database_name => 'REP2',
instantiation_scn => &iscn );
END;
Enter value for iscn:
<Provide the value of SCN that you got from the source database>
Note:In 9i, you must instantiate each table individually.
In 10g recursive=true parameter of DBMS_APPLY_ADM.SET_SCHEMA_INSTANTIATION_SCN
is used for instantiation...
Section 6 : Start the Capture process
begin
DBMS_CAPTURE_ADM.START_CAPTURE(capture_name => 'STREAM_CAPTURE');
end;
/You must have imported a JKM and after that these are the steps
1. Go to source datastrore and click on CDC --> Add to CDC
2. Click on CDC --> Start Journal
3. Now go to the interface Choose the source table and select Journalized data only and then click on ok
4. Now execute the interface
If still it doesn't work, are you using transactions in your interface ? -
Hi,
i have loaded data to cube.But when i check the WithDRAW Qty field ,I found Its is not correct with source tcode.
i think ,some of the records are missed.
My source tcode.is MCRE.
Cube 0PP_C06.
i found that data is not matching for one plant only.,for the remaing plants ,data i.e WithDrawl qty is comming correct.
Can you please tell how should i find which records are missing
Regards
NareshHi Naresh,
First Compare the data in the RSA3 with the source tcode MCRE for the field WithDrawl qty against the Plant.
Thx
Edited by: satya prasad on May 21, 2010 7:17 AM -
new itunes up date will not down load says i have older version bonjour can not be removed tried removing bonjour with revo uninstaller takes bonjour off but then my internet will not work
iOS: Unable to update or restore
-
XML data is not loading inside buttons !
Hi there... it is an urgent issue: My application only loads
the XML data to the "main" dynamic text fieds.. but all the rest
which are "inside" a button do not load them !!! why ? is there any
special way to load them ?
the way I loaded to the "main" text fields is:
promo01title_txt.text =
obj_xml.firstChild.childNodes[0].firstChild.nodeValue;
I do the same for the buttons but it does not work.. am I
supposed to type a prefix or something like that before the text
field ??? I will appreciate your answer, thanks in advance.
JitenIf I understand your post you have dynamic text fields on the
stage that you are loading string values from an XML file into the
text property. It sounds like you are trying to take XML strings
and then load them to a text field that is part of another movie
clip. Based on your code it would be simply
myButton.mytextfield.text =
xml.firstchild.childNodes[0].firstchild.nodeValue
You need to give the complete movieclip path of the
textfield. If it is part of another movie clip then you need to
start with that movieclips name then the name of the
textfield. -
The data is not loading in second level ODS
Hi Gurus,
I am loading the data from first level to second. The data is properly loading to first level but second it is picking 0 records inspite there are 19604 records. This problem arised after the changes went live which I have done and tested in Quality server and was done correctly.
Please throw some light.
Many thanks,
Sunil Morwal.Hi sunil..
1) In monitor screen is it green ??? like in the Processing is it NO data...
2) check in the status tab of the monitor. what is the status???
If the second level request is green adn available for reporting and it has activated..and in the first level ODS the datamart status is checked.. then there is no problem..
the error is coz there i sno ercords which was changes...
cos the records which were come into the ODS was overwriten adn there are no records for delta..
So there is no problem,,
just do the loading for the next deltas...
Looks like everyhting is okai...
regards
AK
points=thanks -
Setups for new operating units not loading to target
If there is a new operating unit in the source, setups for that operating unit do not seem to load into the target if that new operating unit does not exist in the target.
Is there a script to address this issue?
Any additional advice will be appreciated.
Edited by: user11071661 on Apr 16, 2009 1:34 PMHi,
There are lot of dependencies between setup data that you are trying to load. Operating Unit depends on Accounting Setup which in turn depends on setups like Calendar, Currencies and Chart of Accounts. So, you have to make sure that you have loaded all prerequisite data before loading operating unit.
Thanks
Mugunthan.
Maybe you are looking for
-
Issue loading itunes on windows 7 home premium 64
Hello All , Kind of a computer newbie here with issue loading the new iTunes 9.0.3 software onto a new pc. here is situation Just got a new gateway dx4831 with windows 7 home premium 64 os preinstalled, down loaded the new software this afternoon. iT
-
Media Encoder CS6 won't open at all...
I posted awhile back... with no response. I'm trying again as I am, desperate to get this resolved. I installed CS6 MAster Suite on my Windows 7 PC... when I try to open Media Encoder NOTHING happens... a quick spin of the little blue arrow... and n
-
Can't sign in - Error code 80710b36
I'm on ps3. Glad I haven't got a PS4, paying for PSN and not being able to play on the weekend.
-
Greetings, While the cause for this post is the sole HP TouchSmart (TS) at this computing site, an IQ816t, this post addresses the case, and thus applies to the IQ8xxt and any other similar series case. Generally, to restart a PC, a user may attempt
-
ATP Check - Purchase Order Option Includes Qty on Inbound Delivery
We are on ECC 6.0 and we recently turned on purchase orders in our ATP check (and turned off replenishment lead time). We have PO's turned on just we can see the PO due date from the ATP proposal screen, not to actually confirm against that PO qty.