Golden gate extract from multiple oracle db's on different servers
Hello,
I am new to golden gate and I want to know if is it possible to extract data from an oracle database which is on a different server? Below is the server list
Linux server 1: has oracle database (11.2.0.4) (a1db) and golden gate installed (11.2.1.0.3).
Linux server 2: has oracle database (11.2.0.4) (b1db)
a1db and b1db are not clustered, these are 2 separate instances on 2 different servers.
Is it possible to capture change data on b1db from GG installed linux server 1? I am planning to use classic capture.
architecture like below, can it be done? If so what option I will be using in the extract
Thanks,
Arun
Here is something from my personal notes; hope this helps:
Standby or Off Host Environment
GoldenGate extracts, data pumps and replicats can all work with database environments accessed using TNS. When one of these processes needs to work with a database environment over TNS then instead of the following USERID specification:
setenv (ORACLE_SID = “GGDB")
USERID ggsuser, PASSWORD encrypted_password_cipher_text
The following USERID specification would be used:
USERID ggsuser@GGDB, PASSWORD encrypted_password_cipher_text
When this specification is used the setenv line is not required since the process will connect over TNS.
When a data pump or replicat is running in a standby or otherwise an off host environment the USERID specification above is the only special requirement. It is recommended that the TNS entry contains the necessary failover and service name configuration so that if or when switch over or fail over occurs, the process may continue once the environment is available again. if the data pump is using the PASSTHRU parameter then a USERID specification is not required. When the data pump is operating in PASSTHRU mode, it does not need a database connection to evaluate the metadata.
When a source extract is running in a standby or otherwise off host environment the USERID specification above is required as well as Archive Log Only mode. It is recommended that the TNS entry contains the necessary failover and service name configuration so that if or when switch over or fail over occurs, the process may continue once the environment is available again. The source extract requires a database connection in order to evaluate the metadata that occurs in the archived redo log. Since the source extract is running in an environment separate from the source database environment it is unable to read the online redo logs. Therefore it is required to be configured in Archive Log Only mode. If the environment that the source extract is running in is a standby environment, then it will evaluate the archived redo logs through switchover.
The standby or off host environment has minimal requirements. The requirements that need to be met are Oracle software availability and storage for archived redo logs. If the database environment where GoldenGate will be running is a standby database environment then GoldenGate can utilize the required shared libraries from the standby database environment. However if GoldenGate is being executed from a server environment does not contain a database environment, a client installation is required at a minimum. This will provide GoldenGate with the necessary shared libraries in order to satisfy it’s dynamically linked library dependencies. The archived redo logs must also be available for GoldenGate to read. They can be made available using a shared storage solution or with a dedicated storage solution. A standby database environment works well for this purpose as it is receiving archived redo logs on a regular basis. GoldenGate is able to leverage these archived redo logs and not impose any additional infrastructure requirements in order to evaluate and capture the data changes from the source database. In order for GoldenGate to be utilized with a standby database environment for archived redo log access, only a minimal standby database is required. Specifically the standby database needs to be be mountable so that it can accept archived redo logs. Since GoldenGate will connect to the primary database to evaluate the metadata contained in the archived redo logs, a complete standby database is not required.
Similar Messages
-
Oracle Golden Gate - Extract DDL only
Hi. We are working on a golden gate proof of concept. The way our configuration is set up, we want to pull changes from our standby to keep load low on the primary. This will require us to extract ddl changes from the primary, so my current plan is to put the dml on a 5 minute delay, and get the ddl immediately from the primary. I am running into two issues, one I cannot figure out the setting to get DDL only from the primary (right now I am getting both ddl and dml from the primary), and 2, I get the following error when retrieving data from the standby: 2015-04-16 20:08:40 ERROR OGG-00303 Oracle GoldenGate Capture for Oracle, ext1.prm: Invalid destination "+DATA/dgdemo/archivelog" specified for TRANLOGOPTION ALTARCHIVELOGDEST option, and I also get an error when I do do specify an archivelog destination. Can anyone point me to the appropriate settings. Below is the parameter file:
extract ext1
userid ggate password ggate
--TRANLOGOPTIONS ASMUSER sys@ASM ASMPASSWORD password
TRANLOGOPTIONS DBLOGREADER
TRANLOGOPTIONS DBLOGREADERBUFSIZE 2597152,ASMBUFSIZE 28000
TRANLOGOPTIONS ARCHIVEDLOGONLY
TRANLOGOPTIONS ALTARCHIVELOGDEST primary "+DATA/dgdemo/archivelog" RECURSIVE
discardfile ./dirrpt/ext1.dsc,purge
reportcount every 15 minutes, rate
exttrail ./dirdat/t1
table SCOTT.*;OGG does not support ALTARCHIVELOGDEST parameter in ALO mode till OGG version 12c.
Does GoldenGate Parameter ALTARCHIVELOGDEST Support ASM Diskgroups ? (Doc ID 1393059.1)
Also In order to run Extract in the ALO mode when archived logs are stored in ASM, the original database configuration must have complete file specification in log_archive_dest_n setting. Incomplete file specification leads ASM to ignore log_archive_format. An Incomplete file spec only contain the diskgroup name like +ASMDISK1.
Users should ensure log_archive_dest is set using complete file specification. In that case, log_archive_format is honored by ASM, and Extract will work correctly.
For example:
alter diskgroup asmdisk2 add directory '+ASMDISK2/archivedir'; -
Help on Oracle Golden Gate [SQL Server to Oracle Database]
Hi,
I am planning to use Oracle Golden Gate to replicate data from SQl Server 2008 to Oracle database. I had below queries which I wanted to get clarity on before making a decision. It would be very helpful if anyone can clarify this
- Can I set-up OGG to synchronize only a subset of tables within a schema ? (for ex: 10 tables out of 50 tables in a schema)
- Once the synchronization is set-up after the Initial Load, can the synchronization be paused (for few days) and then resumed again seamlessly ?
Any help/input would be greatly appreciated.
Regards
Rajashekar- Can I set-up OGG to synchronize only a subset of tables within a schema ? (for ex: 10 tables out of 50 tables in a schema)
Yes, the TABLE option in the extract needs to specify the table name (not a wildcard ( * )). The replecat needs to be done the same way.
Example:
--Extract--
TABLE <SCHEMA>.<TABLE_NAME>
--Replicat--
MAP <SCHEMA>.<TABLE_NAME>, TARGET <SCHEMA>.<TABLE_NAME>
- Once the synchronization is set-up after the Initial Load, can the synchronization be paused (for few days) and then resumed again seamlessly ?
Once the initial load is done, you can turn off the replicat for a few days. Only thing is that you need to make sure you have space for the trail files that will be generated and access to any backups/archive logs that you may need later. -
Golden Gate Between MySQL to Oracle Not stable
Dear All,
I have a configured replication between MySQL database and Oracle Database using Golden Gate.
Extract data from MySQL is replicated to Oracle.
MySQL is on Linux and version is 5.5
Oracle is 11gR2 RAC on Linux (I am using one node for replication)
Most of the time replication is smooth but occasionally the extract process gets ABENDED and when I try to restart it won't.
The errors I have found in log file and report file are like this:
From Report:
2013-03-20 15:39:42 ERROR OGG-00542 Unexpected threading library failure. Error code 35 (Resource deadlock avoided).
From Error Log File:
VAM function VAMRead returned unexpected result: error 600 - VAM Client Report <CAUSE OF FAILURE : Failed to Query Metadata from Table : kannel.sent_sms WHEN FAILED : While Sending Insert and Delete Record WHERE FAILED : MySQLBinLog Reader Module CONTEXT OF FAILURE : No Information Available!>.
2013-03-17 15:46:53 ERROR OGG-01668 Oracle GoldenGate Capture for MySQL, kannel3.prm: PROCESS ABENDING.
2013-03-20 10:45:51 ERROR OGG-00146 Oracle GoldenGate Capture for MySQL, kannel3.prm: VAM function VAMControl returned unexpected result: error 600 - VAM Client Report <CAUSE OF FAILURE : Sanity Check Failed for events WHEN FAILED : While reading log event from binary log WHERE FAILED : MySQLBinLog Reader Module CONTEXT OF FAILURE : No Information Available!>.
2013-03-20 10:45:51 ERROR OGG-01668 Oracle GoldenGate Capture for MySQL, kannel3.prm: PROCESS ABENDING.
2013-03-20 15:39:42 ERROR OGG-01414 Oracle GoldenGate Capture for MySQL, kannel3.prm: CACHEMGR: tran id: 0 length memtran: 0xb44b6dd8.
2013-03-20 15:39:42 ERROR OGG-00542 Oracle GoldenGate Capture for MySQL, kannel3.prm: Unexpected threading library failure. Error code 35 (Resource deadlock avoided).
2013-03-20 15:39:42 ERROR OGG-01668 Oracle GoldenGate Capture for MySQL, kannel3.prm: PROCESS ABENDING.
This is my extract param:
EXTRACT KANNEL3
DBOPTIONS HOST 10.168.20.241, CONNECTIONPORT 14421
SOURCEDB [email protected]:14421, USERID "kannel", PASSWORD "kannel"
RMTHOST 10.168.20.31, MGRPORT 7809
RMTTRAIL /u01/app/oracle/oradata/GG/dirdat/k3
TRANLOGOPTIONS ALTLOGDEST /root/sandboxes/multi_msb_5_5_19/node2/data/mysql-bin.index
TABLE kannel.sent_sms;
And this is my Replicat:
REPLICAT KREP3
USERID ggs_owner, PASSWORD warsaw
ASSUMETARGETDEFS
HANDLECOLLISIONS
SOURCEDEFS /u01/app/oracle/oradata/GG/dirdef/kannel2.def
DISCARDFILE /u01/app/oracle/oradata/GG/dirrpt/krep3.dsc, PURGE
MAP "kannel.sent_sms", TARGET kannel2.sent_sms, COLMAP (usedefaults,COMPRESS_=compress,SERVICE=@STRCAT(service,"_sms"));
However, my replicat keeps on running fine with no errors.
I will be looking forward for your kind help in this regard.
Regards, ImranI have now used the pump to configure this replication.
I am still monitoring the situation.
So far i have faced, one problem when i stop the extract myself, and when i try to restart it. It gets abend with no error in report or log file.
This is in my log file:
Host Connection: 10.168.20.241 via TCP/IP
Protocol Version: 10
2013-03-27 12:01:58 INFO OGG-01055 Recovery initialization completed for target file /root/sandboxes/GG/dirdat/l1000002, at RBA 546054.
2013-03-27 12:01:58 INFO OGG-01478 Output file /root/sandboxes/GG/dirdat/l1 is using format RELEASE 10.4/11.1.
2013-03-27 12:01:58 INFO OGG-01026 Rolling over remote file /root/sandboxes/GG/dirdat/l1000002.
2013-03-27 12:01:58 INFO OGG-01053 Recovery completed for target file /root/sandboxes/GG/dirdat/l1000003, at RBA 1005.
2013-03-27 12:01:58 INFO OGG-01057 Recovery completed for all targets.
2013-03-27 12:01:58 INFO OGG-00182 VAM API running in single-threaded mode.
2013-03-27 12:01:58 INFO OGG-01513 Positioning to Log Number: 1915444
Record Offset: 2207.
2013-03-27 12:01:58 INFO OGG-00975 Oracle GoldenGate Manager for MySQL, mgr.prm: EXTRACT SMPP2 starting.
2013-03-27 12:01:58 INFO OGG-00992 Oracle GoldenGate Capture for MySQL, smpp2.prm: EXTRACT SMPP2 starting.
2013-03-27 12:01:58 INFO OGG-00993 Oracle GoldenGate Capture for MySQL, smpp2.prm: EXTRACT SMPP2 started.
2013-03-27 12:01:58 INFO OGG-01055 Oracle GoldenGate Capture for MySQL, smpp2.prm: Recovery initialization completed for target file /root/sandboxes/GG/dirdat/l1000002, at RBA 546054.
2013-03-27 12:01:58 INFO OGG-01478 Oracle GoldenGate Capture for MySQL, smpp2.prm: Output file /root/sandboxes/GG/dirdat/l1 is using format RELEASE 10.4/11.1.
2013-03-27 12:01:58 INFO OGG-01026 Oracle GoldenGate Capture for MySQL, smpp2.prm: Rolling over remote file /root/sandboxes/GG/dirdat/l1000002.
2013-03-27 12:01:58 INFO OGG-01053 Oracle GoldenGate Capture for MySQL, smpp2.prm: Recovery completed for target file /root/sandboxes/GG/dirdat/l1000003, at RBA 1005.
2013-03-27 12:01:58 INFO OGG-01057 Oracle GoldenGate Capture for MySQL, smpp2.prm: Recovery completed for all targets.
2013-03-27 12:01:58 INFO OGG-00182 Oracle GoldenGate Capture for MySQL, smpp2.prm: VAM API running in single-threaded mode.
2013-03-27 12:01:58 INFO OGG-01513 Oracle GoldenGate Capture for MySQL, smpp2.prm: Positioning to Log Number: 1915444 Record Offset: 2207.
kindly help how to start this extract.
Regards, Imran -
Golden gate extract and replicate process are not running.
All,
I am trying replicate data between two oracle databases using golden gate.
I am trying this scenario in a single machine(two databases and golden gate are on same windows machine)
1. I have two databases PROD, UAT both are running on 11.2 oracle home.
2. Created the ggate user both the databases, and enabled supplemental logging.
3. Ran the following scripts in both databases.
SQL> @marker_setup.sql
SQL> @ddl_setup.sql
SQL> @role_setup.sql
SQL> grant GGS_GGSUSER_ROLE to ggate;
SQL> @ddl_enable.sql
4. Connected the source database (PROD) in ggsci prompt
GGSCI (home-c07402bbc5) 79> add extract ext1, tranlog, begin now
add exttrail C:\app\Bhanu\Goldengate\lt, extract ext1
edit params ext1
EXTRACT ext1
USERID ggate@PROD, PASSWORD 123456
RMTHOST home-c07402bbc5, MGRPORT 7840
rmttrail C:\app\Bhanu\Goldengate\lt
ddl include mapped objname bhanu.* // bhanu is a schema in PROD database.
TABLE bhanu.*;
5. Connected the target database(UAT) in ggsci prompt
add checkpointtable ggate.checkpoint
edit params ./GLOBALS
GGSCHEMA ggate
CHECKPOINTTABLE ggate.checkpoint
add replicat rep1, exttrail C:\app\Bhanu\Goldengate\Mt,checkpointtable ggate.checkpoint
edit params rep1
replicat rep1
ASSUMETARGETDEFS
userid ggate@UAT, password 123456
discardfile C:\app\Bhanu\Goldengate\rep1_discard.txt, append, megabytes 10
map bhanu.*, target kiran.*;
After that started the extract, replicat using
start extract ext1, start replicate rep1
Now the status.
GGSCI (home-c07402bbc5) 103> info all
Program Status Group Lag Time Since Chkpt
MANAGER RUNNING
EXTRACT STOPPED EXT1 00:00:00 00:11:43
REPLICAT STOPPED REP1 00:00:00 00:21:16
Can you please help me what is wrong in my setup and why extract and replicate process are not running.
Edited by: user12178861 on Nov 19, 2011 11:22 AMThanks for your quick reply.
I have done few changes but extract, replicate process not running.
couple of points I would like to share with you regarding my setup.
1. I am using single golden date instance to replicate the data between PROD and UAT databases.
2. GGSCI (home-c07402bbc5) 1> dblogin userid ggate@PROD,PASSWORD 123456
Successfully logged into database.
GGSCI (home-c07402bbc5) 2> info all
Program Status Group Lag Time Since Chkpt
MANAGER RUNNING
EXTRACT STOPPED EXT1 00:00:00 01:23:29
REPLICAT STOPPED REP1 00:00:00 01:33:02
GGSCI (home-c07402bbc5) 3> VIEW REPORT EXT1
ERROR: REPORT file EXT1 does not exist.
GGSCI (home-c07402bbc5) 4> start er *
Sending START request to MANAGER ...
EXTRACT EXT1 starting
Sending START request to MANAGER ...
REPLICAT REP1 starting
GGSCI (home-c07402bbc5) 5> VIEW REPORT EXT1
ERROR: REPORT file EXT1 does not exist.
GGSCI (home-c07402bbc5) 6> info all
Program Status Group Lag Time Since Chkpt
MANAGER RUNNING
EXTRACT STOPPED EXT1 00:00:00 01:24:10
REPLICAT STOPPED REP1 00:00:00 01:33:44
Target :
GGSCI (home-c07402bbc5) 1> dblogin ggate@UAT,PASSWORD 123456
ERROR: Unrecognized parameter (GGATE@UAT), expected USERID.
GGSCI (home-c07402bbc5) 2> dblogin userid ggate@UAT,PASSWORD 123456
Successfully logged into database.
GGSCI (home-c07402bbc5) 5> add replicat rep1, exttrail C:\app\Bhanu\Goldengate/lt,checkpointtable ggate.checkpoint
ERROR: REPLICAT REP1 already exists.
GGSCI (home-c07402bbc5) 6> delete replicat rep1
Deleted REPLICAT REP1.
GGSCI (home-c07402bbc5) 7> add replicat rep1, exttrail C:\app\Bhanu\Goldengate/lt,checkpointtable ggate.checkpoint
REPLICAT added.
GGSCI (home-c07402bbc5) 8> edit params rep1
GGSCI (home-c07402bbc5) 9> start er *
Sending START request to MANAGER ...
EXTRACT EXT1 starting
Sending START request to MANAGER ...
REPLICAT REP1 starting
GGSCI (home-c07402bbc5) 10> info all
Program Status Group Lag Time Since Chkpt
MANAGER RUNNING
EXTRACT STOPPED EXT1 00:00:00 01:29:46
REPLICAT STOPPED REP1 00:00:00 00:00:48
3. Is mandatory that I need two golden gate instances running each side ?
Thanks for spending your time on this problem. -
Data Extraction from Multiple data sources into a single Infoprovider
Hi Experts,
Can anyone send me links or examples on how to extract data from multiple data sources into 1 Cube/DSO.
Can anyone send me example scenarios for extracting data from 2 data sources into a single Cube/DSO.
Thanks
KumarHi Ashok,
Check the following link from SAP help. Ths is probably what you are looking for.
[ Multiple data sources into single infoprovider|http://help.sap.com/saphelp_nw70/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm]
Data from multiple data sources which are logically related and technicall have the same key can be combined into single record.
For example, if you have Sales order item and sales order item status .
These both have the same key - sales order and item and are logically related.
The Sales order item - provides information on Sales order item - material , price, quantity, plant etc.
The item status - povides information on delivery status, billing status, rejection status....
These are two different data sources 2LIS_!1_VAITM ad 2LIS_11_VASTI.
In case you have few master data attributes coming from different systems ie tow data sources in different systems say completely defines your master data. Then you could use a DSO or infoobject to combine the records.
If you want to see aggregated data you use a cube.
ie say you want to analyzae the customer revenue on a particular material and a particular project.
The project details would come from Project syatem and Sales details fom sales flow.
The data is then combined in the DSO with key cutomer and sales area say.
Then it is aggregated at cube level...by not using sales order in the cube model...so all sales order of the same customer would add while loading the cube.. to give direct customer spend values for your sales area at aggregated level.
Hope this helps,
Best regards,
Sunmit. -
Archive Manager Fails to Extract from multiple achives
Just as the title says, up until a few nights ago my archive manager extracted multi-rar files perfectly, but now whenever I try to extract one, it only extracts from the rar I select.
I've tried reinstalling, and looking in archive managers settings to no avail.
Has anybody else ever had this happen?
Or any ideas on what I could try to fix this?
Thanks in advanceMake sure they're all named correctly (.rar, .r00, .r01, etc.) and not corrupt.
What might work is to extract each rar individually and then use cat to join the extracted files (use a CLI app like unrar, that'd be a lot faster, with a little bash magic).
Does this occur with only one archive set? Have you tired using something simple like unrar? Find out whether it's the app or the archive set that's bad.
Last edited by Ranguvar (2009-01-29 18:37:10) -
How many GOLDEN GATE extracts should I create for 300 schemas
I have a question.I have 300 schemas and each schema has at least 120-130 tables which are not too big.I have to replicate all the schemas.What would be the best way to do it we have to also consider the memory we don't have much of memory available on source.I mean how many extract should I create.since it is also possible to create only one extract and we can add all schemas. what would be the best possible suggestion.should I create multiple extract lets say 10-20 and divide between them.or a single schema would be ok ??.
can anyone suggest me something !!!
Thanks in advance...The limit of 300 processes (extract and replicats) per manager as raised to 5000 in v11.2.1.
You could certainly have 10 extracts:
Extract 1
Schema1.*;
Schema2.*;
etc
Extract 2
Schema31.*;
Schema32.*;
etc
The main things to think of:
Are their transactions that span schemas? Those should be in the same extract.
Can the extract keep up with the volume from those 30 (or more) schemas?
If I have to do maintainence on one schema and have to bring down the extract... are the other 29 schemas ok with this?
Maybe the volume is such that you can put all 300 in one extract? -
Hi,
I need to extract data from my leagcy source system which is on oracle.
I need to extract 50 records from each of my table in the my schema to a flat file with the same name as of table.
I have around 500 tables in my schema. So is there any way to do this.
Thanks,
Yeswanthpossibly you also want to consider referential integrity otherwise when you import to your new solution, it's unlikely any of the data will match up.
if you do want to consider this, then it will be very difficult for someone who doesn't know your data structure to write the queries for you.
consider using a data migration tool of some sort perhaps? -
Logical Data Extract from SAP/Oracle
Hi,
We have been asked by our client to develop a data extraction tool for SAP/Oracle. For example, we would need to extract user/employee information.
What is the best way to go about this? Ideally we would like to allow the user to connect to the DB and select the the data that we would like through a UI. The problem is that they will not know the schema and obviously too many tables to navigate through.
Are there any legal concerns here?
Regards,
JustinJustin Todd wrote:
Hi,
>
> We have been asked by our client to develop a data extraction tool for SAP/Oracle. For example, we would need to extract user/employee information.
>
> What is the best way to go about this? Ideally we would like to allow the user to connect to the DB and select the the data that we would like through a UI. The problem is that they will not know the schema and obviously too many tables to navigate through.
>
> Are there any legal concerns here?
>
> Regards,
> Justin
Hi Justin,
Use BAPI functions, instead of directly accessing tables on the database.
Best regards,
Orkun Gedik -
Golden Gate: Extract going down with below warnings
2011-01-19 15:57:11 INFO OGG-00975 Oracle GoldenGate Manager for Oracle, mgr.prm: EXTRACT EXT1 starting.
2011-01-19 15:57:11 INFO OGG-00992 Oracle GoldenGate Capture for Oracle, ext1.prm: EXTRACT EXT1 starting.
2011-01-19 15:57:11 WARNING OGG-01573 Oracle GoldenGate Capture for Oracle, ext1.prm: br_validate_bcp: failed in call to: ggcrc64valid.
2011-01-19 15:57:11 WARNING OGG-01573 Oracle GoldenGate Capture for Oracle, ext1.prm: br_validate_bcp: failed in call to: br_validate_bcp.please find below the result of it:-
GGSCI (orkxd1dmodidb01.espdev.aurdev.national.com.au) 1> view report ext1
Oracle GoldenGate Capture for Oracle
Version 11.1.1.0.0 Build 078
Linux, x64, 64bit (optimized), Oracle 11 on Jul 28 2010 14:58:37
Copyright (C) 1995, 2010, Oracle and/or its affiliates. All rights reserved.
Starting at 2011-03-15 14:53:13
Operating System Version:
Linux
Version #1 SMP Tue Feb 10 15:14:47 EST 2009, Release 2.6.18-128.1.1.0.1.el5
Node: orkxd1dmodidb01.espdev.aurdev.national.com.au
Machine: x86_64
soft limit hard limit
Address Space Size : unlimited unlimited
Heap Size : unlimited unlimited
File Size : unlimited unlimited
CPU Time : unlimited unlimited
Process id: 16444
Description:
** Running with the following parameters **
EXTRACT ext1
SETENV (ORACLE_HOME = "/oracle/app/ora11gdb/database")
Set environment variable (ORACLE_HOME=/oracle/app/ora11gdb/database)
SETENV (ORACLE_SID = "DEVODI1")
Set environment variable (ORACLE_SID=DEVODI1)
SETENV (NLS_LANG = "AMERICAN_AMERICA.AL32UTF8")
Set environment variable (NLS_LANG=AMERICAN_AMERICA.AL32UTF8)
USERID ggs_owner, PASSWORD *********
EXTTRAIL /goldengate/dirdat/lt
GETENV (ORACLE_HOME)
ORACLE_HOME = /oracle/app/ora11gdb/database
GETENV (ORACLE_SID)
ORACLE_SID = DEVODI1
GETENV (NLS_LANG)
NLS_LANG = AMERICAN_AMERICA.AL32UTF8
TRANLOGOPTIONS ASMUSER sys@+ASM, ASMPASSWORD ******
DISCARDFILE /goldengate/dirdat/capt.dsc, PURGE
RMTHOST orkxd1dmodidb01, MGRPORT 7841
TABLE SRC_FC.ch_acct_cust_xref;
TABLE SRC_FC.ci_custmast;
TABLE SRC_FC.ci_custrel;
TABLE SRC_FC.cop_applicant_details;
TABLE SRC_FC.cop_cust_changed_information;
TABLE SRC_FC.cop_applicant_state;
TABLE SRC_FC.CS_HO_CUSTACCTXREF;
Bounded Recovery Parameter:
BRINTERVAL = 4HOURS
BRDIR = /goldengate
2011-03-15 14:53:13 WARNING OGG-01573 br_validate_bcp: failed in call to: ggcrc64valid.
2011-03-15 14:53:13 WARNING OGG-01573 br_validate_bcp: failed in call to: br_validate_bcp. -
Spooling Extracts from Multiple SQL statements in 1 File
Hi all,
I am trying to spool extract results of 3 separate SQL statements into one single file. I wrote a SQL block similar to the one below. However, the result of the statements overwrite each other: 3 overwrote 2 and overwrote 1. Any suggestion how to combined there extracted results in one file?
spool c:\test.txt
<SQL statement 1>
<SQL statement 2>
<SQL statement 3>
/spool OFF
Thanks in advance
JasonPlease paste you SQL file here. These is no way one should overwrite another.
Eric -
Help for BI Content extraction from multiple datasources
Hello,
I am working on a new installation of BI having just come out of the BW310 and BW350 classes, so please be patient with me. :~) I only have a development instance running, and for my first project, I'm working on the Cross Application Time Sheet. I have the infocube set up for Time Sheet (OCATS_1), plus the other associated ICs (OCATS_C01, OCATS_C02, OCATS_MC1) and I am getting data from the appropriate datasources (0CA_TS, 0CA_TS_IS_1, 0CA_TS_IS_2).
The problem I am getting, is that the transformations from the datasource(s) into the DTP leaves out several fields which are found in the Employee datasource. How would I go about combining the information from the employee datasource into the Time Sheet infocubes along with the timesheet data?
Thanks.
(Points will be awarded, per usual)SRM doesn't use kind of "cockpit" LO-like in ECC.
Overall picture:
http://help.sap.com/saphelp_srm40/helpdata/en/b3/cb7b401c976d1de10000000a1550b0/content.htm
If you setup data flows accordign Business Content of SRM:
http://help.sap.com/saphelp_nw2004s/helpdata/en/3a/7aeb3cad744026e10000000a11405a/frameset.htm
Then just perform an init load and schedule the deltas. -
Ora Golden gate: Extract Process error.
As soon as i start the extract process i have the following error message pop up..
The procedure entry point longjmp could not be located in the dynamic link library orauts.dll.
Can any1 help in dis??
Thanks in Advance.Hi,
I am facing the same issue. Can u please help me how did you solve it.
Thanks in advance. -
Golden gate extract nt starting
Extract process not starting giving below error in the reoort file. What we understand form ht below error.
2013-05-09 04:03:25 ERROR OGG-00446 Checkpointing was not selected during initialization.
2013-05-09 04:03:25 ERROR OGG-01668 PROCESS ABENDING.As I can't move your thread, please open a new thread in the GG forum: GoldenGate
Maybe you are looking for
-
Hi, i am familar with PP and trying to learn PP-PI. Pls explain why we need interfaces in PP-PI? what is the function of those interfaces? how they need to map / link in sap system? what are the general interfaces in pharm industry? Thank you for you
-
How do I share itunes purchases with my daughter?
I have purchased some songs on itunes for my daughter who is 11 so doesn't have her own itunes store account. How can I share them with her? We use the same computer with Windows 7, with different user logins and both have our own itunes library.
-
Problem while printing through citrix
Dear Gurus , We are facing the problem during printing through citrix . When we print locally without using citrix were able to get the proper print as it displayed in the spool . But When we try to take the printout of document having say 12 colum
-
Hi, I am trying to load a flat file to Database. I have created a native schema definition for loading the Flat file and the XML is generated properly when Testing in the schema creation wizard. I am using a SOA Composite application project. I have
-
How do I enlarge a thumbnail on hover and then click to open in new window?
Can anyone help me here? Basically what I have are a few thumbnails on my site that I want the user to be able to click to open to a new page or URL. But before one clicks, just on the hover, I would like the thumbnail to display the larger image. An