Oracle Streams and CLOB column

Hi there,
We are using "Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit". My question is "Does Oracle Streams captures, propagates (source capture method) and applies CLOB column changes?"
If yes, is this default behavior? Can we tell Streams to exclude the CLOB column from the whole (capture-propage-apply) process?
Thanks in advance!

You can exclude columns via a rule (dbms_streams_adm.delete_column).
CLOBs are captured.
http://download.oracle.com/docs/cd/E11882_01/server.112/e17069/strms_capture.htm#i1006263

Similar Messages

  • Setup between Oracle streams and MQ

    Hi All,
    I m trying to create the setup between oracle streams and Messing Queue(MQ).i have already install MQClient on my machine and messing gateways is also working fine,but accoring to setup docs i have created one user having all the granst they provide in docs and created queuetable,queue,dml handler,tables rules and starting queue after that we have done Capture and Apply process on that user, in other word u have say the entire setup but our data is not propagating that messaage to MQ.
    So if anyone has specific setup code then pl provide or any suggestion u want can give.
    thnks & regards
    Sanjeev

    Hi Sanjeev,
    There is a special forum dedicated to Oracle Streams: Streams
    You may want to try there.
    Extra tip: post the code you used. That way it is easier for people to see what you might have done wrong.
    Regards,
    Rob.

  • Differenece between oracle streams and oracle Replication

    Hi all,
    Can anyone tell me difference between oracle streams and oracle replication?
    Regards

    refer the link:Difference Between Oracle Replication & Oracle Streams.
    Oracle Replication is designed to replicate exact copies of data sets to various databases. Oracle Streams is designed to propagate individual data changes to various databases. Thus, Replication is probably easier if the end goal is to maintain identical copies of data, where Streams is easier if the end goal is to allow different databases to react differently to data changes.
    Oracle Replication is a significantly more mature product-- it is quite usable with older databases. Oracle Streams is a much newer technology and is only usable among different 9i databases. Most competent Oracle developers and DBA's are familiar with Oracle Replication, while many fewer have any real experience with Streams.
    The Streams architecture strikes me as a lot more flexible than Oracle Replication's. This leads me to suspect that Oracle will be pushing Streams over Replication in subsequent releases, so I would expect new features in Streams, like DDL changes, that aren't in Oracle Replication. Realistically, though, I don't expect any serious movement away from Replication for at least a few releases, so I wouldn't tend to be overly concerned on this front.
    answered by Justin
    Distributed Database Consulting, Inc.
    reference from forum thread:
    Difference Between Oracle Replication & Oracle Streams.

  • Oracle Streams and Dataguard

    Does anyone know how to configure oracle streams and dataguard to work together? I've set up an environment which successfully captures and applies records in a streams environment using
    log_archive_config=SEND, RECEIVE, NODG_CONFIG
    as soon as I try to introduce the streams setup into a database which already has dataguard setup it begins to get convoluted as setting DG_CONFIG requires you to then set the db_unique_name and it seems that the streams log mining will not work without being fully incorporated into the Dataguard configuration.
    Has anyone set up streams in a dataguard environment?

    Lets see what's missing from your request:
    1. Oracle version number?
    2. Physical or logical Data Guard?
    3. Which Data Guard mode? Synch or Asynch?
    4. Which Data Guard protection level?
    5. ARCH or LGWR?
    6. Which streams mode? Synch or Asynch?
    7. Hotlog or Autolog?
    8. Streams on the production server or the standby server?
    I'm all out of guesses tonight. Please provide enough information for someone to help you.

  • Oracle Streams and Database Encryption

    I am looking for encryption method for OLTP database.
    Oracle Streams will be used to synchronize the data between source and remote database.
    Business process wants all remote database to be fully encrypted.
    What Can I use?
    - TDE is not supported with Oracle Streams,
    - I can’t use DBMS_CRYPTO since it needs big Database modifications.
    Any other Oracle supported techniques, or maybe I’m gonna need to consider hardware encryption (like hard disk encryption) or OS level encryption.
    Thanks,

    TDE is not supported with LogMiner based technologies such as Streams, Data Guard (logical standby). Therefore you can not use TDE to replicated encrypted content. --You will get exception "Unsupported data type"
    But, you might replicate decrypted data into encrypted destination. This means your destination might have TDE encrypted columns.
    Unofficially Oracle will support TDE with logminer based technologies in the next database version 11.
    I am waiting for this.
    Regards,
    Mike

  • Oracle Streams and Oracle Apps  11i

    Hi,
    I am looking for an oracle solution to build a reporting instance off an e-business suite 11i and offload discoverer reporting to the reporting instance. I have tried dataguard logical standby but too many issues and I cannot use. I am wondering if anyone tried using oracle streams to build a reporting instance from an oracle apps instance?
    Thanks

    Hi,
    Is Streams supported in an E-Business Suite database? We once had a scenerio where some parameters for Streams were conflicting with parameters required for E-Business Suite performance improvement. I could not find any documents from Oracle to verify whether Streams on E-Business Suite database was a supported configuration or not.
    So if you have any such document, could you send the link to me?
    Also please provide the link to your white paper/presentation.
    Regards,
    Sujoy

  • Help required in handling of Oracle BLOB and CLOB in OC4J

    Hi,
    I am in a process of porting my application from Weblogic 6 to OC4J.
    I have problem while handling CLob and Blob's in oracle. Weblogic has
    classes defined to do the same. (weblogic.jdbc.common.OracleClob)
    Does any one know how I can achieve this.
    Thanks in advance.
    Regards,
    Moin

    We have an application running on OC4J accessing an Oracle 8.1.7 db, and have no problems using BLOBs and CLOBs. It is explained quite clearly in the Oracle documentation.
    If you require further assistance let me know :-)

  • Oracle XMLType and clobs

    Hi,
    I've been struggling for days trying to figure out how to populate an XMLType
    field in Oracle 9i. I'm using the OracleThinDriver, WL 7.0, weblogic oracle extensions
    and a WL connection pool but to no avail.
    Has anyone done this? Is it possible? I keep getting an invalid LOB when I do
    a select on the XMLType field and cast it to a OracleThinClob type and attempt
    to write to it.
    Any help is much appreciated.
    Thx.
    Rachel

    We do not support XMLType oracle extension in 7.0. That feature is available from next
    major release(8.0?).
    Mitesh
    Rachel Hall wrote:
    Hi,
    I've been struggling for days trying to figure out how to populate an XMLType
    field in Oracle 9i. I'm using the OracleThinDriver, WL 7.0, weblogic oracle extensions
    and a WL connection pool but to no avail.
    Has anyone done this? Is it possible? I keep getting an invalid LOB when I do
    a select on the XMLType field and cast it to a OracleThinClob type and attempt
    to write to it.
    Any help is much appreciated.
    Thx.
    Rachel

  • Database copy and CLOB columns

    Hi,
    I have tried to copy a database to anoter. It just works fine.
    But, if a table has a column with CLOB datatype, the generated SQL is wrong.
    Example of generated SQL :
    insert into my_table(id, clob_col) values (1, (CLOB));
    Any workaround ?
    Stephan
    Oracle 10g

    You can not "copy a database" with SQL Developer. Are you confusing Oracle with a Microsoft product?
    A database is Oracle is not a schema and a database is not a table ... a database is a collection of physical files ... datafiles, tempfiles, control files, log files.
    When you define, in Oracle terms, what you are actually trying to do someone will likely help you.

  • Oracle Streams and Hetrogenous Enviornment

    Hi,
    As we can use Streams in Heterogeneous environment, then let say we implemented streams for replication between MS SQL server and Oracle and data has to be replicated from Oracle to SQL server then on SQL server side what sort of configuration we need.
    And if we want to replicate both data and also user messages then what we would do.
    Regards,
    Abbasi

    Viacheslav Ostapenko wrote:
    Sorry, Aman,
    I couldn't find any info about replication to MS SQL. Is it possible at all? Could you provide link where we can read about this? It could be very interesting.Sorry Viacheslav, even I couldn't find anything for the same. I am not sure that it can be done or not, I haven't heard anyone in my contact doing so. The only place where I have seen Streams being used around me is within Oracle db only. May be someone else can help if he/she has done it.
    Aman....

  • Oracle stream and ASM

    Does Oracle 11g stream run with ASM together?

    They are independent products, working at different layers for different purposes.
    You can have Streams with ASM.
    You can have Streams without ASM.
    You can have ASM with Streams.
    You can have ASM without Streams.
    Hemant K Chitale

  • Oracle stream and  standby database

    i want to implement stream(table level) in the following scenario
    what i must to set for standby as when i switch to standby database the stream work correctly like when primary was working.
    DB1--------------------stream---------------------DB2
    DB1standby .......................................DB2standby
    Narges.

    Your schema above do not tell us if this is a streams master/master. Also it is important to know if you a doing full schema replication or only some tables.
    For the db_name think global name : how do you setup a propagation when the source DB and target DB have the same TNS entry - whose name is driven by the rule dblink name = global_name = tns entry.
    Here it is a slightly different case : DG will succeed to SRC and appear as the same DB for the remote though they are on different hostname. Only way is the fail over in tns entry when you set multiple host for the same services.
    The rest should be ok except for the lost data of SRC which requires a re-sync.
    On the streams side problems remains and I am still skeptic on the feasibility : on paper no problems but problems come fast for the data's lost during the switch to DG (and this is to be expected with async DG, last SCN's will be missing on DG but not necessary on remote target site).
    Then both sites will start suffering on the loss of this data most probably with ORA 1403 data not found or Streams transactions pending and there is no way to predict the extend of this in advance. The re-sync of this status is then crucial and I am working on this. I will come back to the community with a generic procedure for master-master re-sync. One of the main problem is to identify table remote correspondents: These are easy to determine for declarative transformations but for transformation done within apply handler or transformation function associated to a context I have not find better than to comes back to the DBA and ask pitifully which are the remote target tables. The correspondence is a precious data that do not appear in any data dictionary view, but is only into the text of a pl/sql block.
    For my comments on the others thread, when the former SRC comes back, I have not considered the possibility in 11g to re-sync the former Master with a DG being the new master and sound like I should have.
    But this is all new also for me and I need to test it. if it works then you don't have the problem to setup a downstream capture between 2 DB having the same db_name.

  • Using OWB mappings with Oracle CDC/Streams and LCRs

    Hi,
    Has anyone worked with Oracle Streams and OWB? We're looking to leverage Streams to update our data warehouse using Streams to apply changes from the transactional/source DB. At some point we seem to remember hearing that OWB could leverage Streams, perhaps even using the Logical Change Records (LCRs) from Streams as input to mappings?
    Any thoughts much appreciated.
    Thanks,
    Jim Carter

    Hi Jim,
    We've built a fairly complex solution based on streams. We wanted to break up the various components into separate entities so that any network failure or individual component failure wouldn't cause issues for the other components. So, here goes:
    1) The OLTP source database is streaming LCR's to our Datawarehouse where we keep an operational copy of production, updated daily from those streams. This allows for various operational reports to be run/rerun in a given day with the end-of-yesterday picture without impacting the performance on the source system.
    2) Our apply process on the datamart side actually updates TWO copies of data. It does a default apply to our operational copy of production, and each of those tables have triggers that put a second copy of the data into daily partitioned tables. So, yesterday's partitions has only the data that was actually changed yesterday. After the default apply, we walk the Oracle dependency tree to fill in all of the supporting information so that yesterday's partition includes all the data needed to run our ETL queries for that day.
    Example: Suppose yesterday an address for a customer was updated. Streams only knows about the change to the address record, so the automated process would only put that address record into the daily partition. The dependency walk fills in the associated customer, date of birth, etc. data into that partition so that the partition holds all of the related data to that address record for updates without having to query against the complete tables. By the same token, a change to some other customer info will backfill in the adress record for this customer too.
    Now, our ETL queries run against views created against these partitoned tables so that they are only looking at the data for that day (the view s_address joins from our control tables to the partitiond address table so that we are only seeing one day's address records). This means that the ETL is running agains the minimal subset of data required to update dimensions and create facts. It also means that, for example, if there is a problem with the ETL we can suspend running ETL while we fix a problem, and the streaming process will just go on filling partitions until we are ready to re-launch ETL and catch up - one day at a time. We also back up the data mart after each load so that, if we discover an error in ETL logic and need to rebuild we can restore the datamart to a given day and then reprocess the daily partitions in order very simply.
    We have added control fields in those partitioned tables that show which record was inserted/updated/or deleted in production, and which was added by the dependency walk so, if neccessary, our ETL can determine which data elements were the ones that changed. As we do daily updates to the data mart as our finest grain, this process may update a given record in a given partition multiple times so that the status of this record at the end of the day in that daily partition shows the final version of that record for the day. So, for example, if you add an address record an then update it on the same day the partition for that day will show the final updated version of the record, and the control field will show this to be a new inserted record for the day.
    This satisfies our business requirements. Yours may be different.
    We have a set of control tables which manage what partition is being loaded from streams, and which have been loaded via ETL to the datamart. The only limitation is that, of course, the ETL load can only go as far as the last partition completely loaded and closed from streams. And we manage the sizing of this staging system by pruning partitions.
    Now, this process IS complex, and requires a fair chunk of storage, but it provides us with the local daily static copy of the OLTP system for running operational reports against without impacting production, and a guaranteed minimal subset of the OLTP system for speedy ETL runs.
    As for referencing LCRs themselves, we did not go that route due to the dependency issues (one single LTR will almost never include all of the dependant data from which to update a dimension record or build a fact record, so we would have had to constantly link each one with the full data set to get all of that other info).
    Anyway - just thought our approach might give you some ideas as you work out your own approach.
    Cheers,
    Mike

  • Oracle Streams Update conflict handler not working

    Hello,
    I've been working on the Oracle streams and this time we've to come up with Update conflict handler.
    We are using Oracle 11g on Solaris10 env.
    So far, we have implemented bi-directional Oracle Streams Replication and it is working fine.
    Now, when i try to implement Update conflict handler - it executed successfully but it is not fulfilling the desired functionality.
    Here are the steps i performed:
    Steap -1:
    create table test73 (first_name varchar2(20),last_name varchar2(20), salary number(7));
    ALTER TABLE jas23.test73 ADD (time TIMESTAMP WITH TIME ZONE);
    insert into jas23.test73 values ('gugg','qwer',2000,SYSTIMESTAMP);
    insert into jas23.test73 values ('papa','sdds',2050,SYSTIMESTAMP);
    insert into jas23.test73 values ('jaja','xzxc',2075,SYSTIMESTAMP);
    insert into jas23.test73 values ('kaka','cvdxx',2095,SYSTIMESTAMP);
    insert into jas23.test73 values ('mama','rfgy',1900,SYSTIMESTAMP);
    insert into jas23.test73 values ('tata','jaja',1950,SYSTIMESTAMP);
    commit;
    Step-2:
    conn to strmadmin/strmadmin to server1:
    SQL> ALTER TABLE jas23.test73 ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
    Step-3
    SQL>
    DECLARE
    cols DBMS_UTILITY.NAME_ARRAY;
    BEGIN
    cols(1) := 'first_name';
    cols(2) := 'last_name';
    cols(3) := 'salary';
    cols(4) := 'time';
    DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(
    object_name => 'jas23.test73',
    method_name => 'MAXIMUM',
    resolution_column => 'time',
    column_list => cols);
    END;
    Step-4
    conn to strmadmin/strmadmin to server2
    SQL>
    DECLARE
    cols DBMS_UTILITY.NAME_ARRAY;
    BEGIN
    cols(1) := 'first_name';
    cols(2) := 'last_name';
    cols(3) := 'salary';
    cols(4) := 'time';
    DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(
    object_name => 'jas23.test73',
    method_name => 'MAXIMUM',
    resolution_column => 'time',
    column_list => cols);
    END;
    Step-5
    And now, if i try to update the value of salary, then it is not getting handled by update conflict handler.
    update jas23.test73 set salary = 1500,time=SYSTIMESTAMP where first_name='papa'; --server1
    update jas23.test73 set salary = 2500,time=SYSTIMESTAMP where first_name='papa'; --server2
    commit; --server1
    commit; --server2
    Note: Both the servers are into different timezone (i hope it wont be any problem)
    Now, after performing all these steps - the data is not same at both sites.
    Error(DBA_APPLY_ERROR) -
    ORA-26787: The row with key ("FIRST_NAME", "LAST_NAME", "SALARY", "TIME") = (papa, sdds, 2000, 23-DEC-10 05.46.18.994233000 PM +00:00) does not exist in ta
    ble JAS23.TEST73
    ORA-01403: no data found
    Please help.
    Thanks.
    Edited by: gags on Dec 23, 2010 12:30 PM

    Hi,
    When i tried to do it on Server-2:
    SQL> ALTER TABLE jas23.test73 ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
    it throws me an error,
    Error -
    ERROR at line 1:
    ORA-32588: supplemental logging attribute all column exists

  • Error when trying to use (GUI Wizards) on Table with BLOB, CLOB columns.

    I enjoyed watching the demo of ODP.NET/VS 2005 Environment where you can just drag the new command builders, data sets, etc.. but when I try this with a table that contains a BLOB and CLOB column it returns an error and basically cannot build the SQL needed to process these field types.
    I would assume it could do this natively since it is somewhat of a primitive type. I also tried to point it to a Stored Proc that returned a blob and it caused an error when building the SQL.
    So it appears it is not possible to use the GUI data access wizards for these column types. I can use them in code with the ODP.NET with no problems but it would be nice to be able to utilize the GUI Command Builders, etc...

    blob and clob are not supported i guess. oracle is a bit sluggish when it comes to VS and MS products!

Maybe you are looking for

  • Error while starting jboss-3.2.6 on redhat 9 linux

    hello everyone, i am a newbie in the industry. i installed linux on my machine, downloaded the jboss AS, untarred the file but while starting the jboss i got the following error message [root@localhost bin]# ./run.sh run.sh: Missing file: /lib/tools.

  • ACS and Microsoft AD

    I wanted to know if Cisco ACS in any way extends the Microsoft Active Directory schema. I'm thinking not but co-workers want some sort of comfirmation. It's simply an authentication request that either gets accepted or rejected right? Thanks for the

  • (hr)problem with time-event upload

    hi friends, flowing codes like rpteup10 program. but we changed it. (december 2005) . but now there is problem not working . it is writing done but infotype 2011 from pa20 there is no change. maybe problem is fm HR_CC1_TIMEEVENT_INSERT. please help m

  • HT4946 How do I change the answers to my security questions?

    I am trying to make an account without a credit card and I can't remember the answers to my security questions, how do I reset them?

  • Plug and play speakers not working properly after upgrading to Windows 8.1

    After upgrading to Windows 8.1, my plug and play speakers do not work properly.  I have uninstalled and reinstalled them, and restarted.  When I try to play music, nothing comes out.  If I turn them off and back on, music will play for approximately