Capturing data every 2 mins - Streams

I am using Oracle 11gR2 on RHEL
We are planning to configure streams from 11gR2 OLTP environment to the 11gR2 DW. This would be schema level replication.
Once the streams destination schema is populated with the data, we need to schedule the ETL processes to run every 2 minutes as we are dealing with real time reporting. The amount of data flow would be around 5000 records every minute.
So my question is if I need to look for the most recent data, should I be using a TIMESTAMP column in the source tables or does Streams have any inbuilt columns which I can query to get the most recent data(recent 2 minutes of data)?
I went through the Oracle Documentation which specifies that these attributes can be added using DBMS_CAPTURE_ADM.INCLUDE_EXTRA_ATTRIBUTE
row_id (row LCRs only)
serial#
session#
thread#
tx_name
username
But none of these are related to timestamp, so is that the source tables should be having the timestamp column if I need to query the recent 2 mins worth of data from destination schema?
I know CDC can be of help as I can create subscribers and can extend the window to retrieve the latest data but not sure how this can be accomplished with Streams ? Any thoughts?

Sorry, think so I was not clear. I am talking about destination tables here/
I don't have control over the source tables in the OLTP database as they already exist and they would hold data for few years.
DW is what I have control upon and the one's that I was talking about were the Streams destination tables in the DW database.
So what I was talking about is, the presentation layer would hold all the reporting data and if the Streams destination tables continue to grow, then my database size would double due to duplicate data in the reporting Schema and Streams schema.
The reason I wouldn't be able to purge the data in Streams tables(in DW database) is because the rows are not frozen, even after a year some rows might get updated, so if I purge them, I suppose Streams will complain and would throw errors.
In the case of CDC, since the DML activity is applied as inserts, updates and deletes at the destination tables, even if the change table at destination is truncated, I would still receive updates. For eg:
In CDC if there are 100 source in source table which have been replicated to destination then I would have 100 rows in destination tables with Operation I(I stands for Insert)
If I truncate the destination table rows, then obviously it would result in 0 rows in destination table.
Now if all the 100 rows in source table are updated, I would receive 200 rows in the destination table, 100 as UO(Update old value) and 100 as UN(update new value), so what I mean is CDC won't compain if I truncate the destination tables as part of purge operation for maintenance activities. So I can control the size of the table, I can keep it simple by purging the change table(destination table) every day.
In case of streams, whether will I be able to do similar purging operations(at destination side) even on rows which would be updated later ?
Let me know if I am clear.

Similar Messages

  • Using LabView version 6.1, Best way to capture data every 5 milliseconds

    Best way to capture data every 5 ms (milli-seconds) in the .vi diagram when using "Time between Points, and Small Loop Delay tools"

    I have no idea what "Time between Points, and Small Loop Delay tools" is. If this is some code you downloaded, you should provide a linke to it. And, if you want to acquire analog data every 5 milliseconds from a DAQ board, that is possible with just about every DAQ board and is not related to the version of LabVIEW. You simply have to set the sample rate of the DAQ board to 200 samples/sec. If it's digital data, then there will be a problem getting consistent 5 msec data.

  • Best way to capture data every 5 ms (milli-seconds) in the .vi diagram when using "Time between Points, and Small Loop Delay tools" ?

    - Using LabView version 6.1, is there anyway to change the "Time Between Points" indicator of (HH.MM.SS) to only (mm.ss), or to perhaps only (.ss) ?
    - Need to set the data sampling rate to capture every 5 milliseconds, but the defaults is always to 20 or greater; even when the "Small Loop Delay" variable is adjusted down. 
    Thank you in advance.

    I have no idea what "Time between Points, and Small Loop Delay tools" is. If this is some code you downloaded, you should provide a linke to it. And, if you want to acquire analog data every 5 milliseconds from a DAQ board, that is possible with just about every DAQ board and is not related to the version of LabVIEW. You simply have to set the sample rate of the DAQ board to 200 samples/sec. If it's digital data, then there will be a problem getting consistent 5 msec data.

  • Touch screen capturing data frm hardware device

    Hiiii
    What labview real time module is require  to efficiently use a touch screen that can capture data every minute from a hardware device...
    Thanks in advance....

    I think you do not need a real time system for data capture every minute . A touch panel running windows CE and labview should do the trick, and at the same time be stable enough. But if you need something rugged and stable perhaps a compactRIO is the right thing. http://www.ni.com/compactrio/ But a RIO divice do no have a display.
    Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
    (Sorry no Labview "brag list" so far)

  • Capturing data of the previous time interval with Oracle Stream(HotLog)

    I read from 10g Oracle manual that Oracle Steam can capture data within specified time interval with begin_data and end_data options.
    For example :
    BEGIN
    DBMS_CDC_PUBLISH.CREATE_CHANGE_SET(
    change_set_name => 'set_cns',
    description => 'set_cns...',
    change_source_name => 'HOTLOG_SOURCE',
    stop_on_ddl => 'y',
    begin_date => sysdate,
    end_date => sysdate + 1);
    END;
    However if I set begin_date to previous time from now, Oracle doesn't caputre data anymore in this case.(HotLog method)
    (I set begin_date => sysdate - 1/24
    end_date => sysdate + 1/24.)
    Does anybody know to capture the previous time interval with Oracle Stream?

    Change C2 to:
    cursor c2(passing_date IN date) IS
      SELECT MONITOR_ID, SAMPLE_ID,
                   COLL_TIME, DEW_POINT
        FROM ARCHIVE_DATA
        WHERE COLL_TIME < passing_date
        ORDER BY COLL_TIME desc;And rather than populating a table with the three records, you could just select the three records using: where COLL_TIME between Prev3_time and Prev1_time

  • Using streams to capture data from physical standby database.

    Does anybody know if it is possible to use streams to capture data from physical standby database instead of PROD database? The standby database is in read only mode. We use Oracle 11gR2.
    Thanks in advance.

    physical are closed : how will it managed the queues and overspill queues when target is not present? Also the data dictionary must reflect the primary but If you run capture, then you introduce rules that are not on primary: How ?

  • How to mix audio data from multiple streams without increasing in size?

    For example, two clients use JMF to capture audio in linear format (content type is raw)and both send the captured data to a third client where it performs mixing to generate one data stream. Suppose each source data size is S, the mixed stream (using JMF "merge") size is S+S which is not what I want. Is there any function to generte mixed stream with size S? I don't need to the capability to reverse mixed stream back to two individual source streams.
    Thanks.

    S+S, ahh an academic.
    This is actually a forest-trees issue here.
    What your doing is appending the files, which would result in a size of 2S. What you want is somthing of size S. To do this you would need a Normilized merge. I say normalized merge because thats going to give you the best result while still avoiding out of range values.
    This is easy enough to by hand if both streams S1 and S2 are of the same frequency. If they are not the same frequency, the result will be size of the larger file when convereted to whatever uniformfrequency.

  • Imported iMovie HD project clips have incorrect capture date - add-on info

    People converting their .dv files from the iMovie HD style into iMovie 09 style report some frustration at having to rename their clips to view them properly. For discussion, see:
    http://discussions.apple.com/thread.jspa?threadID=1344628
    Fortunately, within the thread, a kind soul has posted a solution - many thanks. This posting is an add-on to that thread, posted here because that thread is archived.
    Unfortunately, I could not understand the instructions fully, so I spent some time working out how to make it go, since the solution requires something beyond the usual drag-and-drop to work.
    *To Convert the Dates and Names of DV Files from iMovie HD to the Event-Date Format of iMovie 09*
    1. Download and Unzip the DVRecordingDate file. It produces a folder with four items: COPYING.txt, DVRecordingDate.class, README.txt and DVRecordingDate.java. None of these should be double clicked.
    2. Identify the folder that has your iMovie 6 .dv files. These files will be entirely renamed, so if they are used in any other projects, only work on a copy of them.
    3. Launch the program Terminal - it's in the Utilities folder of the Applications folder
    4. When you see a prompt ending in a $, type +cd /Users/yourname/Downloads/DVrecordingDate-1-1.0+,
    where /Users/yourname/Downloads/DVrecordingDate-1-1.0 is the folder containing the downloaded DVRecordingDate file. This will change the prompt that appears on every line, but it will still end with a $.
    (HINT: you can get the path name of this folder by selecting it in the Finder, and choosing copy. All you then need do is type cd in Terminal, and paste the rest in.)
    Now is when we rename the files - as stated elsewhere, the Finder creation date is NOT changed - it is only the file name that changes. However, this is enough for iMovie 09 to understand when the file was recorded, and where it should be stored within the events system.
    5. At the $ prompt, type +java DVRecordingDate -rename /Users/yourname/Movies/iMovie\ Events.localized/myholiday/Clip\ *.dv+, where +/Users/yourname/Movies/iMovie\ myholiday/+ is the name of the the folder that has your iMovie 6 .dv files. Again, this can be copied from the Finder, and pasted in after typing java DVRecordingDate -rename. The *.dv part os to ensure the program coverts all your files in the folder.
    A list appears:
    Renaming file(s) from, to:
    /Users/yourname/Movies/myholiday/Clip 01.dv clip-2003-08-31 20;37;41.dv
    /Users/yourname/Movies/myholiday/Clip 02.dv clip-2003-09-04 18;43;26.dv
    /Users/yourname/Movies/imyholiday/Clip 03.dv clip-2003-09-04 19;00;50.dv
    etc
    and in the Finder all the files are renamed.
    And that's it. You can now import the .dv files into iMovie 09 and it will give the files their proper dates within the system.
    Hope this saves someone some time...

    No, I mean that when I import old iMovie projects into iMovie 08 the 'capture date' of footage is not correctly imported (the .dv file creation date is read in instead). Sorry to confuse the issue with the comment about the camera: I just meant to say that all the equipment I used in my setup is fairly standard yet there is a glaring bug in this fundamental functionality.
    Regarding re-import: it took me ages to originally import my dv tapes to the computer & not really practical to go through that again.
    FWIW, I've not yet imported anything from dv directly into 08

  • How to save captured data to a log file to prevent lag?

    Here is my VI.
    I read in an string of this format (the channels number are now 32 only but will like to increase up to more than 10000 later)
    Time\sStamp\tChannel:00\tChannel:01\tChannel:02\tC​hannel:03\tChannel:04\tChannel:05\tChannel:06\tCha​nnel:07\tChannel:10\tChannel:11\tChannel:12\tChann​el:13\tChannel:14\tChannel:15\tChannel:16\tChannel​:17\tChannel:20\tChannel:21\tChannel:22\tChannel:2​3\tChannel:24\tChannel:25\tChannel:26\tChannel:27\​tChannel:30\tChannel:31\tChannel:32\tChannel:33\tC​hannel:34\tChannel:35\tChannel:36\tChannel:37\tIP\​sAddress\t\n
    The problem I am now having is that, the data is through UDP sending in, the program start off normal capturing all data, but when data are captured and saved to the .log file after an hour or more, some of the data will start to be missed.
    I guess this is beause the .log file is becoming larger, and in the rate of 10Hz (now), the file open and close can't chase up? so some data are missed?
    Anyone could advice or amend the VI so that I could have a better way to save the captured data? making a buffer? or what so ever? Hope someone could help =]
    Thanks.
    Attachments:
    myVI.zip ‏32 KB

    milkbottlec wrote:
    Just found some error when the data are saved.
    At some point, there data just copy this whole thing and save in the middle of the excel
    ""Time\sStamp\tChannel:00\tChannel:01\tChannel:02\​tChannel:03\tChannel:04\tChannel:05\tChannel:06\tC​hannel:07\tChannel:10\tChannel:11\tChannel:12\tCha​nnel:13\tChannel:14\tChannel:15\tChannel:16\tChann​el:17\tChannel:20\tChannel:21\tChannel:22\tChannel​:23\tChannel:24\tChannel:25\tChannel:26\tChannel:2​7\tChannel:30\tChannel:31\tChannel:32\tChannel:33\​tChannel:34\tChannel:35\tChannel:36\tChannel:37\tI​P\sAddress\t\n""
    I don't know which part handling the error is wrong.
    If possible, maybe could you help me delete the whole auto naming function (move it aside) and just add in a promt user for saving directory thing ok?
    Thanks!
    Here is the file and i added the excel in.
    Hi milkbottlec,
          Regarding the headings in the middle of your file, I imagine the filename changed temporarily.  When the name changed BACK to an exisiting name, the headings were added (anticipating a new file.)  I'm more curious about the "lost" data problem.  Do you still see holes in the data?  If so, were any bad filenames recorded?
    Sure! Why not present a file-dialog instead of automatically generating filenames,  Looks like you've mastered the "True/False" case, why not build on what you know?  Experiment with the File Dialog express VI on the FILE\Advanced File Functions palette.  Personally, I'm a fan of auto-generated filenames, though, not necessarily built from the data. 
    Cheers!
    "Inside every large program is a small program struggling to get out." (attributed to Tony Hoare)

  • Capturing Data from forms before it is stored in the table

    Hi...I would like to capture data from a form before the data is stored in the database. that is, i would like to access whatever data is entered into a field right after the user pushes the Insert button. I would like to do some processing on the data and then store it in the table along with the data from the other fields. Is it possible to access it through a bind variable or something? Please tell me how to go about it. Thanks

    Hi,
    You can make of the session variables to access the values. Every field in the form has a corresponding session variable with the name "A_<VARIABLE_NAME>". For example for deptno the session variable will be "A_DEPTNO"
    Here is a sample.
    declare
    flightno number;
    ticketno varchar2(30);
    tdate date;
    persons number;
    blk varchar2(10) := 'DEFAULT';
    begin
    flightno := p_session.get_value_as_varchar2(
    p_block_name => blk,
    p_attribute_name => 'A_FLIGHT_NO');
    ticketno := p_session.get_value_as_varchar2(
    p_block_name => blk,
    p_attribute_name => 'A_TICKET_NO');
    tdate := p_session.get_value_as_date(
    p_block_name => blk,
    p_attribute_name => 'A_TRAVEL_DATE');
    persons := p_session.get_value_as_number(
    p_block_name => blk,
    p_attribute_name => 'A_NOF_PERSONS');
    p_session.set_value(
    p_block_name => blk,
    p_attribute_name => 'A_FLIGHTNO',
    p_value => to_char(NULL)
    p_session.set_value(
    p_block_name => blk,
    p_attribute_name => 'A_TICKETNO',
    p_value => to_char(NULL)
    p_session.set_value(
    p_block_name => blk,
    p_attribute_name => 'A_TRAVEL_DATE',
    p_value => to_char(NULL)
    end;
    In the above example the values of the variables are got into temporary variables and session variabels are set to null.
    Thanks,
    Sharmil

  • Capture Date Editing

    Hi!
    Though I read the other messages about dates in LR, I didn't find any valuable answer for my problem:
    Editing Capture Date with the shift process works fine. When I want to edit to a specific date, that is another story:
    I have a lot of Tiff files, that come from negativ scans.
    If I select only one picture in the library grid, every thing goes right.
    If I select more than one picture (ctrl A, or using "shift click" or "control click"), I have very surprising results.
    Sometimes all the Capture Dates are updated in the right way. Fine.
    BUT sometimes, only few of them (one, two ore more at the beginig of the selection) are set to the right specific capture date, the other files getting various dates and times, in a quite unexpected maner. The dates could be set in any year (sometimes 2008..), sometimes only the day is shifted with one step, teh time may be set to any value (nothing goes with DLT shift).
    I tried to check for any relation between files that are updated in the good way or not, but I didn't found anything that could make sense.
    Of course, I can edit all my files one by one, but I'd like to find soething else. Because I read in a message that the "Edit Capture Date" of LR is a mess, I tried some extra softs working on EXIF and IPTC informations, but it seems that they can only change things for JPEG, not for TIFF files.
    I use a PC, win XP SP2, pentium P4, 1 Gig ram, and a lot of free space in my hard drives. I use LR 1.2, but I don't remember if I got thoose probs with LR1.1.
    Thanks for any help or information.
    Jerome

    That could be really a good answer..
    I perhaps misunderstood the "edit capture time" functionality. I thought that if I specify the date and the time, thoose values should replace what was recorded before (and not shift them with the same amount of time). That could make sense especialy for the files that have only one date field before editing: there is nothing to "shift", only a field to fill....
    Anyway, I will try to check If the files had the same date/time values before editing or not, and I'll try to look exactly at the values before and after. And I'll go back (tomorrow, I think) for, I hope, the end of the story.
    Thanks a lot for your help.
    Jerome.

  • Retrieve raw captured data after report generation crash

    Hi everyone
    Yesterday we were performing load tests on a server. During which I created and I scheduled a User defined data collector set based on the system performance collector set. The data capture ran for 2 hours. It then proceeded with the process of generating
    the report. During this process I observed the Tracerpt.exe was slowly and consistently consumed memory. The generate report process was taking a while, so I came back to it 15 minutes later, to discover the process has ended, but no report could be found. 
    I'm assuming the process crashed.
    Is there a way to retrieve the raw capture data and attempt to regenerate a report? Where is the raw data temporarily stored?
    Is there any why to retrieve the missing information?
    Any help is appreciated.
    Ernie Prescott

    Suppressing Result Rows can be done in Query designer itself for one time. You need not to do after you get the report every time.
    If you always expect some 10 free chars constantly, i suggest you to insert them in Rows pane in Query designer but not in Analyzer. So that your report layout will be taking care with all settings what you want to do after the report generates in Excel. I mean you will get desired layout when you run the report first time itself.
    Suppress repeated key values also can be done in Query designer itself one time.
    By doing above, your report execution will not be tedious and it will be minimizing your efforts.
    Over all, What I have understood is, you are doing all sorts of settings in Analyzer after getting such huge volume of data, by which it gives you crash error.
    With all above steps, you can run your report with all pre-defined settings.

  • Osx Yosemite doesn't recognize shot capture date photos

    Apparently OSX Yosemite doesn't recognize shot (capture) date of the photos.
    The problem has highlithted after installing new Photo app, in which every photo is imported using the Creating date of the file, not the Shot date.

    Try trash the following file
    com.apple.ImageCaptureNotifications.DeviceDiscoveryDatabase.501
    from HD/Library/ Caches
    and restart your mac.

  • OSX Yosemite doesn't recognize shot (capture) date of the photos

    Apparently OSX Yosemite doesn't recognize shot (capture) date of the photos.
    The problem has highlithted after installing new Photo app, in which every photo is imported using the Creating date of the file, not the Shot date.

    Use the disks that came with your Mac to do a custom install of the software you need, i.e. iPhoto.  The disk should look like #4 in this screenshot:

  • Flushing the captured data uppon system failure

    I'm creating a JMF application and saving the captured data to a datasink. During the recording the file output keeps increasing with the data being stuffed in it. However when some faillure occurs, like an energy power problem the file still plays but it's speed is slow and without audio, like if not all the data has been flushed yet. I need to know how to flush the data so that uppon the power faillure the file is complete until that point. Where do I start ?
    Thank you !

    Lupan wrote:
    Hi EJP, thanks for replying
    Indeed may be impossible to recover everything but maybe some mecanism to make save points at regular times, to in case of a power failure, recover at least until that point. Isn't there possible to do seamlessly to the user? Thank you !Isn't it possible to refuel my car without my noticing?
    No, because you have to shut down the engine and park the car while you put gas in.
    It takes some time to finalize a file when you're doing writing it. It takes some time to get the stream ready to record again after you've stopped it. You can automate the process so the user doesn't have to do anything, but I think you'd have to play around with a bunch of stuff before you could even approach doing it without losing data.

Maybe you are looking for

  • Can't export iMovie 10.0.1 to Vimeo or anywhere else

    Just started using iMovie 10.0.1 and can't export or share a thing. Get rendering error 50 when trying to upload to Vimeo (and this ia a small file too). Just as a side note, I've been using iMovie for well over 5 yrs but this is the first time I've

  • How to generate CSV ouput from JD Edwards 8.11 using rtf template

    Hi, I have a report "Critical Date Report" - R15611 in JD Edwards 8.11. This report has to be generated as CSV output from BI Publisher which is embedded to JDE. why we choose the output in BI Publisher is to move some of the data displaying to diffe

  • Using a control as both an indicator and control

    Hello, I have been search and reading for a few hours now and have finally decided to post. I fear I am not using the correct terms because I am fairly new at LabView. I currently trying to build a user interface that uses 9 different tabs [I am usin

  • Oracle forms --11g installation error

    Hi All, Configuration: Oracle Database: 10.2.0.5 Oracle forms to be installed Version: 11.1.2.0.0 OS version: Windows 7, 64 Bit JDK version: java version "1.7.0_03" Java(TM) SE Runtime Environment (build 1.7.0_03-b05) Java HotSpot(TM) 64-Bit Server V

  • Can a Thick slab of magnet affect my iPhone 6 plus?

    Hey Guys, I've read a previous post wherein someone had concluded that an iPhone cannot be damaged with a thin magnet. I have an iPhone 6 plus and I happened to keep it next to a thick slab of magnet  (the ones that are attached to cling doors, havin