Processing records

Hi,
My question is not specifically about Oracle, but how to design/implement a solution against a particular use case.
The use case is as follow. A server engine populates data into an Oracle 9 database continuously and asynchronously. This data is transferred from Oracle 9 database to Oracle 9 database (DB links) for processing, every ten minutes.
This data contains authorization requests and responses. As the database is populated asynchronously responses could be found, in the database, before their requests are inserted. In this use case responses cannot be processed before their requests. Because of this, we found two scenarios that make this implementation more complex. First, responses could be transferred before their requests. Second, a request and a response could be in the same transfer bunch, but the request may be behind the response (in terms of processing).
We are still using Oracle 9i; and, the number of records to be processed is huge, so performance is important.
Any suggestions on how to implement this will be much appreciated.
Thank you.

Thank you for your helpful answer. Oracle Stream seems to be the way forward for us.
And, thank you for making me feel like a small mosquito.

Similar Messages

  • Post Processing records in BAPI_PRODORDCONF_CREATE_TT

    Hi All,
    I am using BAPI_PRODORDCONF_CREATE_TT for Production Confirmation with Auto Goods receipt(GR) Feature.
    But, in case any error in goods movement occurs system creates a post processing record( Visible in transaction COGI).
    We don't want entires in COGI.In case, any error occurs system should terminate the session. The same control is maintain in SPRO. But, BAPI overrules that.
    Kindly let me know which settings to be used in BAPI, to avoid COGI.
    In Post Wrong entries field of BAPI , I have used ' " ( Blank) value.
    Please suggest.

    Hi Stuti,
    What I suggest you is to use 2 BAPI's instead of 1.
    First use BAPI_GOODSMVT_CREATE to carry out goods movements.
    If this BAPI is successful then only execute the BAPI you are using only to carrying out the confirmations
    This is the best way to control your confirmation for failed goods movements.
    Regards,
    Yogesh

  • Post processing records deletion log in MF47 -reg

    Hi...
    How to know the MF47 post processing records deletion by users  ?
    some of the post processing records in MF47 are being deleted by the users with out processing in time
    we would like to know where this log will be there and we should be able to see the log like
    which user deleted which records on which date
    regards,
    madhu kiran

    hi,
    i have posted earlier on deletion of MF70 records -backdated backlogs which could not be processed
    now i have asked for tracking of post processing records deletion in MF47
    if some record is deleted then no way to track when and who has deleted  them ?
    regards,
    madhu kiran

  • Post processing records in MF47 and COGI

    Hi All,
    I have a query....
    In STD SAP will the postprocessing records created by Repetative Mfg. be visible in transaction COGI???
    And will the postprocessing records created by Descrite Mfg. be visible in transaction MF47???
    Regards,
    Vinayak.

    Hi ,
    In general for Discreate Mfg the post processing records are checked and cleared in Tcode : COGI.
    Whereas for REM Mfg it is : MF47.
    You will be able to view the REM postprocessing records in MF47, it is a standard behaviour of SAP , hence I can say there is no bug in your system.
    Hope this will help you.
    Regards
    radhak mk

  • Process records in a transaction table in real time

    We are currently designing a new system which basically needs to process records from a transaction table. We are envisaging to have approximately 100000 records per hour. There is no need to process each transaction independently as the external process will process all records residing in the table which have not been processed as yet.
    We are basically looking at various options:
    1) have the external process run continuously, select all records in the table, process them and delete them and then start the process again
    2) have the external process run continuously, select all records in the table, process them, update a status flag and then start the process again processing only those records with their status not yet updated
    3) fire a trigger for each record launching the external process (if it is not running yet)
    4) have a separate table containing a timestamp which is updated via trigger for every transaction that is inserted in the transaction table. Have the external process run continuously and only process those records which have exceeded the previous timestamp.
    Would appreciate any ideas you may have how to tune this process and your views regarding the options mentioned above(or others you might have)
    Thanks a lot.

    user9511474 wrote:
    We are currently designing a new system which basically needs to process records from a transaction table. We are envisaging to have approximately 100000 records per hour. There is no need to process each transaction independently as the external process will process all records residing in the table which have not been processed as yet.My busiest table collects up to 50 million rows per hour (peak periods in the day) that also needs to be processed immediately (as a batch) after the hour. I use partitioning. It is a very flexible and there are very few (if any) performance knocks.
    The entire data set has to be processed. With a partition that means a full scan of the table partition - and the ability to do it using parallel query. No additional predicates are needed, except the to have the CBO apply partition pruning. In other words, the predicate enables the CBO to narrow down the SQL to the previous hour's partition only.
    No additional predicates needed like a STATUS flag to differentiate between processed and unprocessed rows - as the entire data set in the partition is unprocessed at that time. (such a flag approach will not be very scalable in any case)
    Also, I do not use external processes. Too expensive performance wise to ship data all the way from the Oracle buffer cache to some external process. And parallel query is also not an option with an external process as the OCI does not provide the external process with a threading interface in order to hook into each of the data output streams provided by the parallel query clients.
    I stay inside PL/SQL to perform the data processing. PL/SQL is even more capable than ProC/C++ and Java and .Net in this regard.
    The execution interface to drive the scheduling of processing is DBMS_JOB. Straight forward and simple to use.
    The basic principles of processing large data volumes in Oracle is to effectively use I/O. Why use indexes when an entire data set needs to be processed? Why perform updates (e.g. updating a status flag) when the data model and physical implementation of that can eliminate it?
    I/O is the most expensive operation. And when dealing with a large volume, you need to make sure that every single I/O is actually required to achieve the end result. There's no room to waste I/O as the performance penalties are hefty.

  • IView for Activating Compensation Process Records (IT 759)

    Does anyone know the iView for activating compensation process records (IT 759). There are iViews for compensation planning and compensation plan approval in the stndard MSS package but don't see one for activation.
    Regards,
    Sanjay Gera

    Hi Sanjay,
    Unfortunately, we do not have an iView for activation. You may need to develop a custom iView to pull in report RHECM_CHANGE_PROC_STATUS. Since that report is intended to run from R/3 and in bulk (read: background), you would want to consider data volume, browser compatibility to display results and last but not the least, security.
    Donnie

  • TECO status for Prod order for which Post processing record exists

    Dear PP Gurus,
    We use  BAPI_PRODORD_COMPLETE_TECH to TECO production order. Program doesn't TECO orders , if there are Post processing records exsiitng for order giving message "Postprocessing records for order 1000068 prevent technical closing" .  I think this is standard BAPI msg.
    When same BAPI is run in Foreground mode, it gives "Confirmation Prompt" , 'Order 1000068 : There are still reprocessing records . Set order to Technically Complete" with YES/NO/Cancel option. You can save TECO by selecting YES
    Is there a way to achieve this in Background mode.
    Thank you much in advance for help,
    Regards,
    Jatin

    Hello Jatin,
    Call function DIALOG_SET_NO_DIALOG before the BAPI, then system handles the BAPI as in the non-dialog mode and the pop-up does not appear
    Refer KBA 1986661 - PP-SFC: BAPI_PRODORD_COMPLETE_TECH Popup
    Best Regards,
    R.Brahmankar

  • How to process records in a block without navigating (forms45)

    Hi.
    I use multi-record block with check boxes to allow operator select row they want to update. How can I process this task? Do I have to use:
    First_Record;
    loop
    when../*record doesn't exist */... exit;
    If /*checkbox selected */ then process_record
    end if
    Next_Record
    end loop
    But after that I have to navigate to the row I was before and I don't know how to do it.
    Or maybe (it would be much better) there is some way to silently process all rows in a block without employing the navigation process. Any advice?
    Thanks,
    Gregor

    DECLARE
    currentRecord number := get_block_property('YourBlock', CURRENT_RECORD);
    BEGIN
    First_Record;
    loop
    when../*record doesn't exist */... exit;
    If /*checkbox selected */ then process_record
    end if
    Next_Record
    end loop
    go_record(currentRecord);
    END;

  • Udf mapping: any way to detect last processed record?

    Hi !
    I need to detect inside my java user defined function, in the graphical mapping, if I am processing the last record of my input message, for example to add a trace message about how many records were processed, etc....I know I can detect the first one by mapping a function to the root node...and how about the last one ? are there any internal mapping variables available to recall inside a udf ?
    thanks,
    Matias

    Hi,
    one way would be to use a UDF of type queue
    which will store all values from your message
    this way you can get a total number of records
    if you have the total then you know which one is the last one...
    Regards,
    michal
    <a href="/people/michal.krawczyk2/blog/2005/06/28/xipi-faq-frequently-asked-questions"><b>XI / PI FAQ - Frequently Asked Questions</b></a>

  • How to process records one by one(not in parallel) with LSMW(BAPI)

    We loaded a batch of records (around 800) in lsmw with biz object method(BAPI).
    The records were processed in parallel and this caused some locking issue.
    Is there any way to make sure the records in LSMW with biz object method(BAPI) are processed one by one to avoid this kind of locking issue.
    Any reply is appreciated.

    Hi,
    I think there must be something to increase the lead time so that locking doesn't happen and it should be either part of customizing or can be coded,not sure exactly what code as in SAP GUI Scripting also there is a way to increase the lead time to avoid the locking issues.
    Regards,
    Rahul

  • Process records in LOOP

    My logic is
    cursor c1 is
    SELECT col1,col2,...
    FROM tab1
    BEGIN
    FOR c1_rec in c1 LOOP
    IF some condition is satisfied THEN
    delete the record
    -- exit out if the condition is satisfied and go to next record in loop
    END IF;
    BEGIN
    SELECT (some date)
    INTO variable
    FROM table 2
    IF variable = some value THEN
    DELETE operation
    UPDATE operation
    -- exit out if the condition is satisfied and go to next record in loop
    END IF;
    END;
    END;
    What i am trying to acheive is after i enter the loop i am working with a set of records based on the explicit cursor declared. Suppose i am with my first record and it has satisfied the condition of the first DELETE statement and i want to exit out and process the next record in the set instead of going further in the logic. How do i do this?
    Thanks in advance.

    This is the code i have the end loop at the end and i dont want to bail out of the loop i need to go to the next record as soon my first delete is done as the second delete/update is based on another logic check from the first one. So if i use EXIT then i am exiting out of the loop which i dont want to do rather be in the loop and process the next record. I hope that explains. Sorry for the confusion.
    Thanks
    cursor c1 is
    SELECT col1,col2,...
    FROM tab1
    BEGIN
    FOR c1_rec in c1 LOOP
    IF some condition is satisfied THEN
    delete the record
    -- exit out if the condition is satisfied and go to next record in loop
    else
    BEGIN
    SELECT (some date)
    INTO variable
    FROM table 2
    IF variable = some value THEN
    DELETE operation
    UPDATE operation
    -- exit out if the condition is satisfied and go to next record in loop
    END IF;
    END;
    end if;
    end loop;
    END;

  • Processing records from co1p

    Hi
    We have configured automatic production order confirmation and goods movement through XI. In the process, goods movement failed for one of the orders since the material number was incorrect. The confirmation is ok as seen from co14.
    When I attempt to process the same from co1p, I get the error 'material number ** does not exist'. i intend to cancel this confirmation/goods movement so that it can be processed afresh.
    How can I achieve this?
    Regards

    Hi Rajesh
    I would ideally want to cancel the confirmation from co13 so that both the confirmation and the associated goods movement is reverted.
    Currently when I attempt to do this, I get the message 'Future change records for background processing exist for order ......'
    This, I believe means that records are held in co1p.
    So the next alternative is that I wish to clear the record from co1p.
    Hope I am clear.
    Regards

  • Auto Invoice Import Program not processing records

    Hi,
    I wrote a procedure at back end to submit the auto invoice import program. I find concurrent program being submitted but none of the records are processed.
    Please find the code given below for reference.
    APPS Version : 11.5.10.2
    fnd_global.apps_initialize(2709,50325,222);
    l_request:= FND_REQUEST.SUBMIT_REQUEST
    (application => 'AR',
    program => 'RAXTRX',
    description => 'Auto',
    start_time => NULL,-- To start immediately
    sub_request => FALSE,
    argument1 => 'MAIN',
    argument2 => 'T',
    argument3 => '24',--batch_source_id
    argument4 => 'AR Batch Source', --batch_source_name
    argument5 => to_char(SYSDATE,'YYYY/MM/DD HH:MM:SS'), -- should be in format -- RR-MON-DD
    argument6 => '',
    argument7 => '',
    argument8 => '',
    argument9 => '',
    argument10 => '',
    argument11 => '',
    argument12 => '',
    argument13 => '',
    argument14 => '',
    argument15 => '',
    argument16 => '',
    argument17 => '',
    argument18 => '',
    argument19 => '',
    argument20 => '',
    argument21 => '',
    argument22 => '',
    argument23 => '',
    argument24 => '',
    argument25 => '',
    argument26 => 'Y',
    argument27 => 'Y',
    argument28 => '',
    argument29 => 155, -- org_id
    argument30 => chr(0) -- end with chr(0)as end of parameters
    COMMIT;
    - - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - -

    Pl post the log of the concurrent request. Pl see if MOS Doc 1089172.1 (Troubleshooting Autoinvoice Import - Execution Report Errors (Request Status = Completed)) can help
    HTH
    Srini

  • Processing records with in time

    In case of JDBC sender, if I want to process all the records with in the time, how to set this?

    hi ,
    that is possiable with ATP.
    Availability Time Planning enables scheduling of all the XI channels in automatic mode. Using ATP you plan availability times for communication channels to enable them to be started and stopped automatically instead of manually You can schedule channels on daily, weekly, monthly and one time basis so that we can schedule an XI Communication Channel on a particular date and at a particular time in a day or in a week or a month or a year, which will eliminate long polling interval
    Configuring ATP:  
    1 Navigate to Runtime Workbench -->Component Monitoring -->Component Adapter Engine --> Communication Channel Monitoring ...........
    please refer link:
    http://www.****************/Tips/XI/PollingvsATP/Index.htm
    Edited by: bhavanisankar.solasu on Jan 13, 2012 1:08 PM

  • Processing Record by batch

    Hi Experts,
    I have an scenario, In the internal table suppose If I  have a 50,000 records ( fifty thousand records) , suposse I want to process the records in the batch of 5000 ( five thousand ) , I need to process 10 times to compelete this 50,000 records.
    or if I need to process 50,000 records in  the batch of 2000 then ,I need to process it 25 times...
    The number of records per batch will be given by the user ...
    But how to calculate the iterations and process all the records in ABAP langauage...
    Thanks in Advance...
    Regards,
    IFF
    Edited by: IFF on Feb 18, 2008 6:15 PM
    First Iteration : 1-5000
    Second Iteration : 5001-10000
    Last iteration:45001-5000
    The batch size may be directly given by the user
    Regards,
    IFF

    hi
    after the user inputs the no. of records per batch..
    using usual code send those number of records to the batch and delete it from the internal table..
    as u put it in the loop, the process continues until all the records are processed (that is the internal table become empty).
    Regards,
    K.Tharani.

Maybe you are looking for

  • Applications crashing when attempting to Open File?

    Hi I am having a problem that on a few (not all) of my applications running on Mountain Lion, when trying to open/import a file, when the finder window pops up the application crashes and closes itself, so far I've had this problem with Chrome and Lo

  • Is there a way to make the Tween class move XY?

    I want to move an object across the screen. For the first 10 x pixels I want it to also move 10 y pixels. So if I was manually coding it I would do for (i = 0; i < 10; i++) {      x + i;      y +i Once it gets to the max x position I want it to rever

  • How do I increase the number of opened files in LabWindows?

    There's a Win32 function _setmaxstdio, but it doesn't seem to be included in the stdio.h version in LabWindows.  I'm using the fopen call and it's limited to 512 opened files.  MSDN help says to use the _setmaxstdio function.  Is there anyway to do t

  • Network connection time out

    iTunes 7.1.1 - I can not download ANY podcst Tried all the proposed solutions a. dns reflush, b. msconfig, c. De-install NORTON d. check HOST file, e. check with my ISP provider, f. downgrade iTunes (to 7.0.1), etc.... ----> NOTHING!!!! Model: Packar

  • Catalog backup (only catalog) question

    I have been using PSE 9 on Win7 64-bit for more than a year, but just recently I have started using Organizer.  Actually, I haven't started using it, but I have been experimenting with it using a test catalog that only has about 25 photos in it.  Bef