Process records in a transaction table in real time

We are currently designing a new system which basically needs to process records from a transaction table. We are envisaging to have approximately 100000 records per hour. There is no need to process each transaction independently as the external process will process all records residing in the table which have not been processed as yet.
We are basically looking at various options:
1) have the external process run continuously, select all records in the table, process them and delete them and then start the process again
2) have the external process run continuously, select all records in the table, process them, update a status flag and then start the process again processing only those records with their status not yet updated
3) fire a trigger for each record launching the external process (if it is not running yet)
4) have a separate table containing a timestamp which is updated via trigger for every transaction that is inserted in the transaction table. Have the external process run continuously and only process those records which have exceeded the previous timestamp.
Would appreciate any ideas you may have how to tune this process and your views regarding the options mentioned above(or others you might have)
Thanks a lot.

user9511474 wrote:
We are currently designing a new system which basically needs to process records from a transaction table. We are envisaging to have approximately 100000 records per hour. There is no need to process each transaction independently as the external process will process all records residing in the table which have not been processed as yet.My busiest table collects up to 50 million rows per hour (peak periods in the day) that also needs to be processed immediately (as a batch) after the hour. I use partitioning. It is a very flexible and there are very few (if any) performance knocks.
The entire data set has to be processed. With a partition that means a full scan of the table partition - and the ability to do it using parallel query. No additional predicates are needed, except the to have the CBO apply partition pruning. In other words, the predicate enables the CBO to narrow down the SQL to the previous hour's partition only.
No additional predicates needed like a STATUS flag to differentiate between processed and unprocessed rows - as the entire data set in the partition is unprocessed at that time. (such a flag approach will not be very scalable in any case)
Also, I do not use external processes. Too expensive performance wise to ship data all the way from the Oracle buffer cache to some external process. And parallel query is also not an option with an external process as the OCI does not provide the external process with a threading interface in order to hook into each of the data output streams provided by the parallel query clients.
I stay inside PL/SQL to perform the data processing. PL/SQL is even more capable than ProC/C++ and Java and .Net in this regard.
The execution interface to drive the scheduling of processing is DBMS_JOB. Straight forward and simple to use.
The basic principles of processing large data volumes in Oracle is to effectively use I/O. Why use indexes when an entire data set needs to be processed? Why perform updates (e.g. updating a status flag) when the data model and physical implementation of that can eliminate it?
I/O is the most expensive operation. And when dealing with a large volume, you need to make sure that every single I/O is actually required to achieve the end result. There's no room to waste I/O as the performance penalties are hefty.

Similar Messages

  • Can u create cluster and pooled tables in real time

    hi
        can u create cluster and pooled tables in real time.can u send data base tables name of above one.

    Hai Anil
    For creating cluster tables first u have to create table pool ...
    create a table and specify the fields and other tecnical settings and
    then Goto EXTRAS --> Change Table Category and selct the Pooled table and activate it...
    Then Create another table And specify the required fileds and also the settings and then
    Goto EXTRAS --> Change Table Category and selct the Cluaster table and in Delivery and
    Maintainence Properties mention the Pooled table that u created and activate it...
    Regards
    Sreeni

  • Locking multiple records in a database table at a time

    Hi Experts,
       I have a requirement to lock multiple records in the database table for writing. I have created lock object in SE11. But with this we have only 2 possibilities either lock entire table or lock a single record at a time.
       My requirement is, i have table with key field PROJECTID. I want more than one project to be locked, but not the complete table. I dont have any other field in the table to compare.
    Thanks in advance..
    Regards,
    Asrar

    Hi ,
    Try with FOR UPDATE in the SELECT sentence.
    SELECT FOR UPDATE *
        INTO Internal_Table
      FROM Table
    WHERE Conditions.
    UPDATE "Table_name" FROM TABLE  Internal_Table.
    COMMIT WORK.
    This sentence blocks only the records that satisfy of the WHERE conditions (not the entire table) and unlocks after commit work.
    Hope this information is help to you.
    Regards,
    José

  • How to insert records dynamically in a table at run time

    hi, all
      please help me out,
      my problum is how can i insert records from on table to another table at  run time dynamically. Initally the records are coming  from R/3 backend.
    regards

    Hi,
    One way is to first create a Value node (NewNode) with structure binding of that of the model node. Then iterate through the model node, create NewNode elements and set the value from model node elements into it.
    IPrivate<view>.I<model node> mele;
    IPrivate<view>.I<NewNode> nele;
    for(int=0;i<wdContext.node<output>().node<record>().size();i++)
    mele = wdContext.node<output>().node<record>().get<record>ElementAt(i);
    nele = wdContext.node<NewNode>().create<NewNode>Element();
    wdContext.node<NewNode>().addElement(nele);
    nele.set<attr>(mele.get<attr>());
    Second way is to create that NewNode inside the model node and create a supply function.
    Regards,
    Piyush.

  • Post Processing records in BAPI_PRODORDCONF_CREATE_TT

    Hi All,
    I am using BAPI_PRODORDCONF_CREATE_TT for Production Confirmation with Auto Goods receipt(GR) Feature.
    But, in case any error in goods movement occurs system creates a post processing record( Visible in transaction COGI).
    We don't want entires in COGI.In case, any error occurs system should terminate the session. The same control is maintain in SPRO. But, BAPI overrules that.
    Kindly let me know which settings to be used in BAPI, to avoid COGI.
    In Post Wrong entries field of BAPI , I have used ' " ( Blank) value.
    Please suggest.

    Hi Stuti,
    What I suggest you is to use 2 BAPI's instead of 1.
    First use BAPI_GOODSMVT_CREATE to carry out goods movements.
    If this BAPI is successful then only execute the BAPI you are using only to carrying out the confirmations
    This is the best way to control your confirmation for failed goods movements.
    Regards,
    Yogesh

  • Appending a record in an internal table while debugging

    Hi gurus,
              i am trying to add one record in a internal table at run time in debugging mode. i am adding the same record which is already exist in that internal table to test for some scenarios.after adding adn once if i click save, all the data i have entered is going and only the document number staying there. cant we add a record in debugging. if so where did i do mistake. thanks in advance,
                        santosh.

    hi
    yes u can add records in internal table in debugging mode.
    switch to the classical debugger . double click on the internal table and press append button.
    then add the first field value and then press enter ...then double click on the parallel line  below the next field which will take u to input the next field enter value and press on the pencil. u need not press save.
    to append next line press append button again and append in similar way.
    <REMOVED BY MODERATOR>
    Edited by: Alvaro Tejada Galindo on Feb 21, 2008 5:37 PM

  • Acrobat Plugin Can be involved in Real-time processing

    Hi,
    I am implementing an acrobat plug-in solution for color separation and generation of PRN file for the given PDF file.
    Is the Plugin Architecture is good for the real time processing and RIPPIng?  I have a doubt like the Acrobat Process
    is involved in exectuing the each and every plugin HFT functions, is that would be fine for my plug-in to respond
    in realtime?
    Please suggest me.
    Thanks & Regards,
    Abdul Rasheed.

    First and foremost, Acrobat can NOT be used on a server, right?  So this is going to be a user-invoked process, correct?
    Beyond that, what do you thin would be a problem?
    From: Adobe Forums <[email protected]<mailto:[email protected]>>
    Reply-To: "[email protected]<mailto:[email protected]>" <[email protected]<mailto:[email protected]>>
    Date: Wed, 21 Sep 2011 07:29:28 -0700
    To: Leonard Rosenthol <[email protected]<mailto:[email protected]>>
    Subject: Acrobat Plugin Can be involved in Real-time processing
    Acrobat Plugin Can be involved in Real-time processing
    created by skrasheed<http://forums.adobe.com/people/skrasheed> in Acrobat SDK - View the full discussion<http://forums.adobe.com/message/3929688#3929688

  • RDA(real time data aquasation)

    can anyone pl. tel me how the process goes in RDA(real time data Aquasation) means no steps internally.

    Hi,
    Real-Time Data Acquisition u2013BI 2004
    Real-time data acquisition supports tactical decision-making. It also supports operational reporting by allowing you to send data to the delta queue or PSA table in real-time.
    You might be having complex reports in your BI System, which helps in making decisions on the basis of data of your transactional system. Sometimes (quarter closure, month end, year ending...) single change in the transactional data can change your decision, and its very important to consider each record of transactional data of the company at the same time in BI system as it gets updated in the transactional system.
    Using new functionality of Real-time Data Acquisition (RDA) with the Net Weaver BI 2004s system we can now load transactional data into SAP BI system every single minute. If your business is demanding real-time data in SAP BI, you should start exploring RDA.
    The source system for RDA could be SAP System or it could be any non-SAP system. SAP is providing most of the Standard Data Sources as real-time enabled.
    The other alternative for RDA is Web Services, even though Web Services are referred for non-SAP systems, but for testing purpose here I am implementing Web Service (RFC) in SAP source system.
    Eg will be a production line where business wants information regarding defective products in the real time so that production can be stopped before more defective goods are produced.
    In the source system, the BI Service API has at least the version Plug-In-Basis 2005.1 or for 4.6C source systems Plug-In 2004.1 SP10.
    Refer:
    Real-Time Data Acquisition -BI@2004s
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f80a3f6a983ee4e10000000a1553f7/content.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3db14666-0901-0010-99bd-c14a93493e9c
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3cf6a212-0b01-0010-8e8b-fc3dc8e0f5f7
    http://help.sap.com/saphelp_nw04s/helpdata/en/52/777e403566c65de10000000a155106/content.htm
    https://www.sdn.sap.com/irj/sdn/webinar?rid=/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    Regards
    Tg

  • Real time job doesn't receive automatically changes in SAP ECC 6

    Hi Experts,
    <br/>
    <br/>I'm trying to (using Data Integrator) automatically retrieve data from SAP ECC and store it in a local database table. I followed the steps written in this article: http://wiki.sdn.sap.com/wiki/display/BOBJ/Receiving+IDOCs. I'm modifying some records in the Cost Center Master Data (CSKS) and using COSMAS01 as the IDOC in which I'm trying to send the information.
    <br/>
    <br/>All seems to be OK, since I manually sent a Cost Center data row using the /nbd16 transaction, SAP ECC displayed a message telling that an IDOC had been generated. Also, when I checked the IDOC status in BD87 it had status 03, meaning that it had been sent correctly.
    <br/>
    <br/>In Data Services when I clicked on 'View Data' in the local database table that I put in the dataflow I could see the rows that I manually sent from the ECC. However, after truncating the local table, it bothered me that when I tried to send manually a second IDOC with another row from the Cost Center Master Data table, my real-time job didn't receive the request and consequently didn't insert the desired rows. I didn't change anything in the configuration, so my first question to any of you is that if you know what could it be that causes my real-time job to not receive the request from ECC?. I tried making a third try, but at my company they had to shut down the ECC server, so it's going to be a while before I make another try.
    <br/>
    <br/>Now, the REAL question of this post is that after I sent the first IDOC successfully (and before sending manually the second one mentioned earlier), I changed a record in the CSKS table directly in the ECC, hoping that it would automatically generate an COSMAS01 IDOC and send all the data in the table to the Data Integrator. This didn't happen and no IDOC was generated, so do any of you know why the automatic change didn't trigger the sending of the IDOC with the data?
    <br/>
    <br/>As I said before, I made the configuration in ECC and DI following the steps in the link written at the start of the post, and it was tested OK by successfully sending a COSMAS01 IDOC once, manually using bd16.
    <br/>
    <br/>In advance I thank you all for your cooperation. This is my first thread in the SDN forum so also please excuse any mistakes in my english.
    <br/>
    <br/>Best regards...

    You do not need to schedule this job every 10 min.  Why?
    If we can't schedule this job every 10 mints how can I able to retrive the delta records into Queue(RSA7)
    What is your advice I mean how to schedule in order to get the delta?
    Thanks

  • Hi All, please help me...Real time cube planning layout creation...

    Hi All,
    Could you help me on below process please.
    Iam working with BI7.0. I have real time cube (daily sales) and user need to access and update/add records accordingly.
    So for this real time cube I have to create planning layout with the data. User can see that data and update/add.
    I dont how to create planning layout (i heared that rsplan) how to show the data.
    Also I need some help on ABAP, I need to map some fields from a table.
    Actually from a flat file iam uploading data into ODS. In ODS has 3 more extra infoObjects.
    Ex: Shopping mall code, tenent number and outlet number info i have to pick contract number from table with routine and pass that contact number into ODS one field. It means Contact number will not come from flat file, thats what need table and routine.
    So for that I have to create one table (SE11) with all the information. and i have write routine.
    I have to read flat file 2 fields( like A and B) and check the value in a table and get the value C and put into ODS one field. I have to do this with routines.
    In the table have A, B and C values. Table also I have to create. so also please can u tell me some steps.
    Could you please help on this. Also with process chains for this.
    Thanks in advance!
    anil

    Hi Anil,
    If you go to RSPLAN there you can go to Portal link (it must be configured properly in system)
    In portal there will be wizard where you can easily go step by step and create planning functions and sequences like below:
           1.      You choose the appropriate InfoProvider.
           2.      You create one or more aggregation levels.
           3.      You create one or more filters.
           4.      You create one or more planning functions.
           5.      You create a planning sequence.
           6.      You test the planning model.
    which is clearly explained in below link
    http://help.sap.com/saphelp_nw70/helpdata/en/43/1d2440301a06f4e10000000a422035/frameset.htm
    after doing this you can create a query on aggregates you have created above and then make query input ready i.e planning enabled for keyfigures you want like mentioned in below link:
    http://help.sap.com/saphelp_nw70/helpdata/en/43/1d023a41130bd5e10000000a422035/frameset.htm
    then you can run query in Analyzer and save it as a work book after proper drilldown which can be used as template.
    Hope you understood basic thing, as you going on doing u will know clearly every thing.
    Ravi

  • Real-Time Data Acquisition

    WHAT IS REAL-TIME DATA ACQUISITION AND
    WHAT ARE REAL TIME QUERIES AND
    DEAMON UPDATE AND HOW DO WE EXTRACT AND LOAD DATA
    Please Explain in detail.....
    regards
    GURU

    Hi,
    Real-Time Data Acquisition –BI 2004s
    Real-time data acquisition supports tactical decision-making. It also supports operational reporting by allowing you to send data to the delta queue or PSA table in real-time.
    You might be having complex reports in your BI System, which helps in making decisions on the basis of data of your transactional system. Sometimes (quarter closure, month end, year ending...) single change in the transactional data can change your decision, and its very important to consider each record of transactional data of the company at the same time in BI system as it gets updated in the transactional system.
    Using new functionality of Real-time Data Acquisition (RDA) with the Net Weaver BI 2004s system we can now load transactional data into SAP BI system every single minute. If your business is demanding real-time data in SAP BI, you should start exploring RDA.
    The source system for RDA could be SAP System or it could be any non-SAP system. SAP is providing most of the Standard Data Sources as real-time enabled.
    The other alternative for RDA is Web Services, even though Web Services are referred for non-SAP systems, but for testing purpose here I am implementing Web Service (RFC) in SAP source system.
    Eg will be a production line where business wants information regarding defective products in the real time so that production can be stopped before more defective goods are produced.
    In the source system, the BI Service API has at least the version Plug-In-Basis 2005.1 or for 4.6C source systems Plug-In 2004.1 SP10.
    Real-Time Data Acquisition -BI@2004s
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f80a3f6a983ee4e10000000a1553f7/content.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3db14666-0901-0010-99bd-c14a93493e9c
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3cf6a212-0b01-0010-8e8b-fc3dc8e0f5f7
    http://help.sap.com/saphelp_nw04s/helpdata/en/52/777e403566c65de10000000a155106/content.htm
    https://www.sdn.sap.com/irj/sdn/webinar?rid=/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    Thanks,
    JituK

  • Real time infopackage

    Hello
    Could you please let me know RDA is used for which DSO ?
    Real time infopackage support to which DSO and why ?
    Regards

    Hi........
    We use Standard DSO in RDA..........Real-time data acquisition supports tactical decision making. In terms of data acquisition, it supports operational reporting by allowing you to send data to the delta queue or PSA table in real time. You use a daemon to transfer DataStore objects that have been released for reporting to the DSO layer at frequent regular intervals. The data is stored persistently in BI.
    You use real-time data acquisition if you want to transfer data to BI at frequent intervals (every hour or minute) and access this data in reporting frequently or regularly (several times a day, at least). The DataSource has to support real-time data acquisition. The option to support real-time data acquisition is a property of a DataSource.............
    Real-time data acquisition can be used in two primary scenarios:
    • via the Service API (SAPI). It incorporates usage of InfoPackage for
    Real-time Data Acquisition (source to PSA). It then leverages Data Transfer
    Process for Real-time Data Acquisition (PSA to DataStore Object).
    • via a Web Service Incorporates usage of Web Services to populate the PSA
    which then leverages the Real-time DTP to transfer data to the DataStore
    Object
    Regards,
    Debjani........

  • Cannot check real time access while creating infopackge

    Hi all,
      I am getting an error while trying to create a real time infopackage.  I have made my datasource real time by changing the real time flag in ROOSOURCE but while creating the infopackage the real time check is greyed out.
    I had read in some other message in SDN that you can make non-real time DS into real time by changing the flag in ROOSOURCE any ideas.
    Also we are in content 7.0.3 and I dont see any business content either in ECC or BI in ROOSOURCE table with real-time flag marked X.  Does SAP delivery any content which are real time or am I missing something here?
    thanks for your input

    Hi Raj,
    You can create the Real-Time infopackage only after creating the info-package for Init Load. The check box for Real-Time is not enabled if Initialization load is not run. Though this check has to be made only after creating the Real-Time Datasource correctly.
    Go to RSO2, enter the datasource name, and open it in edit mode. Then choose the option 'generic delta'. Give the field on which you want the Delta pointer to be set (The field you want to enter here should <b>NOT</b> be hidden from SAP BW). Importantly, check the option 'Real-Time Enabl'. save the datasource. It becomes Realtime. Replicate the datasource.
    Hope this helps.
    Thanks,
    Praneeth.

  • Export journalized data in real time

    Hello,
    I must pull journalized data in real time out of Oracle DB source.
    my package has ODIWaiForData operator and interface which triggers only in case of Journalized Data in my source table. Interface loads data to target Oracle table in real time and it works.
    Is it possible to send Journalized Data out of Oracle database in real time? Maybe some insert/update statements within sql file, csv file or through ODIOSCommand operator somehow?
    Regards

    Hi,
    did you already see about Logminer?
    Cezar Santos
    www.odiexperts.com

  • Moving the 80 Million records from Conversion database to System Test database (Just for one transaction table) taking too long.

    Hello Friends,
    The background is I am working as conversion manager and we move the data from oracle to SQL Server using SSMA and then we will apply the conversion logic and then move the data to system test ,UAT and Production.
    Scenario:
    Moving the 80 Million records from Conversion database to System Test database (Just for one transaction table) taking too long. Both the databases are in the same server.
    Questions are…
    What is best option?
    IF we use the SSIS it’s very slow and taking 17 hours (some time it use to stuck and won’t allow us to do any process).
    I am using my own script (Stored procedure) and it’s taking only 1 hour 40 Min. I would like know is there any better process to speed up and why the SSIS is taking too long.
    When we move the data using SSIS do they commit inside after particular count? (or) is the Microsoft is committing all the records together after writing into Transaction Log
    Thanks
    Karthikeyan Jothi

    http://www.dfarber.com/computer-consulting-blog.aspx?filterby=Copy%20hundreds%20of%20millions%20records%20in%20ms%20sql
    Processing
    hundreds of millions records can be done in less than an hour.
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

Maybe you are looking for

  • Location of jpf.properties files

    XI 2.0 Tomcat 4.1 JDK 1.4.2   The only way I've been able to have my java proxies find the jpf.properties file is when I put jpf.properties in the same location that I start up my WebServer from (my case, tomcat).   Is there some an xml file or perha

  • Account Relationships. Is it possible to do bulk imports?

    Hi, I would like to know if it´s possible to import csv files with information about Account Relationships. I need to create different types of accounts and then I need to import relationship between them (For instance: Company A will be a Suplier of

  • Capacity Plan for Partially Confirmed Order

    Hi Guys Can any one please guide me on Capacity Planning (CM29) how to: 1. Plan to plan a partial order. Or in other terms: how to plan a part order in wk1 and rest in wk2 2. If the process order is partially confirmed, then how should supervisor re-

  • Custom Built Reports in Analytics returning reduced data numbers

    Reports in Analytics using Custom Built Reports looks to be working and returning results but when Reports are complete they seem to be providing limited numbers of results. This is from these same reports which previously functioned perfectly and wh

  • Tecra S2 - Lan Controller is disabled after FritzCard PCMCIA installation

    My Notebook is a Tecra S2. After a FritzCard PCMCIA v2.0 is plugged in, the Lan Controller is disabled. Not before the FritzCard PCMCIA is removed and the S2 is re-booted, the LAN Controller can be re-activated. The LAN-Controller is identified by th