Advice with High Data Volume

I have to make a process that gets like 100.000 records from a data base at onces and then i have to treat one by one. The method i use after executing a query puts all the data in a Recorset object.
I would like to know if the Recorset is going to be able to treat so many data fast or if thare is any other Class better to do this kind of programs. This process has to be as faster as possible.
I am alse wondering that maybe is faster to read several times from the data base in order to get less Data than reading only one tme an get all the data.
I am realy need to program this process the most optismt way posible.
If someone can help me or knows where can i find information about this topic i would realy apreciate it.
Thanks

You can do it faster if you avoid to transport data between database and the java program, do the process entirely inside de database.
define your java loop inside the database doing a complex sqlstatement or a stored procedure written in the database language... for example, if you are working with sql server then use a stored procedure with transact-sql.
A stored procedure allows you to call a program with parameters that resides into the database, and the database engine will execute it.
the statement to execute throught jdbc is something like:
"execute sp_my_stored_procedure param1, param2, ... , paramN"
where param1... paramN are all needed parameters to do your process, then inside the database must be a stored procedure it could be something like
create procedure sp_my_stored_procedure
@param1 type,
@param2 type
@param3 type
) AS
// HERE THE CODE of the procedure
declare my_cursor insensitive cursor for
SELECT column1, column2 ... columnN
where ...
open my_cursor
//get a row
fetch my_cursor into @column1, @column2 ... @columnN
while ( @@fetch_status = 0 ) begin
// process the row
// do operations on columns values with params
// i.e.
@result1 = @column1 - @column2 * @param1
// do operations with results of your ops ( update, insert... )
update anotherTable set value = @result1
where id = @column5
// get the next row
fetch my_cursor into @column1, @column2 ... @columnN
end
close my_cursor
allocate my_cursor
PD: use transactions if needed

Similar Messages

  • Anyone using durable topics with high data volumes?

    We're evaluating JMS implementations, and our requirements call for durable subscribers, where subscribers can go down for several hours, while the MQ server accumulates a large number of messages.
    Is anyone using Sun MQ in a similar scenario? How is it holding up?
    Sun folks, do you know of production installations that use durable topics with high data volumes?
    thanks,
    -am

    We are using a cluster of Sun�s JMS MQ 3.6 with durable message queues and persistent topics. In a 4 hour window each night we run over 20,000 messages through a queue. The cluster sits on two Windows servers, the producer client is on a AIX box and the consumer is running on a iSeries. Within the 20,000 messages are over 400,000 transactions. Each message can have many transactions. Yes, the iSeries client has gone down twice and the producer continued with the message queue pilling up, as it should. We just use the topic to send and receive command and status inquiries to the clients. So every thing works fine. We have only had a couple issues with a client locking and that maybe fixed with sp3, we are in the process of installing that. The only other issue we have had is that once in a while the producer tries to send an object message with to many transactions and it throws and JMS exceptions. So we put a cap on the size of the messages, if its over a set number of transactions it send each transaction as separately, otherwise it sends all the transactions in one object type (linked list of transactions) message. Compare the cost of this JMS system with Tibco or Sonic and you�re looking at big savings.

  • Performance: How to manage large reports with high data volume

    Hi everybody,
    we actually make some tests on our BO server system, to define limitations and oppertunities. Among other things we constructed a large report with a high data volume (about 250.000 data records).
    When executing the query in SAP Query Designer it takes about 10 minutes to display it. In Crystal Reports we rebult the row and column structure of the query. The data retrieval in Crystal Reports Designer last about 9 minutes - even faster as in the query.
    Unfortunately in BO InfoView the report is not displayed. After 30 minutes of loading time we get a timeout error RCIRAS0244.
    com.crystaldecisions.sdk.occa.managedreports.ras.internal.ManagedRASException:
    Cannot open report document. ---
    The request timed out because there has been no reply from the server for 600.000 milliseconds.
    Also a refresh of an report with saved data is not possible.
    Now we are asking us some questions:
    1. Where can we set the timeout for InfoView to a value larger than 30 minutes?
    2. Why is InfoView so slow compared to Crystal Designer? Where is the bottleneck?
    3. Whats the impact of SAP single sign-on compared to Enterprise logon on the performance?
    Thanks for any helps and comments!
    Sebastian

    Hi Ingo,
    thank you for your reply.
    I will check the servers and maybe change the time limits.
    Unfortunately we have a quite slow server system that probably cause this timeout. In CR Designer we have no problems, its really quick. Is it to expect that CR Designer and InfoView have almost the same performance?
    Another nice point: When we execute the query in SAP BEx Query Designer it takes about 10 minutes to open it, in Crystal Designer it needs just about 5-6 minutes. We integrated exactly the same fields in the report, which exist in die SAP BEx query.
    What may cause the difference?
    - Exceptions and conditions in the query?
    - Free characteristics in the query?
    - anything else?
    Best regards,
    Sebastian

  • Using RSCRM_BAPI with Huge Data Volume

    Hi,
    I am using RSCRM_BAPI to extract a query output into a database table. But, the query returns a large volume of data. I am not sure whether RSCRM_BAPI works fine when data volume is huge. Please suggest whether using this will be a good design or any other method is available to take care of such a scenario.
    Regards,
    Dibyendu

    I have used RSCRM_BAPI when the records were exceeding 65000(limitations of excel) and it worked for me...
    I think should work for you also....
    But there are some limitations....
    like u cannot see texts etc,,,,
    Assign point if it helps,
    Ajay

  • High Data Volume Design request

    Hi All,
    I have a request from my business partners for a high volume data transmission from SAP BI system to SAP ECC system.
    volume of data approx 4 million.
    what is the best design approach recommendation for this kind of load via SAP PI?
    what need to be taken care if we are going to do a IDoc to IDoc scenario or IDoc to PRoxy scenaio? -> i am not a abap developer
    this is going to be async scenario. i assume IDOC is the best approach for handling this kind of load.
    what need to be taken care during the idoc development in the sender system? (as i told i am not a abap guy), so that the load in each idoc is acceptable for SAP PI (max 5 mb) and all the data is streamed to SAP ECC system.
    PS:we are running on SAP PI 7.0 SP15 system.
    Regards,
    Prakash.

    Since your volume of record is really high, you can try design using proxies. IDOC is not really good approach considering the volume of records. If the option is only IDOC to go then limit your records for each message transaction. Hope that helps.
    Refer these links for idoc handling
    /people/michal.krawczyk2/blog/2009/05/21/xipi-collecting-idocs--possible-ways-with-pros-and-cons--5-ways
    /people/michal.krawczyk2/blog/2007/12/02/xipi-sender-idoc-adapter-packaging
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/877c0d53-0801-0010-3bb0-e38d5ecd352c?quicklink=index&overridelayout=true

  • Update rule not working with high data load.

    Hi all,
    i have a problem with a update rule: it's an update loop in a dso, in the start routine i do 3 select for all entries in data-package on the active table of another structure; then i read those table to update some values in my update rule.
    I did some test and it seemed to work well (i tryed for example just for a plant) but when i launched it for all the records in the dso the result was differente and, for the same plant, the values where not updated correctly (they were blank).
    Now the routine is really the same so it sound strange to me that launching the infopackage without filters i can't get the same correct result of my previous test but i was wondering what could be the reason of this error...
    Anyone can help?
    The start routine is this:
      REFRESH i_tab.
      SELECT field1 field2 field3
        INTO TABLE i_tab
        FROM target_dso
        FOR ALL ENTRIES IN DATA_PACKAGE
        WHERE deliv_numb = DATA_PACKAGE-deliv_numb
          AND deliv_item = DATA_PACKAGE-deliv_item
          AND act_gi_dte <> l_blank_date.
    then i read this table in the other routines...

    It is hard to say.  What does the update rule look like?
    after the read statement, you could check the return code.  If it is not zero, go into an infinite loop, and debug it via SM37.
    read table....
    IF sy-subrc <> 0.
      WHILE 1 = 1.
        "debug in SM37.
      ENDWHILE.
    ENDIF.

  • New TC as router locking-up with high data steams

    I bought a new 1TB Time Capsule to replace the router (wireless) supplied by my ISP because I thought it might be faster and would of course provide a way to back up all my data. Set up was flawless. Then it really started acting up. When we started streaming a video it would play well then it would stop. I'd go over to the TC and it would be flashing amber. The Airport Utility - that I set up on my Windows Vista PC - would pop up and say "the device is reporting problems". I would have to go through the set up process and things would be normal again for a while. I eventually just had to put the old wireless router back on. I've been reading how to use the Time Capsule as a wireless device without using it as the router, but still be able to use it for back up but I don't understand. So, first, why wouldn't work as a router w/o so many probs. 2nd, how can I use it for more than a $300 paper weight?

    It was a surprisingly simple solution that Apple was able to help me with via their "Express Lane" service. I only needed to change channels in the 2.4 GHz band (left the 5 GHz channel only). Restarted after a couple of things they helped me tune-up, and my TC has not had a single hiccup since. Thanks Apple.

  • There is something wrong with the volumes buttons in my macbook pro, every time i pressed the one who raises the volume, it leads me to the screen where (i do not no what its called) the background is black with the date and time and a calculator.

    There is something wrong with the volumes buttons in my macbook pro, every time i pressed the one who raises the volume, it leads me to the screen where (i do not no what its called) the background is black with the date and time and a calculator. However, when i lower it, my safari tab goes out of the screen. What do you guys think i should do? I'm getting very nervous.

    hey HAbrakian!
    You may want to try using the information in this article to adjust the behavior of your function keys to see if that resolves the behavior:
    Mac OS X: How to change the behavior of function keys
    http://support.apple.com/kb/ht3399
    Take care, and thanks for visiting the Apple Support Communities.
    -Braden

  • I am receiving bills from my carrier with very high data usage. I read books from apple store. Do ibooks use gb once purchased?

    I am receiving bills from my carrier with very high data usage. I read books from apple store. Do ibooks use gb once purchased?

    To reduce data usage, you should put iPad on Aeroplane Mode to stop all background activities when not using iPad.
    Message was edited by: Diavonex

  • Select data from database tables with high performance

    hi all,
    how to select data from different database tables with high performance.
    im using for all entries instead of inner joins, even though burden on data base tables is going very high ( 90 % in se30)
    hw to increase the performance.
    kindly, reply.
    thnks

    Also Check you are not using open sql much like distict order by group by , use abap techniques on internal table to acive the same.
    also Dont use select endselect.
    if possible use up to n rows claus....
    taht will limit the data base hits.
    also dont run select in siode any loops.
    i guess these are some of the trics oyu can use to avoid frequent DATA BASE HITS AND ABVOID THE DATA BASE LAOD.

  • How to avoid losing data when communicat​e with high speed motor?

    I connect with high speed servo motor via RS232. To avoid losing data. i thought to set receive buffer, only read the buffer if it collected all bytes. Is this possible?

    Hi,
    If you know the number of bytes you are trying to read, you can set a viRead call to return information once the particular number of bytes have been read.  For more information on this, take a look at the KnowledgeBase article on a Serial VISA Read to read a requested number of bytes. 
    Even if you read before all bytes have been collected, you should not lose data.  When the specified number of bytes are stored in the buffer, the viRead call will send the information to the program, and new data coming in will be stored in the buffer until the byte count is reached again.
    I hope this helps,
    Lauren L.
    Applications Engineering
    National Instruments

  • Is there an translate APPID with high volume

    I am using TranslateApis and I encounter some exceptions like: 
    TranslateApiException: AppId is over the quota : ID=1035.V2_Soap.Detect.30AB79E9
    or
    TranslateApiException: IP is over the quota
    Seems I am calling the service to fast. So I am wondering if I can get an APPID with high volume to call. It is fine if I need to paid for it. Thanks.

    Thank you for your question
    There are service limits in place to allow for fairness among all our users:
    You are currently able to translate a maximum of 10000 characters per request, and we recommend keeping each request between 2000 and 5000 characters to optimize response times.  
    The hourly limit is 20 million characters, the daily limit is 480 million characters.
    There is no limit to the number of requests per minute. 
    The Translator API is available through Windows Azure Marketplace (www.aka.ms/TranslatorADM) as a monthly subscription model.  For all paid tiers, you can choose to enable the Auto-refill feature, which
    allows Marketplace to automatically re-subscribe you to the same monthly tier if you prematurely exhaust your monthly volume limit.
    Thanks,
    Tanvi Surti
    Program Manager, Microsoft Translator
    Microsoft Translator team - www.microsoft.com/Translator

  • I have an HTC 8m and with no data connection other than WIFI??? Any advice

    I have an HTC 8m and with no data connection other than WIFI??? Any advice

    Try removing then reinserting the SIM card; it may just not be connecting fully. If that doesn't work, the SIM card may be corrupt (it happens), and you can get a new one at no charge at a Verizon corporate store, or by calling customer service (800) 922-0204.

  • Oracle Business Intelligence with big data

    Has anyone implemented OBIEE 10g utilizing a denormalized data model for a very large transactional data set? There is a potential to generate reports in the 100s of millions of rows, with the data warehouse storing fact tables that have ~1.5 billion rows with a data consumption rate of over 200GB per month.
    Does anyone have any best practices, tips, or advice to determine the feasibility of implementing OBIEE 10g for such a large volume? The data is transactional and there are no current requirements for aggregate data sets. Is it feasible to use OBIEE Answers to generate these types of reports? Thus far I've seen OBIEE Answers hanging/crashing on reports that are > 10MB in size. Any configuration tips or feedback would be appreciated.
    Thanks,
    John

    I think with Big Data environment you need not worry about caching, runtime aggregation and processing, if your configuration is right. The hardware along with distributed processing would take care of most of the load since Big Data database are designed to be in-memory and highly responsive.
    The thing you should consider that final output should not be too large. What I mean is, your request can process 100 million rows on database, but final output can be 1000 rows, since BI is all about summarization.
    If you have large data thrown to the Presentation Server, it could make presentation service unstable.

  • How do I create an interactive PDF file with variable data

    We would like to basically do a 'mail merge' of our list of customers with an interactive PDF file (including videos, menus, etc - not just form fill out and web links) to create a single PDF file that contains multiple mail pieces ... one for each customer ... with each mail piece being customized for that customer.  Customizations would include different greetings (Dear Bob, Dear Dana, etc), as well as different charts based on data unique to the customer, different photographs, etc.
    I've seen that InDesign and Acrobat Professional can be used to create an interactive PDF (such as from http://tv.adobe.com/watch/ask-the-adobe-ones/14-calling-rufus-about-interactive-pdf-making).  However I don't understand how I can insert data from a database, csv file, excel file etc into the PDF file so that each page, or each set of pages, within the PDF can be customized.
    Can anyone point me to a tool to use for this?
    Thanks,
    Bob Kendall

    For that kind of volume and unattended operation, you want InDesign Server – which is the server/high volume edition of INDD.
    From: Adobe Forums <[email protected]<mailto:[email protected]>>
    Reply-To: "[email protected]<mailto:[email protected]>" <[email protected]<mailto:[email protected]>>
    Date: Thu, 3 Nov 2011 06:58:07 -0700
    To: Leonard Rosenthol <[email protected]<mailto:[email protected]>>
    Subject: How do I create an interactive PDF file with variable data
    Re: How do I create an interactive PDF file with variable data
    created by Ti26E31DSxxx<http://forums.adobe.com/people/Ti26E31DSxxx> in PDF Language and Specifications - View the full discussion<http://forums.adobe.com/message/4005459#4005459

Maybe you are looking for

  • My ipod touch will only charge on my computer. What should I do?

    I have an iPod touch 3rd generation 32GB with iOS 4.3.3 that will charge and sync on my computer. But if it is plugged into a wall, car, or dock charger it will not charge. I believe it is a software problem because it works on the computer. I have a

  • Number of times a particular vendor has exceeded the threshold limit

    Hello, We wish to find the number of times a particular vendor invoice has exceeded tolerance limit.  Is there any table where this data gets updated or is their any standard report ? Regards Jayesh.

  • Save sew swatch?

    I'm using InDesign CS5.5. How do I make a new swatch and save it so that it appears in my swatches pane when I close and re-open InDesign? I created a few new swatches, but when I opened the program again, they were gone. Thanks for any help.

  • Updated to Quick Time 7 Pro but Can't view trailers

    I updated to Quick Time 7 Pro and every thing looks ok when Quick Time is opened. But when I try to view movie trailers on this website it continues to tell me that I need to download Quick Time. Help.

  • Authorize computer - unknown error (-50)

    there was an error in the itunes store, please try again later... this is the message i get every time when in try to authorize my computer. I had to reinstall my system after a disk crash and have copied the entire music folder back into the new sys