Need advise on large file processing with good performance

Hi All,
I am working on a program in which I have to read millions of records from application server file.For this, I am reading 1 million records each time and uploading into the DB-table.
what is the best approach to process the millions of records.what I am currently doing is, I read 1 million records one by one , modify each and every record based on some conditions and store them in one internal table and update the DB table.
I am also thinking alternate approach is,read 1 million into one internal table and after that within the loop, modify each and every records for given condition and update the DB table.
which approach is the best one?
you can advise me any other alternate approches with good performance.
Regards,
Nivas
Edited by: Nivas4081 on Jul 24, 2008 2:55 PM

Hi Joshi,
Thanks for your reply reply. I have tested both ways as I mentioned in my query but reading record by reocrd and update data packets takes less time than reading into iternal table,then modify and update the DB table.
Hi ralph,
Thanks for the reply.
The modifications are similar in all the lines.I get related data from other class/method,do some calculation and modify each each record.
Are there any performnace tricks to be followed when processing large amount of data.by the way I am reading certail amount of records say 400K and updating DB table using parallel processing.
Apart from this, any suggestions on this?
Regards,
Nivas

Similar Messages

  • Large file processing in file adapter

    Hi,
    We are trying to process a large file of ~280 MB file size and we are getting timeout errors. I followed all the required tunings for memory and heap sizes and still the problem exists. I want to know if installation of decentral adapter engine just for this large file processing might solve the problem which I doubt.
    Based on my personal experience there might be a limitation of file size processing in XI may upto 100 MB with minimul mapping and no BPM.
    Any comments on this would be appreciated.
    Thanks
    Steve

    Dear Steve,
    This might help you,
    Topic #3.42
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/70ada5ef-0201-0010-1f8b-c935e444b0ad#search=%22XI%20sizing%20guide%22
    /people/sap.user72/blog/2004/11/28/how-robust-is-sap-exchange-infrastructure-xi
    This sizing guide &  the memory calculations  it will be usefull for you to deal further on this issue.
    http://help.sap.com/bp_bpmv130/Documentation/Planning/XISizingGuide.pdf#search=%22Message%20size%20in%20SAP%20XI%22
    File Adpater: Size of your processed messages
    Regards
    Agasthuri Doss

  • Large file folder with question mark is flashing on screen.

    Just turned on computer for the first time and a large file folder with a question mark is flashing on the screen. I am unable to do anything and have restarted and it does the same thing.  Please help!

    It does this when it can't see the operating system. You didn't give much information but I'm assuming it's a brand new mini and you aren't using any external drives for the operating system. If that's the case then yes, reinstall OS X. Put your install DVD into the optical drive and reboot. As soon as you hear the chime, hold down the "C" key on your keyboard (or the Option key until the Install Disk shows up). That will force your Mac to boot from the install DVD in the optical drive.
    If reinstallation isn't possible, or if the problem continues after reinstallation, then you need to exchange it for another, because something is keeping the operating system on the drive from being read.

  • Functionality of Files Processed with Photoshop CC

    Adobe has stated that Photoshop CC users will have access to all files saved on their computers.
    What will be the functionality of files processed with Photoshop CC and saved on my computer if I no longer subscribe to Photoshop CC?
    Adobe will introduce new functions such as Camera Shake Reduction to Photoshop CC.  Let's say that I use a new function on a layer or as a Smart Object, and then save it on my computer, and then end my subscription to Photoshop CC.  Will I be able to open the file with my layers intact and editable in any program other than Photoshop CC?
    If I save the files with the layers merged as TIFFs or JPEGs I assume that I will be able to open them with most photo programs. Let's say that I do not save merged files and I am willing to sacrifice the layers.  Will Photoshop CC be required to merge the layers and save as files that can be edited in other programs?  What happens with other file types such as PSDs?
    Let's say I own CS6 or previous versions with perpetual licenses.  Will I be able to open and edit files in CS6 with layers created in Photoshop CC?

    Eric, you need to clear the URL filed before pasting another into it:
    http://davecrossworkshops.com/2013/05/10/moving-files-backwards-photoshop-cc/
    [EDIT]
    Having read the link, it is not quite the disaster Dave Cross makes it out to be.  While the new features would not be directly editable in CS6 or earlier, you could absolutely still use the image data in those layers, and make further edits using the features and tools in whatever copy of Photoshop you were using.  The Shape layer which used the new radius corners tool in CC, is still a shape layer, but even if it had been rasterised, you could select it and turn that selection into a path, and edit that path. 
    I am not so sure about the ACR filter layer.  I'd like to think that the results of using that adjustment layer would be apparent in whatever that adjustment layer had been applied to in Photoshop CC.  So you'd still have the benefit of the edit.  Dave made that over complex by nesting the ACR layer in a smart object, and I could only guess at the consequences of that.  But the bottom line is you could still open a PSD file created in CC, and have all the layers, and most of the information.  You might even have the outcome of all or most of the edits made in CC.  If you are worried, put a copy merged layer at the top of the stack while CC is still working for you, or just keep subscribing to what will continue to be the world’s best raster image editing program. 

  • Need to get large file sizes down to downloadable sizes-how?

    I am using Compressor 4 and have several professional videos I produced with Final Cut Studio in QuickTime/.MOV format using ProRes 442. They are all 720p and of course, all have large file sizes. I needed to convert them so they are compatible with iTunes as well as Apple devices (AppleTV, iPad/iPod/iPhone, etc). They also need to be downloadable in a reasonable time and I of course want to maintain a quality image. Several of the videos are about an hour long, some a little more.
    I an unable to get most of these files to be small enough for practical download and still maintain an acceptable quality. For example:
    I have one video (QuickTime rendered from FCPS) that runs 54 minutes. 1280x720, ProRess 422, Linear PCM with a Bit rate of 72.662. The best I seem to be able to get out of compressor is to use the setting options for "HD for Apple Devices (5 Mbs)" which gives me a file that's just over 2Gb. Not bad, and the quality is still very good, but the file size is just too big for people to download. I get reports of people getting estimates of 77 hours to download this one file.
    I did try using the setting under Video Sharing Services / Large 540p Video Sharing just to see what it would give me, but it really wasn't that much smaller and the quality was noticeably degraded.
    I'm confused as to what to do about this. I have downloaded things from iTunes before that are good quality, about an hour and their file sizes are simply bnot this big.
    Can anyone give me some advice? Thank you. Grateful for anything you can offer.
    MC

    If you're not concerned with precise average bitrate, use h.264 codec, set bitrate to "automatic" and use the quality slider. Be sure to have "keyframes" on "automatic"; every 24 frames or so makes h.264 quite inefficient and is only useful for live-streaming, where you want people to be able to resynchronise as quickly as possible. Having multi-pass enabled might help, but I'm not sure what it does when using constant-quality encoding.
    Do a test-encoding of a representative clip of maybe about rwo minutes or so, play with the quality slider and see how far down you can go and still be satisfied with the quality. Using "automatic" bitrate uses as many bits to encode each frame as is necessary to get the required quality, and file sizes will vary depending on the type of material. Complex textures and high motion will use more bits, still frames and slow pans will use very little.
    A light chroma-channel denoising may help improve perceived quality, as may very light sharpening in case you scaled down the material.
    If you're still not happy with the bitrate vs. quality tradeoff, try reducing the resolution, but generally you should get 720p down to around 2.5 to 3.0 Mbps with adequate quality.
    On the other hand, If someone needs 77 hours to download 2 GB, they have another problem anyway. That sounds like a 56k modem connection. What's he doing trying to download HD content anyway?
    Bernd

  • Need help exporting large file to bitmap format

    I have a large file that I need to export to a bitmap format and I am experiencing unknown "IMer" errors, memory and file size limitations.
    The file contains only one artboard of size around 8000 pixels by 6000 pixels. I can successfully export as a png at 72dpi giving me a 100% image of my AI file. However I need to export an image at 200% of the AI file. If I export as a png at 144dpi to achieve this I get the following error(s)
    "The operation cannot complete because of an unknown error (IMer)" or "Could not complete this operation. The rasterised image exceeded the maximum bounds for savung to the png format"
    I have tried exporting as tiff, bmp and jpeg but all give "not enough memory"  or "dimensions out of range" errors.
    Does anyone know the limitations with AI for saving large files to bitmap format?
    Does anyone have a workaround I could try?

    You could try printing to Adobe Postscript files.
    This will actually save a .ps file and you can scale it to 200% when doing so. You'll need to "print" for every image change.
    Then open a Photoshop document at your final size (16000x12000 pixels) and place or drag the .ps files into the Photoshop document.
    I just tested this and it seems to work fine here. But, of course, I don't have your art, so complexity of objects might factor in and I'm not seeing that here.
    And you are aware that Photoshop won't allow you to save a png at those dimension either, right? The image is too large for PNG. Perhaps you could explain why you specifically need a png that large???

  • Need to link 2 files, one with a numeric firld and on with an alpha numeric

    I need to link 2 files in my report by employee number. In the employee file it is an 8 character Numeric field like 1505087. In the GL File, it is an 8 character field like '01505087' I have done a TONUMBER on the employee number.  Where do I link these files? I have done it in my data selection, but it now takes hours to run the report. The report does work. Any suggestions for linking the files together?
    Thanks,
    Rick

    Hi Rick,
    Linking numeric and non numeric fields at report level takes long time because for each detail it has to process i.e convert your non numeric field to numeric and then link.
    To avoice this, try to use Add command for your non numerica table and convert the field to numeric in free hand SQL itself.
    Eg: Select fields..convert(non numeric to numeric) from table   // my convert syntex is not correct becasue i don't know what database you are using
    Now you get both tables with numeric fields and you can link both.  It will not take much time to retive records.
    Thanks,
    Sastry

  • Airport Extreme, Airdisk, and Vista - large file problem with a twist

    Hi all,
    I'm having a problem moving large-ish files to an external drive attached to my Airport Extreme SOMETIMES. Let me explain.
    My system - Macbook Pro, 4gb ram, 10gb free HD space on macbook, running latest updates for mac and vista, external hard drive on AE is an internal WD in an enclosure w/ 25gb free (its formatted for pc, but I've used it directly connected to multiple computers, mac and pc, without fail), AE is using firmware 7.3.2, and im only allowing 802.11n. The problem occurs on the Vista side - havent checked the mac side yet.
    The Good - I have bit torrents set up, using utorrent, to automatically copy files over to my airdisk once they've completed. This works flawlessly. If i connect the hard drive directly to my laptop (macbook running bootcamp w/ vista, all updates applied), large files copy over without a hitch as well.
    The Bad - For the past couple weeks (could be longer, but I've just noticed it recently being a problem - is that a firmware problem clue?) If i download the files to my Vista desktop and copy them over manually, the copy just sits for a while and eventually fails with a "try again" error. If i try to copy any file over 300mb, the same error occurs.
    What I've tried - well, not a lot. The first thing i did was to make sure my hard drive was error free, and worked when physically connected - it is and it did. I've read a few posts about formatting the drive for mac, but this really isnt a good option for me since all of my music is on this drive, and itunes pulls from it (which also works without a problem). I do however get the hang in itunes if i try to import large files (movies). I've also read about trying to go back to an earlier AE firmware, but the posts were outdated. I can try the mac side and see if large files move over, but again, i prefer to do this in windows vista.
    this is my first post, so im sure im leaving out vital info. If anyone wants to take a stab at helping, I'd love to discuss. Thanks in advance.

    Hello,
    Just noticed the other day that I am having the same problem. I have two Vista machines attached to TC w/ a Western Digital 500 gig USB'd. I can write to the TC (any file size) with no problem, however I cannot write larger files to my attached WD drive. I can write smaller folders of music and such, but I cannot back up larger video files. Yet, if I directly attach the drive to my laptop I can copy over the files, no problem. I could not find any setting in the Airport Utility with regards to file size limits or anything of the like. Any help on this would be much appreciated.

  • Large file processing in XI 3.0

    Hi,
    We are trying to process a large file of ~280 MB file size and we are getting timeout errors. I followed all the required tunings for memory and heap sizes and still the problem exists. I want to know if installation of decentral adapter engine for just this file processing might solve the problem which I doubt.
    Based on my personal experience there might be a limitation of file size processing in XI may upto 100 MB with minimul mapping and no BPM.
    Any comments on this would be appreciated.
    Thanks
    Steve

    Hi Debnilay,
    We do have 64 bit architecture and still we have the file processing problem. Currently we are splitting the file into smaller chuncks and processsing. But we want to process as a whole file.
    Thanks
    Steve

  • File Splitting for Large File processing in XI using EOIO QoS.

    Hi
    I am currently working on a scenario to split a large file (700MB) using sender file adapter "Recordset Structure" property (eg; Row, 5000). As the files are split and mapped, they are, appended to a destination file. In an example scenario a file of 700MB comes in (say with 20000 records) the destination file should have 20000 records.
    To ensure no records are missed during the process through XI, EOIO, QoS is used. A trigger record is appended to the incoming file (trigger record structure is the same as the main payload recordset) using UNIX shellscript before it is read by the Sender file adapter.
    XPATH conditions are evaluated in the receiver determination to eighther append the record to the main destination file or create a trigger file with only the trigger record in it.
    Problem that we are faced is that the "Recordset Structure" (eg; Row, 5000) splits in the chunks of 5000 and when the remaining records of the main payload are less than 5000 (say 1300) those remaining 1300 lines get grouped up with the trigger record and written to the trigger file instead of the actual destination file.
    For the sake of this forum I have a listed a sample scenario xml file representing the inbound file with the last record wih duns = "9999" as the trigger record that will be used to mark the end of the file after splitting and appending.
    <?xml version="1.0" encoding="utf-8"?>
    <ns:File xmlns:ns="somenamespace">
    <Data>
         <Row>
              <Duns>"001001924"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001925"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001926"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001927"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001928"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001929"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"9999"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
    </Data>
    </ns:File>
    In the sender file adapter I have for test purpose changed the "Recordset structure" set as "Row,5" for this sample xml inbound file above.
    I have two XPATH expressions in the receiver determination to take the last record set with the Duns = "9999" and send it to the receiver (coominication channel) to create the trigger file.
    In my test case the first 5 records get appended to the correct destination file. But the last two records (6th and 7th record get sent to the receiver channel that is only supposed to take the trigger record (last record with Duns = "9999").
    Destination file: (This is were all the records with "Duns NE "9999") are supposed to get appended)
    <?xml version="1.0" encoding="UTF-8"?>
    <R3File>
         <R3Row>
              <Duns>"001001924"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
         <R3Row>
              <Duns>"001001925"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
         <R3Row>
              <Duns>"001001926"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</xtract_Code>
         </R3Row>
              <R3Row>
              <Duns>"001001927"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
              <R3Row>
              <Duns>"001001928"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
    </R3File>
    Trigger File:
    <?xml version="1.0" encoding="UTF-8"?>
    <R3File>
              <R3Row>
              <Duns>"001001929"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Ccr_Extract_Code>"A"</Ccr_Extract_Code>
         </R3Row>
              <R3Row>
              <Duns>"9999"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Ccr_Extract_Code>"A"</Ccr_Extract_Code>
         </R3Row>
    </R3File>
    I ve tested the XPATH condition in XML Spy and that works fine. My doubts are on the property "Recordset structure" set as "Row,5".
    Any suggestions on this will be very helpful.
    Thanks,
    Mujtaba

    Hi Debnilay,
    We do have 64 bit architecture and still we have the file processing problem. Currently we are splitting the file into smaller chuncks and processsing. But we want to process as a whole file.
    Thanks
    Steve

  • Java.io.IOException during large file processing on PI 7.1

    Hello Colleagues,
    for a large file scenario on our PI 7.1 System we have to verify with big file size we are able to process over PI.
    During handing over the large file (200 MB XML) form the Adapter Frame Work (File Adapter) to the Integration Engine we receive following error:
    Transmitting the message to endpoint http://<host>:<port>/sap/xi/engine?type=entry using connection File_http://sap.com/xi/XI/System failed, due to: com.sap.engine.interfaces.messaging.api.exception.MessagingException: Error transmitting the message over HTTP. Reason: java.io.IOException: Error writing to server.
    The message processing stopped and message still lies at Adapter Frame Work. Large files up to 100 MB we are able to process successfully.
    Please, could you tell me why this happened and how we are able to solve it?
    Because there is not a java.outofmemory exception however a IO exception i think it could be an memory issue?!
    Many thanks in advance!
    Regards,
    Jochen

    Hi Jochen,
    Indeed the error is IO Error and it is because the Adapter engine was not able to send the message to Integration server. But it happens due to memory/heap size issues.
    Look at these thread, they are having the same problem. Please try the remedy measures suggested by them
    Mail to Proxy scenario with attachment. AF channel error.
    Error with huge file
    problem with big file in file-to-proxy scenario
    Is there any additional information in Adapter messaging tool.?
    Regards
    Suraj
    Edited by: S.R.Suraj on Oct 1, 2009 8:55 AM

  • Bottleneck in Large file processing

    Hi,
    We are experiencing timeout and memory issues in large file processings. I want to know wheather J2EE adapter engine or Integration engine is the bottleneck in processing large messages like over 300 MB files without splitting the files.
    Thanks
    Steve

    Hi Mario,
    We are testing a scenario to find out what is the maximum file size that XI can handle based on the blog
    ( /people/william.li/blog/2006/09/08/how-to-send-any-data-even-binary-through-xi-without-using-the-integration-repository) without any mapping. Upto 20 MB it works Ok and after that we are getting timeout error .
    Data from Moni:
    com.sap.engine.services.httpserver.exceptions.HttpIOException: Read timeout. The client has disconnected or a synchronization error has occurred. Read [1704371] bytes. Expected [33353075]. at com.sap.engine.services.httpserver.server.io.HttpInputStream.read(HttpInputStream.java:186) at com.sap.aii.af.service.util.ChunkedByteArrayOutputStream.write(ChunkedByteArrayOutputStream.java:181) at com.sap.aii.af.ra.ms.transport.TransportBody.<init>(TransportBody.java:99) at com.sap.aii.af.ra.ms.impl.core.transport.http.MessagingServlet.doPost
    This could be due to ICM timeout settings which we are planning to increase.
    I would like to hear from others experience of maximum file size that they could process. Ofcouse I do know that it depends on the environment.
    Thanks
    Steve

  • Need help in setting file name with special characters in attachment

    Hi
    We have a requirement where we need to set the file name that contains special characters (like Russian) and send mauil using Java mail.
    If we set the file name as such, the attachment in the email contains garbled filename
    Can you pl let me know how to resolve this?
    We should use the file name as attachment name and this will have say special characters. The receiver who gets the mail should get with the correct attachment name
    One important point.. the attachments are opened from MS outlook.
    Thanks and regards
    Ram
    Edited by: 884910 on 13 Sep, 2011 5:00 AM

    Read the FAQ carefully. You don't need to call encodeText unless you're using a really
    old version of JavaMail.
    And, it depends on whether the mail reader you're using is handling encoded parameters
    according to the (new) MIME standard, or according to the (old) non-standard hack.
    Sadly, without knowing what mail reader the recipient is using, it's impossible to use
    encoded filenames that will work everywhere.

  • Large file uploads  with jersey client

    Hello All,
    I am getting heap spcace iissue while uploading large files via jersey client , the jvm will run out memory
    Can any one help me in understanding how can i send large file about 300mb
    via jersy client.
    code that i have wriiten is:
    WebResource webResource = client.resource(url);
    MultiPart multiPart = new MultiPart();
    FileDataBodyPart filePart = new FileDataBodyPart("file", file,
    MediaType.APPLICATION_OCTET_STREAM_TYPE);
    multiPart.bodyPart(filePart);
    multiPart.bodyPart(new FormDataBodyPart("code", code));
    ClientResponse response = webResource.type(MediaType.MULTIPART_FORM_DATA).post(ClientResponse.class,
    multiPart);
    Thanks

    Thanks for ur reply but i have tried it by using
    client.setChunkedEncodingSize(10000);
    WebResource webResource = client.resource(url);
              // file to multi part Request
              MultiPart multiPart = new MultiPart();
              FileDataBodyPart filePart = new FileDataBodyPart("file", file,
                        MediaType.APPLICATION_OCTET_STREAM_TYPE);
              multiPart.bodyPart(filePart);
              multiPart.bodyPart(new FormDataBodyPart("code",code));
              ClientResponse response = null;
                   response = webResource.type(MediaType.MULTIPART_FORM_DATA).post(ClientResponse.class,
                             multiPart);
    Now the file are easily send in chunked but the problem is i am not able to receive value of code now..
    On the other server where i am sending this file along code is:
    public void upload(@RequestParam("file") MultipartFile uploadedFile,
                   @RequestParam("code") String code,
                   Writer responseWriter) throws IOException
    }

  • Copy a small version into a larger file - problem with typing

    Hi,
    I have paste a copy of a Captivat file in a 10px (each side) larger file. Now I got the problem that typing animation is not on the correct place. Take a look:
    the original:
    the pasted version in the 10px larger file:
    Is this a known problem? Is there a way to fix this?

    Hi there,
    I would suggest that you use a screen capture utility such as Snagit to capture your Request Membership box and then insert this into your Captivate slide as an image. This will then enable you to accurately position the graphic and should address the typing problem.
    Best - Mark
    Visit the macrofireball blog

Maybe you are looking for

  • Bookmarks are lost and address/search bar is nonfunctional due to non exisistant corrupt

    Firefox unexpectedly updated itself and upon doing so, lost all of my bookmarks. Also, when a new browser window opens, three windows open: the welcome window, the start page and the invisible hand page. Every time, as if I have just Firefox every ti

  • What is the problem in this code (problem in select query)

    Just need your help in solving one issue. In the below code one query which is highlighted Select statement not working means cursor directly goes to the endselect .means these two statements not executed. CLASS cl_abap_container_utilities DEFINITION

  • Acrobat 9 Pro license key issue.

    Why won't my Acrobat 9 Pro license key work on the install onto my new computer?

  • Menu Motion Content? means what in English..

    Ok, so I'm minutes away from my first dvd and I'm so excited, but the dvd will not burn because I have " too much motion menu content" for what is allowed? What does this mean exactly, and what can I delete or shrink to make it all work? Could the si

  • Editing a table

    Hi Forum, I have a table in my view. I want to change it to editable mode in runtime when i click the change pushbutton. Again if i click display pushbutton it should go to non editable mode. How is this possible programmatically. Kindly help me out.