Reading huge files

I have a simple java file which reads huge files and writes them to database , I am getting out of memory error frequently , please tell me what memory arguments do I have to add to make this work ?

Without seeing your code to produce the blob I have no idea what options you have. For all I know you could use the setBinaryStream method of PreparedStatement where you just pass it an InputStream pointing to your data. Which obviously wouldn't have to be in memory.

Similar Messages

  • File Adapter-Problem Reading Huge Files

    Hi,
    Here is the issue that i am facing
    When reading huge file(csv file upto 6MB-8MB) the communication channel configured as File Adapter with a polling interval of 7 min(420 sec) is inconsistent in reading the complete file.Sometimes it reads the the complete file of 6 MB and sometimes it reads a part of the file say 3MB/6MB.Can this inconsistent behaviour be resolved.??
    Your suggestions highly appreciated.
    Regards
    Pradeep

    Hi Pradeep !
    8mb is not a huge file for XI, I think it is a small one. Maybe your problem is not the size..please check if XI is not starting to read the file before it is completely written to the source folder. If you are creating that csv file from another application directly to the poll source directory of the XI scenario specified in the file adapter, and your poll interval is small, XI could start reading the file while you are still writing it. If this is the case, try to put the file with a different extension or filename than the specified in file adapter comm channel and when the file is completely written, rename it to its final filename and check if you are still having that misbehavior.
    You can write the file to a temp directory and the move it to the XI directory once finished.
    Regards,
    Matias.

  • Reading huge flat file through SOAP adapter

    Hi Everybody,
        In one of our interface we need to read big flat file using soap adapter at sender side into xi and we are using java map to convert into xml. but before that i need to split this flat file into multiple files  in the first message mapping. and in the second map we have to write a java map to do the flat file conversion to XMLBut i got struck up in reading this big flat file into XI as i need to declare some datatype to read this entire file. Can anybody tell me how i can do this. is it a possible to do first of all with SOAP adapter . 
    Thanks
    raj

    hi vijay,
      Thanks for your prompt reply. Due to some reasons i am not allowed to use file adapter . i can use only JMS adapter or SOAP adapter. we tried few scenarios with JMS content conversion but what ever scenario i am asking here is complex at multilevel i can't even use JMS in this case. so we are thinking to read whole file using SOAP adapter and then we are planning to split the file into multiple files, as file can be huge size ,using java mapping and in next level we want to use another mapping to do content conversion. SO I have to do experiements whether this is a feasible solution or not. because when u declare at sender side
    <ffdata_MT>
    <Recordset>
        <ROW>    String type
    when u declare like this and when u sent the flat file using SOAP adapter at sender side we are getting whole file which we sent at part of "ROW" as string. but inside java mapping i need to see whenther i can split this in XI ,so that i can use these split files in next mapping for content conversion. Hope i am clear now. I want to know whether it is a feasible solution or not.
    I really appreciate if sombody give some idea on this
    Thanks
    raj

  • Reading  huge xml files in OSB11gR1(11.1.1.6.0)

    Hi,
    I want to read a huge xml file of size 1GB in OSB(11.1.1.6.0)?
    I will be creating a (JCA)file adapter in jdeveloper and importing artifacts to OSB.
    Please let me know the maximum file size that could be handled in OSB?
    Thanks in advance.
    Regards,
    Suresh

    Depends on what you intend to do after reading the file.
    Do you want to parse the file contents and may be do some transformation? Or do you just have to move the file from one place to another for ex. reading from local system and moving to a remote system using FTP?
    If you just have to move the file, I would suggest using JCA File/FTP adapter's Move operation.
    If you have to parse and process the file contents within OSB, then it may be possible depending on the file type and what logic you need to implement. For ex. for very large CSV files you can use JCA File Adapter batching to read a few records at a time.

  • Reading huge flat file in OSB 11gR1

    Hi,
    I want to read a flat file in OSB.The size of the flat file may be larger, upto 1 MB.
    As per my knowledge, OSB provides following approaches to read a flat file-
    1.JCA(creating a file adapter in jdev and importing artifacts in OSB)
    2.MFL transformation
    3.Java callout
    Please let me know which is the best way to read the flat file.Also , is there any other way to do the same.
    Thanks in advance.
    Regards,
    Seemant
    Edited by: Seemant Srivastava on Feb 18, 2011 1:47 PM

    Which option is best one to convert a flat file to XML - is it via File Adapter or MFL ? Well, it's a topic of debate and it usually depends on your choice. Manoj has explained it clearly above that why one may prefer File Adapter over MFL. It also depends on your familiarity with the product. If you are a Oracle developer dealing with BPEL/Mediator mostly then you will prefer going for File adapter in this situation, even with OSB, but if you are a OSB developer (since the time it was known as ALSB) you will prefer MFL over adapter.
    It's just matter of choice & your comfort. Remember, in different-different cases, both solutions may result in different performance, so better test them from performance perspective and then choose.
    Such flexibility of optional tags can only be handled by mfl.I don't think so. File adapter should also be able to handle this use case. Have you checked this -
    http://download.oracle.com/docs/cd/E17904_01/integration.1111/e10231/nfb.htm#CHDDHEAI
    Also, upto what size of input file is supported by mfl.It's a unanswerable question. It totally depends upon your system structure. I personally don't prefer huge files translation at OSB because it hurts the performance of OSB.
    but I think such feature will not be supported when the file adapter artifacts are imported and used in OSB 11.1.1.3/11.1.1.4Correct. From OSB Dev guide -
    25.2.1.1 Oracle JCA Adapter Limitations
    Following are limitations when using some JCA adapters with Oracle Service Bus:
    •Oracle JCA Adapter for AQ – Streamed payload is not supported with Oracle Service Bus.
    •Oracle JCA Adapter for FTP and Files – Attachments (large payload support), pre- and post-processing of files, using a re-entrant valve for processing ZIP files, content streaming, and file chunked read are not supported with Oracle Service Bus.
    http://download.oracle.com/docs/cd/E17904_01/doc.1111/e15866/jca.htm#BABBICIA
    Regards,
    Anuj

  • When I open iPhoto 9.2.1 it says that my library needs to be upgraded but when I upgrade the library I am trying to open (33 GB worth of pictures from 2006) the new application says it cannot read the files and how to find them on the system to reimport?!

    When I open iPhoto 9.2.1 it says that my library needs to be upgraded but when I upgrade the library I am trying to open (33 GB worth of pictures from 2006) the new application says it cannot read the files and how to find them on the system to reimport?! and then I'd like to erase the original files since the space requirement is huge!!!! Why is upgrading software iPhoto such a pain. I've gotta get a presentation done and all I get for my money is roadblock!!!

    hello, it sounds like the library is damaged.
    Download iPhoto Library Manager and use its rebuild function. This will create a new library based on data in the albumdata.xml file. Not everything will be brought over - no slideshows, books or calendars, for instance - but it should get all your albums and keywords back.
    Because this process creates an entirely new library and leaves your old one untouched, it is non-destructive, and if you're not happy with the results you can simply return to your old one. 

  • I have had to buy an GRAID 4TB to back up my images as I am a photographer and have huge files.  Can I use my old timecapsule 2TB as storage for say 4 years of photography.  My imac is getting full.

    I am a photographer and have huge files so  have had to buy a GRAID 4TB for backup and now have an old 2 TB time capsule not doing anything. Could I use the timecapsule as storage and transfer say a few years of images over to that and free up space on my imac.  I do have the images copied onto other hard drives but feel I need more copies in case of external harddrive failure.

    Could I use the timecapsule as storage and transfer say a few years of images over to that and free up space on my imac.
    Can't give you a definite answer as I don't use a Time capsule, but if it is not being used for anything, why not try it out? My guess is you could use it for miscellaneous storage if you wanted to. 

  • How to export a video at a moderate certain format and size in premiere CC 2014: No Huge File Results!

    I have the latest version of Adobe Premiere, CC 2014 but whenever I want to render and export a video that is typically 70 megabytes in size I can't seem to always see how big the file will end up being for all the formats and I don't want to end up with a video larger than 250 to 300 megabytes in size just for the extremely best or highest quality even though I want the video to be well polished and in better quality than I usually get thee, These videos are usually imported as mp4. What file version of mp4 or otherwise would give me an originally imported video of 70 megabytes an output of no more than or around 300 megabytes or even less without compromsing video quality? I don't want a huge file size as I am uploading it to my YouTube channel via their web interface on another person's machine and they don't have Premiere and only has a DSL connection. So what video output settings should I use for a good quality video that isn't smaller than the original but the same size or a bit higher and not extremely large like MOVs and sometimes mp4s usually are depending on what you do to it and it is outputted.?

    Export
    Format: H.264
    Preset: YouTube (make sure this matches the dimensions of your sequence)
    You could further adjust the Target Bitrate but the preset is already setup for a good quality file for upload.
    You can also do further reading on Video Bitrates.

  • Reading text file to table

    Hi,
    I'm using oracle 9i on a server , and microsoft iis 6.0 is running on other server,
    I want to create a stored procedure, that read log file exist on iis server, and insert data on a table on oracle server.
    I want this procedure to run on daily basis, updating table content, then I will use discoverer to create reports.
    how can I do that?

    Best way to load data from text file to database is SQL*Loader.Actually in 9i the best way might be to use an External Table. This is much simpler to use than SQL*Loader because we can just issue a SELECT statement to get the data. SQL*loader is really only necessary when we have huge amounts of data and perfomance is critical.
    Cheers, APC

  • Unable to process huge files in SFTP BASED PROXY SERVICE

    Hi,
    I credted osb project which will have sftp based proxy and sftp based business service.After completing the development i tested the fie transfer through WINSCP ,but my sftp based proxy was picking the files but i am unable to see the files in the output directory for huge files,for small file it is coming after aquite long time .please assist me .

    If you enable content streaming then you cant access the file/message content within the Proxy service and can not perform actions like Replace or perform transformations on the content. Use streaming for pass through scenarios only. If you want to read a large file and also perform transformations on content I would recommend using JCA Adapter for FTP and try reading records from file in batches, a few records at a time. Just so we know the requirement better, what is the file size and what is the format of the content?

  • HUGE file size for Pages (iWork '06) documents

    I'm in the process of evaluating Pages in the hopes of dumping the last of my Microsoft programs (ie, Word.) I'm impressed with the ease of use; especially when it comes to being a little creative.
    My great concern is the file size. I created a Pages document of 6 pages. It had text, images and tables. This relatively small document is a whopping 17.7MB. When I export this document to a Word doc and open it, it looks identical. The file size of the Word doc is only 1.4 MB. A HUGE difference. Is this a known issue or is there some sort of compression setting I'm missing?
    I typically work with large docs and I don't want Pages files hogging all of my drive space.

    The package shows image1.tif, which I added on the
    first page. There's also an image1_filtered-9.tiff
    which is HUGE. It's about 3MB larger than the
    original file. It looks like Pages possibly makes a
    copy of an image when you apply an effect like a drop
    shadow?
    So if you don't want huge files, I guess you have to
    stay away from the image effects...which is actually
    what made Pages interesting.
    It is the adjustment tool which is causing the ballooning of your files.
    If you use the levels adjustement, you will end up with an image which is labelled ~levelled.tiff. If you use the various other parts of the adjustment screen you will get files which are labelled ~-filtered.tiff, with or without version numbers.
    There is no optimization of the various files as they are saved, and no user control over their generation. It would be much nicer if the adjustments weere done on the fly, or at least if a size-specific thumbnail were made for on-screen representation.
    For now, however, if you are concerned with the size of your files in pages (and keynote, since it also happens there) then do the adjustments elsewhere. I suggest iPhoto, if that is where you have stored your items. You can always make your placed photo an image placeholder and replace it with alternative adjustments later.
    (I tested this with a 4k file made for Livejournal, saved as a jpg. The resultant filtered and levelled tiffs were 40k. A considerable increase.)

  • Huge file size reduced by unlocking

    I ended up with a 2.7GB file that consisted of 8 locked tracks that had been locked and unlocked lots of times. I knew that was about ten times the real size. The only way to get rid of whatever was causing the huge file size was to unlock the tracks and Save As Archive which reduced the size to 400MB.
    Why did this happen?
    Also, is there a detailed manual available that tells you all the tricks about GB? Things like: exactly what does locking a track do (Help is not very informative); and how do Save, Save As and Save As Archive differ in their treatment of purple, orange and other regions. I have had problems saving everything when I backup to an external drive, so I'd like to find out exactly how these Saves work.

    Christoph,
    Yes I just did that. I saved a copy of the song and then trashed the freeze files that should not have been there. The song played fine, no problem. THe file went from 1.6G to 176M.
    I have found this situation in a few other files also.
    These are my saving habits when I'm working on a song. I work for a while using "Save". Then after a while I start to save many major changes as "Save as Archive". I always end a session this way. Usually I'm not concerned about disc space. I use "Save as archive" all the time because I would rather have a file with everything I'm using contained within that file. Last year I accidently trashed 10,000 loops and I have since found that "save as archive" is a helpful backup when I'm acting stupid around the trash.
    I may be one of the few folks who uses "Save as archive " so often and somehow it seems that every once in a while a hiccup occurs and the freeze files do not erase when the tracks are unlocked.
    Since this discovery I have gone thru many other files and found this to be the case every now and then. Some of the files didnt hint at being over stuffed but had a few errant freeze files. Normally I would never go and check a files contents but that one file at 1.6G grabbed my attention
    Just thought I would post to see if anyone has this problem.

  • Read Large Files Using BPEL File Adapter

    Hi,
    I have a scenario, where files of large size in text format are to be read and send to 3rd party. MTOM Policy has to be attached. Files of size 3MB or less are polled and size greater than 3 MB are not Retrieved. How can I resolve the issue. Do I have to break the file and send the data? If so how?
    Thanks
    Ranga

    You have to use streaming feature of the JCA file adapter for handling huge file.
    You could go through the following link
      http://docs.oracle.com/cd/E23943_01/integration.1111/e10231/adptr_file.htm#CIAHHEBF

  • How to read pdf file using file adapter

    Hi..
        How to read pdf file using file adapter?
    regards
    Arun

    Hi
    This may help you
    /people/sap.user72/blog/2005/07/27/xi-generate-pdf-file-out-of-file-adapter
    /people/alessandro.guarneri/blog/2007/02/21/sap-xi-acting-as-a-huge-file-mover
    ---Ram

  • How to locate a phrase in a file and read the file after that location

    I have a text file with the following contents
    Previous date data.....[10,000 lines ]
    same as below but with earlier dates
    * ABC  AutoJournal Utility - Version 2.00 *
    Beginning Reportournal at (time): Tue May 04 18:07:46 IST 2010
    AutoJournal assigned updated records a timestamp of (server time):  *2010-05-04*  //THIS IS THE PHRASE  i am looking for
    #  Posted Date,           Policy,  Reason, Amount, Insured Name, Agent Name
    1) 2010-04-26 00:00:00.0, 7363496, GRSS, 10.0, KRASSNER W, BROOKS STEVEN E
    2) 2010-04-23 00:00:00.0, 4768200, GRSS, 0.0, STECKLER M, STECKLER GIBSON
    3) 2010-04-20 00:00:00.0, 7328358, GRSS, 4.0, LYON GREGO, WEST MICHAEL J
    4) 2010-04-20 00:00:00.0, 4754236, GRSS, 0.0, PEEBLES JA, HOFFMANN GABRIE
    5) 2010-04-22 00:00:00.0, 7363793, GRSS, 2.0, LAHAYE NAN, GONYEA MICHAEL
    6) 2010-04-26 00:00:00.0, 7360935, GRSS, 35.0, AMITAY NOA, WIESEL HENRYI need to locate the date and start reading the data for the current date only i.e Today's date
    Thus far i have done this....
    it's just the outline
    String srchDate="2010-05-04";
    BufferedReader bf=null;
                   try {
                          bf = new BufferedReader(new FileReader("C:\\File.log"));
                       int lineID = 0;  
                       Pattern pattern =  Pattern.compile(srchDate);  
                       Matcher matcher = null;  
                       String line =null;
                          while((line=bf.readLine())!=null){ 
                                 System.out.println(line);
                                    lineID++;  
                                    matcher = pattern.matcher(line);      //Will match the entire line against srchDate and will return false ...
                                    if( matcher.find()){  
                                        //do stuff...      //stuck here
                                          //get the line no..
                        } catch (Exception e) {
                             e.printStackTrace();
                        }     Some help on this would be very helpful...
    Thanks :)
    P.S:This needs to be done using java 1,4 only

    kevinaworkman wrote:
    RainaV wrote:
    kevinaworkman wrote:
    If I were you, I would split your problem up into smaller pieces.
    Write a method that reads a file into a String.This file is actually quiet huge i am not sure if this would work as the file will only grow in size in the future....How big is "quiet huge"? I wouldn't worry about optimization until you've actually encountered a problem.FYI The file size is 20MB ...
    >
    Write a method that figures out where in a String a substring occurs (hey, that one might be done for you already...).
    Write a method that returns a substring of a String, based on a start index and a length (something tells me this one might exist already too).
    Write a method that converts a String to the correct datatype.
    Regarding these steps i am not sure if i follow>>
    the lines you see 1)....
    2).... till last line say 800..
    I don't understand your question.I was wondering if your appraoch would be complicated but will give it a try...
    >
    Actually i thght i would just read one complete line and store that in an list and then write it to an excel sheet...Okay, and the steps I outlined would work for reading in one line at a time, as well. It's just one more step.
    i am unable to locate this phrase using regex so can you suggest something on that lines
    You're trying to find today's date? Why does that require a regular expression?I want to read the data after i have located today's date in the Text file so i need to locate it first but the code i posted doesn't work so i suggested if you can advice on the correction....as per my approach mentioned in the original post
    But if i do it your way then definately no need for regex i need to store all of it in a string and work woth sub strings so on,...

Maybe you are looking for

  • Import or load a new logo from the external system

    Hi all... I want to import or load a new logo from the ext system helps to generate for my ALV report program(for top-of-page)......the one from SE78 is not working for me if i do import or with pre-defined one...... pls revert me on urgent basis....

  • Problem with User exit for Purchase Order creation/change

    Hi Everyone, I have a requirement where I have to implement an enhancement for the standard transaction(ME21N/ME22N). I need to modify a particular field of the item table (EKPO) based on a few comparisons from the header values (After save of PO). I

  • Adobe Acrobat 8 Pro and digital signature

    cannot use and register my digital signature in Adobe Acrobat 8 for Mac. I managed to use it under Windows, by the way. The Siemens HiPath SIM card uses OpenSC SCA http://www.opensc-project.org/sca It's on USB token, using built-in Mac OS X driver. I

  • HT204389 Siri slow or not responsive

    As of about an hour ago, Siri hasn't been working for me. I asked her what the name of Barack Obama's kids, the purple color circled the icon for about 30 seconds then she said "I'm really sorry about this, but I can't take any requests right now. Pl

  • Very Slow NAS Connection

    Got a Netgear ReadyNAS Duo in the office. It works fine with all other PCs in the office but my Macbook Air (running 10.7.2). Taking minutes to access files almost every hours. It seems like it has to reload the whole directory and it takes minutes t