Huge file problem

Hi All,
I have input xml file which consists of 60,000 records and its size above 10MB. I wanted to process this file in XI such a way, where file has to be pick and process, but only 10,000 records should be process also file not split.
Its xml not flat file, As I know in FCC we can achieve it by "Record st per mesage"
Regards,
Sameer

Hi Sameer,
You can try out this scenario with BPM by looping 10000 times but then there is a point of contention because BPM is a memory hogger and looping it for 10000 times , the performance of the system is going to affected seriously.
Regards
joel

Similar Messages

  • File Adapter-Problem Reading Huge Files

    Hi,
    Here is the issue that i am facing
    When reading huge file(csv file upto 6MB-8MB) the communication channel configured as File Adapter with a polling interval of 7 min(420 sec) is inconsistent in reading the complete file.Sometimes it reads the the complete file of 6 MB and sometimes it reads a part of the file say 3MB/6MB.Can this inconsistent behaviour be resolved.??
    Your suggestions highly appreciated.
    Regards
    Pradeep

    Hi Pradeep !
    8mb is not a huge file for XI, I think it is a small one. Maybe your problem is not the size..please check if XI is not starting to read the file before it is completely written to the source folder. If you are creating that csv file from another application directly to the poll source directory of the XI scenario specified in the file adapter, and your poll interval is small, XI could start reading the file while you are still writing it. If this is the case, try to put the file with a different extension or filename than the specified in file adapter comm channel and when the file is completely written, rename it to its final filename and check if you are still having that misbehavior.
    You can write the file to a temp directory and the move it to the XI directory once finished.
    Regards,
    Matias.

  • Remote huge file load problem

    ---Following is the code to load a huge image file (50-100M)
    InputStream in = new FileInputStream(readFile);     
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    byte[] buffer = new byte[4096];     
    while((i = in.read(buffer)) != -1){
    out.write(buffer, 0, i);
    in.close();     
    byte[] result = out.toByteArray();
    ---It will crash (or hang) on toByteArray() when file size exceed 18mb.
    Is there any way to load large files?
    Thanks

    1)I am new to the java forum. May I know what is
    documentation for java BUT?http://java.sun.com/docs/index.html
    2)Also may I know where I can find topics about
    processing the file without loading the whole of it
    into memory?
    3) Actuall we are loading this file into bytearray in
    java at client side(PC or MAC) and use rpc to
    transfer to server(UNIX) - the C code in server side
    will take this byte buffer and use fwrite to create
    this huge file. Is there a better way than this?a) Break the file into smaller chunks and send the chunks together with a chunk identifer to the server. The server will then re-construct the file from the chunks.
    b) Install an ftp server on the server machine and user FTP to sent the file.

  • Upload huge file size: "The page cannot be displayed" browser

    What are precautions to be take for uploading a huge file
    I have an upload file operation in the web application. The web server is OC4J.
    //UploadForm.jsp
    <FORM NAME="InputForm" ACTION="UploadAction.jsp" METHOD="POST" enctype=multipart/form-data>
    <input type="file" name="fileName">
    </FORM>
    After I deploy the application to the web server, if I upload a small file size, it is fine.
    But if i upload huge data ,data is uploading but if any action button is clicked
    After 30 seconds, it has error "The page cannot be displayed" shown on the web browser.

    Hi All,
    I have a similar time out problem, it ll be great if some one provide a solution.
    The problem I'm facing is becasue of the time out setting in the proxy server which is 1.5 mins. The app server is sending Response properly but after some 2 mins or so because the processing is a bit complicated (3 API calls done intermittently for different validations).
    I cant request for change in proxy server time out setting. So I have to handle this with in my application.
    It will be greatful if anyone helps us out of this.
    Thanks
    Noufal

  • Huge file size reduced by unlocking

    I ended up with a 2.7GB file that consisted of 8 locked tracks that had been locked and unlocked lots of times. I knew that was about ten times the real size. The only way to get rid of whatever was causing the huge file size was to unlock the tracks and Save As Archive which reduced the size to 400MB.
    Why did this happen?
    Also, is there a detailed manual available that tells you all the tricks about GB? Things like: exactly what does locking a track do (Help is not very informative); and how do Save, Save As and Save As Archive differ in their treatment of purple, orange and other regions. I have had problems saving everything when I backup to an external drive, so I'd like to find out exactly how these Saves work.

    Christoph,
    Yes I just did that. I saved a copy of the song and then trashed the freeze files that should not have been there. The song played fine, no problem. THe file went from 1.6G to 176M.
    I have found this situation in a few other files also.
    These are my saving habits when I'm working on a song. I work for a while using "Save". Then after a while I start to save many major changes as "Save as Archive". I always end a session this way. Usually I'm not concerned about disc space. I use "Save as archive" all the time because I would rather have a file with everything I'm using contained within that file. Last year I accidently trashed 10,000 loops and I have since found that "save as archive" is a helpful backup when I'm acting stupid around the trash.
    I may be one of the few folks who uses "Save as archive " so often and somehow it seems that every once in a while a hiccup occurs and the freeze files do not erase when the tracks are unlocked.
    Since this discovery I have gone thru many other files and found this to be the case every now and then. Some of the files didnt hint at being over stuffed but had a few errant freeze files. Normally I would never go and check a files contents but that one file at 1.6G grabbed my attention
    Just thought I would post to see if anyone has this problem.

  • How to upload Huge file?

    Hi all,
    I am wondering how we can handle huge file upload in servlet/jsp level,say a file upload size of 70M bytes .I am using Struts Formfile but there's a lot of limitation there. Any suggestions? Thanks in advance

    Vishal,
    The main problem with CR txt-file creation is that a record in a flat file should have length = sum of all fields length as they are set in transfer structure in BW. It means that if, for example, you have in your file field value = "1234", but in BW the IO which is to accept this value has a length = 8, then in CR file you should have value = "____1234" or "1234____", where _ sign represent a space.
    You don't have to separate fields in a record by delimiters. They are counted as usual symbols, not delimiters. Only CR symbol is taken as a record delimiter. BW settings wont help either.
    Actually, you need to write a program to create such a file. There is a VBA program in the link I provided. But, it takes as an original an Excel file. You can use it for logic presentation.
    Actually, you need to open your source file, concatenate all fields of each record into one record without delimiters, padding each field by leading or following spaces.
    Unfortunately, I have to leave for a week.
    If you didn't solve your problem after 6 days, you can post back and I'll help you to create a program for CR txt-file creation.
    Best regards,
    Eugene
    Message was edited by: Eugene Khusainov

  • XML parsing huge file

    Hi,
    I have a 36M XML file i need to parse, I'm new to XML.
    I usually get a 200K file in CSV format from most of my client that they transfer into there account i then simply update the MSSQL database with the CSV file at midnight on my server. But now i have 74 clients that are regroup and they send me 1 XML file.
    When i run it using the sample they gave me it works fine but on the 36M file i get a Jrun error then i found out that :
    <CFFile action="READ" variable="xmlfile" file="c:\mypath\#clientfile#.xml" charset="utf-8">
    <cfset xmlObj = xmlParse(#xmlfile#)>
    Doesnt work on big files because it runs out of memory.
    I need a way to parse that file using Java i downloaded xmlsax.js but i dont know how to use it to parse then get my parsed var back from it can anyone help me please.
    I got the file here :  http://xmljs.sourceforge.net/website/sampleApplications-sax.html
    Thank you

    In response to Owain Norths' comments about DOM parsing.
    I'm not sure if the memory issues are the fault of the DOM parsing method being used or if the problem is in how CF converts XML text into CF objects (arrays, structs) that the XML text represents.  It possible that the CF objects are responsible for using excessive amounts of memory.  Either way it sounds like CF's XML parsing capabilities aren't appropriate for larger (large being a relative term) XML files.
    It might be an interesting experiment to use third party Java components (such as Xerces2) to parse some XML files and see what the performance and memory usage look like.
    I will re-state my original advice.  The poster needs to import data from XML files into tables on MS SQL Server.  Bulk import tasks, such as from XML or CSV files, are generally better handled in MS SQL Server.  Some options include: a job that executes T-SQL, an Integration Services package, or the Bulk Copy Program (BCP) utility. 
    From: Owain North [email protected]
    Sent: Fri 11/12/2010 8:57 AM
    To:
    Subject: XML parsing huge file
    Couldn't agree more, and to be honest I can't believe this hasn't come up before. To me, the thought that something like CF should have to be bypassed when you get to files of a few megs is utterly ridiculous. I haven't looked into the different methods of parsing XML as it's really not my thing, but are we saying that DOM parsing is necessary for CF to be able to perform the functions it does on  the resulting XML object? Or does one create the same result, just through a different method?
    Owain North
    Code Monkey
    Titan Internet Ltd
    http://www.titaninternet.co.uk <http://www.titaninternet.co.uk/>
    Owain North is a mildly overweight computer programmer who likes to sit in the corner of a darkened room tapping away on his keyboard whilst wearing a massive set of headphones to avoid human contact where possible. He particularly likes to avoid natural light and salad.
    In his spare time he likes to pet his dog and work on his track car: http://www.306gti6.com/forum/showthread.php?id=124722&page=1
    The other day he went up to the toilets upstairs and there were no hand towels left! Bad times.
    It's Filthy Friday, so we all got Dominos for lunch. Large (obviously) half and half Mighty Meaty and American Hot. Good it was, especially as one of the other guys didn't want his garlic & herb dip = win.
    At the moment, he's having to look into WCF for a new project on  server monitoring. He doesn't know anything about it yet but after a  quick session on Amazon with the company credit card and some  extortionate delivery fees he's well on his way to writing his first WCF  service.
    In case you're interested - in the end, he just had to dry his hands on his jeans.

  • HUGE files this year versus last year to burn

    Last year I made my husband a CD for Christmas and had 1.1 hours of music on it. This year the 1.1 hours of music from 2009 is a HUGE file and it won't burn. Why are this year's downloads so big and is there a way I can filter out extraneous info to just burn a CD of the music?? Thanks for any help!!

    1.1 hours of music should burn to a normal audio CD without a problem, no matter what the sizes of the underlying files are. Pls explain what you are doing, and somebody here will try to help.

  • Due to huge file message failed in Moni

    Hi All
    Scenario is SQL to BI
    My sender Adapter successfully picked up data from SQL. Both Select query and Update querry have done there operations. All the records were selected, there flags are updated successfully.
    Now problem is when it came to Integration Server, message failed saying error "CLIENT_SEND_FAILED" this is due to huge file.
    Now how can i do the roll back because now i cannot resend this particular message because of huge file and all the records flags are updated, so i cannot run the query again.
    Please help
    Regards
    Dheeraj Kumar

    Hi Dheeraj,
    >>Can i see what are the records that got updated by this traction?
    Check the inbound payload in MONI/Communication channel and check which all records were ther. alternatively check the modified record in the specific time at database level
    >>when i am checking payload it is not showing whole file, at the end it is written "Solve the error and then refresh the page" "unable to write onto server".
    Where are you seeing this? Right click and try to view the source of the page
    >>can you suggest me now what workaround i can do to resolve this problem?
    As suggested read fewer records (your select query needs to be altered for this). Or use a sender stored procedure which will read fewer records and send to PI
    Regards
    Suraj

  • Capturing in FCE, making it a HUGE file called "av"

    When capturing in FCE, it has been locking up because it is making it a file that takes up about 3 times more space called "AV". This has been an intermittent problem that does not happen every single capture but after capturing for a few minutes it returns to this. Very frustrating. I have never had any problems out of FCE since I got it with my Power Mac G5. When I try to open up the file as a stand alone file, it says it's "not a video file", what does that mean?? Help please....??

    Yes I checked the formatting, using only internal drive, no externals.
    First of all, if it "has no timecode", then why does Final Cut say "locating timeocode break" when it stops capturing?? Plus, it makes sense that if it is skipping around in timecode breaks, that's why the system is locking up because if it thinks it's a huge file that takes up my remaining hard drive space, it kicks off as to save the remaining 2gb of space.
    Your other comment about setting to DV converter. I know that there are a few options for settings. One is DV converter, another is Firewire. I have noticed there is no difference between capturing video with the two. I wish what you said was true, that if set on DV converter, there is no timecode. But if that was true, then why all of the problems capturing from VHS? When I capture from my 8mm video camera, it only stops capturing when there is a hard break in control track.
    What I ended up doing to capture this entire video I wanted from VHS was use iMovie of all things. It's pretty sad when you have to use the lower grade, novice video program when you have paid for the professional version just to capture some video. This went beautifully with NO PROBLEMS at all. My opinion: because iMovie DOES NOT READ TIMECODE, it just captures whatever you put in, thank goodness.
    Another question for you. When you have recorded things on your old VCR, don't you remember the numbers on the read out of the VCR or on the screen when it rewinds for fast forwards? That is timecode.

  • Sorting file content of a huge file and less RAM

    Hi,
    I want to sort the content of a big file. In a file each line ended by \n is one record. I want to sort the lines alphabetically.
    for eg.
    The file test.txt contains the following data
    pqr xyz \n
    abc xyz \n
    The sorted file should contain
    abc xyz \n
    pqr xyz \nand also the file size is very big around 20GB. and RAM is very less so it is not even possible to load such a huge file in memory.
    How to solve the above problem? Can anyone suggest me the good solution?
    Thanks

    Getting rid of the file in favor of a database sounds pretty good to me.Have you any clue how much time it will take to insert 20 GB into a relation database?
    I find szgy's solution sutaible for me
    but I think before merging (step 5) the data in
    separate files (compare set of two files) should be
    compared and should be written to a new file. Then
    these resultant files should be compared again tili
    it becomes a single file.you can merge all the files at once to using a heap, hence avoiding unnecessary file-writes.
    >
    The above step is required since I'm reading data
    from a file. I'll be taking the first few lines and
    sorting them and writing into new file. Here each
    file is sorted. but if you merge all the sorted files
    as it is the resultant file will not have sorted data
    since there was no comparision.
    Now I'll have to think upon sorting algorithms and
    merging of files after comparing data between two
    filesYou can email me at [email protected] to get a full working code that does all this for integers, including merging the results with help of a heap.
    For 10000000 (400 MB) it takes (in ms):
    other     read     write sort
    2579     16655     35407     37422
    Total is 80 sek, which is fairly ok for 400 MB of data I think (5 MB/sek) including reads and writes.
    Gil

  • Huge general problem - stability

    Hi all,
    I'm experiencing huge stability problem for the past few weeks.
    Most of my programs (excel, iPhoto, safari...) are crashing without any reasons ... I keep sending reports to apple,... but I thought that if I post something here, someone might have an idea on what's going on ...
    I think it's starting after safari crash (due to java and/or flash)... then, I can't work anymore. I have to reboot my computer and hope for the best.
    I have saved few crash report if that can be of any help.

    You need to follow up your crashes with repairing your disk drive.
    Install OS X to an external drive - so you know it is safe. And then run Disk Utility to repair your drive and permissions. I would not rely solely on DU though to repair drives, I'd invest in Alsoft Disk Warrior.
    Before you install or update your system or add software, always boot from another drive and do repairs, and make sure you have a current working system.
    You may need to update your backups and do a fresh install. Or try using a new test user account.
    At the least use the Reset for Safari to empty and delete the cache files and folders that Safari creates.
    ~/Library/Caches/MetaData
    ~/Library/Caches/Safari
    ~/Library/Safari/ - everything except the bookmarks.plist (might want to move everything to another location temporarily).
    Same for java, flash cache folders.
    Then look into why and what is causing this. Could be bad memory.

  • Trash icon in console is not highlighted and I cannot move to trash a huge file

    In Console I have a huge file that I cannot move to trash because the trash icon is not highlighted and I cannot highlight any messages.
    Help.

    Hi Niel:
    This solved my problem.  It took hours to empty the trash but eventually the problem was solved and I now have tons of room on my drive.  It seems that there were thousands of error messages in the console file from an application that I had downloaded a while back.  Thanks a lot for your help. 

  • DW CS5 Missing Related Files Problem

    Hello
    Hoping somebody may be able to help me with a missing related files problem in CS5. I've tried Adobe phone support but they couldn't solve the problem.
    *Some* of my sites (but not all) are not showing all the files related to that page. My pages are typically in .asp vb server model and contain .asp virtual includes (header, nav etc) plus .js and .css linked files
    However DW is only showing the .asp includes as related files and not css or js ones. In design view the CSS is displaying correctly, and all styles appear in the CSS inspector but not showing as a related file in the bar. In CS4 it works fine.
    I've tried deleting the site and recreating it, copying the files to a new folder and setting up a fresh site, creating a new folder and downloading all files from the remote server but no joy. I've also tried refreshing related files. if I click on the filter in the bar it only shows the asp includes and nothing else.
    Any help would be gratefully appreciated, this is a great feature when it works
    Cheers
    MB

    UPDATE:
    Managed to solve this problem, this is a new issue to DW CS5, previous versions do not seem to exhibit this problem
    It occurs when you use site root rather than document relative paths for js/css etc files AND don't have the remote site url (web url) specified in the site setup | local info dialog, or do not have it fully qualified with http://
    ie. ../js/file.js works fine without a remote web url but /js/file.js does not show the related file in the bar unless a remote url is defined.
    Hope that helps anyone else that encounters the same issue!
    MB

  • With conversion to Leopard, file problems with networked Windows computer

    Last night I did an Archive & Install from Tiger to Leopard on my Intel MacBook Pro. Today, I had trouble finding the other computers at my office. Once I finally got them to show up, I opened a Word file found on another computer, made some changes, and when I tried to save it, I got this message: "This is not a valid file name. Try one or more of the following: *Check the path to make sure it was typed correctly. *Select a file from the list of files and folders." Since this file already existed and I wasn't changing the name, I thought this was odd, but I changed the name from "Seating Chart 3-8-08" to "SeatingChart3-8-08" in case Leopard didn't like spaces when talking to Windows, but I got the same error message. Finally I gave up, not knowing what to do, then discovered that it had in fact saved my file. Still, every time I try to save ANY Word document from the shared folder of the Windows computer, I get the same error message endlessly until I choose "Don't Save."
    When I try to open an Excel file from that computer, it won't even open; it says " 'File Name.xls' cannot be accessed. This file may be Read-Only, or you may be trying to access a Read-Only location. Or, the server the document is stored on may not be responding." As with the Word file problem above, I did not have any problem accessing the files until I converted to Leopard.
    The Windows machine is Windows XP using Microsoft Office 2003; I have Microsoft Office 2004 on my machine.

    See if this Link, by Pondini, offers any insight to your issue...
    Transfer from Old  to New
    http://pondini.org/OSX/Setup.html
    Also, See here if you haven't already...
    http://www.apple.com/support/switch101/     Switching from PC

Maybe you are looking for

  • Can't email contents of page

    Hi all I use to be able to send html emails with Safari by simply clicking mail contents of this page. Now whenever I click it, I get the following message.. ''An email message can't be created because Safari can't find an email application. You can

  • Using new Apple ID on an old iPad

    Hi, I have a question. I recently had my iPhone stolen. On that Apple ID I also had my iPad. I now have a new phone and decided to create a new Apple ID (I feel safer this way). My question is, can I simply change Apple ID info on my iPad to the new

  • 10.8.4 update error

    My laptop is a early 2008 version Mac BookPro 15" display, below are details of my laptop. ================================   Model Name:          MacBook Pro   Model Identifier:          MacBookPro4,1   Processor Name:          Intel Core 2 Duo   Pr

  • Business Objects and Adobe Interactive forms

    Good day, Is it possible to use a Business Objects report output and post it into Adobe forms? We would like to use a report output and allow users to comment per report line  - online - or is there any other application in Business Objects which wil

  • URGENT ISSUE Flexfield segment disabling on tab out event

    I have a flexfield on my page. i have to set one flexfield segment(A) disabled when we tab out of the other flexfield segment(B) (i.e.wen B loses focus) how do we capture such an event and show results thru a controller class? Is it a case of partial