Regarding large file breaking in to chunks

Dear all,
We have a file about 500mb .To Send this file we came to conclusion that we need to break this file in to smaller chunks with help of shell script in unix.
Can you please Send me the Procedure to do the same.
Also Send me the Unix script to
1) break the file in to chunks
2)to combine this chunks in to single.
Let me know if there isany alternate approach for the same.
Please Send me the abovr details to [email protected]
Thanks,
Srinivasa

Large file handling issue
ref: that thread where they have mentioned multiple ways to handle such a situation.

Similar Messages

  • Processing Large Files using Chunk Mode with ICO

    Hi All,
    I am trying to process Large files using ICO. I am on PI 7.3 and I am using new feature of PI 7.3, to split the input file into chunks.
    And I know that we can not use mapping while using Chunk Mode.
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    Thanks in Advance,
    - Pooja.

    Hello,
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    According to this blog:
    File/FTP Adapter - Large File Transfer (Chunk Mode)
    The following limitations apply to the chunk mode in File Adapter
    As per the above screenshots, the split never cosiders the payload. It's just a binary split. So the following limitations would apply
    Only for File Sender to File Receiver
    No Mapping
    No Content Based Routing
    No Content Conversion
    No Custom Modules
    Probably you are doing content conversion that is why it is not working.
    Hope this helps,
    Mark
    Edited by: Mark Dihiansan on Mar 5, 2012 12:58 PM

  • WiFi breaks when transferring large file over LAN

    Hi guys,My WiFi mostly behaves itself, however if I try to transfer a large file (The last one was ~1GB) over the LAN (standard SMB file copy), WiFi drops out. It gets roughly 30MB through before connectivity is lost.The WiFi continues to show as connected, but I cannot even ping my router.Only disconecting and reconnecting WiFi re-establishes the connection.Any idea what's up? (My router is the black 'home hub' style. I can't find any version information in it's admin panel.) (And before someone asks, no I'm not telling you the contents of the file as it isn't relevant. With the greatest of respect, if you just ask unrelated questions to pry, please don't waste everyones time.)

    Didn't think of that, thanks. Great workaround.I'm going to call Sky, see if they have any firmware updates or other solution. Surely they wouldn't leave such a big knows issue unfixed. I'll report back with what they say. Edit:Just got off the phone from Sky technical support.This issue was first reported Nov/Dec 2014.The chap I spoke to does not know of a fix, or whether a fix is being worked on, or if a fix will ever be worked on.He has not way of contacting the development team to find out any of this information.He said firmware updates are pushed out to routers every now and again. He does not know when this is going to happen and does not have a way to notify about firmware updates installed on my router. (This terrifies me. Sky have a backdoor into my router and can install any code they wish, without even telling me what they changed/installed). There is no way to turn these auto-updates off either. Here's a key thing he did say though:He said he was not at liberty to provide my ADSL credentials, but if I chose to extract them from the router and install my own router instead, they would not have any problem with that.When I told him that this was against their TOS and they could terminate my contract for doing this, he went silent and just said "errr..." Allowing Sky, their subcontractors and anyone else they choose to install any code they wish to my router whenever they please and have free open access to my entire LAN is a total deal breaker for me. That router is getting unplugged today and I'm putting a new router on the credit card. I told tech support this. I said if they have a problem with that, let's go the legal route. I literally have a recorded call where they give me permission to install my own router. 

  • How can I get to read a large file and not break it up into bits

    Hi,
    How do I read a large file and not get the file cut into bits so each has its own beginning and ending.
    like
    1.aaa
    2.aaa
    3.aaa
    4....
    10.bbb
    11.bbb
    12.bbb
    13.bbb
    if the file was read on the line 11 and I wanted to read at 3 and then read again at 10.
    how do I specify the byte in the file of the large file since the read function has a read(byteb[],index,bytes to read);
    And it will only index in the array of bytes itself.
    Thanks
    San Htat

    tjacobs01 wrote:
    Peter__Lawrey wrote:
    Try RandomAccessFile. Not only do I hate RandomAccessFiles because of their inefficiency and limited use in today's computing world, The one dominated by small devices with SSD? Or the one dominated by large database servers and b-trees?
    I would also like to hate on the name 'RandomAccessFile' almost always, there's nothing 'random' about the access. I tend to think of the tens of thousands of databases users were found to have created in local drives in one previous employer's audit. Where's the company's mission-critical software? It's in some random Access file.
    Couldn't someone have come up with a better name, like NonlinearAccessFile? I guess the same goes for RAM to...Non-linear would imply access times other than O(N), but typically not constant, whereas RAM is nominally O(1), except it is highly optimised for consecutive access, as are spinning disk files, except RAM is fast in either direction.
    [one of these things is not like the other|http://www.tbray.org/ongoing/When/200x/2008/11/20/2008-Disk-Performance#p-11] silicon disks are much better at random access than rust disks and [Machine architecture|http://video.google.com/videoplay?docid=-4714369049736584770] at about 1:40 - RAM is much worse at random access than sequential.

  • Handling Large File

    Hi all,
    We need to handle a large file 88omb .is there any provision at the adapter level to break the file in to smaller chunks.
    Can we avoid using the shell scripts ansd OS level commands.
    Thanks,
    Srinivas

    Hi Srinivas,
       if it is a text file then you could break up the file into multiple recordsets,
      e.g.,
    [Converting Text Format in the Sender File/FTP Adapter to XML   |http://help.sap.com/saphelp_nwpi711/helpdata/en/44/658ac3344a4de0e10000000a1553f7/frameset.htm]
    and
    [#821267 File Adapter FAQ|http://service.sap.com/sap/support/notes/821267]
    14. Memory Requirements
    Regards
      Kenny

  • Splitting large file in XI

    can we split the incoming file in XI, we are getting a large file of size 80MB , wanted to cut down to 40MB each
    Sender system is sending 80MB file at single shot, they cannot change it.
    It has become mandatory for me to break it in XI.  (scenario is File to Proxy)

    Hi Viswanath,
    Handling large files say anything above 100MB is always a problem with File adapter as the data has to be moved from Adapter Engine integration Engine and vice versa.
    Third party tools are generally used for that. Conversion Agent by Itemfield is one of the best approaches.
    Also, on the Advanced tab of the file sender adapter, select the check box next to Advanced Mode. There you can specify Maximum File Size (Bytes) option.
    Huge processing of files
    Night Mare-Processing huge files in SAP XI
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging
    SAP XI acting as a (huge) file mover
    The specified item was not found.
    Managing bulky flat messages with SAP XI (tunneling once again) - UPDATED
    The specified item was not found.
    Regards,
    Vinod.

  • FTP how much large  file can take?

    Hi Experts,
    can you please any one tell me FTP,how much large size file can pull.maximum size of FTP?
    Thanks
    narendran

    Hi Narendan,
    if you read the question 14 (Q: Which memory requirements does the File Adapter have? Is there a restriction on the maximum file size it can process?) in this note  821267 - FAQ: XI 3.0 / PI 7.0 / PI 7.1 / PI 7.3 File Adapter , shows that the size restriction depends about several factors, i recommend that you read it.
    Check also this thread Size recommendation for placing file from FTP to Application Server??
    Also, i suggest that you read this blog if you would need to work with big files: File/FTP Adapter - Large File Transfer (Chunk Mode)
    Regards.

  • Problem while processing large files

    Hi
    I am facing a problem while processing large files.
    I have a file which is around 72mb. It has around more than 1lac records. XI is able to pick the file if it has 30,000 records. If file has more than 30,000 records XI is picking the file ( once it picks it is deleting the file ) but i dont see any information under SXMB_MONI. Either error or successful or processing ... . Its simply picking and igonring the file. If i am processing these records separatly it working.
    How to process this file. Why it is simply ignoring the file. How to solve this problem..
    Thanks & Regards
    Sowmya.

    Hi,
    XI pickup the Fiel based on max. limit of processing as well as the Memory & Resource Consumptions of XI server.
    PRocessing the fiel of 72 MB is bit higer one. It increase the Memory Utilization of XI server and that may fali to process at the max point.
    You should divide the File in small Chunks and allow to run multiple instances. It will  be faster and will not create any problem.
    Refer
    SAP Network Blog: Night Mare-Processing huge files in SAP XI
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    /people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14
    Processing huge file loads through XI
    File Limit -- please refer to SAP note: 821267 chapter 14
    File Limit
    Thanks
    swarup
    Edited by: Swarup Sawant on Jun 26, 2008 7:02 AM

  • Reading large file with JCA Adapter in OSB

    Hello,
    We are searching for a solution how to read large file (>50M) from network drive and deliver it to queue via OSB 11gR4 (10.3.4). The problem is when reading the file with JCA File Adapter. It seems that it cannot handle as large files as we have. The documentation provides a way to bypass file size limitation by using Chunk Read but it seems to require BPEL Process execution which is not possible in our environment. Does anyone know if there are ways to implement this without having BPEL Process?
    Our usecase:
    read file from network drive -> transfer with OSB -> deliver MQ
    Other options than JCA File Adapter can be considered, if anyone can advice...

    If it's a plain routing use case and no message processing is required then you may simply use OSB's FILE transport instead of JCA adapter. Create a messaging type proxy service and select request message type as "binary". Also enable the content streaming (Disk buffer, compression).
    From OSB Dev guide -
    Oracle JCA Adapter for FTP and Files – Attachments (large payload support), pre- and post-processing of files, using a re-entrant valve for processing ZIP files, content streaming, and file chunked read are not supported with Oracle Service Bus.
    http://download.oracle.com/docs/cd/E17904_01/doc.1111/e15866/jca.htm#BABBICIA
    You may also refer -
    Reading huge flat file in OSB 11gR1
    Regards,
    Anuj

  • Tips or tools for handling very large file uploads and downloads?

    I am working on a site that has a document repository feature. The documents are stored as BLOBs in an Oracle database and for reasonably sized files its not problem to stream the files out directly from the database. For file uploads, I am using the Struts module to get them on disk and am then putting the blob in the database.
    We are now being asked to support very large files of 250MB+. I am concerned about problems I've heard of with HTTP not being reliable for files over 256MB. I'd also like a solution that would give the user a status bar and allow for restarts of broken uploads or downloads.
    Does anyone know of an off-the-shelf module that might help in this regard? I suspect an ActiveX control or Applet on the client side would be necessary. Freeware or Commercial software would be ok.
    Thanks in advance for any help/ideas.

    Hi. There is nothing wrong with HTTP handling 250MB+ files (per se).
    However, connections can get reset.
    Consider offering the files via FTP. Most FTP clients are good about resuming transfers.
    Or if you want to keep using HTTP, try supporting chunked encoding. Then a user can use something like 'GetRight' to auto resume HTTP downloads.
    Hope that helps,
    Peter
    http://rimuhosting.com - JBoss EJB/JSP hosting specialists

  • Unable to transfer large files from MB to External HDs

    Hi,
    This may be an unusual one. My 1st edition Macbook won't let me transfer large files to my external hard-drives, either by USB, Ethernet or wirelessly through my home wi-fi network.
    I've started video editing so I'm importing DVcam tapes through firewire from the camera. A full DV tape translates to about 8-12gb I think.
    Using iMovie and saving the import to my Freecom 3.5inch network harddrive vis USB or ethernet, it fails saying 'Unexpected error, error code 1309'.
    When I try quicktime pro instead to import the tape to the external HD, I get 'Operation could not be completed, An attempt to add a resource to the file failed'. This happens too with my WD 2.5inch external drive.
    However, there is no such problem when I import to the MB's internal harddrive. When I then try and shift the large file off the laptop to an external drive, via USB, ethernet of wirelessly, for editing and save keeping, it fails half way through.
    I have the same problem when I try back-up my 17gb virtual windows machine I use with VMware fusion off the laptop to an external drive.
    I am currently burning an 18gb iMovie project from the MB's internal HD, over 5 DVDs using Toast's disc-spanning and will reimport them to the external, which is really time consuming.
    Also to note, Final Cut Express doesn't have these problems as it seems to break up what would be an 18gb movie file into several smaller files, which my MB is quiet happy to let go onto an external drive. However, the problem re-emerges on Final Cut Ex when I try and import a HDV tape. Possibly FCE isn't breaking them up into small enough pieces for my MB to handle?
    Any ideas why this is happening and what I can do to fix it? Or does this mean a new laptop?

    Irish Apple Fan wrote:
    Hi,
    This may be an unusual one. My 1st edition Macbook won't let me transfer large files to my external hard-drives, either by USB, Ethernet or wirelessly through my home wi-fi network.
    I've started video editing so I'm importing DVcam tapes through firewire from the camera. A full DV tape translates to about 8-12gb I think.
    Using iMovie and saving the import to my Freecom 3.5inch network harddrive vis USB or ethernet, it fails saying 'Unexpected error, error code 1309'.
    When I try quicktime pro instead to import the tape to the external HD, I get 'Operation could not be completed, An attempt to add a resource to the file failed'. This happens too with my WD 2.5inch external drive.
    I'm a bit late to the party, but specifically there's a 4GB file size limit in the FAT32 format that is standard on many external hard drives. I've run across this problem before, and usually it copies over most of the file and then quits when it reaches the limit.

  • SharePoint Foundation 2013 Optimization For Large File Transfer?

    We are considering upgrading from  WSS 3.0 to SharePoint Foundation 2013.
    One of the improvements we want to see after the upgrade is a better user experience when downloading large files.  It can be done now, but it is not reliable.
    Our document library consists of mostly average sized Office documents, but it also includes some audio and video files and software installer package zip files ranging from 100MB to 2GB in size.
    I know we can change the settings to "allow" larger than default file downloads but how do we optimize the server setup to make these large file transfers work as seamlessly as possible? More RAM on the SharePoint Foundation server? Other Windows,
    SharePoint or IIS optimizations?  The files will often be downloaded from the Internet, so we will not have control over the download speed.

    SharePoint is capable of sending large files, it is an HTTP stateless system like any other website in that regard. Given your server is sized appropriately for the amount of concurrent traffic you expect, I don't see any special optimizations required.
    Trevor Seward
    Follow or contact me at...
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.
    I see information like this posted warning against doing it as if large files are going to cause your SharePoint server and SQL to crash. 
    http://blogs.technet.com/b/praveenh/archive/2012/11/16/issues-with-uploading-large-documents-on-document-library-wss-3-0-amp-moss-2007.aspx
    "Though SharePoint is meant to handle files that are up to 2 gigs in size, it is not practically feasible and not recommended as well."
    "Not practically feasible" sounds like a pretty dire warning to stay away from large files.
    I had seen some other links warning that large file in the SharePoint database causes problems with fragmentation and large amounts of wasted space that doesn't go away when files are removed or that the server may run out of memory because downloaded
    files are held in RAM.

  • Q: HD players and carving up large files

    Greetings,
    First of all, this is what I'm using:
    Intel(R) Core(TM)2 Duo CPU
    8 gigs of RAM
    External 2TB drive, internal 1TB drive (for media only)
    Windows Vista, 64-bit
    Adobe Premiere Pro CS4
    First question, what is a good media player to play back MTS and M2TS files? I know there are several of them that come recommended on various sites, but I'm interested in knowing which ones to trust, which ones have the best options, and which ones some of you have had good luck with.
    But my biggest problem is that a recent client gave me some very large files with which to work, and it's been like trying to shove an elephant through a drinking straw. They gave me an external hard drive with a ton of AVCHD footage on it (1440 x 1080, 29.97). No problem...I thought. They wanted to use the footage for a time lapse sequence of a big construction job of a unique dome they're building. The way they had described it to me, they were going to record the site over a long period of time, programming the camera to capture a few seconds every few hours.
    Well, someone screwed the pooch. The camera was turned on almost every day, but they let the thing run all day! So, some of these files are over 30 gigs, and it's been murder trying to import them and get them to play. As I expected, it was pointless to try to use the external hard drive as a source since it's not even firewire but USB, and getting the files to play back in PP was not a good experience; I could practically hear the drive choking to death. But even when I transferred a couple of large files to my internal terabyte, they still played back terribly in PP, in both the viewer and on the timeline. When I moved the sequence marker, it would hang up the system for long periods of time, and then playback was choppy at best.
    I imagine it could be processor speed - or lack thereof. I know a quad would be better, but I've played back smaller MTS files with no problem, and right now getting more processor speed isn't an option - if that's even the main problem. The thing that adds insult to injury is that this project is going to mix SD (DVCAM, I think) with this AVCHD, so I don't even think shooting the timelapse stuff in HD was even necessary (they had a small Sony camera they wanted to use). I know I could convert the m2ts files to AVIs or something like that, but that would take forever since I'm going to need to use a little chunk of each file (to edit the time lapse).
    So, my second question is, is it possible to carve up these large files somehow, so that I can import smaller files into PP and use the (source) viewer in PP to scan the footage more easily? Like I said, it's been difficult to scan a 7-hour, 35 gig clip without missing something and/or hanging up the system. I don't have that problem with smaller source files. Unfortunately, I have a feeling carving up these m2ts isn't possible without converting the files to another format....?
    I don't know, maybe it isn't the files and it has something else to do with my computer. But I've been editing SD without much trouble at all, and also some HD (with smaller files), so something tells me these files are just too big. My drive is 7200 rpm, and I don't have anything else on this computer except some other Adobe stuff (PowerPoint, Reader; the "garbage" is minimal. Anybody have any ideas? Thanks in advance.
    Paul

    I would be declining to take on this project on the basis of what you have told us about what you
    have been supplied with...unless they are prepared to acknowledge the scale of the task (problem) and pay on an hourly basis.
    Its going to take a lot of time!

  • 4.2.3/.4 Data load wizard - slow when loading large files

    Hi,
    I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
    The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
    While loading data, these configuration leads to the execution of the following query for each row:
    select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
    which can be found in the v$sql view while loading.
    It makes the loading process slow, because of the upper function no index can be used.
    It seems that the setting of "case sensitive" is not evaluated.
    Dropping the numeric index for the primary key and using a function based index does not help.
    Explain plan shows an implicit "to_char" conversion:
    UPPER(TO_CHAR(PK)=UPPER(:UK_1)
    This is missing in the query but maybe it is necessary for the function based index to work.
    Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
    Best regards
    Klaus

    Nevertheless, a bulk loading process is what I really like to have as part of the wizard.
    If all of the CSV files are identical:
    use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
    create a VIEW on the collection (makes it easier elsewhere)
    create a procedure (in a Package) to bulk process it.
    The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
    MK

  • Loading large files in Java Swing GUI

    Hello Everyone!
    I am trying to load large files(more then 70 MB of xml text) in a Java Swing GUI. I tried several approaches,
    1)Byte based loading whith a loop similar to
    pane.setText("");
                 InputStream file_reader = new BufferedInputStream(new FileInputStream
                           (file));
                 int BUFFER_SIZE = 4096;
                 byte[] buffer = new byte[BUFFER_SIZE];
                 int bytesRead;
                 String line;
                 while ((bytesRead = file_reader.read(buffer, 0, BUFFER_SIZE)) != -1)
                      line = new String(buffer, 0, bytesRead);
                      pane.append(line);
                 }But this is gives me unacceptable response times for large files and runs out of Java Heap memory.
    2) I read in several places that I could load only small chunks of the file at a time and when the user scrolls upwards or downwards the next/previous chunk is loaded , to achieve this I am guessing extensive manipulation for the ScrollBar in the JScrollPane will be needed or adding an external JScrollBar perhaps? Can anyone provide sample code for that approach? (Putting in mind that I am writting code for an editor so I will be needing to interact via clicks and mouse wheel roatation and keyboard buttons and so on...)
    If anyone can help me, post sample code or point me to useful links that deal with this issue or with writting code for editors in general I would be very grateful.
    Thank you in advance.

    Hi,
    I'm replying to your question from another thread.
    To handle large files I used the new IO libary. I'm trying to remember off the top of my head but the classes involved were the RandomAccessFile, FileChannel and MappedByteBuffer. The MappedByteBuffer was the best way for me to read and write to the file.
    When opening the file I had to scan through the contents of the file using a swing worker thread and progress monitor. Whilst doing this I indexed the file into managable chunks. I also created a cache to further optimise file access.
    In all it worked really well and I was suprised by the performance of the new IO libraries. I remember loading 1GB files and whilst having to wait a few seconds to perform the indexing you wouldn't know that the data for the JList was being retrieved from a file whilst the application was running.
    Good Luck,
    Martin.

Maybe you are looking for