Another Large file processing question.

I have a file that's about 500,000 lines long. I need to comma delimit it. I can't do it in Excel for obvious reason. My last resort is Linux. It contains fixed width columns and I need to put coma between the columns. I know how wide each column is and its position. How can I add comma's to this file in the specific places that I need. Do I need a script. Can I use 'sed'?

sed or awk may be the tools you want to try for this task. Documentation can be found through the man command or using you're favorite search engine.
C.

Similar Messages

  • Large file processing in file adapter

    Hi,
    We are trying to process a large file of ~280 MB file size and we are getting timeout errors. I followed all the required tunings for memory and heap sizes and still the problem exists. I want to know if installation of decentral adapter engine just for this large file processing might solve the problem which I doubt.
    Based on my personal experience there might be a limitation of file size processing in XI may upto 100 MB with minimul mapping and no BPM.
    Any comments on this would be appreciated.
    Thanks
    Steve

    Dear Steve,
    This might help you,
    Topic #3.42
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/70ada5ef-0201-0010-1f8b-c935e444b0ad#search=%22XI%20sizing%20guide%22
    /people/sap.user72/blog/2004/11/28/how-robust-is-sap-exchange-infrastructure-xi
    This sizing guide &  the memory calculations  it will be usefull for you to deal further on this issue.
    http://help.sap.com/bp_bpmv130/Documentation/Planning/XISizingGuide.pdf#search=%22Message%20size%20in%20SAP%20XI%22
    File Adpater: Size of your processed messages
    Regards
    Agasthuri Doss

  • Large file processing in XI 3.0

    Hi,
    We are trying to process a large file of ~280 MB file size and we are getting timeout errors. I followed all the required tunings for memory and heap sizes and still the problem exists. I want to know if installation of decentral adapter engine for just this file processing might solve the problem which I doubt.
    Based on my personal experience there might be a limitation of file size processing in XI may upto 100 MB with minimul mapping and no BPM.
    Any comments on this would be appreciated.
    Thanks
    Steve

    Hi Debnilay,
    We do have 64 bit architecture and still we have the file processing problem. Currently we are splitting the file into smaller chuncks and processsing. But we want to process as a whole file.
    Thanks
    Steve

  • File Splitting for Large File processing in XI using EOIO QoS.

    Hi
    I am currently working on a scenario to split a large file (700MB) using sender file adapter "Recordset Structure" property (eg; Row, 5000). As the files are split and mapped, they are, appended to a destination file. In an example scenario a file of 700MB comes in (say with 20000 records) the destination file should have 20000 records.
    To ensure no records are missed during the process through XI, EOIO, QoS is used. A trigger record is appended to the incoming file (trigger record structure is the same as the main payload recordset) using UNIX shellscript before it is read by the Sender file adapter.
    XPATH conditions are evaluated in the receiver determination to eighther append the record to the main destination file or create a trigger file with only the trigger record in it.
    Problem that we are faced is that the "Recordset Structure" (eg; Row, 5000) splits in the chunks of 5000 and when the remaining records of the main payload are less than 5000 (say 1300) those remaining 1300 lines get grouped up with the trigger record and written to the trigger file instead of the actual destination file.
    For the sake of this forum I have a listed a sample scenario xml file representing the inbound file with the last record wih duns = "9999" as the trigger record that will be used to mark the end of the file after splitting and appending.
    <?xml version="1.0" encoding="utf-8"?>
    <ns:File xmlns:ns="somenamespace">
    <Data>
         <Row>
              <Duns>"001001924"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001925"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001926"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001927"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001928"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001929"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"9999"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
    </Data>
    </ns:File>
    In the sender file adapter I have for test purpose changed the "Recordset structure" set as "Row,5" for this sample xml inbound file above.
    I have two XPATH expressions in the receiver determination to take the last record set with the Duns = "9999" and send it to the receiver (coominication channel) to create the trigger file.
    In my test case the first 5 records get appended to the correct destination file. But the last two records (6th and 7th record get sent to the receiver channel that is only supposed to take the trigger record (last record with Duns = "9999").
    Destination file: (This is were all the records with "Duns NE "9999") are supposed to get appended)
    <?xml version="1.0" encoding="UTF-8"?>
    <R3File>
         <R3Row>
              <Duns>"001001924"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
         <R3Row>
              <Duns>"001001925"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
         <R3Row>
              <Duns>"001001926"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</xtract_Code>
         </R3Row>
              <R3Row>
              <Duns>"001001927"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
              <R3Row>
              <Duns>"001001928"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
    </R3File>
    Trigger File:
    <?xml version="1.0" encoding="UTF-8"?>
    <R3File>
              <R3Row>
              <Duns>"001001929"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Ccr_Extract_Code>"A"</Ccr_Extract_Code>
         </R3Row>
              <R3Row>
              <Duns>"9999"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Ccr_Extract_Code>"A"</Ccr_Extract_Code>
         </R3Row>
    </R3File>
    I ve tested the XPATH condition in XML Spy and that works fine. My doubts are on the property "Recordset structure" set as "Row,5".
    Any suggestions on this will be very helpful.
    Thanks,
    Mujtaba

    Hi Debnilay,
    We do have 64 bit architecture and still we have the file processing problem. Currently we are splitting the file into smaller chuncks and processsing. But we want to process as a whole file.
    Thanks
    Steve

  • Java.io.IOException during large file processing on PI 7.1

    Hello Colleagues,
    for a large file scenario on our PI 7.1 System we have to verify with big file size we are able to process over PI.
    During handing over the large file (200 MB XML) form the Adapter Frame Work (File Adapter) to the Integration Engine we receive following error:
    Transmitting the message to endpoint http://<host>:<port>/sap/xi/engine?type=entry using connection File_http://sap.com/xi/XI/System failed, due to: com.sap.engine.interfaces.messaging.api.exception.MessagingException: Error transmitting the message over HTTP. Reason: java.io.IOException: Error writing to server.
    The message processing stopped and message still lies at Adapter Frame Work. Large files up to 100 MB we are able to process successfully.
    Please, could you tell me why this happened and how we are able to solve it?
    Because there is not a java.outofmemory exception however a IO exception i think it could be an memory issue?!
    Many thanks in advance!
    Regards,
    Jochen

    Hi Jochen,
    Indeed the error is IO Error and it is because the Adapter engine was not able to send the message to Integration server. But it happens due to memory/heap size issues.
    Look at these thread, they are having the same problem. Please try the remedy measures suggested by them
    Mail to Proxy scenario with attachment. AF channel error.
    Error with huge file
    problem with big file in file-to-proxy scenario
    Is there any additional information in Adapter messaging tool.?
    Regards
    Suraj
    Edited by: S.R.Suraj on Oct 1, 2009 8:55 AM

  • Bottleneck in Large file processing

    Hi,
    We are experiencing timeout and memory issues in large file processings. I want to know wheather J2EE adapter engine or Integration engine is the bottleneck in processing large messages like over 300 MB files without splitting the files.
    Thanks
    Steve

    Hi Mario,
    We are testing a scenario to find out what is the maximum file size that XI can handle based on the blog
    ( /people/william.li/blog/2006/09/08/how-to-send-any-data-even-binary-through-xi-without-using-the-integration-repository) without any mapping. Upto 20 MB it works Ok and after that we are getting timeout error .
    Data from Moni:
    com.sap.engine.services.httpserver.exceptions.HttpIOException: Read timeout. The client has disconnected or a synchronization error has occurred. Read [1704371] bytes. Expected [33353075]. at com.sap.engine.services.httpserver.server.io.HttpInputStream.read(HttpInputStream.java:186) at com.sap.aii.af.service.util.ChunkedByteArrayOutputStream.write(ChunkedByteArrayOutputStream.java:181) at com.sap.aii.af.ra.ms.transport.TransportBody.<init>(TransportBody.java:99) at com.sap.aii.af.ra.ms.impl.core.transport.http.MessagingServlet.doPost
    This could be due to ICM timeout settings which we are planning to increase.
    I would like to hear from others experience of maximum file size that they could process. Ofcouse I do know that it depends on the environment.
    Thanks
    Steve

  • Large file processing issue

    Hi,
    A 2MB source file is found to be generating a file of over 180 MB causing it to fail in pre prod and production. The processes successfully in Development Box where there is no web dispatcher or restrictions on size.
    The recommendation from SAP is that we try to reduce the outout file size.
    Kindly look into the issue ASAP.
    Appreciate your help.
    Thanks,
    Satya Kumar

    Hi Satya,
    There are many ways are available check the below links
    /people/stefan.grube/blog/2007/02/20/working-with-the-payloadzipbean-module-of-the-xi-adapter-framework
    /people/aayush.dubey2/blog/2007/10/10/zip-transfer-unzip-increase-the-performance-of-your-java-abap-applications
    /people/pooja.pandey/blog/2005/10/17/number-formatting-to-handle-large-numbers
    /people/alessandro.guarneri/blog/2007/02/21/sap-xi-acting-as-a-huge-file-mover
    /people/alessandro.guarneri/blog/2006/03/05/managing-bulky-flat-messages-with-sap-xi-tunneling-once-again--updated
    One more way is we have to develope the ZIP Adapter and send the zip file after processing again we have to unzip the file.
    Regards
    Ramesh

  • Need advise on large file processing with good performance

    Hi All,
    I am working on a program in which I have to read millions of records from application server file.For this, I am reading 1 million records each time and uploading into the DB-table.
    what is the best approach to process the millions of records.what I am currently doing is, I read 1 million records one by one , modify each and every record based on some conditions and store them in one internal table and update the DB table.
    I am also thinking alternate approach is,read 1 million into one internal table and after that within the loop, modify each and every records for given condition and update the DB table.
    which approach is the best one?
    you can advise me any other alternate approches with good performance.
    Regards,
    Nivas
    Edited by: Nivas4081 on Jul 24, 2008 2:55 PM

    Hi Joshi,
    Thanks for your reply reply. I have tested both ways as I mentioned in my query but reading record by reocrd and update data packets takes less time than reading into iternal table,then modify and update the DB table.
    Hi ralph,
    Thanks for the reply.
    The modifications are similar in all the lines.I get related data from other class/method,do some calculation and modify each each record.
    Are there any performnace tricks to be followed when processing large amount of data.by the way I am reading certail amount of records say 400K and updating DB table using parallel processing.
    Apart from this, any suggestions on this?
    Regards,
    Nivas

  • OutOfMemory error on large file processing in sender file adapter

    Hi Experts,
    I got file to IDOC scenario, sender side we are using remote adapter engine, files are processing fine when the size is below 5mb, if the file size is more than 5mb we are getting java.lang.OutOfMemoryError: java heap space. can anyone suggest me what parameter i need to change in order to process more than 5mb.

    Hi Praveen,
    Suggestion from SAP is not to process huge files at a time. Instead, you can make the into chunks and process.
    To increase the heap Memory:
    For Java heap-size and other memory requirements, Refer to the following OSS note for details:
    Note 723909 - Java VM settings for J2EE 6.40/7.0
    Also check out the OSS Note 862405 - XI Configuration Options for Lowering Memory Requirements
    There are oss notes available for the Java Heap size settings specific to the OS environment too.
    Thanks,

  • Large File Processing Problem

    HI Group,
    I am facing problem in XI while processing 48 MB File through File adapter,I have used Content Conversion in the design.
    I am using normal 64Bit operating system with Max of 2GB heap size,still I am facing the problem,Can any body tell me how much Heap size I required to process 48 MB size file through XI?

    Hi,
    Refer following SAP note-for this go to www.service.sap.com/notes
    File adapter faq- 821267
    Java Heap - 862405
    for java settings- 722787
    This blog may give some insight-/people/sap.user72/blog/2004/11/28/how-robust-is-sap-exchange-infrastructure-xi
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    btw, if the error tells that, trailer is missing.. where are you getting this error ?
    Regards,
    Moorthy

  • Yet another Applescript file renaming question...

    Hi all,
    Well, having been a full-time Applescripter for five years at Apple, I thought it would be easy to dust my chops off and write a simple script to get a file name from one folder and paste that name onto a file in another folder.... but I guess 15 years of no-scripting has proven too much of a challenge to my aging brain... and I am on a time crunch to get this done for my client.
    SO, here is what I'm hoping to get help with.  I have no doubt it is pretty simple.
    File "A" is in a folder.
    File "B" is in a different folder.
    I need to GET the name of File A and use it to rename File B.
    Can one of you youngin's take 30 seconds and help an old dude out?
    Thanks a bunch,
    matt

    Niel... almost there.
    The script works perfectly (with me placing the path instead of just the folder)... thank you so much.
    One extra thing though... I didn't realize that the files are different file types (jpg and psd).
    So what do I need to do to change the name without changing the file type?
    THANKS again!
    Matt

  • BAI SWIFT,  bank key , file processing question

    Current setup:
    System has Bank key 12345, Bank number 12345 setup. No SWIFT code
    BAI file has SWIFT key "CITIBRBR" for Citibank in 02 header
    EBS my bank account number "5555555" is tied to Bank key "12345"
    Statement processing works successfully
    Future setup:
    System has Bank key 12345, Bank number 12345 and now SWFT code CITIBRBRXXX
    BAI file has SWIFT key "CITIBRBR" in 02 Header
    EBS Transaction type: my bank account number "5555555" is tied to Bank key "12345"
    Now statement does not process.
    House bank account table T012K contains aan entry with account number
    5555555.  There is no entry in the house bank table T012 with bank key CITIBRBR for this entry.
    EBS Transaction type: my bank account number "5555555" is tied to Bank key "12345"
    WHat I did:
    1. changed EBS setup so I have 5555555 and CITIBRBR tied to transaction type
    2. Created new bank key CITIBRBR and tied to house bank, That worked but data was incorrect. Citibank says that the data contains wrong information in bank key section.
    How can this be fixed?

    See the RandomAccessFile class in the API.

  • Remote Desktop Connection Drops when opening a large file or Transferring a large file

    I am running a Dell R720 Windows 2008 R2 Server. When I open a Large PDF or transfer a large file to the server, the server drops the remote desktop connection. I do not see any errors and no events are reported. I can access the server via
    iDRAC 7 enterprise and the server is still up and functioning properly; however, the remote desktop connection can only be restored after the server is rebooted. I have read the following article below do not see any conflicts.
    http://support.microsoft.com/kb/2477133/en-us
     That said, the issue happens when:
    1. opening a large PDF
    2. Using a UNC path to transfer a large file
    3. Using Hyper-V to import a .VHD (another large file)
    Any help is appreciated - Thanks in advance

    Hi,
    Thank you for posting in Windows Server Forum.
    Does this issue facing for single user or multiple users?
    Have you tried on other system? IS it, facing same issue.
    From description it seems network issue, please check whether there is any drop for network connection or it works on low bandwidth. You need to see there is no loss from bandwidth perspective. There are other certain reason which can drop the connection or
    performance as remote desktop works on many different points. 
    For try you can autotune disable and check with following command.
    netsh interface tcp set global autotuning=disabled
    To renable follow beneath command
    netsh interface tcp set global autotuninglevel=normal
    When you are remote desktop to the remote server, please set the connection speed accordingly to optimize performance and might it will resolve your case of dropping connection.
    More information.
    Announcing the Remote Desktop Protocol Performance Improvements in Windows Server 2008 R2 and Windows 7 white paper
    http://blogs.msdn.com/b/rds/archive/2010/02/05/announcing-the-remote-desktop-protocol-performance-improvements-in-windows-server-2008-r2-and-windows-7-white-paper.aspx
    Hope it helps!
    Thanks.
    Dharmesh Solanki
    TechNet Community Support

  • Processing large files on Mac OS X Lion

    Hi All,
    I need to process large files (few GB) from a measurement. The data files contain lists of measured events. I process them event by event and the result is relatively small and does not occupy much memory. The problem I am facing is that Lion "thinks" that I want to use the large data files later again and puts them into cache (inactive memory). The inactive memory is growing during the reading of the datafiles up to a point where the whole memory is full (8GB on MacBook Pro mid 2010) and it starts swapping a lot. That of course slows down the computer considerably including the process that reads the data.
    If I run "purge" command in Terminal, the inactive memory is cleared and it starts to be more responsive again. The question is: is there any way how to prevent Lion to start pushing running programs from memory into the swap on cost of useless harddrive cache?
    Thanks for suggestions.

    It's been a while but I recall using the "dd" command ("man dd" for info) to copy specific portions of data from one disk, device or file to another (in 512 byte increments).  You might be able to use it in a script to fetch parts of your larger file as you need them, and dd can be used to throw data from and/or to standard input/output so it's easy to get data and store in temporary container like a file or even a variable.
    Otherwise if you can afford it, and you might with 8 GB or RAM, you could try and disable swapping (paging to disk) alltogether and see if that helps...
    To disable paging, run the following command (in one line) in Terminal and reboot:
    sudo launchctl unload -w /System/Library/LaunchDaemons/com.apple.dynamic_pager.plist
    To re-enable paging, run the following command (in one line) in Terminal:
    sudo launchctl load -w /System/Library/LaunchDaemons/com.apple.dynamic_pager.plist
    Hope this helps!

  • Large file folder with question mark is flashing on screen.

    Just turned on computer for the first time and a large file folder with a question mark is flashing on the screen. I am unable to do anything and have restarted and it does the same thing.  Please help!

    It does this when it can't see the operating system. You didn't give much information but I'm assuming it's a brand new mini and you aren't using any external drives for the operating system. If that's the case then yes, reinstall OS X. Put your install DVD into the optical drive and reboot. As soon as you hear the chime, hold down the "C" key on your keyboard (or the Option key until the Install Disk shows up). That will force your Mac to boot from the install DVD in the optical drive.
    If reinstallation isn't possible, or if the problem continues after reinstallation, then you need to exchange it for another, because something is keeping the operating system on the drive from being read.

Maybe you are looking for

  • Table events lost after opening a popup window

    In an advanced table , I have a column which shows a link. If this link is clicked , a new window opens which displays a popup window. If the link is not clicked , I can traverse through the advanced table. If I click on the link, I cannot navigate t

  • Page not navigating into next page

    HI, I have two screens, where in the First screen if you give a Partner number then it should take you to the next screen, and show relevant detaisl ina table control. But this is not happening,, where are the settings to be checked... Any reason....

  • What if I have 2 firefox sync accounts?

    What if I have 2 separate firefox sync accounts from different computers? If I log out of one and try to log on the other on the same computer, it says it will merge the 2. How does that work?

  • Consumer endpoints and remote operations/input message types formation

    Hello, Couple of questions; 1) In case of a BC, a consumer endpoint (i.e. a proxy for an external consumer) doesn't need to be activated with the NMR -- is this understanding correct? 2) Moreover, as the SU for this component defines a "consumes" ele

  • IPod Touch 2.0.1 Reboots and shuts down randomly

    I have an 8gb iPod touch on the 2.0.1(5b108) firmware that now reboots and shuts down randomly while just sitting idle. I've tried restoring and resetting and it still does this. In terms of the shutting down randomly it seems like my hold button is