Large file conversion

Has anyone out there had to deal with a large file conversion, i.e., the data to be loaded into R/3 is very large, too large to fit in an Excel file. Originally we suggested to the conversion team to put their data in an Excel file and we would flag all the records that were not loaded so they could find them easily and correct the data. However, now they tell us the number of records is too large for an Excel file and they DON'T want to split the file up into manageable pieces.
They also don't want to use a BDC session to correct the errors.
Any sharing of past experiences would be appreciated.
Thanks.
Nic

Depends on the skills in your team perhaps; when we've hit Excel limits in the past then Access has been the next tool of choice (often because it's already installed on the desktop).  This means you could, if you wanted, knock up a bit of RFC code in Access VB and + a "Call Transaction" function module in ABAP to do the file conversion, and they could correct any fails / bad data from within the Access system.  It's not going to perform as well as an ABAP with the usual "<i>load dataset into memory, do lots of call transactions, write the errors to a BDC session</i>" though (and run times could be critical to your project).

Similar Messages

  • 4.2.3/.4 Data load wizard - slow when loading large files

    Hi,
    I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
    The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
    While loading data, these configuration leads to the execution of the following query for each row:
    select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
    which can be found in the v$sql view while loading.
    It makes the loading process slow, because of the upper function no index can be used.
    It seems that the setting of "case sensitive" is not evaluated.
    Dropping the numeric index for the primary key and using a function based index does not help.
    Explain plan shows an implicit "to_char" conversion:
    UPPER(TO_CHAR(PK)=UPPER(:UK_1)
    This is missing in the query but maybe it is necessary for the function based index to work.
    Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
    Best regards
    Klaus

    Nevertheless, a bulk loading process is what I really like to have as part of the wizard.
    If all of the CSV files are identical:
    use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
    create a VIEW on the collection (makes it easier elsewhere)
    create a procedure (in a Package) to bulk process it.
    The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
    MK

  • Handling Large files in PI scenarios?

    Hello,
    We have lot of scenarios (almost 50) where we deal with file interfaces atleast in receiver or sender side. Some of them are just file transfers where we use AAE and some are where we have to do message mapping (sometimes very complex ones).
    the interfaces work perfectly fine will a normal file which dont have much records but recently we started testing big files with over 1000 records and its taking a lot of time to process. It is also causing other messages which gets lined up in the same queue to wait in the queue for the amount of time it takes for the first message to process.
    This must be a very practical scenario where PI has to process large files specially files coming from banks. What is the best way to handle its processing? Apart from having a better system hardware (we are currently in the test environment. Production environment will definetely be better) is there any technique which might help us improve the processing of large files without data loss and without interrupting other message?
    Thanks,
    Yash

    Hi Yash,
    Check this blogs for the strcuture you are mentioning:
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Regards,
    ---Satish

  • Processing Large Files using Chunk Mode with ICO

    Hi All,
    I am trying to process Large files using ICO. I am on PI 7.3 and I am using new feature of PI 7.3, to split the input file into chunks.
    And I know that we can not use mapping while using Chunk Mode.
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    Thanks in Advance,
    - Pooja.

    Hello,
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    According to this blog:
    File/FTP Adapter - Large File Transfer (Chunk Mode)
    The following limitations apply to the chunk mode in File Adapter
    As per the above screenshots, the split never cosiders the payload. It's just a binary split. So the following limitations would apply
    Only for File Sender to File Receiver
    No Mapping
    No Content Based Routing
    No Content Conversion
    No Custom Modules
    Probably you are doing content conversion that is why it is not working.
    Hope this helps,
    Mark
    Edited by: Mark Dihiansan on Mar 5, 2012 12:58 PM

  • Handling large files with FTP in XI

    Hi All,
    I have a file scenario where i have to post the file with size of more than 500MB and with the fields upto 700 in each line.
    The other scenario worked fine if the file size is below 70MB and less number of fields.
    Could anyone help me in handling such scenario with large file size and without splitting the file.
    1) From your previous experience did you use any Tools to help with the development of the FTP interfaces?
    2) The client looked at ItemField, but is not willing to use it, due to the licensing costs. Did you use any self-made pre-processors?
    3) Please let me know the good and bad experiences you had when using the XI File Adapter?
    Thanks & Regards,
    Raghuram Vedagiri.

    500 MB is huge. XI will not be able to handle such a huge payload for sure.
    Are you using XI as a mere FTP or are you using Content Conversion with mapping etc?
    1. Either use a splitting logic to split the file outside XI ( using Scripts ) and then XI handles these file.
    2. or Quick Size your hardware ( Java Heap etc ) to make sure that XI can handle this file ( not recommended though). SAP recommends a size of 5 MBas the optimum size.
    Regards
    Bhavesh

  • Splitting large file in XI

    can we split the incoming file in XI, we are getting a large file of size 80MB , wanted to cut down to 40MB each
    Sender system is sending 80MB file at single shot, they cannot change it.
    It has become mandatory for me to break it in XI.  (scenario is File to Proxy)

    Hi Viswanath,
    Handling large files say anything above 100MB is always a problem with File adapter as the data has to be moved from Adapter Engine integration Engine and vice versa.
    Third party tools are generally used for that. Conversion Agent by Itemfield is one of the best approaches.
    Also, on the Advanced tab of the file sender adapter, select the check box next to Advanced Mode. There you can specify Maximum File Size (Bytes) option.
    Huge processing of files
    Night Mare-Processing huge files in SAP XI
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging
    SAP XI acting as a (huge) file mover
    The specified item was not found.
    Managing bulky flat messages with SAP XI (tunneling once again) - UPDATED
    The specified item was not found.
    Regards,
    Vinod.

  • How to send a large file in XI ?

    How to send a large file in XI ?

    hi,
    use a splitting mechanism to convert large files into smaller files and process them as if they where independent.
    You can use the "Recordsets Per Message" parameter in the File Adapter sender comm channel by using File Content Conversion, to create a new message for each 1000,10000,etc records in the source file...this could be a way of splitting.
    If you scenario works ok for small files, maybe you can develop another scenario that runs previous the current one, that only splits files (e.g. File-XI-File) and then puts the output smaller files in the directory that your current scenario monitors.
    Check this blog for huge file processing
    /people/alessandro.guarneri/blog/2007/02/21/sap-xi-acting-as-a-huge-file-mover
    Thanks,
    Vijaya

  • Video File Conversion and Adobe AIR

    Hello guys
    I want to work on a simple app which will actually convert large size video files to small size video files. It will be a multiformat (and atleast supports .flv)
    But i have no idea where to start!
    I mean how to achive this file conversion thing.
    Any blog, library or resource will be highly appreciated appreciated
    Bunch of Thanks

    Thanks a lot for the reply
    I have not yet tried Alchemey, Is it available to download?
    Secondly, I want to do all the conversion locally, So MERAPI might be a good choice, Let me try it
    Thanks once again

  • How do I convert a large file which show error after is converting?

    I've tried to export a PDF to Excel and  word file and have not been able, show error after downloading and show converting in progress
    File is les 100MB and 936 pages

    Hi Large Files to Convert,
    Even when a PDF file is under the 100-MB file size limit, it could be that the large number of pages is adding a level of complexity that causes the ExportPDF service to timeout before it can finish the conversion.
    If the PDF file wasn't created from a scanned document, you can simplify the conversion somewhat by disabling OCR as described in this document: How to disable Optical Character Recognition (O... | Adobe Community
    Otherwise, you might want to consider using a free trial of Acrobat to convert that large file. For more information, see www.adobe.com/products/acrobat.html.
    Best,
    Sara

  • Does SAP XI (PI 7.0) support streaming to support large file/Idoc

    Does SAP XI (PI 7.0) support streaming to support large file/Idoc/Message Structure transfers?

    AFAIK, that is possible with flat files, when you use File Content Conversion.
    Check this blog: /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    Regards,
    Henrique.

  • Using XI to FTP large files

    Hi Folks,
    I have a scenario in which I need to transfer a flat file to an external system. No mapping is required. We are not using BPM. I read Michael's comments on using Java proxies to transfer large files. I know that we can use standard Java IO APIs to copy the file over. However, I don't know how to implement this.
    In my scenario an SAP tranaction will create the file. I just need XI to pick it up and FTP it to another server. Can you point in the right direction as to how i should go about imlementing this?
    1. I assume i will still have to use file adapter to pick up the file.Right?
    2. Then, i use Java server proxy to FTP it to the target system?
    3. In order to generate the proxy i need a message interface. Should i use a dummy Message Interface as my inbound and outbound that points to a dummy message type?
    Can someone provide me a sample?
    Thanks,
    Birla

    Hi Nilesh,
    Thanks for the reply and the link. However, the blog doesn't provide solution to my problem. I was asking if XI can pick-up a large file (say 200MB) and FTP it to an external system without doing content conversion.
    I already read these blogs.
    FTP_TO_FTP
    /people/william.li/blog/2006/09/08/how-to-send-any-data-even-binary-through-xi-without-using-the-integration-repository
    /people/shabarish.vijayakumar/blog/2006/04/03/xi-in-the-role-of-a-ftp
    These blogs suggest that i can use Java proxy to achieve better performance.
    I just don't know how to implement it.
    Any help would be much appreciated.
    Thanks,
    Birla.

  • Handling large files in scope of WSRP portlets

    Hi there,
    just wanted to ask if there are any best practices in respect to handling large file upload/download when using WSRP portlets (apart from by-passing WebCenter all-together for these use-cases, that is). We continue to get OutOfMemoryErrors and TimeoutExceptions as soon as the file being transfered becomes larger than a few hundred megabytes. The portlet is happily streaming the file as part of its javax.portlet.ResourceServingPortlet.serveResource(ResourceRequest, ResourceResponse) implementation, so the problem must somehow lie within WebCenter itself.
    Thanks in advance,
    Chris

    Hi Yash,
    Check this blogs for the strcuture you are mentioning:
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Regards,
    ---Satish

  • MTS File Conversion for iMovie

    Dear Apple Support Community,
    I am struggling with movie file conversions, and getting the optimum results from such a process before starting my editing work in iMovie.
    I have a Canon HD camcorder which records movies according to AVCHD standards. Before using iMovie, I need to convert the .MTS files to a format which is compatible with iMovie, and I have tried the following:
    Copy the files from the SD card on to my HDD.
    Using Toast 11 (Titanum), I have selected different profiles to convert the file to formats such as MOV, M4V, MP4 etc.
    The converted files yields a size that does not match the original one i.e. a 300 MB file after being converted, is now either 791 MB, or 96 MB in this case.
    When importing the converted file into iMovie, iMovie still wants to optimize the file for some reason (I believe using APPLE INTERMEDIATE CODEC resolves the optimization issue)
    Import the files directly from the SD card into iMovie, provided I have not altered the original file structure i.e. when inserting the SD card, iMovie recognizes the SD card the same way it would for connecting the Camera.When I select the videos to be imported, iMovie does so without hassles, but the files again turn out to be bigger than the original file by a factor of between 2 and 3, depending on the original file.
    What is the best way to convert files to preserve the quality as close as possible to the original?
    Is it normal for coverted files to be larger than the original file after being converted (smaller makes sense, but bigger just baffles me)
    Thanks for any feedback!

    Thanks for the great feedback.
    I do have struggles though in general, part of which originates from my camera itself.
    My camera still has a tendency to record an interlaced effect even though my settings are confidently selected to progressive (I have 50i and 25p to choose from - 25p is my selection).
    However, when I use Toast Titanium, I have an option to select the source video to be deinterlaced, but Toast does not do a good job at deinterlacing, compared to Handbrake. The benefit of my Toast Preset is that I can covert the video to an Apple Intermediate Codec, which means that when the files are imported into iMovie, the process is rather quick.
    When I convert using Handbrake, the software does a pretty **** good job at deinterlaing, but the downside is that I cannot find a preset that makes import into iMovie satisfactory.
    So, in light of all this long discussion I gave, my question is really, what is the best file converter for iMovie to convert MTS files that need to be deinterlaced.
    Thanks for any help!
    Adios

  • Large File Time Outs

    I tend to need larger files extracted to Excel. I've found that these time out. Are there settings to extend the conversion time? Or, is there an option to select specific pages to be able to break the conversion into seperate pieces?

    Nope. Downloads were attempted only to my MBP. Unsuccessful when attempted through the TC, whether wired or wireless. Successful when MBP connected directly to the ethernet jack for the internet connection.
    After downloading files directly connected, I can move the files around on the network via wireless - for example MBP to TC - with no issues.
    Haven't done anything with these files on the Dell - just noted that for network load. I use that thing as little as possible. It's running Windows 7 64 bit.
    Thanks!

  • BT Cloud - large file ( ~95MB) uploads failing

    I am consistently getting upload failures for any files over approximately 95MB in size.  This happens with both the Web interface, and the PC client.  
    With the Web interface the file upload gets to a percentage that would be around the 95MB amount, then fails showing a red icon with a exclamation mark.  
    With the PC client the file gets to the same percentage equating to approximately 95MB, then resets to 0%, and repeats this continuously.  I left my PC running 24/7 for 5 days, and this resulted in around 60GB of upload bandwidth being used just trying to upload a single 100MB file.
    I've verified this on two PCs (Win XP, SP3), one laptop (Win 7, 64 bit), and also my work PC (Win 7, 64 bit).  I've also verified it with multiple different types and sizes of files.  Everything from 1KB to ~95MB upload perfectly, but anything above this size ( I've tried 100MB, 120MB, 180MB, 250MB, 400MB) fails every time.
    I've completely uninstalled the PC Client, done a Windows "roll-back", reinstalled, but this has had no effect.  I also tried completely wiping the cloud account (deleting all files and disconnecting all devices), and starting from scratch a couple of times, but no improvement.
    I phoned technical support yesterday and had a BT support rep remote control my PC, but he was completely unfamiliar with the application and after fumbling around for over two hours, he had no suggestion other than trying to wait for longer to see if the failure would clear itself !!!!!
    Basically I suspect my Cloud account is just corrupted in some way and needs to be deleted and recreated from scratch by BT.  However I'm not sure how to get them to do this as calling technical support was futile.
    Any suggestions?
    Thanks,
    Elinor.
    Solved!
    Go to Solution.

    Hi,
    I too have been having problems uploading a large file (362Mb) for many weeks now and as this topic is marked as SOLVED I wanted to let BT know that it isn't solved for me.
    All I want to do is share a video with a friend and thought that BT cloud would be perfect!  Oh, if only that were the case :-(
    I first tried web upload (as I didn't want to use the PC client's Backup facility) - it failed.
    I then tried the PC client Backup.... after about 4 hrs of "progress" it reached 100% and an icon appeared.  I selected it and tried to Share it by email, only to have the share fail and no link.   Cloud backup thinks it's there but there are no files in my Cloud storage!
    I too spent a long time on the phone to Cloud support during which the tech took over my PC.  When he began trying to do completely inappropriate and irrelevant  things such as cleaning up my temporary internet files and cookies I stopped him.
    We did together successfully upload a small file and sharing that was successful - trouble is, it's not that file I want to share!
    Finally he said he would escalate the problem to next level of support.
    After a couple of weeks of hearing nothing, I called again and went through the same farce again with a different tech.  After which he assured me it was already escalated.  I demanded that someone give me some kind of update on the problem and he assured me I would hear from BT within a week.  I did - they rang to ask if the problem was fixed!  Needless to say it isn't.
    A couple of weeks later now and I've still heard nothing and it still doesn't work.
    Why can't Cloud support at least send me an email to let me know they exist and are working on this problem.
    I despair of ever being able to share this file with BT Cloud.
    C'mon BT Cloud surely you can do it - many other organisations can!

Maybe you are looking for

  • Hot Synch Crashed

    It happened again. While performing a hot synch my screen went blank, I restarted my computer and the message comes up that "hot synch app had to close".  I followed last set of instructions for a clean uninstall, deleted the proper reg keys, reboote

  • ERROR -WHILE LOADING MATERIAL MASTER FROM A FLAT FILE

    HI GUYS , I AM TRYING TO CREATE MATERIAL MASTER FROM A FROM FLAT FILE IAM GETTIGN AN ERROR MESSAGE 'YOU HAVE NOT FULLY MAINTAINED THE DESCRIPTIONS' REQUIRE HELP ON THIS THANKS

  • Lr v.5.6 slideshow burn to DVD

    I have Lr-5.6 on Windows V7 - Put 301 photos to slideshow saved file and burned to DVD disk. IT plays on my desktop, and my Sony DVD player, but not on DVD at church. What can I do to create a DVD slideshow that will play on most DVD players?

  • Export with query option

    Hi all! I need export different tables from my 10.2.0.4 database using expdp. Some of this tables, needs be under different conditions. How can I prepare my parameter file for this export? Table1 --> query: where valid=1 Table2 --> query: where valid

  • Clean out order is getting overloaded on the resource after optimization

    Hello Folks, I am using campaign optimization and after the campaign optimization is achieved by the optimizer the clean out order will be introduced. The clean out is overlapping with the actual operation on the resource. The resources used are fini