Large file processing issue
Hi,
A 2MB source file is found to be generating a file of over 180 MB causing it to fail in pre prod and production. The processes successfully in Development Box where there is no web dispatcher or restrictions on size.
The recommendation from SAP is that we try to reduce the outout file size.
Kindly look into the issue ASAP.
Appreciate your help.
Thanks,
Satya Kumar
Hi Satya,
There are many ways are available check the below links
/people/stefan.grube/blog/2007/02/20/working-with-the-payloadzipbean-module-of-the-xi-adapter-framework
/people/aayush.dubey2/blog/2007/10/10/zip-transfer-unzip-increase-the-performance-of-your-java-abap-applications
/people/pooja.pandey/blog/2005/10/17/number-formatting-to-handle-large-numbers
/people/alessandro.guarneri/blog/2007/02/21/sap-xi-acting-as-a-huge-file-mover
/people/alessandro.guarneri/blog/2006/03/05/managing-bulky-flat-messages-with-sap-xi-tunneling-once-again--updated
One more way is we have to develope the ZIP Adapter and send the zip file after processing again we have to unzip the file.
Regards
Ramesh
Similar Messages
-
Large file processing in file adapter
Hi,
We are trying to process a large file of ~280 MB file size and we are getting timeout errors. I followed all the required tunings for memory and heap sizes and still the problem exists. I want to know if installation of decentral adapter engine just for this large file processing might solve the problem which I doubt.
Based on my personal experience there might be a limitation of file size processing in XI may upto 100 MB with minimul mapping and no BPM.
Any comments on this would be appreciated.
Thanks
SteveDear Steve,
This might help you,
Topic #3.42
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/70ada5ef-0201-0010-1f8b-c935e444b0ad#search=%22XI%20sizing%20guide%22
/people/sap.user72/blog/2004/11/28/how-robust-is-sap-exchange-infrastructure-xi
This sizing guide & the memory calculations it will be usefull for you to deal further on this issue.
http://help.sap.com/bp_bpmv130/Documentation/Planning/XISizingGuide.pdf#search=%22Message%20size%20in%20SAP%20XI%22
File Adpater: Size of your processed messages
Regards
Agasthuri Doss -
Java.io.IOException during large file processing on PI 7.1
Hello Colleagues,
for a large file scenario on our PI 7.1 System we have to verify with big file size we are able to process over PI.
During handing over the large file (200 MB XML) form the Adapter Frame Work (File Adapter) to the Integration Engine we receive following error:
Transmitting the message to endpoint http://<host>:<port>/sap/xi/engine?type=entry using connection File_http://sap.com/xi/XI/System failed, due to: com.sap.engine.interfaces.messaging.api.exception.MessagingException: Error transmitting the message over HTTP. Reason: java.io.IOException: Error writing to server.
The message processing stopped and message still lies at Adapter Frame Work. Large files up to 100 MB we are able to process successfully.
Please, could you tell me why this happened and how we are able to solve it?
Because there is not a java.outofmemory exception however a IO exception i think it could be an memory issue?!
Many thanks in advance!
Regards,
JochenHi Jochen,
Indeed the error is IO Error and it is because the Adapter engine was not able to send the message to Integration server. But it happens due to memory/heap size issues.
Look at these thread, they are having the same problem. Please try the remedy measures suggested by them
Mail to Proxy scenario with attachment. AF channel error.
Error with huge file
problem with big file in file-to-proxy scenario
Is there any additional information in Adapter messaging tool.?
Regards
Suraj
Edited by: S.R.Suraj on Oct 1, 2009 8:55 AM -
Bottleneck in Large file processing
Hi,
We are experiencing timeout and memory issues in large file processings. I want to know wheather J2EE adapter engine or Integration engine is the bottleneck in processing large messages like over 300 MB files without splitting the files.
Thanks
SteveHi Mario,
We are testing a scenario to find out what is the maximum file size that XI can handle based on the blog
( /people/william.li/blog/2006/09/08/how-to-send-any-data-even-binary-through-xi-without-using-the-integration-repository) without any mapping. Upto 20 MB it works Ok and after that we are getting timeout error .
Data from Moni:
com.sap.engine.services.httpserver.exceptions.HttpIOException: Read timeout. The client has disconnected or a synchronization error has occurred. Read [1704371] bytes. Expected [33353075]. at com.sap.engine.services.httpserver.server.io.HttpInputStream.read(HttpInputStream.java:186) at com.sap.aii.af.service.util.ChunkedByteArrayOutputStream.write(ChunkedByteArrayOutputStream.java:181) at com.sap.aii.af.ra.ms.transport.TransportBody.<init>(TransportBody.java:99) at com.sap.aii.af.ra.ms.impl.core.transport.http.MessagingServlet.doPost
This could be due to ICM timeout settings which we are planning to increase.
I would like to hear from others experience of maximum file size that they could process. Ofcouse I do know that it depends on the environment.
Thanks
Steve -
Large file processing in XI 3.0
Hi,
We are trying to process a large file of ~280 MB file size and we are getting timeout errors. I followed all the required tunings for memory and heap sizes and still the problem exists. I want to know if installation of decentral adapter engine for just this file processing might solve the problem which I doubt.
Based on my personal experience there might be a limitation of file size processing in XI may upto 100 MB with minimul mapping and no BPM.
Any comments on this would be appreciated.
Thanks
SteveHi Debnilay,
We do have 64 bit architecture and still we have the file processing problem. Currently we are splitting the file into smaller chuncks and processsing. But we want to process as a whole file.
Thanks
Steve -
File Splitting for Large File processing in XI using EOIO QoS.
Hi
I am currently working on a scenario to split a large file (700MB) using sender file adapter "Recordset Structure" property (eg; Row, 5000). As the files are split and mapped, they are, appended to a destination file. In an example scenario a file of 700MB comes in (say with 20000 records) the destination file should have 20000 records.
To ensure no records are missed during the process through XI, EOIO, QoS is used. A trigger record is appended to the incoming file (trigger record structure is the same as the main payload recordset) using UNIX shellscript before it is read by the Sender file adapter.
XPATH conditions are evaluated in the receiver determination to eighther append the record to the main destination file or create a trigger file with only the trigger record in it.
Problem that we are faced is that the "Recordset Structure" (eg; Row, 5000) splits in the chunks of 5000 and when the remaining records of the main payload are less than 5000 (say 1300) those remaining 1300 lines get grouped up with the trigger record and written to the trigger file instead of the actual destination file.
For the sake of this forum I have a listed a sample scenario xml file representing the inbound file with the last record wih duns = "9999" as the trigger record that will be used to mark the end of the file after splitting and appending.
<?xml version="1.0" encoding="utf-8"?>
<ns:File xmlns:ns="somenamespace">
<Data>
<Row>
<Duns>"001001924"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Cage_Code>"3NQN1"</Cage_Code>
<Extract_Code>"A"</Extract_Code>
</Row>
<Row>
<Duns>"001001925"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Cage_Code>"3NQN1"</Cage_Code>
<Extract_Code>"A"</Extract_Code>
</Row>
<Row>
<Duns>"001001926"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Cage_Code>"3NQN1"</Cage_Code>
<Extract_Code>"A"</Extract_Code>
</Row>
<Row>
<Duns>"001001927"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Cage_Code>"3NQN1"</Cage_Code>
<Extract_Code>"A"</Extract_Code>
</Row>
<Row>
<Duns>"001001928"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Cage_Code>"3NQN1"</Cage_Code>
<Extract_Code>"A"</Extract_Code>
</Row>
<Row>
<Duns>"001001929"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Cage_Code>"3NQN1"</Cage_Code>
<Extract_Code>"A"</Extract_Code>
</Row>
<Row>
<Duns>"9999"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Cage_Code>"3NQN1"</Cage_Code>
<Extract_Code>"A"</Extract_Code>
</Row>
</Data>
</ns:File>
In the sender file adapter I have for test purpose changed the "Recordset structure" set as "Row,5" for this sample xml inbound file above.
I have two XPATH expressions in the receiver determination to take the last record set with the Duns = "9999" and send it to the receiver (coominication channel) to create the trigger file.
In my test case the first 5 records get appended to the correct destination file. But the last two records (6th and 7th record get sent to the receiver channel that is only supposed to take the trigger record (last record with Duns = "9999").
Destination file: (This is were all the records with "Duns NE "9999") are supposed to get appended)
<?xml version="1.0" encoding="UTF-8"?>
<R3File>
<R3Row>
<Duns>"001001924"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Extract_Code>"A"</Extract_Code>
</R3Row>
<R3Row>
<Duns>"001001925"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Extract_Code>"A"</Extract_Code>
</R3Row>
<R3Row>
<Duns>"001001926"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Extract_Code>"A"</xtract_Code>
</R3Row>
<R3Row>
<Duns>"001001927"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Extract_Code>"A"</Extract_Code>
</R3Row>
<R3Row>
<Duns>"001001928"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Extract_Code>"A"</Extract_Code>
</R3Row>
</R3File>
Trigger File:
<?xml version="1.0" encoding="UTF-8"?>
<R3File>
<R3Row>
<Duns>"001001929"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Ccr_Extract_Code>"A"</Ccr_Extract_Code>
</R3Row>
<R3Row>
<Duns>"9999"</Duns>
<Duns_Plus_4>""</Duns_Plus_4>
<Ccr_Extract_Code>"A"</Ccr_Extract_Code>
</R3Row>
</R3File>
I ve tested the XPATH condition in XML Spy and that works fine. My doubts are on the property "Recordset structure" set as "Row,5".
Any suggestions on this will be very helpful.
Thanks,
MujtabaHi Debnilay,
We do have 64 bit architecture and still we have the file processing problem. Currently we are splitting the file into smaller chuncks and processsing. But we want to process as a whole file.
Thanks
Steve -
Help with Aperture/T2i/iMovie/large file ( 2GB) issue.
I am having an issue with a very large file (3.99 GB) that I shot with my Rebel T2i. The file imports fine into Aperture, but then it isn't recognized at all by iMovie. I found out that iMovie can only handle files that are 2 GB or smaller.
So now, I am trying to figure out how to chop my mega file neatly into a pair of 2GB halves. When I use the trim function, that does not seem to do the trick -- this may be a case of Aperture's non-destructive nature actually working against me.
Does anyone have a solution for this? My intuition suggests that this may be a job for QuickTime Pro -- but I wasn't sure how that works now that we all have QuickTime X.
Much appreciated.The file may well be in the wrong format, can you tell us more about it. See This
-
CF8.01 Large Files Upload issue
We are having an issue with posting large files to the server
through CFFile. Our server is running on Windows 2003 R2 SP2 with
2GB of RAM. The average upload size is 800MB and we may run into
multiple simultaneous uploads with the large file size. So, we have
adjusted the "Maximum size of post data" to 2000 MB and "Request
Throttle Memory" to 5000 MB in the ColdFusion admin setting to
hopefully can allow up to 5 simultaneous uploads.
However, when we tried to launch two 800MB uploads at the
same time from different machines, only one upload can get through.
The other one returned "Cannot connect to the server" error after a
few minutes. No errors can be found in the W3C log and the
ColdFusion logs (coldfusion-out.log and exception.log) but it is
reported in the HTTPErr1.log with the following message:
2008-04-18 08:16:11 204.13.x.x 3057 65.162.x.x 80 HTTP/1.1
POST /testupload.cfm - 1899633270 Timer_EntityBody DefaultAppPool
Can anyone shed some light of it? Thanks!quote:
Originally posted by:
Newsgroup User
Don't forget that your web server (IIS, Apache, etc.) can
have upload
throttles as well.
We did not throttle our IIS to limit the upload/download
bandwidth.
Is there a maximum limit for "Request Throttle Memory"
setting? Can we set it over the available physical RAM size? -
When we are downloading a large file 100MB + with Internet Explorer the
workstation will at some point report that the connection was reset by the
server and we lose the download. This is kind of random and I've only seen
it in IE. I have and use Firefox at my desk and I have no trouble like
this. But our campus standard is IE and I really would like to pin this down.
My fear is the Packet filters. I went and looked at the packet filters we
have set up and was shocked to see we had 77 filter exceptions. My last
look only showed 45 so I need to sit down and review every filter exception
and thin the heard. Anyway, below are the exception for http which is what
I believe IE uses in downloading files. Right? and wanted a second option
from you guys.
We have NW 6.5 sp 5 BM 3.8 sp 4 ir3 and I use Craigs tuneup and proxycfg.
the packet typ HTTP is defines as:
Protocol TCP
SRC Port ALL
DEST Port 80
ACK bit Filtering disabled
Stateful Filtering enabled
The filter exception are:
#15
Packet Type: HTTP
Source: B57_1_EII
All Circuits
Any Address
Destination: B57_2_EII
All Circuits
66.89.73.96/255.255.255.224
Comment:
#22
Packet Type: HTTP
Source: <All Interfaces>
All Circuits
Any Address
Destination: CE1000_1
All Circuits
Any Address
Comment:
#24
Packet Type: HTTP
Source: B57_2_EII
All Circuits
Any Address
Destination: <All Interfaces>
All Circuits
Any Address
Comment: Added by BRDCFG to allow HTTP proxy.
#67
Packet Type: HTTP
Source: B57_2_EII
All Circuits
12.22.25.0/255.255.255.240
Destination: <All Interfaces>
All Circuits
192.168.3.7
Comment: Inbound http for powerschool
#75
Packet Type: HTTP
Source: B57_1_EII
All Circuits
192.168.3.6
Destination: <All Interfaces>
All Circuits
Any Address
Comment:
Thanks
DavidWhen I looked at the version level of Proxy.cfg I use that you wrote it was
version 21, I did download the current version. Would the differences
between the versions cause any of the behavior I've seen. I didn't see much
in there that looked too different.
Other Comments inside the threaded message:
> On your stateful HTTP exceptions, you are:
> #15 Allowing all HTTP to network 66.89.73.96/255.255.255.224
To allow private LAN access to the Public IP subnet range to allow
management of Routers etc. that are outside of this firewall.
> #22 Allowing all HTTP to cross interface CE1000_1 - if this is the public
> interface, you are allowing all outbound HTTP without using a proxy. If
> this is the private interface, you are allowing all inbound HTTP through
> any static NAT connection.
CE1000_1 is the interface that is connected to our DMZ. We host 3 webserver
in the DMZ for public access. One is our Groupwise Webaccess.
> #24 Allowing all HTTP as long as it originates from the B57_2 interface.
> If the public IP address were also specified, it would be one of the
> default exceptions intended to allow proxy to send HTTP requests out from
> the server to the Internet. Because source/destination IP address is
> missing, it effectively allows traffic into the B57_2 interface as well,
> assuming the B57_2 interface is public.
#24 here is the default added by BM. I don't think I ever touched this one.
This is a set of filters that were imported from our earlier BM versions
when we built this box. Sounds like it should have the Public IP address
listed as the source IP. Would you suggest adding that in?
> #67 Assuming B57_2 is public, you are allowing HTTP traffic to come from
> a specific network, into a host at 192.168.3.7, via static NAT.
Correct. We have a turn key box with a windows application that is managed
by the vendor via HTTP and that is their IP range and the target IP is
192.168.3.7 and it hold a static NAT in the 66.89.73 range on the BM Server.
> #75 Hard for me to tell what this is doing, as it appears the B57
> interface is your public interface. It will allow any HTTP from IP
> address 192.168.3.6 to anywhere, but only if that address is on the B57
> side of the server, which sounds incorrect. I'm guessing that the
> 192.168.3.x network is on the CE1000 side of the server, in which case
> this is a useless filter exception.
The Interface is B57_1_EII and this is the Private LAN interface. The
intent is to allow the SPAM filter box http access to the internet without
requiring a proxy. The box is unable to accept a proxy setting and the way
it gets it's updates from the vendor is via HTTP.
Thanks
David -
Need advise on large file processing with good performance
Hi All,
I am working on a program in which I have to read millions of records from application server file.For this, I am reading 1 million records each time and uploading into the DB-table.
what is the best approach to process the millions of records.what I am currently doing is, I read 1 million records one by one , modify each and every record based on some conditions and store them in one internal table and update the DB table.
I am also thinking alternate approach is,read 1 million into one internal table and after that within the loop, modify each and every records for given condition and update the DB table.
which approach is the best one?
you can advise me any other alternate approches with good performance.
Regards,
Nivas
Edited by: Nivas4081 on Jul 24, 2008 2:55 PMHi Joshi,
Thanks for your reply reply. I have tested both ways as I mentioned in my query but reading record by reocrd and update data packets takes less time than reading into iternal table,then modify and update the DB table.
Hi ralph,
Thanks for the reply.
The modifications are similar in all the lines.I get related data from other class/method,do some calculation and modify each each record.
Are there any performnace tricks to be followed when processing large amount of data.by the way I am reading certail amount of records say 400K and updating DB table using parallel processing.
Apart from this, any suggestions on this?
Regards,
Nivas -
OutOfMemory error on large file processing in sender file adapter
Hi Experts,
I got file to IDOC scenario, sender side we are using remote adapter engine, files are processing fine when the size is below 5mb, if the file size is more than 5mb we are getting java.lang.OutOfMemoryError: java heap space. can anyone suggest me what parameter i need to change in order to process more than 5mb.Hi Praveen,
Suggestion from SAP is not to process huge files at a time. Instead, you can make the into chunks and process.
To increase the heap Memory:
For Java heap-size and other memory requirements, Refer to the following OSS note for details:
Note 723909 - Java VM settings for J2EE 6.40/7.0
Also check out the OSS Note 862405 - XI Configuration Options for Lowering Memory Requirements
There are oss notes available for the Java Heap size settings specific to the OS environment too.
Thanks, -
File processing issue -File- PI- RFC
Dear Experts,
I have struck with this source file while processing the scenerio FTP(File) ->PI->RFC
I need to take the data from CSV file,
Just push the first row data to Char field and second row data to Mean field in RFC.
FCC structure:
The source data is simple but i struck with some logic,Kindly guide me how to process this data by using FCC or we can handle by mapping.Also check the source structure whether i followed correctly.
Best Regards,
MonikandanHi Amit,
Exactly i need to send the entire 1st row data to CHAR Description field and 2nd row data to mean field.
While i am testing in message mapping using the source file will getting the below output
Exact output i need is Process -001,Date row data will go to CHAR field and 10000,20-02-05 row data will go to MEAN field in RFC.
How to split it.kindly guide.
Best Regards,
Monikandan. -
HI Group,
I am facing problem in XI while processing 48 MB File through File adapter,I have used Content Conversion in the design.
I am using normal 64Bit operating system with Max of 2GB heap size,still I am facing the problem,Can any body tell me how much Heap size I required to process 48 MB size file through XI?Hi,
Refer following SAP note-for this go to www.service.sap.com/notes
File adapter faq- 821267
Java Heap - 862405
for java settings- 722787
This blog may give some insight-/people/sap.user72/blog/2004/11/28/how-robust-is-sap-exchange-infrastructure-xi
/people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
btw, if the error tells that, trailer is missing.. where are you getting this error ?
Regards,
Moorthy -
All,
I am using a file - file scenario.The file adapter is picking up all my files from the source directory and i see successfully processed in Tcode sxmb_moni.
But the files are not delivered to my target location.
Any ideas.
Thanks,
NandiniHi Nandini,
If you want to find the exact error about this check the CC monitoring in Receiver side and log. so that it will say what could be the problem
I hope you have problem with the Receiver CC FCC parameters. Double check them.
If you want to see the CC monitoring, you can see them using RWB->adapte->CC monitoring
Warm Regards,
Vijay -
Another Large file processing question.
I have a file that's about 500,000 lines long. I need to comma delimit it. I can't do it in Excel for obvious reason. My last resort is Linux. It contains fixed width columns and I need to put coma between the columns. I know how wide each column is and its position. How can I add comma's to this file in the specific places that I need. Do I need a script. Can I use 'sed'?
sed or awk may be the tools you want to try for this task. Documentation can be found through the man command or using you're favorite search engine.
C.
Maybe you are looking for
-
2013 versus 2010 - Export to Excel
Hi, When exporting a view from Project Server 2013 Project Center to Excel, it seems to export without any formatting (no colours, etc). It only exports rudimentary formatting such as the indenting. This seems to be different than it used to be. Does
-
Stock determination during goods issue
Hi expert, our client has one requirement in such a way whem they issue the material , at that time they could see batch wise stock and acordingly they could issue the material. i have assigned stock determination rule and group to material master as
-
[Master-Detail] JBO-35007: Row currency has changed since...
Hi, I have a very annoying problem in ADF Faces with row currency: I have a "Master table - Detail table" relationship on a very simple test page based on the SRDemo's PRODUCTS and SERVICE_REQUESTS tables. Everything was done automatically: create En
-
I have a problem of sending mail through my NOKIA device 5235 for 30 days,I already complaints by mails & calls more than 10 times for this problem to customer care but not given solution for this matter.when I tried to send the mail below errors i
-
CS4 and adobe Air 1.5.1
Hello I change to Flash player 10.022r and adobe Air 1.5.1 and my old file work well inside Flash in Desktop mode but when I execute de archive swf outside the Filesystem class not work... Before the update work well and when I compile de air file an