SAP ISR -XI - SAP POS. File size more than 11 KB failing in Inbound

Hi All,
We are implementing SAP ISR- XI - POS Retail implementation using
Standard Content Store Connectivity 2.0, GM Store Connectivity 1.0, and
other contents.
In our Inbound Scenario File-RFC , we are picking files from FTP server
for sales and totals data and if the size of this sales data file in
format *_XI_INPUT.DAT is greater than 11 kb , it is failing at XI
Mapping level, saying Exception occurred during XSLT
mapping "GMTLog2IXRetailPOSLog" of the application. We have tried and tested at mapping level no error found as this is processing files below 11 Kb successfully with same mappings, Also this is standard Mapping by SAP in form of XI Content Store connectivity 2.0.
At XI Side we have processed the file of eg: 40 KB  by splitting the record data and making
file size less than 11KB and it is being processed successfully, but file of 40 kb fails.
XI Server: AIX  Server.
There may be some memory setting missing or some basis problem also. Kindly let me know how to proceed.
Regards,
Surbhi Bhagat

hi,
this is hard to believe that such small files cannot be processed
do your XI mappings work for any other flows with something more then 11kb?
let me know about that and then we will know some more
as this is realy very small size
maybe your XI was installed in on PocketPC
Regards,
Michal Krawczyk

Similar Messages

  • Load and Read XML file size more than 4GB

    Hi All
    My environment is Oracle 10.2.0.4 on Solaris and I have processes to work with XML file as below detail by PL/SQL
    1. I read XML file over HTTP port into XMLTYPE column in table.
    2. I read value no.1 from table and extract to insert into another table
    On test db, everything is work but I got below error when I use production XML file
         ORA-31186: Document contains too many nodes
    Current XML size about 100MB but the procedure must support XML file size more than 4GB in the future.
    Belows are some part of my code for your info.
    1. Read XML by line into variable and insert into table
    LOOP
    UTL_HTTP.read_text(http_resp, v_resptext, 32767);
    DBMS_LOB.writeappend (v_clob, LENGTH(v_resptext), v_resptext);
        END LOOP;
        INSERT INTO XMLTAB VALUES (XMLTYPE(v_clob));
    2. Read cell value from XML column and extract to insert into another table
    DECLARE
    CURSOR c_xml IS
    (SELECT  trim(y.cvalue)
    FROM XMLTAB xt,
    XMLTable('/Table/Rows/Cells/Cell' PASSING xt.XMLDoc
    COLUMNS
    cvalue
    VARCHAR(50)
    PATH '/') y;
        BEGIN
    OPEN c_xml;
    FETCH c_xml INTO v_TempValue;
    <Generate insert statement into another table>
    EXIT WHEN c_xml%NOTFOUND;
    CLOSE c_xml;
        END
    And one more problem is performance issue when XML file is big, first step to load XML content to XMLTYPE column slowly.
    Could you please suggest any solution to read large XML file and improve performance?
    Thank you in advance.
    Hiko      

    See Mark Drake's (Product Manager Oracle XMLDB, Oracle US) response in this old post: ORA-31167: 64k size limit for XML node
    The "in a future release" reference, means that this boundary 64K / node issue, was lifted in 11g and onwards...
    So first of all, if not only due to performance improvements, I would strongly suggest to upgrade to a database version which is supported by Oracle, see My Oracle Support... In short Oracle 10.2.x was in extended support up to summer 2013, if I am not mistaken and is currently not supported anymore...
    If you are able to able to upgrade, please use the much, much more performing XMLType Securefile Binary XML storage option, instead of the XMLType (Basicfile) CLOB storage option.
    HTH

  • Broadcasting results not transferring to AL11 if file size more than 3 MB

    Hi All,
    I am broadcasting Workbook results to AL11(application server) by using SAP standard program. If the result file is more than 3MB pre calculation working fine but file is not transferring to application server. Could please let me is there is setting to increase the transfer limit to AL11. Infact I am in touch with Ba
    Thanks in advance.
    Regards,
    J B

    Hi Inder,
    As per sap recommendation we would be able to handle 100 MB, you need to tune your server by increasing the [arametersso that you would be able to handle the messages with big payload.
    By default the parameter  icm/HTTP/max_request_size_KB will be 10240 which can handle 100MB of file size.if you increase the parameter value by tuning ur system you can process a file bigger than that..
    Please refer to the below links for reference....
    [link1|http://help.sap.com/saphelp_nw04s/helpdata/en/58/108b02102344069e4a31758bc2c810/content.htm]
    [link2|http://help.sap.com/saphelp_nwpi71/helpdata/de/95/1528d8ca4648869ec3ceafc975101c/content.htm]
    as per the above suggestions the best practice is to sand as a multiple idocs splitting into chunks.
    Cheers!!!!
    Naveen.

  • File size more than 50 MB

    Hi
    I have a scenario where my file size is 50 MB. I have to split the file and then do the processing . whats the best approach
    venkat

    Hi
    Splitting the file before it enters  XI would considerably reduce the load on XI.
    please refer the following links
    /people/alessandro.guarneri/blog/2006/03/05/managing-bulky-flat-messages-with-sap-xi-tunneling-once-again--updated
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    Also check this valuable thread where size  related issue is discussed Re: Reg :The Maximum size of a file that can be processed

  • XML to Idoc- XML size more than 200 MB

    Hi Experts,
    I have an interface with source xml file size more than 200MB. I need to map it to Idoc and send it to ERP. I am planning to change wsdl of idoc(changing occurence 1.1 to 0.99999) to send multiple idocs in one message.
    I want to split XML file somehow and send it. what are the different options i have to achieve that ?  I am short of ideas.
    I know if we have flat file,  we can use recordset per message in sender file adapter.
    OR
    Cleint is willing to send it as a SOAP request, i do not want to use SOAP for such a big file.
    Regards
    Inder

    Hi Inder,
    As per sap recommendation we would be able to handle 100 MB, you need to tune your server by increasing the [arametersso that you would be able to handle the messages with big payload.
    By default the parameter  icm/HTTP/max_request_size_KB will be 10240 which can handle 100MB of file size.if you increase the parameter value by tuning ur system you can process a file bigger than that..
    Please refer to the below links for reference....
    [link1|http://help.sap.com/saphelp_nw04s/helpdata/en/58/108b02102344069e4a31758bc2c810/content.htm]
    [link2|http://help.sap.com/saphelp_nwpi71/helpdata/de/95/1528d8ca4648869ec3ceafc975101c/content.htm]
    as per the above suggestions the best practice is to sand as a multiple idocs splitting into chunks.
    Cheers!!!!
    Naveen.

  • Handling xml message of size more than 100mb in SAP PI 7.1

    Dear Experts,
    Is it possible for PI to pick-up and process a XML message of size more than 100 MB in PI 7.1 EHP-1?
    If yes, can you please let me know how to handle it?
    Thank  you.

    Hi Saravana,
    it is not a best practice to more than 100mb..
    you can increase below parameters and so that you would be able to process for the best..
    u2022     UME Parameters :  May be we need to look into the pool size and poolmax wait parameters - UME recommended parameters (like: poolmaxsize=50, poolmaxwait=60000)
    u2022     Tuning Parameters:  May be we need to look/define the Message Size Limit u201Clike: EO_MSG_SIZE_LIMIT = 0000100u201D under tuning category
    u2022     ICM Parameters: May be we need to consider ICM parameters (ex: icm/conn_timeout = 900000. icm/HTTP/max_request_size_KB = 2097152)
    Thanks and Regards,
    Naveen

  • Outofmemory while creating a new object of file with size more than 100 MB

    I have created an application which generates a report by getting the data from our archived files (.zip file).By the time, the application is reaching a file with size more than 100 mb, it is running out fo memory while creating the object of that particular file. Can some one help me by tellin if there id way to resolve this issue?
    Thanks in advance

    If you're getting OutOfMemoryError, the simplest thing to try is to give the VM more memory at startup. For Sun's VM, I believe the default is 64 MB. You can increase this by using the -X args that control the heap size. For example: java -Xms128m -Xmx256m ... etc. ... This says start with 128 MB of heap, and allow it to grow up to 256 MB.
    One thing to consider, though, is do you need that much stuff in memory at once? A more intelligent approach might be to unpack the archive to the file system, and then read a file at a time or a line at a time or a whatever at a time is appropriate for the processing you need to do.

  • Couldn't Upload file with size more than 2K

    Hi,
    i am developing an webapplication where in i have to upload a file.we are using hibernate for data storing and retrieval.problem comes when i am uploading a file with size more than 2k bytes.if the file size lesser than 2k it works fine.i found that its a bug with thin jdbc driver version 9.but i am using 10g jdbc driver.still problem exists. i am not suppose to use oci drivers.
    working on
    OS: windows Xp.
    Apps :weblogic8.1
    DB: oracle 9i
    if anyone has solution plz mailme at [email protected]

    I'm not sure where the issue would be. Are you saying that you are using a 9i driver to access a 10g database? If so, download the newer driver and add it to your WEB-INF/lib directory. Remove the existing driver.
    If, on the other hand, you are using a 10g driver for a 9i database, I would not expect problems. However, you could always download the older driver and try it out.
    Or am I missing something?
    - Saish

  • Adobe Flash CS5 generating SWF file size more then fla file?

    Hi,
    I have developed a video player in adobe flash cs4 and used all vector arts but when I open this fla file in Adobe Flash CS5 and published it, the swf file size more then fla file.
    Please see the details below -
    Adobe Flash CS4 -
    Index.fla file size: 523 KB
    Index.swf file size: 55 KB
    Adobe Flash CS5 (Same file when open in CS5 and published)
    Index.fla file size: 183 KB
    Index.swf file size: 215 KB
    Please suggest.
    Thanks & regards
    Sunil Kumar

    Not working!
    Thanks
    Sunil Kumar

  • Problem to increaase max open files to more than 1024

    Hello,
    I tried to increase the max open files to more than 1024 in the following way
    $ sudo nano /etc/security/limits.conf
    mysql hard nofile 8192
    mysql soft nofile 1200
    However, after reboot I am still not able to start MariaDB:
    $ sudo systemctl start mysqld
    Job for mysqld.service failed. See 'systemctl status mysqld.service' and 'journalctl -xn' for details.
    $ sudo systemctl status mysqld
    ● mysqld.service - MariaDB database server
    Loaded: loaded (/usr/lib/systemd/system/mysqld.service; enabled)
    Active: activating (start-post) since Tue 2014-09-02 13:08:20 EST; 53s ago
    Main PID: 6504 (mysqld); : 6505 (mysqld-post)
    CGroup: /system.slice/mysqld.service
    ├─6504 /usr/bin/mysqld --pid-file=/run/mysqld/mysqld.pid
    └─control
    ├─6505 /bin/sh /usr/bin/mysqld-post
    └─6953 sleep 1
    Sep 02 13:08:20 acpfg mysqld[6504]: 140902 13:08:20 [Warning] Could not increase number of max_open_files to more than 1024 (request: 4607)
    I am using the following /etc/mysql/my.cnf
    [mysql]
    # CLIENT #
    port = 3306
    socket = /home/u/tmp/mysql/mysql.sock
    [mysqld]
    # GENERAL #
    user = mysql
    default-storage-engine = InnoDB
    socket = /home/u/tmp/mysql/mysql.sock
    pid-file = /home/u/tmp/mysql/mysql.pid
    # MyISAM #
    key-buffer-size = 32M
    myisam-recover = FORCE,BACKUP
    # SAFETY #
    max-allowed-packet = 16M
    max-connect-errors = 1000000
    skip-name-resolve
    sql-mode = STRICT_TRANS_TABLES,ERROR_FOR_DIVISION_BY_ZERO,NO_AUTO_CREATE_USER,NO_AUTO_VALUE_ON_ZERO,NO_ENGINE_SUBSTITUTION,NO_ZERO_DATE,NO_ZERO_IN_DATE,ONLY_FULL_GROUP_BY
    sysdate-is-now = 1
    innodb = FORCE
    innodb-strict-mode = 1
    # DATA STORAGE #
    datadir = /home/u/tmp/mysql/
    # BINARY LOGGING #
    log-bin = /home/u/tmp/mysql/mysql-bin
    expire-logs-days = 14
    sync-binlog = 1
    # CACHES AND LIMITS #
    tmp-table-size = 32M
    max-heap-table-size = 32M
    query-cache-type = 0
    query-cache-size = 0
    max-connections = 500
    thread-cache-size = 50
    open-files-limit = 65535
    table-definition-cache = 1024
    table-open-cache = 2048
    # INNODB #
    innodb-flush-method = O_DIRECT
    innodb-log-files-in-group = 2
    innodb-log-file-size = 128M
    innodb-flush-log-at-trx-commit = 1
    innodb-file-per-table = 1
    innodb-buffer-pool-size = 2G
    # LOGGING #
    log-error = /home/u/tmp/mysql/mysql-error.log
    log-queries-not-using-indexes = 1
    slow-query-log = 1
    slow-query-log-file = /home/u/tmp/mysql/mysql-slow.log
    How is it possible to increase the number?
    Thank you in advance.

    Change/add in the my.ini file, under [mysqld]:
    max_allowed_packet=2048M
    Reference:
    * http://dev.mysql.com/doc/refman/5.5/en/ … wed_packet

  • How can I search for files with more than one keyword?

    I´ve created some keywords, and some files in my folder are tagged with two, three or more keywords.
    Is there a way to search for files using more than one keyword on the search field?
    Thanks!

    Use the Find command (menu Edit) and in criteria at the right side is a plus sign to add another criteria and set it to your custom wishes.
    make a choice in results and you should be OK

  • Is there a way to open CSV files with more than 255 columns?

    I have a CSV file with more than 255 columns of data.  It's a fairly standard export of social media data that shows volume of posts by day for the past year, from which I can analyze the data and publish customized charts. Very easy in Excel but I'm hitting the Numbers limit of 255 columns per table. Is there a way to work around the limitation? Perhaps splitting the CSV in two? The data shows up in the CSV file when I open via TextEdit, so it's there. Just can't access it in Numbers. And it's not very usable/useful for me in TextEdit.
    Regards,
    Tim

    You might be better off with Excel. Even if you could find a way to easily split the CSV file into two tables, it would be two tables when you want only one.  You said you want to make charts from this data.  While a series on a chart can be constructed from data in two different tables, to do so takes a few extra steps for each series on the chart.
    For a test to see if you want to proceed, make two small tables with data spanning the tables and make a chart from that data.  Make the chart the normal way using the data in the first table then repeat the following steps for each series
    Select the series in the chart
    Go to Format sidebar
    Click in the "Value" box
    Add a comma then select the data for this series from the second chart
    Press Return
    If there is an easier way to do this, maybe someone else will chime in with that info.

  • How to transfer my iphone video to computer,the video siza more than 1.5G

    how to transfer my iphone4 video to computer. the video siza more than 1.5G

    Follow the instructions here or here, and then sync the iPhone with the iTunes library containing the content.
    (112570)

  • Converting document files of more than 100 Mb?

    Is it true that you cannot convert document files with more than 100 Mb to PDFs because of that limit on Adobe CreatePDF?
    Is there a way of solving this problem e.g. document files with 285 Mb?

    have you tried compressing the images in the original document?  You can probably compress to 200 dpi if they are only for printing.

  • Processing Multiple Files for more than 100 Receive Location - File Size - 25 MB each file, file type DML

    Hi Everybody
    Please suggest.
    For one of our BizTalk interface, we have around 120 receive locations.  We are just moving (*.dml) files from the source to destination without doing any processing.  
    We receive lots of files in different receive locations and in few cases the file size will be around 25 MB and so while moving these large files, the CPU usage is varying between 10% to 90+% and the time consuming for this single huge file is around
    10 to 20 minutes.  This solution was already in place and was designed by the previous vendor for moving the files to their clients.  Each client has 2 receive locations and they have around 60 clients.  Is there any best solution for implementing
    this with in BizTalk or outside BizTalk? Please suggest.
    I am also looking for how to control the number of files which gets picked from the BizTalk receive location.  For example, If we have say 1000 files in receive location and we want to pick at a time only 50 files only (batch of 50) then is it possible?
    because currently it is picking all the files available in source location, and one of the process is dropping thousands of files in to the source location, so we want to control  the number of files getting picked (or even if we can control to pick the
    number of KBs).  Please guide us on how we can control the number of files.

    Hi Rajeev,
    25 MB per file, 1000 files. Certainly you got to revisit the reason for choosing BizTalk.
    “the time consuming for this single huge file is around 10 to 20 minutes”
     - This is a problem.
    You could consider other file transfer options like XCopy or RobotCopy etc if you want to transfer to another local/shared drive. Or you can consider using SSIS
    which does comes with many adapters to send to destination system depending on their destination transfer protocol.
    But in your case, you have some of the advantages that you get with BizTalk. For your scenario, you have more source systems (more Receive locations), with BizTalk
    it’s always easier to manage these configurations, you can easily enable and disable them when a need arise. You can easily configure tracking; configure host instances based on load etc. So you can consider following design for your requirement. This design
    would suit you well since you’re not processing the message and just pass it through from source to destination:
    Use a custom pipeline component in the Receive Locations which receives the large file.
    Stores the received file into disk and creates a small XML metadata message that contains the information about where the large file is stored.
    The small XML message is then published into the
    message box db
    instead of the large file. Let the metadata file also contain the same context properties as the received file.
    In the send port, use another custom pipeline component that process the metadata xml file, retrieve the location of the disk where the file is stored, access the file and send it to destination.
    Read the following article on this design..
    http://www.codeproject.com/Articles/180333/Transfer-Large-Files-using-BizTalk-Send-Side
    This way you don’t need to publish the whole message into message box DB which would considerably reduce the processing time and utilises host instance to process
    more files. This way you can still get the advantages of BizTalk and still process large files.
    And regarding your question of restricting the Receive location to handles the number of files from receives location. No it’s not possible.
    Regards,
    M.R.Ashwin Prabhu
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

Maybe you are looking for