Broadcasting results not transferring to AL11 if file size more than 3 MB

Hi All,
I am broadcasting Workbook results to AL11(application server) by using SAP standard program. If the result file is more than 3MB pre calculation working fine but file is not transferring to application server. Could please let me is there is setting to increase the transfer limit to AL11. Infact I am in touch with Ba
Thanks in advance.
Regards,
J B

Hi Inder,
As per sap recommendation we would be able to handle 100 MB, you need to tune your server by increasing the [arametersso that you would be able to handle the messages with big payload.
By default the parameter  icm/HTTP/max_request_size_KB will be 10240 which can handle 100MB of file size.if you increase the parameter value by tuning ur system you can process a file bigger than that..
Please refer to the below links for reference....
[link1|http://help.sap.com/saphelp_nw04s/helpdata/en/58/108b02102344069e4a31758bc2c810/content.htm]
[link2|http://help.sap.com/saphelp_nwpi71/helpdata/de/95/1528d8ca4648869ec3ceafc975101c/content.htm]
as per the above suggestions the best practice is to sand as a multiple idocs splitting into chunks.
Cheers!!!!
Naveen.

Similar Messages

  • Load and Read XML file size more than 4GB

    Hi All
    My environment is Oracle 10.2.0.4 on Solaris and I have processes to work with XML file as below detail by PL/SQL
    1. I read XML file over HTTP port into XMLTYPE column in table.
    2. I read value no.1 from table and extract to insert into another table
    On test db, everything is work but I got below error when I use production XML file
         ORA-31186: Document contains too many nodes
    Current XML size about 100MB but the procedure must support XML file size more than 4GB in the future.
    Belows are some part of my code for your info.
    1. Read XML by line into variable and insert into table
    LOOP
    UTL_HTTP.read_text(http_resp, v_resptext, 32767);
    DBMS_LOB.writeappend (v_clob, LENGTH(v_resptext), v_resptext);
        END LOOP;
        INSERT INTO XMLTAB VALUES (XMLTYPE(v_clob));
    2. Read cell value from XML column and extract to insert into another table
    DECLARE
    CURSOR c_xml IS
    (SELECT  trim(y.cvalue)
    FROM XMLTAB xt,
    XMLTable('/Table/Rows/Cells/Cell' PASSING xt.XMLDoc
    COLUMNS
    cvalue
    VARCHAR(50)
    PATH '/') y;
        BEGIN
    OPEN c_xml;
    FETCH c_xml INTO v_TempValue;
    <Generate insert statement into another table>
    EXIT WHEN c_xml%NOTFOUND;
    CLOSE c_xml;
        END
    And one more problem is performance issue when XML file is big, first step to load XML content to XMLTYPE column slowly.
    Could you please suggest any solution to read large XML file and improve performance?
    Thank you in advance.
    Hiko      

    See Mark Drake's (Product Manager Oracle XMLDB, Oracle US) response in this old post: ORA-31167: 64k size limit for XML node
    The "in a future release" reference, means that this boundary 64K / node issue, was lifted in 11g and onwards...
    So first of all, if not only due to performance improvements, I would strongly suggest to upgrade to a database version which is supported by Oracle, see My Oracle Support... In short Oracle 10.2.x was in extended support up to summer 2013, if I am not mistaken and is currently not supported anymore...
    If you are able to able to upgrade, please use the much, much more performing XMLType Securefile Binary XML storage option, instead of the XMLType (Basicfile) CLOB storage option.
    HTH

  • SAP ISR -XI - SAP POS. File size more than 11 KB failing in Inbound

    Hi All,
    We are implementing SAP ISR- XI - POS Retail implementation using
    Standard Content Store Connectivity 2.0, GM Store Connectivity 1.0, and
    other contents.
    In our Inbound Scenario File-RFC , we are picking files from FTP server
    for sales and totals data and if the size of this sales data file in
    format *_XI_INPUT.DAT is greater than 11 kb , it is failing at XI
    Mapping level, saying Exception occurred during XSLT
    mapping "GMTLog2IXRetailPOSLog" of the application. We have tried and tested at mapping level no error found as this is processing files below 11 Kb successfully with same mappings, Also this is standard Mapping by SAP in form of XI Content Store connectivity 2.0.
    At XI Side we have processed the file of eg: 40 KB  by splitting the record data and making
    file size less than 11KB and it is being processed successfully, but file of 40 kb fails.
    XI Server: AIX  Server.
    There may be some memory setting missing or some basis problem also. Kindly let me know how to proceed.
    Regards,
    Surbhi Bhagat

    hi,
    this is hard to believe that such small files cannot be processed
    do your XI mappings work for any other flows with something more then 11kb?
    let me know about that and then we will know some more
    as this is realy very small size
    maybe your XI was installed in on PocketPC
    Regards,
    Michal Krawczyk

  • File size more than 50 MB

    Hi
    I have a scenario where my file size is 50 MB. I have to split the file and then do the processing . whats the best approach
    venkat

    Hi
    Splitting the file before it enters  XI would considerably reduce the load on XI.
    please refer the following links
    /people/alessandro.guarneri/blog/2006/03/05/managing-bulky-flat-messages-with-sap-xi-tunneling-once-again--updated
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    Also check this valuable thread where size  related issue is discussed Re: Reg :The Maximum size of a file that can be processed

  • Why does iPhoto (9.0/11) not retain the Event name when exporting more than one event? (using File - Export - Album name with number).

    Why does iPhoto (9.0/11) not retain the Event name when exporting more than one event? (using File -> Export -> Album name with number).
    Exporting a single Event retains the Event name which is what I'd expect. But highlighting more than one event and exporting it renames the images to Events 001.JPG, Event 002.JPG etc.
    I was recently on holidays and had all my events nicely split on Dad's computer but when I went to export it I couldn't retain any of this information. Now I have to replicate this all again on my computer.
    It wasn't possible to export the entire library as the external drive was fat32 format an I didn't want all of it. It would be nice to export a bunch of events to someone and have it retain the name.
    Does anyone have a work around or will this be fixed at some point by Apple?

    Why does iPhoto (9.0/11) not retain the Event name when exporting more than one event? (using File -> Export -> Album name with number).
    Exporting a single Event retains the Event name which is what I'd expect. But highlighting more than one event and exporting it renames the images to Events 001.JPG, Event 002.JPG etc.
    I was recently on holidays and had all my events nicely split on Dad's computer but when I went to export it I couldn't retain any of this information. Now I have to replicate this all again on my computer.
    It wasn't possible to export the entire library as the external drive was fat32 format an I didn't want all of it. It would be nice to export a bunch of events to someone and have it retain the name.
    Does anyone have a work around or will this be fixed at some point by Apple?

  • I recently downloaded a series of instructional videos.  They are in mp4 format and rather large (~500MB on average).  When I try and open them with Quicktime (default) they will not play.  Are there file size restrictions?  Any suggestions would be great

    I recently downloaded a series of instructional videos.  They are in mp4 format and rather large (~500MB on average).  When I try and open them with Quicktime (default) they will not play.  Are there file size restrictions?  I have gone through all of the most recent software updates.  Any suggestions would be great.  Thanks.

    Try VLC Media Player.  It has a reputation for playing just about anything you throw at it.

  • How can I search for files with more than one keyword?

    I´ve created some keywords, and some files in my folder are tagged with two, three or more keywords.
    Is there a way to search for files using more than one keyword on the search field?
    Thanks!

    Use the Find command (menu Edit) and in criteria at the right side is a plus sign to add another criteria and set it to your custom wishes.
    make a choice in results and you should be OK

  • Is there a way to open CSV files with more than 255 columns?

    I have a CSV file with more than 255 columns of data.  It's a fairly standard export of social media data that shows volume of posts by day for the past year, from which I can analyze the data and publish customized charts. Very easy in Excel but I'm hitting the Numbers limit of 255 columns per table. Is there a way to work around the limitation? Perhaps splitting the CSV in two? The data shows up in the CSV file when I open via TextEdit, so it's there. Just can't access it in Numbers. And it's not very usable/useful for me in TextEdit.
    Regards,
    Tim

    You might be better off with Excel. Even if you could find a way to easily split the CSV file into two tables, it would be two tables when you want only one.  You said you want to make charts from this data.  While a series on a chart can be constructed from data in two different tables, to do so takes a few extra steps for each series on the chart.
    For a test to see if you want to proceed, make two small tables with data spanning the tables and make a chart from that data.  Make the chart the normal way using the data in the first table then repeat the following steps for each series
    Select the series in the chart
    Go to Format sidebar
    Click in the "Value" box
    Add a comma then select the data for this series from the second chart
    Press Return
    If there is an easier way to do this, maybe someone else will chime in with that info.

  • Couldn't Upload file with size more than 2K

    Hi,
    i am developing an webapplication where in i have to upload a file.we are using hibernate for data storing and retrieval.problem comes when i am uploading a file with size more than 2k bytes.if the file size lesser than 2k it works fine.i found that its a bug with thin jdbc driver version 9.but i am using 10g jdbc driver.still problem exists. i am not suppose to use oci drivers.
    working on
    OS: windows Xp.
    Apps :weblogic8.1
    DB: oracle 9i
    if anyone has solution plz mailme at [email protected]

    I'm not sure where the issue would be. Are you saying that you are using a 9i driver to access a 10g database? If so, download the newer driver and add it to your WEB-INF/lib directory. Remove the existing driver.
    If, on the other hand, you are using a 10g driver for a 9i database, I would not expect problems. However, you could always download the older driver and try it out.
    Or am I missing something?
    - Saish

  • Problem to increaase max open files to more than 1024

    Hello,
    I tried to increase the max open files to more than 1024 in the following way
    $ sudo nano /etc/security/limits.conf
    mysql hard nofile 8192
    mysql soft nofile 1200
    However, after reboot I am still not able to start MariaDB:
    $ sudo systemctl start mysqld
    Job for mysqld.service failed. See 'systemctl status mysqld.service' and 'journalctl -xn' for details.
    $ sudo systemctl status mysqld
    ● mysqld.service - MariaDB database server
    Loaded: loaded (/usr/lib/systemd/system/mysqld.service; enabled)
    Active: activating (start-post) since Tue 2014-09-02 13:08:20 EST; 53s ago
    Main PID: 6504 (mysqld); : 6505 (mysqld-post)
    CGroup: /system.slice/mysqld.service
    ├─6504 /usr/bin/mysqld --pid-file=/run/mysqld/mysqld.pid
    └─control
    ├─6505 /bin/sh /usr/bin/mysqld-post
    └─6953 sleep 1
    Sep 02 13:08:20 acpfg mysqld[6504]: 140902 13:08:20 [Warning] Could not increase number of max_open_files to more than 1024 (request: 4607)
    I am using the following /etc/mysql/my.cnf
    [mysql]
    # CLIENT #
    port = 3306
    socket = /home/u/tmp/mysql/mysql.sock
    [mysqld]
    # GENERAL #
    user = mysql
    default-storage-engine = InnoDB
    socket = /home/u/tmp/mysql/mysql.sock
    pid-file = /home/u/tmp/mysql/mysql.pid
    # MyISAM #
    key-buffer-size = 32M
    myisam-recover = FORCE,BACKUP
    # SAFETY #
    max-allowed-packet = 16M
    max-connect-errors = 1000000
    skip-name-resolve
    sql-mode = STRICT_TRANS_TABLES,ERROR_FOR_DIVISION_BY_ZERO,NO_AUTO_CREATE_USER,NO_AUTO_VALUE_ON_ZERO,NO_ENGINE_SUBSTITUTION,NO_ZERO_DATE,NO_ZERO_IN_DATE,ONLY_FULL_GROUP_BY
    sysdate-is-now = 1
    innodb = FORCE
    innodb-strict-mode = 1
    # DATA STORAGE #
    datadir = /home/u/tmp/mysql/
    # BINARY LOGGING #
    log-bin = /home/u/tmp/mysql/mysql-bin
    expire-logs-days = 14
    sync-binlog = 1
    # CACHES AND LIMITS #
    tmp-table-size = 32M
    max-heap-table-size = 32M
    query-cache-type = 0
    query-cache-size = 0
    max-connections = 500
    thread-cache-size = 50
    open-files-limit = 65535
    table-definition-cache = 1024
    table-open-cache = 2048
    # INNODB #
    innodb-flush-method = O_DIRECT
    innodb-log-files-in-group = 2
    innodb-log-file-size = 128M
    innodb-flush-log-at-trx-commit = 1
    innodb-file-per-table = 1
    innodb-buffer-pool-size = 2G
    # LOGGING #
    log-error = /home/u/tmp/mysql/mysql-error.log
    log-queries-not-using-indexes = 1
    slow-query-log = 1
    slow-query-log-file = /home/u/tmp/mysql/mysql-slow.log
    How is it possible to increase the number?
    Thank you in advance.

    Change/add in the my.ini file, under [mysqld]:
    max_allowed_packet=2048M
    Reference:
    * http://dev.mysql.com/doc/refman/5.5/en/ … wed_packet

  • Outofmemory while creating a new object of file with size more than 100 MB

    I have created an application which generates a report by getting the data from our archived files (.zip file).By the time, the application is reaching a file with size more than 100 mb, it is running out fo memory while creating the object of that particular file. Can some one help me by tellin if there id way to resolve this issue?
    Thanks in advance

    If you're getting OutOfMemoryError, the simplest thing to try is to give the VM more memory at startup. For Sun's VM, I believe the default is 64 MB. You can increase this by using the -X args that control the heap size. For example: java -Xms128m -Xmx256m ... etc. ... This says start with 128 MB of heap, and allow it to grow up to 256 MB.
    One thing to consider, though, is do you need that much stuff in memory at once? A more intelligent approach might be to unpack the archive to the file system, and then read a file at a time or a line at a time or a whatever at a time is appropriate for the processing you need to do.

  • Converting document files of more than 100 Mb?

    Is it true that you cannot convert document files with more than 100 Mb to PDFs because of that limit on Adobe CreatePDF?
    Is there a way of solving this problem e.g. document files with 285 Mb?

    have you tried compressing the images in the original document?  You can probably compress to 200 dpi if they are only for printing.

  • Lync BToE - USB HeartBeat is not passed from Lync for long duration (more than 20 seconds)

    Lync BToE - USB HeartBeat is not passed from Lync for long duration (more than 20 seconds)

    Hi,
    Did the issue happen only for you or for multiple users?
    Make sure phone update to the latest firmware.
    Best Regards,
    Eason Huang
    Eason Huang
    TechNet Community Support

  • Adobe Flash CS5 generating SWF file size more then fla file?

    Hi,
    I have developed a video player in adobe flash cs4 and used all vector arts but when I open this fla file in Adobe Flash CS5 and published it, the swf file size more then fla file.
    Please see the details below -
    Adobe Flash CS4 -
    Index.fla file size: 523 KB
    Index.swf file size: 55 KB
    Adobe Flash CS5 (Same file when open in CS5 and published)
    Index.fla file size: 183 KB
    Index.swf file size: 215 KB
    Please suggest.
    Thanks & regards
    Sunil Kumar

    Not working!
    Thanks
    Sunil Kumar

  • Importing From Excel file will not import assignments correctly where there are more than one assignment per task

    I am trying to import an excel file ... see below.   It appears if you select both Tasks and Assignments tables it will not allow you to do this, says file is not in a project recognized format.  So I import Tasks tab first then run the
    wizard again and import the assignments tab while selecting merge with project option.   This works with one exception:
    If there are more than one assignment per task.  It does not merge in the 2nd assignment, only the first.
    Is there a way to import a task with 2 or more assignments?
    Background
    To figure out what the format for the file needed to be, I started with a MPP file and from project saved as Excel and selected the "Project Template" file.  Now I am trying to import that XLS to see how it works
    ID
    Active
    Task Mode
    Name
    Duration
    Start
    Finish
    Predecessors
    Outline Level
    1
    Yes
    Auto Scheduled
    DS001
    3 days
    June 17, 2014 8:00 AM
    June 19, 2014 5:00 PM
    1
    2
    Yes
    Auto Scheduled
    MT001
    3 days
    June 17, 2014 8:00 AM
    June 19, 2014 5:00 PM
    2
    3
    Yes
    Auto Scheduled
    CT001
    1 day
    June 17, 2014 8:00 AM
    June 17, 2014 5:00 PM
    3
    4
    Yes
    Auto Scheduled
    CT002
    2 days
    June 18, 2014 8:00 AM
    June 19, 2014 5:00 PM
    3
    3
    5
    Yes
    Auto Scheduled
    DS002
    1 day
    June 20, 2014 8:00 AM
    June 20, 2014 5:00 PM
    1
    6
    Yes
    Auto Scheduled
    MT002
    1 day
    June 20, 2014 8:00 AM
    June 20, 2014 5:00 PM
    2
    7
    Yes
    Auto Scheduled
    CT003
    1 day
    June 20, 2014 8:00 AM
    June 20, 2014 5:00 PM
    4
    3
    Task   Name
    Resource Name
    % Work Complete
    Work
    Units
    CT001
    Engineer1
    0
    8h
    100%
    CT002
    Engineer2
    0
    16h
    100%
    CT003
    Engineer1
    0
    8h
    100%
    CT003
    Engineer2
    0
    8h
    100%
    Andrew Payze

    Andrew,
    I did a quick test using your example although I didn't include the Predecessor or Outline Level fields to keep things a little simpler. I imported the Excel data into a new Project file with a single import of task and assignment data. It is very important
    to set up the Excel Workbook correctly. On Sheet 1 I entered the task data (Name, Duration, Start). I did not include the ID or finish date as that is redundant - Project generates its own ID and will calculate the finish date based on the start date and duration.
    On Sheet 2 I entered the assignment data basically just as you show it except I didn't include the Units.
    The data imported as expected.
    John

Maybe you are looking for

  • Multiple crop marks in Ai CS3

    Ai CS3, Win Xp Pro I'm setting up business cards for two color printing and need crop marks around each card. Total of 4 cards, 7 cuts. In searching for Multiple Cropmarks on the internet, I'm finding articles that say it's not possible. However, I'm

  • Elements 5.0 and Lightroom

    How compactible is lightroom with elements 5.0

  • Syncing 2 profiles together (windows 7)

    I have 2 profiles on my computer (running Win 7) how can I sync both profiles together so when i do something or add a song on 1 profile it shows up on the 2nd profile

  • Getting error in payroll data  extraction

    hi, i am extracting payroll data for south africa and used following includes. but i am getting errror " staement RP-IMP-C2-RW .   is not defined ". plz help me in this , am i misng some includes. INCLUDE ole2incl. INCLUDE rpc2cd09.  "Cluster CD data

  • Apple Id issues

    Hey, At the moment, I cannot log in into my iMac. Which is very bad because I can't work and design! I have changed my Primary apple ID email address in iForgot, and in system preferences i couldn't change is and is stayed with the old email address,