Serializable String size limits

I have an remote EJB deployed on WebSphere Application Server 5.1.
The EJB client is invoking this EJB passing it as large string as the input parameter. The size of the string can be upto 20 MB.
The question is - what is the size limit of a serialized object ( in this case String) which can be used during a rmi/iiop invocation?

842366 wrote:
Hi,
old client has size limit on connection string size?
if i create a tnsnames with 3 vip old client generate an error.
Is there any workaround?upgrade client
use EZCONNECT; which does not require any tnsnames.ora
How do I ask a question on the forums?
SQL and PL/SQL FAQ

Similar Messages

  • String size limitations, HTML table

    Post Author: Jeff Kulbeth
    CA Forum: General
    I am currently using CR 8.5 and am aware of string size limitations... 254 characters max.  I'm looking at CR XI and can't find any data on string size limitations.  A few questions:
    1) We're considering database fields larger than 254 characters, and would like to use CR to parse the field.  In CR XI, what's the maximum string length that can return from a table and still be used in a formula?   Can a Memo field be used in a formula in CR XI? 
    2) Once I've parsed the fields in the database table, I'll need to return formatted strings back to the report.  The strings will be larger than 254 character ... What's the maximum string size allowed in CR XI?
    3) Somewhat related, but not entirely:  Have any patches been released that provide HTML table interpretation in CR XI?
    Thanks for the help,
    Jeff Kulbeth

    Post Author: V361
    CA Forum: General
    The maximum length of a String constant, a String value held by a String variable, a String value returned by a function or a String element of a String array is 65,534 characters.
    The maximum size of an array is 1000 elements.
    The maximum number of arguments to a function is 1000. (This applies to functions that can have an indefinite number of arguments such as Choose).
    Not sure about the HTML ?

  • Connection string size limits

    Hi,
    old client has size limit on connection string size?
    if i create a tnsnames with 3 vip old client generate an error.
    Is there any workaround?

    842366 wrote:
    Hi,
    old client has size limit on connection string size?
    if i create a tnsnames with 3 vip old client generate an error.
    Is there any workaround?upgrade client
    use EZCONNECT; which does not require any tnsnames.ora
    How do I ask a question on the forums?
    SQL and PL/SQL FAQ

  • JDBC:KPRB string size limits-NEED HELP!

    Hello All.
    Please, I need your help.
    I have the problems in Oracle 9.2.0.6 with stored java and jdbc:kprb internal driver.
    I try to put string(more than 32K) into LONG-type field using following java class:
    import oracle.sql.CLOB;
    import java.sql.Connection;
    import java.sql.DriverManager;
    import java.sql.PreparedStatement;
    import java.io.Reader;
    import java.io.CharArrayReader;
    public class LongTest {
    public static void insertLong()
    throws Exception {
    Connection con = DriverManager.getConnection("jdbc:default:connection:");
    //generate large string
    StringBuffer stringBuffer = new StringBuffer();
    for (int i = 0; i < 100000; i++) {
    stringBuffer.append("qwerty");
    String st = stringBuffer.toString();
    //put large string to long-type field
    PreparedStatement pst = null;
    try {
    pst = con.prepareStatement("insert into TEST_LONG (ldata) values (?)");
    pst.setString(1, st);
    pst.execute();
    } finally {
    if (pst != null) {pst.close();}
    But I get error from jdbc:kprb about limitation of data size for this type(LONG).
    But it works well in oracle 10.2.0.1...
    Has anybody a solution of how to do this in oracle 9.2.0.6???
    thanx!

    Hi,
    THis is a known limitation in the 92 RDBMS solved in 10g.
    Kuassi http://db360.blogspot.com

  • Nio ByteBuffer and memory-mapped file size limitation

    I have a question/issue regarding ByteBuffer and memory-mapped file size limitations. I recently started using NIO FileChannels and ByteBuffers to store and process buffers of binary data. Until now, the maximum individual ByteBuffer/memory-mapped file size I have needed to process was around 80MB.
    However, I need to now begin processing larger buffers of binary data from a new source. Initial testing with buffer sizes above 100MB result in IOExceptions (java.lang.OutOfMemoryError: Map failed).
    I am using 32bit Windows XP; 2GB of memory (typically 1.3 to 1.5GB free); Java version 1.6.0_03; with -Xmx set to 1280m. Decreasing the Java heap max size down 768m does result in the ability to memory map larger buffers to files, but never bigger than roughly 500MB. However, the application that uses this code contains other components that require the -xMx option to be set to 1280.
    The following simple code segment executed by itself will produce the IOException for me when executed using -Xmx1280m. If I use -Xmx768m, I can increase the buffer size up to around 300MB, but never to a size that I would think I could map.
    try
    String mapFile = "C:/temp/" + UUID.randomUUID().toString() + ".tmp";
    FileChannel rwChan = new RandomAccessFile( mapFile, "rw").getChannel();
    ByteBuffer byteBuffer = rwChan.map( FileChannel.MapMode.READ_WRITE,
    0, 100000000 );
    rwChan.close();
    catch( Exception e )
    e.printStackTrace();
    I am hoping that someone can shed some light on the factors that affect the amount of data that may be memory mapped to/in a file at one time. I have investigated this for some time now and based on my understanding of how memory mapped files are supposed to work, I would think that I could map ByteBuffers to files larger than 500MB. I believe that address space plays a role, but I admittedly am no OS address space expert.
    Thanks in advance for any input.
    Regards- KJ

    See the workaround in http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4724038

  • HTML Editor size limitation?

    Hi,<br>
    I'm using Apex 3.1 on Oracle 10g...<br>
    Is there a size limitation on the number of characters in the HTML Editor? <br>
    We have a Report/Form combo for a column. clicking on the "Edit" Link from the Report takes you to the Form where you can edit it. <br>
    The corresponding database column is of type LONG. I see the data in the report, but clicking on Edit gives me an error - <br>
    ORA-20001: Error fetching column value: ORA-06502: PL/SQL: numeric or value error: character string buffer too small <br>
    I've tried switching from CLOB to LONG for the column's datatype, but both fail to open the Form. <br>
    If I change the Item type to "Text Area", the LONG column works, but I do need it as an HTML Editor... <br>
    Thanks for any help!

    Scott (sspafado),
    Before you ask, my name is Viji
    :)

  • Maximum string size???

    Hello everybody,
    I need to know how much data can be stored in a String.
    In the following Topic I've read that the String size is limited to 32677 bytes: http://forum.java.sun.com/thread.jsp?forum=31&thread=57885
    In another forum I read that String itself has no size-limitation but the amount of data that can be stored is related to the VM HEAP SIZE.
    I'd be very glad to receice any help from you.
    THX...
    Benjamin Gutmann

    However looking through the source of String there is
    no limit opposed onto the length.Actually there is a limit, because the String data is stored internally in an array of characters. And there is an upper bound on the size of an array, because as the Java language specification says "Arrays must be indexed by int values". That's where Integer.MAX_VALUE comes from.

  • XMLType field Size Limitation

    Hi,
    I am using Oracle DB 9.2.0 version and Toplink 9.0.3 version.
    Tables in the database have some fields defined as XMLType and i am using toplink's mapping to insert values in those fields. (Note: This was not directly possible but there was a patch provided by oracle which made this work). The mapped object's xmltype variable is defined as String.
    Now I wanted to know if there is any size limitations on the XML content to be inserted in the database XMLType field. If so please do let me know exact size.
    Thanks,

    Good afternoon,
    The folks on the Oracle XDB forum might be able to better answer your question.
    Oracle XDB Forum:
    XML DB
    -Blaise

  • Oracle (Apache) HTTP Server - Default Header Size limitation

    Does the Oracle HTTP Server have a http header size limitation? If so, what is the maximum size allowed for http headers? Can it be changed, and how?
    Which version of Apache shipped with AS 9.0.4? I've been trying to find the answer to my questions in the Apache 1.3 documentation, but I'm not having any luck.
    Let me explain my problem. We're using Vintela's Single Sign-On library for authentication in our java applications. Recently we ran into a problem where a user was not being granted access to the application. Much debugging occured, eventually we had to open a trouble ticket with Vintela. They suggested it might be a http header size limitation and to check the configuration for the web cache and apache. We easily found the configuration options for the web cache, but are still looking for apache.
    We've by-passed the web cache and accessed the HTTP server directly and we are still experiancing the same problem.
    To keep this message short and concise, I've omitted most of our troubleshooting, we're pretty sure the problem is related to a HTTP header size limitation.

    One trick I saw the Oracle guys do is telnet to the httpd port and manually type in an HTTP request.
    Perhaps ask the Vintela people for a test string that you can paste in a telnet window to test if the server handles it correctly.
    Also, you can put Apache in a sort of debug mode as well using the AS Console. I can't remember, but I think this may show the entire http request including the headers.

  • What is the size limitation of Oracle JDBC Statement?

    Want to know how large is the limiation of one complete SQL insert string can be place
    into a JDBC Statement and can insert the big record successfully. I tried up to 13K, JDBC still can accept it. Does some experienced
    guru can tell me what is the size limitation
    for insert or update SQL string JDBC can accept it?
    Thanks a lot.

    None known. More than 10 GB. Please note Acrobat is not for server use (I mention that because you mention C# and experience shows it is often preferred for backend development).

  • MessageTextInput Size Limitations

    Does anyone know the size limits of this UIX Control. I have a field that is allowed 4000 characters. The control will allow the 4000 characters to be entered, and displayer, but it cannot be saved.
    If this a limitation of the browser (I am using IE 6) or the component?
    Thank you
    Kelly Burton
    Object CTalk Inc.

    Does anyone know the size limits of this UIX Control.
    I have a field that is allowed 4000 characters. The
    e control will allow the 4000 characters to be
    entered, and displayer, but it cannot be saved. I suppose that the "value" attribute of your UIXInput is bound to a String field server-side.
    In this circumstance, I have tried to enter a string of 5000 characters by a Mozilla browser and i have verified that it has been saved in the server without problems.
    What kind of error do you have server-side?

  • Running out of heap with size-limited cache

    I'm experimenting with creating a size-limited cache using Coherence 3.6.1 and not having any luck. I'm creating four 1GB storage-enabled cache nodes and then a non-storage-enabled process to populate them with test data. When I monitor the cache nodes w/ visualvm and run my test case to populate the cache, I just see the heaps get larger and larger until the processes either just stop responding, or I get an OutOfMemoryError. I've tried slowing down and even stopping, waiting for awhile, and then restarting the test case populating the caches to see if that helps, but it doesn't seem to make a difference.
    A related question, since Coherence doesn't seem to be able to recover from the OOME, is there anyway to configure it to end the process when that happens instead of just sitting there?
    Thanks.
    Stack Trace:
    SRVCoherence[93191:9953 0] 2011/06/29 00:39:04 1,015.5 MB/191.38 KB ERROR Coherence   -2011-06-29 00:39:04.679/449.883 Oracle Coherence GE 3.6.1.0 <Error> (thread=DistributedCache:PartitionedPofCache, member=4):
    java.lang.OutOfMemoryError: Java heap space
         at com.tangosol.util.SegmentedHashMap.grow(SegmentedHashMap.java:893)
         at com.tangosol.util.SegmentedHashMap.grow(SegmentedHashMap.java:840)
         at com.tangosol.util.SegmentedHashMap.ensureLoadFactor(SegmentedHashMap.java:809)
         at com.tangosol.util.SegmentedHashMap.putInternal(SegmentedHashMap.java:724)
         at com.tangosol.util.SegmentedHashMap.put(SegmentedHashMap.java:418)
         at com.tangosol.util.SimpleMapIndex.insertInternal(SimpleMapIndex.java:236)
         at com.tangosol.util.SimpleMapIndex.insert(SimpleMapIndex.java:133)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$Storage.updateIndex(PartitionedCache.CDB:20)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ResourceCoordinator.processEvent(PartitionedCache.CDB:74)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ResourceCoordinator.finalizeInvokeSingleThreaded(PartitionedCache.CDB:56)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ResourceCoordinator.finalizeInvoke(PartitionedCache.CDB:9)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.processChanges(PartitionedCache.CDB:3)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onPutAllRequest(PartitionedCache.CDB:68)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$PutAllRequest.onReceived(PartitionedCache.CDB:85)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onMessage(Grid.CDB:11)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:33)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
         at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
         at java.lang.Thread.run(Thread.java:680)Cache Config:
    <?xml version="1.0"?>
    <!DOCTYPE cache-config SYSTEM "cache-config.dtd">
    <cache-config>
        <caching-scheme-mapping>
            <cache-mapping>
                <cache-name>Event</cache-name>
                <scheme-name>EventScheme</scheme-name>
            </cache-mapping>
        </caching-scheme-mapping>
        <caching-schemes>
            <distributed-scheme>
                <scheme-name>EventScheme</scheme-name>
                <service-name>PartitionedPofCache</service-name>
                <partition-count>16381</partition-count>
                <serializer>
                    <class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
                </serializer>
                <backing-map-scheme>
                    <local-scheme>
                        <high-units>300m</high-units>
                        <low-units>200m</low-units>
                        <unit-calculator>BINARY</unit-calculator>
                        <eviction-policy>LRU</eviction-policy>
                    </local-scheme>
                </backing-map-scheme>
                <autostart>true</autostart>
            </distributed-scheme>
            <invocation-scheme>
                <service-name>InvocationService</service-name>
                <thread-count>5</thread-count>
                <autostart>true</autostart>
            </invocation-scheme>
        </caching-schemes>
    </cache-config>

    Hi Timberwolf,
    Some tips to keep in mind when cnofiguring a cache to be size limited:
    - High units are per storage node. For instance, 5 storage nodes with a high-units configuration of 10mb can theoretically store up to 50mb. However this implies a perfect distribution which is normally not the case.
    - High units are per named cache. If two caches are mapped to a scheme and the scheme specifies a high units of 10mb, then each cache will have a high units of 10mb, meaning 20mb of storage total. This is especially important to keep in mind when using a wildcard (*) naming convention.
    - For backing maps, high units only take primary storage into account. With a backup count of 1 (the default) a high-units setting of 10mb really means that up to 20mb of data will be stored on that node for that cache.
    - The entire heap of a dedicated storage node cannot be used for storage. In other words, a 1gb heap cannot hold 1gb of cache data. At the very most, ~75% can be used for cache storage (around 750mb.) If the storage node is performing grid functions (such as entry processors, filters, or aggregations) then the total high-units setting should be less than 75%.
    A sample configuration could look like this: 1gb JVM with 700mb dedicated storage. Half of the storage is primary and half is backup. Therefore, the total high-units for that node should not exceed 350mb.
    Hope it helps.
    Thanks,
    Cris

  • DLP Attachment Scanning - Size Limitations

    Is it documented anywhere what the attachment file size limitations are for DLP scanning?  In the ESA Configuration documentation I read:
    "To scan attachments, the content scanning engine extracts the attachment for the RSA Email DLP scanning engine to scan." 
    Can you identify what scanning engine is referenced by "content scanning engine" and what is the maximum attachment size it can process?  Also are those settings modifiable and some indcation of performance impact if they are increased to a maximum of 50 MB per attachment?
    I know you can make some modifications in the DLP policy, however it is our desire to DLP scan every document sent up to our allowable maximum email size.
    If large attachments cannot be scanned we may be forced to reduce our maximum message/attachment file size. 
    We are currently using Async O/S 7.5.1-102 and will be moving to 7.6.0 when it comes GA.

    Hello David,
    The content scanning engine in reference is the same AsyncOS scanning engine responsible or Message and Content Filter scanning. The maximum size of attachment to scan for this scanning engine is controlled by your 'scanconfig' setttings, as configured in the IronPort CLI. The default 'maximum size of attachment to scan' is 5MB.
    IronPort1.example.com>scanconfig
    There are currently 6 attachment type mappings configured to be SKIPPED.
    Choose the operation you want to perform:
    - NEW - Add a new entry.
    - DELETE - Remove an entry.
    - SETUP - Configure scanning behavior.
    - IMPORT - Load mappings from a file.
    - EXPORT - Save mappings to a file.
    - PRINT - Display the list.
    - CLEAR - Remove all entries.
    - SMIME - Configure S/MIME unpacking.
    []> setup
    1. Scan only attachments with MIME types or fingerprints in the list.
    2. Skip attachments with MIME types or fingerprints in the list.
    Choose one:
    [2]>
    Enter the maximum depth of attachment recursion to scan:
    [5]>
    Enter the maximum size of attachment to scan:
    [5242880]>
    <...>
    Any message that is larger than this limit will be skipped by the scanning engine. This would mean that pertinent DLP policys and filters would not match that same message. Naturally, allowing larger messages to be scanned will result in performance risks, as more system resources would be required to complete the content scanning.
    Regards,
    -Jerry

  • Exchange 2013 Mail Size Limits

    I am having an issue with setting the max send and receive size on Exchange 2013.  I keep getting the following error when I attempt to send a 20 meg file server to an internal exchange account OR if I attempt to send a 20 meg file from the exchange
    server to an external account: 
    #550 5.3.4
    ROUTING.SizeLimit; message size exceeds fixed maximum size for route ##
    I have checked the mail sizes and below is the report.  I currently have both send and receive set to 100MB.  Is there some other setting in 2013 that I am not aware of?
    AnonymousSenderToRecipientRatePerHour                       : 1800
    ClearCategories                                            
    : True
    ConvertDisclaimerWrapperToEml                               : False
    DSNConversionMode                                          
    : UseExchangeDSNs
    ExternalDelayDsnEnabled                                     : True
    ExternalDsnDefaultLanguage                                  :
    ExternalDsnLanguageDetectionEnabled                         : True
    ExternalDsnMaxMessageAttachSize                             : 100 MB (104,857,600 bytes)
    ExternalDsnReportingAuthority                               :
    ExternalDsnSendHtml                                        
    : True
    ExternalPostmasterAddress                                   :
    GenerateCopyOfDSNFor                                        :
    HygieneSuite                                               
    : Standard
    InternalDelayDsnEnabled                                     : True
    InternalDsnDefaultLanguage                                  :
    InternalDsnLanguageDetectionEnabled                         : True
    InternalDsnMaxMessageAttachSize                             : 100 MB (104,857,600 bytes)
    InternalDsnReportingAuthority                               :
    InternalDsnSendHtml                                        
    : True
    InternalSMTPServers                                        
    JournalingReportNdrTo                                       : <>
    LegacyJournalingMigrationEnabled                            : False
    LegacyArchiveJournalingEnabled                              : False
    LegacyArchiveLiveJournalingEnabled                          : False
    RedirectUnprovisionedUserMessagesForLegacyArchiveJournaling : False
    RedirectDLMessagesForLegacyArchiveJournaling                : False
    MaxDumpsterSizePerDatabase                                  : 18 MB (18,874,368 bytes)
    MaxDumpsterTime                                            
    : 7.00:00:00
    MaxReceiveSize                                             
    : 100 MB (104,857,600 bytes)
    MaxRecipientEnvelopeLimit                                   : 500
    MaxRetriesForLocalSiteShadow                                : 2
    MaxRetriesForRemoteSiteShadow                               : 4
    MaxSendSize                                                
    : 100 MB (104,857,600 bytes)
    MigrationEnabled                                           
    : False
    OpenDomainRoutingEnabled                                    : False
    RejectMessageOnShadowFailure                                : False
    Rfc2231EncodingEnabled                                      : False
    SafetyNetHoldTime                                          
    : 2.00:00:00
    ShadowHeartbeatFrequency                                    : 00:02:00
    ShadowMessageAutoDiscardInterval                            : 2.00:00:00
    ShadowMessagePreferenceSetting                              : PreferRemote
    ShadowRedundancyEnabled                                     : True
    ShadowResubmitTimeSpan                                      : 03:00:00
    SupervisionTags                                            
    : {Reject, Allow}
    TLSReceiveDomainSecureList                                  : {}
    TLSSendDomainSecureList                                     : {}
    VerifySecureSubmitEnabled                                   : False
    VoicemailJournalingEnabled                                  : True
    HeaderPromotionModeSetting                                  : NoCreate
    Xexch50Enabled                                             
    : True

    Hello Landfish,
    Good Day...
    The output gives the information that Size limit set for Receive and Send is 100 mb, but setting could have changed. So you can follow the below steps to resolve the issue. 
    There are basically three places where you can configure default message size limits on Exchange:
    Organization transport settings
    Send/receive connector settings
    User mailbox settings.
    To check your server’s current limit you can open Exchange Management Shell
    Try the below commands to check the Message quota size limit
    get-transportconfig | ft maxsendsize, maxreceivesize
    get-receiveconnector | ft name, maxmessagesize
    get-sendconnector | ft name, maxmessagesize
    get-mailbox Administrator |ft Name, Maxsendsize, maxreceivesize
    To change the above size limits based on your requirement.
    Set-TransportConfig -MaxSendSize 200MB -MaxReceiveSize 500MB (Size is based on your requirement)
    Attachment size limit
    To set up the rule you can use the below PowerShell cmdlet, as the method is quite simple
    New-TransportRule -Name LargeAttach -AttachmentSizeOver 20MB -RejectMessageReasonText "Message attachment size over 20MB - email rejected."
    For More info
    https://technet.microsoft.com/en-us/library/bb124708(v=exchg.150).aspx
    Remember to mark as helpful if you find my contribution useful or as an answer if it does answer your question.That will encourage me - and others - to take time out to help you Check out my latest blog posts @ Techrid.com

  • How to Get Around the Memo Size Limitations in CR ?

    I am Using Crystal Reports 2008, SQL Database and ASP .Net Visual Studio 2010 for Team Foundation with Crystal Viewer embedded in a web page.  All current update and patches area installed.
    Database has Memo Fields up to 164000 characters in length. Viewer show fine, but with the reports that have been designed to print this information, only seeing part of the Memo field.
    This happens with both RTF, Text and HTML formatted data from within the database field.
    I have read that there is a limitation on the size of a Memo field that Crystal Reports will print (65,534).
    I actually received an Crystal Reports error box when i try to concatenate multiple substring fields as a formula.
    Does anyone have any suggestions or ideas on a work-around ? 
    Due to legal considerations, this data has to be output as it was input, so it can't be hacked. It can be parsed and again merged  but I really donu2019t want to try and write SQL procedures to parse HTML code into readable multiple pieces based on variable length tags with large memo fields.
    Please offer any and every suggestion,
    Thanks to all  ! !
    Edited by: Ludek Uher on Oct 21, 2010 1:31 PM

    yes sir,
    already did but i didn't receive any answers. . .   Memo Field Size Limitations with Crystal Reports 2008 ?
    Thanks for your help.

Maybe you are looking for

  • My phone number wrong in Settings

    So I just rolled my Verizon number to my new iPhone after testing the iPhone for 30 days. In iTunes and on the iPhone, both display the old number, not the number that it currently is. Is there anyway to change this?

  • My Nokia N70 ME headset AD-41 play/pause/fw/bw but...

    My Nokia N70 ME headset AD-41 play/pause/fw/bw button won't work. I have tried to plug this headset in to the other N70 ME and it's work completely fine. Only on my N70 Me, I can still hear the music but the button of the headset won't work even thou

  • CS5 - Styles missing in CSS class menu

    I have noticed that a number of my CSS styles are not visible in the drop down menu in the HTML property inspector. There is however an empty space where they should be in the alphabetical list and although I can select the style I want by guessing w

  • Help - problems with mixin class and recursion

    I'm trying to set up functionality which will allow me to track gui node nesting. Basically, I'd like to have an optional name for each node and be able to generate a string which uses these to track descent. So, e.g., if I have "panel1" as my top sc

  • Transfer Sales  Order from client to client

    Hi All, I am trying to transfer sales order from one sap to other sap system.Here i am using ale idoc process.In the system there are some sales orders,i need to transfer that to other system.we configured message type and added out put types as ALE