When is a span with class lov generated for a Date Picker/Lov?

Hi,
can anybody tell me when a surrounding
< span class="lov">
is generated around the field and the icon of a date picker/lov and in which situations not?
Normally this span is always generated but I have already seen it in the past that the span isn't generated by APEX, but I'm unable to reproduce that anymore. Is it theme dependent? But item rendering can't be changed through the templates...
Thanks
Patrick

Hi Carl,
thanks for your response! Just got the confirmation that with Application Express 2.1.0.00.39 it isn't rendered for Page Item Date Pickers. That's why I can't reproduce it with my 2.2 installation. I already thought it's some setting of the page item/... which I don't see.
Thanks
Patrick

Similar Messages

  • Using a partitionned cache with off-heap storage for backup data

    Hi,
    Is it possible to define a partitionned cache (with data into the heap) with off-heap storage for backup data ?
    I think it could be worthwhile to do so, as backup data are associated with a different access pattern.
    If so, what are the impacts of such off-heap storage for backup data ?
    Particularly, what are the impacts on performance ?
    Thanks.
    Regards,
    Dominique

    Hi,
    It seems what using scheme for backup-store is broken in latest version of Coherence, I've got an exception using your setup.
    2010-07-24 12:21:16.562/7.969 Oracle Coherence GE 3.6.0.0 <Error> (thread=DistributedCache, member=1): java.lang.NullPointerException
         at com.tangosol.net.DefaultConfigurableCacheFactory.findSchemeMapping(DefaultConfigurableCacheFactory.java:466)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$Storage$BackingManager.isPartitioned(PartitionedCache.java:10)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$Storage.instantiateBackupMap(PartitionedCache.java:24)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$Storage.setCacheName(PartitionedCache.java:29)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ServiceConfig$ConfigListener.entryInserted(PartitionedCache.java:17)
         at com.tangosol.util.MapEvent.dispatch(MapEvent.java:266)
         at com.tangosol.util.MapEvent.dispatch(MapEvent.java:226)
         at com.tangosol.util.MapListenerSupport.fireEvent(MapListenerSupport.java:556)
         at com.tangosol.util.ObservableHashMap.dispatchEvent(ObservableHashMap.java:229)
         at com.tangosol.util.ObservableHashMap$Entry.onAdd(ObservableHashMap.java:270)
         at com.tangosol.util.SafeHashMap.put(SafeHashMap.java:244)
         at com.tangosol.coherence.component.util.ServiceConfig$Map.put(ServiceConfig.java:43)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$StorageIdRequest.onReceived(PartitionedCache.java:45)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onMessage(Grid.java:11)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.java:33)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.java:3)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.java:3)
         at com.tangosol.coherence.component.util.Daemon.run(Daemon.java:42)
         at java.lang.Thread.run(Thread.java:619)Tracing in debuger has shown what problem is in PartitionedCache$Storage#setCacheName(String) method, it calls instantiateBackingMap(String) before setting __m_CacheName field.
    It is broken in 3.6.0b17229
    PS using asynchronous wrapper around disk based backup storage should reduce performance impact

  • Issue with LDT file generated for the XML Template

    Hi,
    I have defined a Template in 11i instance. Attached an .rtf file to the same. Now, I have to migrate the Template definition along with the .rtf assignment to the template to R12 Instance.
    I was able to download the Template definition using FNDLOAD command. LDT file got generated in the same directory where I executed the FNDLOAD DOWNLOAD command.
    But, I was trying to download the .rtf assignment using the command - java oracle.apps.xdo.oa.util.XDOLoader DOWNLOAD.
    I have specified a .ldt file along with the command, but the .ldt file doesnt gets generated. It says the default file is xdotmpl.ldt.
    Where does the .ldt file gets generated for this command?
    Thanks in advance.
    Kiran

    I gave the command as follows -
    java oracle.apps.xdo.oa.util.XDOLoader DOWNLOAD \
    -DB_USERNAME apps \
    -DB_PASSWORD apps \
    -JDBC_CONNECTION server1:1525:instance02 \
    -LOB_TYPE TEMPLATE \
    -APPS_SHORT_NAME IBY \
    -LCT_FILE $XDO_TOP/patch/115/import/xdotmpl.lct \
    -LDT_FILE AP_FORMAT.ldt
    I was expecting a ldt file - AP_FORMAT.ldt. But infact I get a AP_FORMAT.drvx file.
    Also, when the above command is executed, I am getting an error message -
    Calling FNDLOAD: FNDLOAD apps/apps@server1:1525:instance02 0 Y DOWNLOAD /backup1/anliv02/apps/apps_st/appl/xdo/12.0.0/patch/115/import/xdotmpl.lct AP_FORMAT.ldt XDO_DS_DEFINITIONS APPLICATION_SHORT_NAME=IBY
    APP-FND-01564: ORACLE error 12514 in AFPCOA
    Cause: AFPCOA failed due to ORA-12514: TNS:listener does not currently know of service requested in connect descriptor
    The SQL statement being executed at the time of the error was: and was executed from the file .
    Generating DRVX file: xdotmpl.drvx
    Any help on this appreciated.
    Thanks in advance.
    Kiran

  • HT4539 My 3g iphone has lost all data when trying to sync with itunes, put icloud for ipad and iphone but didnt allow to sync with computor as not very techy!Thought if messed up it wouldnt effect computor! Now Phone not responding to icloud to collect da

    My iphone has lost all data when trying to sync with my computor.
    I have an pad with most info on, but not keen to sync with that incase that data gets lost!
    I joined icloud but cannot retreive from there as its not accepting my icloud account! Help!
    If I sync phone with ipad is it likely to work safely to retrieve info as both are apple devices?.

    ps when syncing it jumps through steps 1 - 4 real fast, i seem to remeber iphone showing the number of tracks transferring and names, but i see nothing? then it sits on 5 saying "waiting for changes to be applied"

  • Stock report with value and qauntity for given date not month wise

    Hi gems,
    can any body give me the standard report for Stock value and qauntity for given date not month wise at storage location level

    Hi
    check the report S_P00_07000139 with the option inventory and raw material report- detail and selection date (from, to date same). List will give opening & closing balances with goods movment and their values.
    Thanks

  • When service PO is saved no xml generated for MM-SUS Service Procurement

    Hi,
    We have SRM 7.0 & EHP 6.0 (service pack 4) .
    The settings are in place as per the below SAP Notes:
    Note 1286936 - PI configuration for SRM - additional information
    Note 1268336 - Business Suite 2008: Synchronous peer-to peer Services
    For Service Procurement (MM-SUS), in XI we have configured Standard Integration Scenario "SE_Services_Procurement" which is available in Integration Repository. (Proxy-Proxy Communication)
    We are facing issue while creating the service purchase order in MM-SUS scenario.
    PO should transfer through XML as in XI config we have used proxies for both ECC & SRM, when creating PO xml not generated in SXI_MONITOR in ECC.
    Please guide the config required in ECC to generate the XML
    tnx in adv,
    rgds,
    balu

    Hi Balu,
    SAP Note 1268336 - Business Suite 2008: Synchronous peer-to peer Services, configuration in SOAMANAGER that you have done are for
    1. creation of SRM RFx from within the ERP Web Dynpro applicattion "Collective processing of Purchase Requisitions (CPPR)
    2. Central Contract Management (CCM)
    This will not help us in enabling service procurement.
    Have u found any documentation to enable generation of XML message in ERP once a service PO is generated?
    Any progress from ur side? Could u help us with this issue.
    Best regards,
    Nikhil

  • I have problem with daq..when it is connected with laptop it asks for all the options like sampling rate etc..It displays building VI and it stops..it is not processing further..cau u plz help to solve this problem

    i hav problem with daq initialisation...plz help to solve the above mentioned issue

    Hi muthu,
    we also have a problem: to less information…
    - What is connected to your laptop?
    - What is "it" in "it displays building VI"? Do you use the DAQ Assistent ExpressVI?
    - What means "plz"?
    And could you please put less text in the title of your message and more text (with relevant information) into the message body?
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • OIM 11g - Issue with Bulk Load Utility for Account Data

    Hi,
    We are trying to load the account data for users in OIM 11g using bulk load utility.
    We are trying to load the account data for resource "iPlanet". For testing purpose, we made one account entry in csv file and run the bulk load utility. After the bulk load process completes, we have noticed that resource is provisioned to the user multiple times and multiple entries have been created in process form table.
    We have tried to run the utility multiple times with a different user record each time.
    The out put of the below sql query:
    SELECT MSG FROM OIM_BLKLD_LOG
    WHERE MODULE = 'ACCOUNT' AND LOG_LEVEL = 'PROGRESS_MSG'
    ORDER BY MSG_SEQ_NO;
    is coming as follows:
    MSG
    Number of Records Loaded: 126
    Number of Records Loaded: 252
    Number of Records Loaded: 504
    Number of Records Loaded: 1008
    Number of Records Loaded: 2016
    Number of Records Loaded: 4032
    We have noticed that each time the number of records loaded is increased to double from the records loaded in last run even when the csv file contains only one record.
    Provided below are the parent and child csv file entries.
    Parent file:
    UD_IPNT_USR_USERID,UD_IPNT_USR_FIRST_NAME,UD_IPNT_USR_LAST_NAME,UD_IPNT_USR_COMMON_NAME,UD_IPNT_USR_NSUNIQUEID
    KPETER,Peter,Kevin,Peter Kevin,
    Child file 1:
    UD_IPNT_USR_USERID,UD_IPNT_GRP_GROUP_NAME
    KPETER,group1
    Child file 2:
    UD_IPNT_USR_USERID,UD_IPNT_ROL_ROLE_NAME
    KPETER,role1
    Can you please throw some insight on what could be the potential cause for this issue and how it could be resolved?
    Thanks
    Deepa
    Edited by: user10955790 on Jun 25, 2012 6:45 AM

    Hi Deepa,
    I know from 'User load' perspective that is required to restart Oracle Identity Manager when we need to reload data that was not loaded during the first run.
    So, my suggestion is restart it before reload.
    Reference: http://docs.oracle.com/cd/E21764_01/doc.1111/e14309/bulkload.htm#CHDEICEH
    I hope this helps,
    Thiago Leoncio.

  • Urgent help with simple BPEL process for reading data from database

    Hello there,
    I need help with BPEL project.
    i have created a table Employee in Database.
    I did create application, BPEL project and connection to the database properly using Database Adapter.
    I need to read the records from the database and convert into xml fomat and it should to go approval for BPM worklist.
    Can someone please describe me step by step what i need to do.
    Thx,
    Dps

    I have created a table in Database with data like Empno,name,salary,comments.
    I created Database Connection in jsp page and connecting to BPEL process.
    It initiates the process and it goes automatically for approval.
    Please refer the code once which i created.
    <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
    "http://www.w3.org/TR/html4/loose.dtd">
    <%@page import="java.util.Map" %>
    <%@page import="com.oracle.bpel.client.Locator" %>
    <%@page import="com.oracle.bpel.client.NormalizedMessage" %>
    <%@page import="com.oracle.bpel.client.delivery.IDeliveryService" %>
    <%@page import="javax.naming.Context" %>
    <%@page import="java.util.Hashtable" %>
    <%@page import="java.util.HashMap" %>
    <%@ page import="java.sql.*"%>
    <%@ page import= "jspprj.DBCon"%>
    <html>
    <head>
    <title>Invoke CreditRatingService</title>
    </head>
    <body>
    <%
    DBCon dbcon=new DBCon();
    Connection conn=dbcon.createConnection();
    Statement st=null;
    PreparedStatement pstmt=null;
    Hashtable env= new Hashtable();
    ResultSet rs = null;
    Map payload =null;
    try
    env.put(Context.INITIAL_CONTEXT_FACTORY, "com.evermind.server.rmi.RMIInitialContextFactory");
    env.put(Context.PROVIDER_URL, "opmn:ormi://localhost:port:home/orabpel");//bpel server
    env.put("java.naming.security.principal", "username");
    env.put("java.naming.security.credentials", "password");//bpel console
    Locator locator = new Locator("default","password",env);
    IDeliveryService deliveryService =
    (IDeliveryService)locator.lookupService(IDeliveryService.SERVICE_NAME );
    // construct the normalized message and send to Oracle BPEL Process Manager
    NormalizedMessage nm = new NormalizedMessage();
    java.util.HashMap map = new HashMap();
    st=conn.createStatement();
    out.println("connected");
    String query1="Select * from EMPLOYEE";
    rs=st.executeQuery(query1);
    /*reading Data From Database and converting into XML format
    so that no need of going to BPEL console and entering the details.
    while (rs.next()){
    String xml1 = "<AsynchBPELProcess1ProcessRequest xmlns='http://xmlns.oracle.com/AsynchBPELProcess1'>"+
    "<Empno>"+rs.getString(1)+"</Empno>"+
    "<EmpName>"+rs.getString(2)+"</EmpName>"+
    "<Salary>"+rs.getString(3)+"</Salary>"+
    "<Comments>"+rs.getString(4)+"</Comments>"+
    "</AsynchBPELProcess1ProcessRequest>";
    out.println(xml1);
    nm.addPart("payload", xml1 );
    // EmployeeApprovalProcess is the BPEL process in which human task is implemented
    deliveryService.post("EmployeeApprovalProcess", "initiate", nm);
    // payload = res.getPayload();
    out.println( "BPELProcess CreditRatingService executed!<br>" );
    // out.println( "Credit Rating is " + payload.get("payload") );
    //Incase there is an exception while invoking the first server invoke the second server i.e lsgpas13.
    catch(Exception ee) {
    //("BPEL Server lsgpas14 invoking error.\n"+ee.toString());
    %>
    </body>
    </html>
    Its working fine.And i want it for Bulk approvals.please help me step by step procedure if any other way to implement this.

  • Issue with CIF Integration model for Transaction data

    Hi Gurus,
    I have activated the Integration model for PO & PReqs by location wise and i assumed that transaction data is online and we need not to re activate for any new Product / location combination got created in system.
    But the issue is that every time a new material got created, we need to deactivate and activate the integration model for Transaction data to get the transaction data into APO.
    Is there any way to avoid the following exercise as it is taking very long time for activation.
    Please guide me as it is very urgent.
    Thanks for help in advance...
    Thanks & Regards,
    Jagadeesh

    I assume 1,60,000 location products are spread around different locations.
    Rather than one Integration Model it is better to have multiple Integration Models.
    For example: one for each region like North America, one for Europe, one for Asia (assuming you have countries spread across the world). Or create Intgeration Model by Country.
    This way you reduce the number of Products in an Integration Model.
    It is very important to have manageable set of Integration Models. Let me give an example - you have some problem hence the single Material Master Integration Model is down (inactive). At that time any PP or PDS transfer will not have the Header or COmponent products transferred to APO (in effect PDS/PPM cannot be transferred). If you are creating or converting Planned Orders they will not transfer to R/3 (as Header product is not part of active intgeration model).
    But if you have country spefic or region specific Integration Model - only that country is affected not all.
    In fact you should have other integration model (like PDS/PPM, Procurement Relationships, Planned / Production Orders, Sales Orders, Stocks) in same manner i.e. either Country(s) specific or group of countries by region. The risk of models getting inactive or taking too much time to activate after regeneration easily outweighs management of more number of Integration Models (compared to one Global integration model per object).
    Hope this gives you some direction.
    Somnath

  • Iterate DB using DBcursor- get with DB_DBT_USERMEM flag set for DBT

    Have BDB running in TDS mode. Want to iterate over a complete database using a DBcursor from start to end. Set the DB_DBT_USERMEM flag on the DBT structure with data pointing to a fixed sized user allocated memory block to hold the contents of a single record read. Currently cursor-get fails with DB_BUFFER_SMALL. I assume that this is because cursor->get retrieves more than one record.
    Is it possible to iterate over the DB using the said cursor while allocating user-memory for only one (1) database record? Each call to cursor->get with DB_NEXT / DB_PREV / DB_FIRST /DB_LAST etc would update the single record entry.

    Hi Kedar,
    No, DBcursor->get() retrieves multiple key/data items if you're using the DB_MULTIPLE or DB_MULTIPLE_KEY flags. See "Bulk Retrieval":
    [http://www.oracle.com/technology/documentation/berkeley-db/db/programmer_reference/am_misc_bulk.html#am_misc_bulk_get]
    You only want to retrieve a single record per call, hence are not using the aforementioned flags. In this case the DB_BUFFER_SMALL error indicates that the length of the requested/retrieved item is larger than that specified for the DBT via its "ulen" field.
    [http://www.oracle.com/technology/documentation/berkeley-db/db/api_reference/C/dbt.html#dbt_DB_DBT_USERMEM]
    If you want to iterate over all the records in the database (including duplicates, if the database is configured to support them) you should use the DB_NEXT flag.
    Note than when the DB_BUFFER_SMALL error is returned the "size" field of the DBT is set to the the length needed for the requested item; you can inspect that value to decide how to size your supplied buffer (or you may know in advance the size of the data items in the database).
    Here is an excerpt from the example code in "Retrieving records with a cursor" with the necessary adjustments for the data DBT:
    [http://www.oracle.com/technology/documentation/berkeley-db/db/programmer_reference/am_cursor.html#am_curget]
         DB *dbp;
         DBC *dbcp;
         DBT key, data;
         int close_db, close_dbc, ret;
         /* Acquire a cursor for the database. */
         if ((ret = dbp->cursor(dbp, NULL, &dbcp, 0)) != 0) {
              dbp->err(dbp, ret, "DB->cursor");
              goto err;
         close_dbc = 1;
         /* Initialize the key/data return pair. */
         memset(&key, 0, sizeof(key));
         memset(&data, 0, sizeof(data));
         /* Retrieve data item in user suplied buffer. */
    #define BUFFER_LENGTH 1024
         if ((data.data = malloc(BUFFER_LENGTH)) == NULL)
              return (errno);
         data.ulen = BUFFER_LENGTH;
         data.flags = DB_DBT_USERMEM;
         /* You can supply your own buffer for the key as well. */
         /* Iterate through the database. */
         while ((ret = dbcp->c_get(dbcp, &key, &data, DB_NEXT)) == 0)
              /* Operate on the retrieved items. */
         if (ret != DB_NOTFOUND) {
              dbp->err(dbp, ret, "DBcursor->get");
              goto err;
    err:
         // ...Regards,
    Andrei

  • Class Type 019 for operation classification in Routing

    Hi,
    I need to assign class type 019 for classifying operation.
    Its giving the below error: "Data for class type 019 has not been converted yet"
    The details of the help are as below:
    Can any one update the implications of the steps or how can Proceed with this.....
    Diagnosis
    You want to define resource selection conditions. In this case, you need to classify the operations with class type 019. The data for class type 019 must be converted first.
    Procedure
    1. In Customizing for the Classification System, call the classifiable object types.
    2. Make a note of the settings for the following objects in object table CRHD (work center) and class type 019:
    AFVC
    CRHD
    PLPO
    3. Delete these objects and save your settings.
    4. In the main menu, choose Tools -> ABAP Workbench, and then Development -> ABAP Editor.
    5. Enter the program name RCCLUKA2 and choose Execute.
    6. Enter class type 019 and choose Execute.
    The system converts the data of this class type.
    7. Call the classifiable object types in Customizing for the Classification System again.
    8. Use the settings you have written down to create the objects again that you have deleted before
    Thanks
    Amlan C

    Dear,
    The classification data of the resource has not yet been converted.
    So that the same classes can be used for defining the selection conditions as for classifying resources, a conversion must be carried out for class type 019.
    Please start report RCCLUKA2 in SE38  with the following parameters: Object table:      CRHD
    Class type:        019
    Regards,
    R.Brahmankar

  • When to set delta index flag for master data?

    In transaction RSDDBIAMON2 option "Set Delta Flag" shows:
    Table Name                     |Table Size          |Delta Index
    /BIC/FTSGCSGMC          |10,155,000          |check mark in box
    /BIC/DTSGCSGM1           |5,000,000           |check mark in box
    /BI0/SVERSION               |700                    |check mark in box
    When should I check the "Delta Index" column for fact, dimension, and master tables?  I believe I need to check fact and dimension tables for delta BIA to occur during roll up so users can see the newly loaded requests but I am uncertain on when and why I should check this box for master data.
    Thanks for your input.

    Vitaliy,
    Thanks for answering another of my question.  Yes, we load master data a couple of times a day so based on your answer, I should also check box "Delta Index" for the master tables as well.  Thanks for helping me understand this point.
    If the below is our approach, #2 should be checked for all tables, correct?
    1)  Create/fill BIA index for cube "A"
    2)  Check the box "Delta Index" for all the tables which includes fact, dimension, and master tables for cube "A" .
    3)  Run process chain to roll up cube "A" daily
    4)  Run process chain to merge cube "A" weekly
    Many thanks,
    Thao

  • Class not found for class generated by autotype when deploy

    I'm getting a ClassNotFoundException when I attempt to deploy a webservice that was originally generated from an XSD.
    1) I create the stubs using "autotype" and the schemaFile attribute. Class files are generated and placed in the location specified by destDir.
    2) I then run source2wsdd to generate the web-services.xml and WSDL files. source2wsdd specified the "javaSource" attribute and points to the java file the WebService is to call.
    I also take all the generated JAVA classes and put it into a JAR file for step 3.
    3) I run wspackage to create the EAR file. For "webAppClasses" I specify the class files that goes with the "javaSource" specified in step 2. I also take the JAR file created in step 2 (which contains all the code generated by step 1) and use "utilJars" to put in the the /lib directory of the war file.
    4) When I deploy, I get a "class not found" for one of the classes that is in the utility jar file I created in step 2. I have verified that the jar file is in the "web-inf/lib" directory.
    What am I not seeing that is causing WebLogic (WLS 8.0 SP2) to not find the class?
    Thanks in advance...
    Wayne

    Bruce: Got it working. (Bad news is I'm not sure what I did to fix it. :-( )
    My service now deploys, but I'm having problems doing a "source2wsdd" from inside
    Eclipse. Found another customer in this group with the same issue, and added my
    info to his message.
    For now I have to run the build outside Eclipse, but at least my service is deployable.
    Thanks for taking the time to reply to my message!
    Wayne
    Bruce Stephens <[email protected]> wrote:
    I think attachments are working again, so you might post a short example
    and we can take a look...
    Bruce Stephens wrote:
    Hi Wayne,
    Could it be something simple like not being capital "WEB-INF"
    ...probably not, but who knows. Also, in step 1, is your destDir going
    to WEB-INF/classes ?? Might review the working example Manoj generated
    [1].
    HTH,
    Bruce
    [1]
    http://www.manojc.com/?sample43
    Wayne Holmes wrote:
    I'm getting a ClassNotFoundException when I attempt to deploy a webservice
    that was originally generated from an XSD.
    1) I create the stubs using "autotype" and the schemaFile attribute.Class files are generated and placed in the location specified by destDir.
    2) I then run source2wsdd to generate the web-services.xml and WSDLfiles. source2wsdd specified the "javaSource" attribute and points to
    the java file the WebService is to call.
    I also take all the generated JAVA classes and put it into a JARfile for step 3.
    3) I run wspackage to create the EAR file. For "webAppClasses" Ispecify the class files that goes with the "javaSource" specified in
    step 2. I also take the JAR file created in step 2 (which contains all
    the code generated by step 1) and use "utilJars" to put in the the /lib
    directory of the war file.
    4) When I deploy, I get a "class not found" for one of the classesthat is in the utility jar file I created in step 2. I have verified
    that the jar file is in the "web-inf/lib" directory.
    What am I not seeing that is causing WebLogic (WLS 8.0 SP2) to notfind the class?
    Thanks in advance...
    Wayne

  • XML alignment problem when opening with notepad - XML generated from SAP

    Hi all,
    I am sending a mail with attachment as XML format by using the function module SO_NEW_DOCUMENT_ATT_SEND_API1.
    I am getting the xml file in mail and it was properly aligned when i open it. But when i open it with notepad the alignments are changing. 
    Below is the code for sending a mail. The content of the XML file populaed in int. table  lt_attachment.
    lt_attachment[] = pt_attachment[].
      LOOP AT lt_attachment INTO ls_attachment.
        ls_objtxt-line = ls_attachment-container.
        APPEND ls_objtxt TO lt_objtxt.
        CLEAR : ls_attachment, ls_objtxt.
      ENDLOOP.
    Creating the document to be sent
      ls_mailsubject-obj_name     = 'MAILATTCH'.
      ls_mailsubject-obj_langu    = sy-langu.
      ls_mailsubject-obj_descr    = 'You have got mail'.
      ls_mailsubject-sensitivty   = 'F'.
      lv_cnt = LINES( lt_objtxt ).
      ls_mailsubject-doc_size     = ( lv_cnt - 1 ) * 255 + STRLEN( ls_objtxt ).
      DESCRIBE TABLE lt_objtxt LINES lv_tab_lines.
    Header of the email
      ls_objpack-transf_bin = space.
      ls_objpack-head_start = 1.
      ls_objpack-head_num   = 0.
      ls_objpack-body_start = 1.
      ls_objpack-body_num   = lv_tab_lines.
      ls_objpack-doc_type   = lc_raw.
      APPEND ls_objpack TO lt_objpack.
      CLEAR ls_objpack.
      ls_objpack-transf_bin = lc_x.
      ls_objpack-head_start = 1.
      ls_objpack-head_num   = 1.
      ls_objpack-body_start = 1.
      ls_objpack-body_num   = lv_tab_lines.
      ls_objpack-doc_type   = 'XML'.
      ls_objpack-obj_name   = 'data'.
      ls_objpack-obj_descr  = 'data'.
      ls_objpack-doc_size   = ls_objpack-body_num * 255.
      APPEND ls_objpack TO lt_objpack.
      CLEAR ls_objpack.
    Add Recipients
      ls_reclist-rec_type = 'U'.
      ls_reclist-com_type  = 'INT'.
      ls_reclist-receiver =  pv_mail.
      APPEND  ls_reclist TO lt_reclist.
    Mail Contents
      ls_mailtxt-line = 'Please find attached your XML doc.'.
      APPEND ls_mailtxt TO lt_mailtxt.
      CLEAR ls_mailtxt.
      ls_mailtxt-line = lc_regards.
      APPEND ls_mailtxt TO lt_mailtxt.
      CLEAR ls_mailtxt.
      ls_mailtxt-line = lc_dewa.
      APPEND ls_mailtxt TO lt_mailtxt.
      CLEAR ls_mailtxt.
    Sending the document
      CALL FUNCTION 'SO_NEW_DOCUMENT_ATT_SEND_API1'
        EXPORTING
          document_data              = ls_mailsubject
         put_in_outbox              = lc_x
         commit_work                = lc_x
        TABLES
          packing_list               = lt_objpack
          contents_bin               = lt_objtxt
          contents_txt               = lt_mailtxt
          receivers                  = lt_reclist
        EXCEPTIONS
          too_many_receivers         = 1
          document_not_sent          = 2
          document_type_not_exist    = 3
          operation_no_authorization = 4
          parameter_error            = 5
          x_error                    = 6
          enqueue_error              = 7
          OTHERS                     = 8.
      IF sy-subrc EQ 0.
        COMMIT WORK.
        SUBMIT rsconn01 WITH mode = 'INT' AND RETURN.
        pv_return = 'Success'.
      ELSE.
        pv_return = 'Failed'.
      ENDIF.
      CLEAR: ls_objtxt,ls_reclist, ls_objpack ,ls_doc_chng.
      REFRESH: lt_objtxt,lt_reclist,lt_objpack.
    Could any one help me on this.
    Thanks & Regards,
    Vineel.

    Can't you PREVIEW your question before posting? Here you have more than 2500 characters, so we can't read your code.
    Your "problem" seems normal to me (you shouldn't open an XML file with Notepad). Could you give an example how it looks like and how you'd like to make it appear?

Maybe you are looking for

  • How to set the current row in table automatically when tabout from one row to next row

    Hi I'm using jdev 11.1.2.0.0 How can we set the  row(in which row focus is)  as current row in the table? If we create a table with single row selection, then whichever row we select, that row becomes current row because of the selection listener(#{b

  • Ultra 24 FAN speed

    My Ultra 24 fan is rotating very speed and generating huge noise. As per my diagnosis, I found that system is not able to recognize temperature, fan speed and voltages. I've changed the motherboard also with BIOS 1.5 still my problem is not resolved.

  • Getting a 400 error code when I try to sign into my elements 9 account

    I can't access my Adobe account online now and I need to get some help with how to do things. It keeps telling me the server is not working or there might be a 400 error code. Does anyone at Adobe take care of these things?

  • Is Lightroom a useful addition to Elements 13?

    I've just upgraded from PSE10 to 13 - very happy with my purchase so far.  One of the next tasks on my agenda is to scan a load of old photos and 35mm slides.  Would Lightroom be a useful addition to PSE13 for it's batch editing capabilities, whilst

  • Duotone presets folder empty

    In the Scott Kelby book on CS3 for digital photographers, in the section on quadtoning he describes duotone presets that are supposed to already loaded in the presets folder. On My CS3 extended, the presets folder is empty. Is this something that I n