Export Chinise data in XLS file using toad

Hii..
I have one table which have chinise data
All data is correct .
I am exporting chinese data from Oracle table,I am facing problem
     I export data through toad in flat file
     and import it in XLS file using UTF character set .
After importing all character gets converted to ???? marks.
I cant see the data properly in the XLS file .
my guess data base is not exporting file as unicode and hence charactger gets converted
How I export data which I show correct data in chinise?

Hi
I am using 8.5 version of toad .I tried with sql developer also but no vail..
can I set like this
ALTER SESSION SET NLS_LANGUAGE='SIMPLIFIED CHINESE';
ALTER SESSION SET NLS_TERRITORY='CHINA';
ALTER SESSION SET NLS_CHARACTER_SET='AMERICAN_AMERICA.UTF8';
and my NLS_LANG in windows for 10g client is AMERICAN_AMERICA.WE8MSWIN1252
PLz suggest mi
My NLS_SESSION_PARAMETERS in database are as follows
PARAMETER     VALUE
NLS_LANGUAGE     AMERICAN
NLS_TERRITORY     AMERICA
NLS_CURRENCY     $
NLS_ISO_CURRENCY     AMERICA
NLS_NUMERIC_CHARACTERS     .,
NLS_CALENDAR     GREGORIAN
NLS_DATE_FORMAT     DD-MON-RR
NLS_DATE_LANGUAGE     AMERICAN
NLS_SORT     BINARY
NLS_TIME_FORMAT     HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT     DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT     HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT     DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY     $
NLS_COMP     BINARY
NLS_LENGTH_SEMANTICS     BYTE
NLS_NCHAR_CONV_EXCP     FALSE

Similar Messages

  • Export batch data into CSV file using SQL SP

    Hi,
    I have created WCF-Custom receive adapter to poll Sql SP (WITH xmlnamespaces(DEFAULT 'Namespace' and For XML PATH(''), Type) . Get the result properly in batch while polling and but getting error while converting into CSV by using map.
    Please can anyone give me some idea to export SQL data into CSV file using SP.

    How are you doing this.
    You would have got XML representation for the XML batch received from SQL
    You should have a flat-file schema representing the CSV file which you want to send.
    Map the received XML representation of data from SQL to flat-file schema
    have custom pipeline with flat-file assembler on the assembler stage of the send pipeline.
    In the send port use the map which convert received XML from SQL to flat file schema and use the above custom flat-file disassembler send port
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • Problem while writing data on xls file using jxl API

    Hi,
    I am getting problem while writing data on excel file using jxl api.
    When i write data on file and all handles associated to the file are closed, file size increases but when i open the file nothing is written in it and when file is closed manually from excel window, file size decreased to its original that was before writing data.
    here is code:
              FileOutputStream os = new FileOutputStream(this.dirPath + this.fileName, true);
              WritableWorkbook this.workbook = Workbook.createWorkbook(os);
    after writing data following handler are closed:
    this.os.flush();
                        this.workbook.write();
                        this.workbook.close();
                        this.os.close();
                        this.os = null;
    can any body help me.
    Thanks in advance

    Err, I did help you. I did understand your problem; and I solved it for you. What was missing was that you apparently made no effort to understand what you were being told. Or even consider it. You just argued about it, as though you were the one with the solution, instead of the one whose code didn't work.
    And the other thing that was missing was the part where you said 'thank you' to me for solving your problem. Somewhat more appropriate than biting the hand that fed you, frankly. I do this for nothing, on my own gas, and it's extremely irritating when people keep asking about problems I have already solved for them. I am entitled to discourage that. It's part of making them more efficient actually.
    But it happens often enough that it also makes me think I'm just wasting my time. Probably I am.

  • How to get Header in Downloaded .xls file using  GUI_Download function

    How to get Header in Downloaded .xls file using  GUI_Download function ???
    How to use the the Header parameter available in GUI_Download function .

    HI,
    see this sample code..
    data : Begin of t_header occurs 0,
           name(30) type c,
           end of t_header.
    data : Begin of itab occurs 0,
           fld1 type char10,
           fld2 type char10,
           fld3 type char10,
           end   of itab.
    DATA: v_pass_path TYPE string.
    append itab.
    itab-fld1 = 'Hi'.
    itab-fld2 = 'hello'.
    itab-fld3 = 'welcome'.
    append itab.
    append itab.
    append itab.
    append itab.
    append itab.
    t_header-name = 'Field1'.
    append t_header.
    t_header-name = 'Field2'.
    append t_header.
    t_header-name = 'Field3'.
    append t_header.
      CALL FUNCTION 'GUI_FILE_SAVE_DIALOG'
        EXPORTING
          default_extension     = 'XLS'
        IMPORTING
          fullpath              = v_pass_path.
      CALL FUNCTION 'GUI_DOWNLOAD'
        EXPORTING
          filename                        = v_pass_path
          filetype                        = 'DBF'
        TABLES
          data_tab                        = itab
          FIELDNAMES                      = t_header
    Cheers,
    jose.

  • Spool SQl data into text file using dynamic sql

    Hi,
    I am spooling output data into text file using command
    select 'select t.mxname,bo.lxtype,t.mxrev'||chr(10)||'from mx_1234567'||chr(10)||
    'where <condition>';
    here mxname varchar(128),lxtype(128),mxrev(128) all are of varchar type.I want the output in format
    e.g Part|1211121313|A
    but due to column width the output,I am getting is with spaces.
    "Part then blank spaces |1211121313 then blank spaces |A"
    how can I remove these spaces between columns.I used set space 0 but not working.
    Thanks in advance.
    Your help will be appreciated.

    Hi Frank,
    I have seen your reply for SET LINE SIZE function. But, I could not be able to understand it.
    I am facing similar kind of issue in my present project.
    I am trying spool more than 50 columns from a table into flat file. Because of more column lengths in few columns, i am getting space. There are so many columns with the same issue. I want to remove that space.so that, data can fit perfectly in one line in .txt file without any wrap text.
    Below is my sample query.sql. Please let me know the syntax. My mail id : [email protected]
    --Created : Sep 22,2008, Created By : Srinivasa Bojja
    --Export all Fulfillments
    --Scheduled daily after 1:00am and should complete before 3:30am
    WHENEVER SQLERROR EXIT SQL.SQLCODE
    SET LINESIZE 800
    SET WRAP OFF
    SET PAGESIZE 800
    SET FEEDBACK OFF
    SET HEADING ON
    SET ECHO OFF
    SET CONCAT OFF
    SET COLSEP '|'
    SET UNDERLINE OFF
    SPOOL C:\Fulfillment.txt;
    SELECT SRV.COMM_METHOD_CD AS Method,
    SRV.SR_NUM AS "Fulfillment Row_Id",
    CON.LAST_NAME AS "Filled By"
    SRV.SR_TITLE AS Notes,
    SRVXM.ATTRIB_04 AS "Form Description"
    FROM SIEBEL.S_SRV_REQ SRV,
    SIEBEL.S_SRV_REQ_XM SRVXM,
    SIEBEL.S_USER USR,
    SIEBEL.S_CONTACT CON
    WHERE SRV.ROW_ID = SRVXM.PAR_ROW_ID AND
    SRV.OWNER_EMP_ID = USR.ROW_ID AND
    CON.ROW_ID= SRV.CST_CON_ID;
    SPOOL OFF;
    EXIT;

  • Merge xls files using powershell

    is there any way to merge xls files using powershell ?

    What does "merge" mean?  Copy cells?  Copy Sheets?  Copy formulas?  Copy Data?
    "Merge"is way too vague.
    ¯\_(ツ)_/¯

  • Not able to extract performance data from .ETL file using xperf commands. getting error "Events were lost in this trace. Data may be unreliable ..."

    Not able to extract  performance data from .ETL file using xperf commands.
    Xperf Commands:
    xperf –i C:\TempFolder\Test.etl -o C:\TempFolder\BootData.csv  –a process
    Getting following error after executing above command:
    "33288636 Events were lost
    in this trace. 
    Data may be unreliable
    This is usually caused
    by insufficient disk bandwidth for ETW lo
    gging.
    Please try increasing the minimum
    and maximum number of buffers
    and/or
                    the buffer size. 
    Doubling these values would be a good first at
    tempt.
    Please note, though, that
    this action increases the amount of me
    mory
                    reserved
    for ETW buffers, increasing memory pressure on your sce
    nario.
    See "xperf -help start"
    for the associated command line options."
    I changed page size file but its does not work for me.
    Any one have idea, how to solve this problem and extract ETL file data.

    I want to mention one point here. I have total 4 machines out of these 3 machines above
    commands working properly. Only one machine has this problem.<o:p></o:p>
    Hi,
    I consider that you can try to use xperf to collect the trace etl file and see if it can be extracted on this computer:
    Refer to following articles:
    start
    http://msdn.microsoft.com/en-us/library/windows/hardware/hh162977.aspx
    Using Xperf to take a Trace (updated)
    http://blogs.msdn.com/b/pigscanfly/archive/2008/02/16/using-xperf-to-take-a-trace.aspx
    Kate Li
    TechNet Community Support

  • Display data in log file using PL/SQL procedure

    Just as srw.message is used in Oracle RDF Reports to display data in log file in Oracle Apps, similarly how it is possible to display data in log file using PL/SQL procedure?
    Please also mention the syntax too.

    Pl post details of OS, database and EBS versions.
    You will need to invoke the seeded FND_LOG procedure - see previous discussions on this topic
    Enable debug for pl/sql
    https://forums.oracle.com/forums/search.jspa?threadID=&q=FND_LOG&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    HTH
    Srini

  • Creating data and .par files using SWIFT Integration Package

    Hi,
    I have a requirement to generate a data and .par file using SAP PI Integration package for SWIFT.
    I am following the SAP Note: 1303428.
    I have created 1 sender and 2 receiver comm channels(one for payload and other for par). Used operation mapping SWIFT_payload_parFile_split in the Interface determination.
    I am using the adapter module : localejbs/swift/FileActConversionToSWIFTModule and setting the parameter DetachParameters to true. This adapter module is being used in all the three channels(1 sender and 2 receivers).
    I have used ASMA ans has set the FileName checkbox.
    Now after placing the file in the input directory, the file with the same name gets created in the output directory but the file is exactly same and also no .par file is getting created. I have set the Empty file handling to Ignore, so it shows that there is no data to  create a .par file and only payload file is getting created but the payload file is exactly same.
    Also if I use the adapter module : localejbs/swift/FileActConversionToSWIFTModule in only the sender communication channel, a payload file gets created like below.
    <?xml version="1.0" encoding="UTF-8"?>
    -<ns1:SWIFT_payload xmlns:ns1="http://sap.com/xi/SWIFT"><payload>PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4KPEZpbGU+CiA8UGFyYW1ldGVycyB4bWxucz0idXJuOnN3aWZ0OnNhZzp4c2Q6ZnRhLnBhcmFtLjEuMCIgeG1sbnM6eHNpPSJodHRwOi8vd3d3LnczLm9yZy8yMDAxL1hNTFNjaGVtYS1pbnN0YW5jZSI+CiAgPE92ZXJyaWRlcz4KICAgPFJlc3BvbmRlcj5jbj1jZngsb3U9bmEsbz1jaXRpZ2IybCxvPXN3aWZ0PC9SZXNwb25kZXI+CiAgIDxTZXJ2aWNlPnN3aWZ0LmNvcnAuZmEhcDwvU2VydmljZT4KICAgPFJlcXVlc3RUeXBlPnBhaW4uMDAxLjAwMS4wMzwvUmVxdWVzdFR5cGU+CiAgIDxUcmFuc2ZlckRlc2NyaXB0aW9uPkIwNDExMC1CYXRjaDE3ODc8L1RyYW5zZmVyRGVzY3JpcHRpb24+CiAgIDxUcmFuc2ZlckluZm8+QjA0MTEwLUJhdGNoMTc4NzwvVHJhbnNmZXJJbmZvPgogICA8RmlsZURlc2NyaXB0aW9uPjIwNzY4PC9GaWxlRGVzY3JpcHRpb24+CiAgIDxGaWxlSW5mbz5Td0NvbXByZXNzaW9uPW5vbmU8L0ZpbGVJbmZvPgogICA8Tm9uUmVwdWRpYXRpb24+VFJVRTwvTm9uUmVwdWRpYXRpb24+CiAgIDxTaWduPlRSVUU8L1NpZ24+CiAgIDxQcmlvcml0eT5Ob3JtYWw8L1ByaW9yaXR5PgogIDwvT3ZlcnJpZGVzPgogPC9QYXJhbWV0ZXJzPgogPERvY3VtZW50IHhtbG5zPSJ1cm46aXNvOnN0ZDppc286MjAwMjI6dGVjaDp4c2Q6cGFpbi4wMDEuMDAxLjAzIiB4bWxuczp4c2k9Imh0dHA6Ly93d3cudzMub3JnLzIwMDEvWE1MU2NoZW1hLWluc3RhbmNlIj4KICA8Q3N0bXJDZHRUcmZJbml0bj4KICAgPEdycEhkcj4KICAgIDxNc2dJZD4xMDAwMzI4MTE1PC9Nc2dJZD4KICAgIDxDcmVEdFRtPjIwMTQtMDMtMjhUMTk6MjY6Mzc8L0NyZUR0VG0+CiAgICA8TmJPZlR4cz4xPC9OYk9mVHhzPgogICAgPEN0cmxTdW0+NTkwLjAwPC9DdHJsU3VtPgogICAgPEluaXRnUHR5PgogICAgIDxObT5BTEVYSU9OIElOVC4gU0FSTDwvTm0+CiAgICAgPElkPgogICAgICA8T3JnSWQ+CiAgICAgICA8QklDT3JCRUk+QUxYTlVTMjBYWFg8L0JJQ09yQkVJPgogICAgICA8L09yZ0lkPgogICAgIDwvSWQ+CiAgICA8L0luaXRnUHR5PgogICA8L0dycEhkcj4KICAgPFBtdEluZj4KICAgIDxQbXRJbmZJZD4xMDAwMzI4MTE1PC9QbXRJbmZJZD4KICAgIDxQbXRNdGQ+VFJGPC9QbXRNdGQ+CiAgICA8QnRjaEJvb2tnPmZhbHNlPC9CdGNoQm9va2c+CiAgICA8TmJPZlR4cz4xPC9OYk9mVHhzPgogICAgPEN0cmxTdW0+NTkwLjAwPC9DdHJsU3VtPgogICAgPFBtdFRwSW5mPgogICAgIDxJbnN0clBydHk+Tk9STTwvSW5zdHJQcnR5PgogICAgIDxTdmNMdmw+CiAgICAgIDxDZD5TRVBBPC9DZD4KICAgICA8L1N2Y0x2bD4KICAgIDwvUG10VHBJbmY+CiAgICA8UmVxZEV4Y3RuRHQ+MjAxNC0wMy0yOTwvUmVxZEV4Y3RuRHQ+CiAgICA8RGJ0cj4KICAgICA8Tm0+QUxYTiBCRU5FTFVYIEJWIE5MIEJSQU5DSDwvTm0+CiAgICAgPFBzdGxBZHI+CiAgICAgIDxTdHJ0Tm0+U3RyYWF0PC9TdHJ0Tm0+CiAgICAgIDxUd25ObT5OZXRoZXJsYW5kczwvVHduTm0+CiAgICAgIDxDdHJ5Pk5MPC9DdHJ5PgogICAgIDwvUHN0bEFkcj4KICAgICA8Q3RyeU9mUmVzPk5MPC9DdHJ5T2ZSZXM+CiAgICA8L0RidHI+CiAgICA8RGJ0ckFjY3Q+CiAgICAgPElkPgogICAgICA8SUJBTj5OTFhYQU5CQTEyMzAwNDU2NzY3ODkwPC9JQkFOPgogICAgIDwvSWQ+CiAgICAgPENjeT5FVVI8L0NjeT4KICAgIDwvRGJ0ckFjY3Q+CiAgICA8RGJ0ckFndD4KICAgICA8RmluSW5zdG5JZD4KICAgICAgPEJJQz5BQk5BTkwyWFhYWDwvQklDPgogICAgICA8UHN0bEFkcj4KICAgICAgIDxDdHJ5Pk5MPC9DdHJ5PgogICAgICA8L1BzdGxBZHI+CiAgICAgPC9GaW5JbnN0bklkPgogICAgPC9EYnRyQWd0PgogICAgPENocmdCcj5TTEVWPC9DaHJnQnI+CiAgICA8Q2R0VHJmVHhJbmY+CiAgICAgPFBtdElkPgogICAgICA8RW5kVG9FbmRJZD5OTDEyMzAwNDAwMDAwMDwvRW5kVG9FbmRJZD4KICAgICA8L1BtdElkPgogICAgIDxBbXQ+CiAgICAgIDxJbnN0ZEFtdCBDY3k9IkVVUiI+NTkwLjAwPC9JbnN0ZEFtdD4KICAgICA8L0FtdD4KICAgICA8Q2R0ckFndD4KICAgICAgPEZpbkluc3RuSWQ+CiAgICAgICA8QklDPkFCTkFOTFhYWFhYPC9CSUM+CiAgICAgICA8Q2xyU3lzTW1iSWQ+CiAgICAgICAgPE1tYklkPjAwMzwvTW1iSWQ+CiAgICAgICA8L0NsclN5c01tYklkPgogICAgICAgPE5tPkFCTiBBbXJvPC9ObT4KICAgICAgIDxQc3RsQWRyPgogICAgICAgIDxDdHJ5Pk5MPC9DdHJ5PgogICAgICAgPC9Qc3RsQWRyPgogICAgICA8L0Zpbkluc3RuSWQ+CiAgICAgPC9DZHRyQWd0PgogICAgIDxDZHRyPgogICAgICA8Tm0+QUxYTiBOTCBEb21lc3RpYyBWZW5kb3I8L05tPgogICAgICA8UHN0bEFkcj4KICAgICAgIDxDdHJ5Pk5MPC9DdHJ5PgogICAgICA8L1BzdGxBZHI+CiAgICAgIDxJZD4KICAgICAgIDxPcmdJZD4KICAgICAgICA8T3Rocj4KICAgICAgICAgPElkPjAwMTUwMDAxOTc8L0lkPgogICAgICAgIDwvT3Rocj4KICAgICAgIDwvT3JnSWQ+CiAgICAgIDwvSWQ+CiAgICAgPC9DZHRyPgogICAgIDxDZHRyQWNjdD4KICAgICAgPElkPgogICAgICAgPElCQU4+TkwxMjAwMzA0NTY3ODkxMjAwMDA8L0lCQU4+CiAgICAgIDwvSWQ+CiAgICAgIDxDY3k+RVVSPC9DY3k+CiAgICAgIDxObT5BTFhOIE5MIERvbWVzdGljIFZlbmRvcjwvTm0+CiAgICAgPC9DZHRyQWNjdD4KICAgICA8Um10SW5mPgogICAgICA8VXN0cmQ+L1BNREQvVEVTVDY4MSw1OTAuMDAsRVVSLDIwMTQwMzI8L1VzdHJkPgogICAgIDwvUm10SW5mPgogICAgPC9DZHRUcmZUeEluZj4KICAgPC9QbXRJbmY+CiAgPC9Dc3RtckNkdFRyZkluaXRuPgogPC9Eb2N1bWVudD4KPC9GaWxlPgo=</payload></ns1:SWIFT_payload>
    But while creating the par file, it shows that the file could not be overwritten and so the .par file is not getting created.
    I need to understand that:
    1)  How do I configure both of my receiver channels i.e. what should be the difference. Currently, I am just setting the
    same Output directory in both and file name schema * and has used ASMA with FileName parameter. So same name files are getting created and so one file gets discarded. How to create a .par file.
    2) Also Is the file above is the correct file required by SWIFT.
    3) Also when I check in SXMB_MONI, I can see that after running the adapter module same payload as given above is going in both payload and par files i.e. also if I am using different names in my receiver communication channels two files are getting created payload and par both are having the same payload. So what exactly should get created.
    Kindly guide on this implementation.

    Hi,
    I am able to generate the .par file by setting the localsecurity to true and the KeyId from the key manager.
    Now two files are getting created .xml and .par. .xml file which is the payload file is the same as the input file with no difference and contains both the overrides and the data parameters. The .par file contains the Algorithm and the Value.
    Is this correct?
    Also suppose the input file name is SEPA.xml, then the payload file is created with the name as
    SEPA.xml and the par file is created with the name SEPA.xml.par. I need only SEPA.par. How to achieve this.

  • Sun IDM 8.1.1P2: Export user records to xls file Functionality Issue

    Hi All,
    This is my first post in this form, please guide me to right path.
    We implemented custom functionality to search user records from AD and LDAP from IDM User console. After searching the records we provided a export functionality to export resultant user records to xls file.
    The issue is the number of exported user records to xls file is not same as the number of user records while search.
    This functionality is working good in our Development and VALenvironments but not with the Production environment.
    I checked the custom jsp file and the calling Rule from all the three environments and they are same.
    From VAL and PROD server.log I see the following.
    PWC1406: Servlet.service() for servlet jsp threw exception
    java.lang.IllegalStateException: PWC3991: getOutputStream() has already been called for this response
    But this error message didn't stop VAL to export same number of records to xls file.
    # of records serached from PROD is 16809
    # of records serached from VAL is 10312
    They are constant all the time.
    # of records exported to xls from PROD is 168 or 1274 (It is varying, each time I export it shows different number)
    # of records exported to xls from VAL is 10312 (Always same as search)
    We are on Glassfish V2.1.1P8.
    I checked file sizes from VAL and PROD both are same.
    It would be great if any one can point me to the right direction where else I have to check for possible cause.
    Thanks,
    Ravi Mangalagiri

    Hi Arjun,
    Thanks for responding to my post.
    The search is working as expected in all 3 environments DEV,VAL and PROD.
    The search and alignment performed by the Rule where as DB connection and Saving to XLS performed by the custom JSP file.
    Since search is working fine I don't think any permissions issue with AD or LDAP.
    Couple of things I noticed from server.log from all environments
    SEVERE|sun-appserver2.1.1|javax.enterprise.system.container.web|_
    ThreadID=297;_ThreadName=httpSSLWorkerThread-9084-102;_RequestID=5efa3ecb-0ec9-4695-ab51-8049257b
    9d57;|StandardWrapperValve[jsp]: PWC1406: Servlet.service() for servlet jsp threw exception
    java.lang.IllegalStateException: PWC3991: getOutputStream() has already been called for this resp
    onse
    and
    WARNING|sun-appserver2.1.1|javax.enterprise.system.stream.err|_ThreadID=78;_ThreadName=Provisioner;_RequestID=531d32b0-6d9a-4
    3e-bd74-0bc9478ffdae;|org.xml.sax.SAXParseException: XML document structures must start and end within the same entity.
    at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source)
    This is logging when the custom jsp is executing.
    getOutputStream() has already been called for this response.
    I am not sure if this is the root cause, since it is logging in DEV and VAL also.
    Other things I noticed are.
    Yester day I conducted 10 tests and all are taking 6 min 18 sec or 6 min 19 sec or 6 min 22 sec.
    Also I noticed that the number of user records exported to xls depends on the transfer rate.
    For example,
    if the file download transfer rate is 1.50 KB then the user records are between 1200 to 1800 where as the search user records are 16590.
    if the file download transfer rate is 800 B then the user records are between 200 to 600 where as the search user records are 16590.
    Not sure where to check this time value(attribute) 6 min 18 sec..
    Please provide me some info where else I need to check.
    Thanks,
    Ravi.

  • How to write data to text file using external tables

    can anybody tell how to write data to text file using external tables concept?

    Hi,
    Using external table u can load the data in your local table in database,
    then using your local db table and UTL_FILE pacakge u can wrrite data to text file
    external table
    ~~~~~~~~~~~
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14200/statements_7002.htm#i2153251
    UTL_FILE
    ~~~~~~~~~
    http://download-east.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#sthref14093
    Message was edited by:
    Nicloei W
    Message was edited by:
    Nicloei W

  • How to Write data in Excel File using java

    Hi
    can anybody help me to write data in excel file
    using java code
    Thankx In Advance

    How much are you willing to pay for that?
    If you want it for free, http://jexcelapi.sourceforge.net/

  • How to write non-XML data to a file using an OSB FTP Routing?

    Hi --
    Situation ... I need to write non-XML data to a file using FTP. A proxy service retrieves XML and transforms it with XSLT to CSV format, then gives it to a Biz service to file it out, using FTP. Simple.
    Problem ... OSB sends the contents of $body to any service it calls. Because $body is a SOAP document, it has to contain XML. So therefore I have to put my CSV data into an XML element, in order to put it into $body; and this inner element then gets written to the file, which I don’t want. But if I don't enclose my CSV content in a tag, I get "Unexpected CDATA encountered" trying to assign it to a variable.
    There has to be away around this!
    Thanks for your help.
    John Taylor

    Solved. Steps:
    -- Transform the XML to CSV using an XSL transform. Put the CSV data inside enclosing XML elements, and use a Replace action to put the XML element + CSV contents back into *$body*.
    -- Define an MFL transform that only knows about the enclosing XML elements. Use a delimiter of "\n" (hard return).
    -- Route from the proxy service to a Biz service that has Service Type = Messaging Service and Request Message Type = MFL; specify the MFL transform, which will receive the incoming *$body* variable, strip off the enclosing XML element within it, and pass the CSV contents to the FTP service.
    Edited by: DunedainRanger on Nov 29, 2011 9:03 AM

  • Appending XML data to a file using file adaptor

    Hi,
    I am trying to append data to a file using file adaptor in XML format. (Objective is to store data as XML message in the file). I understand it is possible to append data to an existing file by making appropriate changes in WSDL manually. However, my issue is every time the XML data is appended to a file even the XML header message is also getting appended. As a result the file data does not adhere to defined schema structure anymore.
    Is there a way to save XML data in a file in append mode ?
    Thanks, Riz

    I am having the same issue as well, which makes the output XML an invalid one. I have an XSD that has a hierarchy of elements. whenever I am wiriting the child elements the header message & the name space is also getting written. For example
    <?xml version="1.0" ?><Parent xmlns="http://testSchema.com/outputSchema">
    <StartDate>01/01/2001</StartDate>
    <EndDate>01/30/2001</EndDate>
    <Child/>
    </Parent>
    <?xml version="1.0" ?><Child xmlns:ns1="http://testSchema.com/outputSchema">
    <ns1:Id>20012981</ns1:Id>
    <ns1:Value/>
    <ns1:Date>01/15/2001</ns1:Date>
    </Child>
    <?xml version="1.0" ?><Child xmlns:ns1="http://testSchema.com/outputSchema">
    <ns1:Id>20012981</ns1:Id>
    <ns1:Value/>
    <ns1:Date>01/15/2001</ns1:Date>
    </Child>
    I am using a onefile adapter partnerlink to write the parent & an another one for writing the child element, since the child elements will repeat more than once. One other way to fix this issue is write the data using java embedded activity, but that would need a lot of boiler plate code, for writing/reading/appending the same file & for handling all those IOexceptions & buffer IN/outs. I am curious if someone had the same issue & how they resolved it.

  • Data convertion while exporting data into flat files using export wizard in ssis

    Hi ,
    while exporting data to flat file through export wizard the source table is having NVARCHAR types.
    could you please help me on how to do the data convertion while using the export wizard?
    Thanks.

    Hi Avs sai,
    By default, the columns in the destination flat file will be non-Unicode columns, e.g. the data type of the columns will be DT_STR. If you want to keep the original DT_WSTR data type of the input column when outputting to the destination file, you can check
    the Unicode option on the “Choose a Destination” page of the SQL Server Import and Export Wizard. Then, on the “Configure Flat File Destination” page, you can click the Edit Mappings…“ button to check the data types. Please see the screenshot:
    Regards,
    Mike Yin
    TechNet Community Support

Maybe you are looking for

  • Rash of console messages when starting Numbers

    When I start Numbers, the console log always contains a rash of messages that indicate something isn't quite correct. 28/09/2011 08:48:11.021 Numbers: Object (0x3051070 of class __NSCFDictionary) named "NSMutableDictionary-0" was already registered w

  • Must render every clip ????

    Hi All, I shot a project using the XL2 in 24p /16:9 in FCP I'm using the 24p preset but everytime iI bring a clip in the timeline it must be render. Can you guys tell what I'm doing wrong or is it normal for this to happen. If there is a work around

  • HR ABAP Technical documentation

    Can you please snd me the technical documents... Appreciate your assistance... Points will be awarded... Raj

  • Drop down error in Online Interactive Forms

    Hi, I have used a Value Help drop down in my form so that i can get data from a datasource. However when i click on the arrow i get the followin error : java.lang.NullPointerException at com.sap.tc.webdynpro.clientserver.event.PDFValueHelpServiceEven

  • Just wondering

    I bought an Apple Nano, back in 2005 with the Apple Care plan, wich i'm glad i did, because resently my Nano died... The care plan work wonderfully and fast, and i recived a replacement Ipod, but the replacment seems to have a different serial number