Issue while generating a CSV file through Oracle.

Hi,
I am generating a CSV file using oracle stored procedure.
In an Oracle procedure I need to create a CSV file, which contains the data of a select query, and then emailing this file. Problem is that one of the field contains comma eg ('ABC,DE'). Now in the CSV file 'ABC' is shown in the one cell and 'DE' is shifted to the adjacent cell which is not required..
Thanks.

Hi Welcome,
If you data contains comma then make ~ as your delimeter.
or repalce all the commas form your data.
select replace(column_name,',','') from table_name; /* before writing to the file */Regards,
Bhushan

Similar Messages

  • Issues while generating Schema DAT files

    We are facing two type of issues when generating Schema ".dat" files from Informix Database on Solaris OS using the
    "IDS9_DSML_SCRIPT.sh " file.
    We are executing the command on SOLARIS pormpt as follows..
    "IDS9_DSML_SCRIPT.sh <DBName> <DB Server Name> ".
    The first issue is ,after the command is excuted ,while generating the ".dat" files the following error is occuring .This error is occuring for many tables
    19834: Error in unload due to invalid data : row number 1.
    Error in line 1
    Near character position 54
    Database closed.
    This happens randomly for some schemas .So we again shift the script to a different folder in Unix and execute it.
    Can we get the solution for avoiding this error.
    2. The second issue is as follows..
    When the ".dat" files are generated without any errors using the script ,these .dat files are provided to the OMWB tool to load the Source Model.
    The issue here is sometimes OMWB is not able to complete the process of creating the Source Model from the .dat files and gets stuck.
    Sometimes the tables are loaded ,but with wrong names.
    For example the Dat files is having the table name as s/ysmenus for the sysmenus table name.
    and when loaded to oracle the table is created with the name s_ysmenus.
    Based on the analysis and understanding this error is occuring due to the "Delimiter".
    For example this is the snippet from a .dat file generated from the IDS9_DSML_SCRIPT.sh script.The table name sysprocauthy is generated as s\ysprocauthy.
    In Oracle this table is created with the name s_ysprocauthy.
    s\ysprocauthy║yinformixy║y4194387y║y19y║y69y║y4y║y2y║y0y║y2005-03-31y║y65537y║yT
    y║yRy║yy║y16y║y16y║y0y║yy║yy║y╤y
    Thanks & Regards
    Ramanathan KrishnaMurthy

    Hello Rajesh,
    Thanks for your prompt reply. Please find my findings below:
    *) Have there been any changes in the extractor logic causing it to fail before the write out to file, since the last time you executed it successfully? - I am executing only the standard extractors out of the extractor kit so assumbly this shouldnt be a issue.
    *) Can this be an issue with changed authorizations? - I will check this today, bt again this does not seem to be possible as the same object for a different test project i created executed fine and a file was created.
    *) Has the export folder been locked or write protected at the OS level? Have the network settings (if its a virtual directory) changed? - Does not seem so because of the above reason.
    I will do some analysis today and revert back for your help.
    Regards
    Gundeep

  • Column heading is not coming while generating a .csv file from .sql file

    Hi all,
    Now since I am able to generate a .csv file by executing my .sql file, the column heading is missing in that. Please advise. I have used the following parameters in my query:
    set linesize 1000
    set colsep ','
    set echo off
    set feedback off
    set pagesize 0
    set trimspool on
    spool /path/file.csv
    select ...... from .... where .....;
    spool off
    exit

    set pagesize 0 <-- your problem
    you must set it into a high value (max value 50000)
    see:
    SQL> select * from dual;
    D
    X
    SQL> set pagesize 0
    SQL> select * from dual;
    X
    SQL> set pagesize 50000
    SQL> select * from dual;
    D
    X

  • Issue while loading a csv file using sql*loader...

    Hi,
    I am loading a csv file using sql*loader.
    On the number columns where there is data populated in them, decimal number/integers, the row errors out on the error -
    ORA-01722: invalid number
    I tried checking the value picking from the excel,
    and found the chr(13),chr(32),chr(10) values characters on the value.
    ex: select length('0.21') from dual is giving a value of 7.
    When i checked each character as
    select ascii(substr('0.21',5,1) from dual is returning a value 9...etc.
    I tried the following command....
    "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
    to remove all the non-number special characters. But still facing the error.
    Please let me know, any solution for this error.
    Thanks in advance.
    Kiran

    control file:
    OPTIONS (ROWS=1, ERRORS=10000)
    LOAD DATA
    CHARACTERSET WE8ISO8859P1
    INFILE '$Xx_TOP/bin/ITEMS.csv'
    APPEND INTO TABLE XXINF.ITEMS_STAGE
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    ItemNum                    "trim(replace(replace(:ItemNum,chr(9),''),chr(13),''))",
    cross_ref_old_item_num               "trim(replace(replace(:cross_ref_old_item_num,chr(9),''),chr(13),''))",
    Mas_description               "trim(replace(replace(:Mas_description,chr(9),''),chr(13),''))",
    Mas_long_description               "trim(replace(replace(:Mas_long_description,chr(9),''),chr(13),''))",
    Org_description               "trim(replace(replace(:Org_description,chr(9),''),chr(13),''))",
    Org_long_description               "trim(replace(replace(:Org_long_description,chr(9),''),chr(13),''))",
    user_item_type                    "trim(replace(replace(:user_item_type,chr(9),''),chr(13),''))",
    organization_code               "trim(replace(replace(:organization_code,chr(9),''),chr(13),''))",
    primary_uom_code               "trim(replace(replace(:primary_uom_code,chr(9),''),chr(13),''))",
    inv_default_item_status          "trim(replace(replace(:inv_default_item_status,chr(9),''),chr(13),''))",
    inventory_item_flag               "trim(replace(replace(:inventory_item_flag,chr(9),''),chr(13),''))",
    stock_enabled_flag               "trim(replace(replace(:stock_enabled_flag,chr(9),''),chr(13),''))",
    mtl_transactions_enabled_flag          "trim(replace(replace(:mtl_transactions_enabled_flag,chr(9),''),chr(13),''))",
    revision_qty_control_code          "trim(replace(replace(:revision_qty_control_code,chr(9),''),chr(13),''))",
    reservable_type               "trim(replace(replace(:reservable_type,chr(9),''),chr(13),''))",
    check_shortages_flag               "trim(replace(replace(:check_shortages_flag,chr(9),''),chr(13),''))",
    shelf_life_code               "trim(replace(replace(replace(replace(:shelf_life_code,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    shelf_life_days               "trim(replace(replace(replace(replace(:shelf_life_days,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    lot_control_code               "trim(replace(replace(:lot_control_code,chr(9),''),chr(13),''))",
    auto_lot_alpha_prefix               "trim(replace(replace(:auto_lot_alpha_prefix,chr(9),''),chr(13),''))",
    start_auto_lot_number               "trim(replace(replace(:start_auto_lot_number,chr(9),''),chr(13),''))",
    negative_measurement_error          "trim(replace(replace(replace(replace(:negative_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    positive_measurement_error          "trim(replace(replace(replace(replace(:positive_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    serial_number_control_code          "trim(replace(replace(:serial_number_control_code,chr(9),''),chr(13),''))",
    auto_serial_alpha_prefix          "trim(replace(replace(:auto_serial_alpha_prefix,chr(9),''),chr(13),''))",
    start_auto_serial_number          "trim(replace(replace(:start_auto_serial_number,chr(9),''),chr(13),''))",
    location_control_code               "trim(replace(replace(:location_control_code,chr(9),''),chr(13),''))",
    restrict_subinventories_code          "trim(replace(replace(:restrict_subinventories_code,chr(9),''),chr(13),''))",
    restrict_locators_code               "trim(replace(replace(:restrict_locators_code,chr(9),''),chr(13),''))",
    bom_enabled_flag               "trim(replace(replace(:bom_enabled_flag,chr(9),''),chr(13),''))",
    costing_enabled_flag               "trim(replace(replace(:costing_enabled_flag,chr(9),''),chr(13),''))",
    inventory_asset_flag               "trim(replace(replace(:inventory_asset_flag,chr(9),''),chr(13),''))",
    default_include_in_rollup_flag          "trim(replace(replace(:default_include_in_rollup_flag,chr(9),''),chr(13),''))",
    cost_of_goods_sold_account          "trim(replace(replace(:cost_of_goods_sold_account,chr(9),''),chr(13),''))",
    std_lot_size                    "trim(replace(replace(replace(replace(:std_lot_size,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    sales_account                    "trim(replace(replace(:sales_account,chr(9),''),chr(13),''))",
    purchasing_item_flag               "trim(replace(replace(:purchasing_item_flag,chr(9),''),chr(13),''))",
    purchasing_enabled_flag          "trim(replace(replace(:purchasing_enabled_flag,chr(9),''),chr(13),''))",
    must_use_approved_vendor_flag          "trim(replace(replace(:must_use_approved_vendor_flag,chr(9),''),chr(13),''))",
    allow_item_desc_update_flag          "trim(replace(replace(:allow_item_desc_update_flag,chr(9),''),chr(13),''))",
    rfq_required_flag               "trim(replace(replace(:rfq_required_flag,chr(9),''),chr(13),''))",
    buyer_name                    "trim(replace(replace(:buyer_name,chr(9),''),chr(13),''))",
    list_price_per_unit               "trim(replace(replace(replace(replace(:list_price_per_unit,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    taxable_flag                    "trim(replace(replace(:taxable_flag,chr(9),''),chr(13),''))",
    purchasing_tax_code               "trim(replace(replace(:purchasing_tax_code,chr(9),''),chr(13),''))",
    receipt_required_flag               "trim(replace(replace(:receipt_required_flag,chr(9),''),chr(13),''))",
    inspection_required_flag          "trim(replace(replace(:inspection_required_flag,chr(9),''),chr(13),''))",
    price_tolerance_percent          "trim(replace(replace(replace(replace(:price_tolerance_percent,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    expense_account               "trim(replace(replace(:expense_account,chr(9),''),chr(13),''))",
    allow_substitute_receipts_flag          "trim(replace(replace(:allow_substitute_receipts_flag,chr(9),''),chr(13),''))",
    allow_unordered_receipts_flag          "trim(replace(replace(:allow_unordered_receipts_flag,chr(9),''),chr(13),''))",
    receiving_routing_code               "trim(replace(replace(:receiving_routing_code,chr(9),''),chr(13),''))",
    inventory_planning_code          "trim(replace(replace(:inventory_planning_code,chr(9),''),chr(13),''))",
    min_minmax_quantity               "trim(replace(replace(replace(replace(:min_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    max_minmax_quantity               "trim(replace(replace(replace(replace(:max_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    planning_make_buy_code               "trim(replace(replace(:planning_make_buy_code,chr(9),''),chr(13),''))",
    source_type                    "trim(replace(replace(:source_type,chr(9),''),chr(13),''))",
    mrp_safety_stock_code               "trim(replace(replace(:mrp_safety_stock_code,chr(9),''),chr(13),''))",
    material_cost                    "trim(replace(replace(:material_cost,chr(9),''),chr(13),''))",
    mrp_planning_code               "trim(replace(replace(:mrp_planning_code,chr(9),''),chr(13),''))",
    customer_order_enabled_flag          "trim(replace(replace(:customer_order_enabled_flag,chr(9),''),chr(13),''))",
    customer_order_flag               "trim(replace(replace(:customer_order_flag,chr(9),''),chr(13),''))",
    shippable_item_flag               "trim(replace(replace(:shippable_item_flag,chr(9),''),chr(13),''))",
    internal_order_flag               "trim(replace(replace(:internal_order_flag,chr(9),''),chr(13),''))",
    internal_order_enabled_flag          "trim(replace(replace(:internal_order_enabled_flag,chr(9),''),chr(13),''))",
    invoice_enabled_flag               "trim(replace(replace(:invoice_enabled_flag,chr(9),''),chr(13),''))",
    invoiceable_item_flag               "trim(replace(replace(:invoiceable_item_flag,chr(9),''),chr(13),''))",
    cross_ref_ean_code               "trim(replace(replace(:cross_ref_ean_code,chr(9),''),chr(13),''))",
    category_set_intrastat               "trim(replace(replace(:category_set_intrastat,chr(9),''),chr(13),''))",
    CustomCode                    "trim(replace(replace(:CustomCode,chr(9),''),chr(13),''))",
    net_weight                    "trim(replace(replace(replace(replace(:net_weight,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    production_speed               "trim(replace(replace(:production_speed,chr(9),''),chr(13),''))",
    LABEL                         "trim(replace(replace(:LABEL,chr(9),''),chr(13),''))",
    comment1_org_level               "trim(replace(replace(:comment1_org_level,chr(9),''),chr(13),''))",
    comment2_org_level               "trim(replace(replace(:comment2_org_level,chr(9),''),chr(13),''))",
    std_cost_price_scala               "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
    supply_type                    "trim(replace(replace(:supply_type,chr(9),''),chr(13),''))",
    subinventory_code               "trim(replace(replace(:subinventory_code,chr(9),''),chr(13),''))",
    preprocessing_lead_time          "trim(replace(replace(replace(replace(:preprocessing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
    processing_lead_time                "trim(replace(replace(replace(replace(:processing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
    wip_supply_locator               "trim(replace(replace(:wip_supply_locator,chr(9),''),chr(13),''))"
    Sample data from csv file.
    "9901-0001-35","390000","JMKL16 Pipe bend 16 mm","","JMKL16 Putkikaari 16 mm","","AI","FJE","Ea","","","","","","","","","","","","","","","","","","","","","","","","","21-21100-22200-00000-00000-00-00000-00000","0","21-11100-22110-00000-00000-00-00000-00000","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","0.1","Pull","AFTER PROD","","","Locator for Production"
    The load errors out on especially two columns :
    1) std_cost_price_scala
    2) list_price_per_unit
    both are number columns.
    And when there is data being provided on them. It errors out. But, if they are holding null values, the records go through fine.
    Message was edited by:
    KK28

  • Error while generating a flat file from oracle database

    i have imported three knowledge modules for the interafce.
    LKM SQL to SQL
    IKM SQL to FILE Append
    CKM oracle
    I have tried executing the interface in following ways
    1. checked the staging area different from target and choosed SUNOPSIS_MEMORY_ENGINE. three boxes appeared in the flow but i could not see any of the knowledge modules in the flow and i could not select form the drop down also.
    all the three boxes were showing the error.
    2. checked the staging area different from target and choosed a oracle logical schema. my flow has two boxes.
    For the source i have given LKM SQL to SQL and for target i have given IKM SQL to FILE append
    But the interface is erroring out while creating the load table... the error message is missing parameter
    IS there anything i have to do apart from this

    if your file ouput file has Date field,Numeric field then there is a chance for failure. Try making all the output fields as Varchar2.
    -app

  • Facing Issue while running from on desktop through oracle form builder 10g.

    Dear All,
    I am facing No plugin to show content error while trying to run oracle from on desktop through form builder 10g. please help.
    Thanks,
    Pradeep

    I am facing No plugin to show content error while trying to run oracle from on desktop through form builder 10g. What is your OS version?
    What Java version are you using?
    Have you configured your Oracle Developer Suite (ODS) 10g to run forms from the Forms Builder?
    Craig...

  • Deadlock with thread issues while generating reports with Crystal Report XI

    We are facing deadlock with thread issues while generating report with Crystal Report XI
    Version Number is 11.0 and the database used is Oracle
    In the log file on line number 74350  by 2008/12/16 13:35:54 there is a dead lock with Thread: u20184u2019 is waiting to acquire lock for 'com.crystaldecisions.reports.queryengine.av@15214b9' which is held by the Thread: '0'.
    And  a dead lock with Thread: u20180u2019 is waiting to acquire lock for 'com.crystaldecisions.reports.queryengine.av@15214b9' which is held by the Thread: '4'.
    Exactly after 10 minutes we can see the thread 4 and 0 are declared as STUCK by 2008/12/16  13:45:54 .
    Is this an existing issue with Crystal Report?
    Is there some solution for this problem?
    THE LOG FILE INFORMATION IS GIVEN BELOW
    [deadlocked thread] [ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)':
    Thread '[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'' is waiting to acquire lock 'com.crystaldecisions.reports.queryengine.av@15214b9' that is held by thread '[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)''
    Stack trace:
         com.crystaldecisions.reports.queryengine.av.V(Unknown Source)
         com.crystaldecisions.reports.queryengine.av.do(Unknown Source)
         com.crystaldecisions.reports.queryengine.as.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.c(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.a(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.a(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.cy.b(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.cy.long(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.o(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.a(Unknown Source)
         com.crystaldecisions.reports.common.ab.a(Unknown Source)
         com.crystaldecisions.reports.common.ab.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.o(Unknown Source)
         com.crystaldecisions.reports.reportengineinterface.a.a(Unknown Source)
         com.crystaldecisions.reports.reportengineinterface.a.a.b.a(Unknown Source)
         com.crystaldecisions.reports.sdk.ReportClientDocument.open(Unknown Source)
         com.sysarris.aris.crystalreports.RepServlet.generateReport(RepServlet.java:65)
         com.sysarris.aris.crystalreports.RepServlet.doPost(RepServlet.java:40)
         javax.servlet.http.HttpServlet.service(HttpServlet.java:763)
         javax.servlet.http.HttpServlet.service(HttpServlet.java:856)
         weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:225)
         weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:127)
         weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:272)
         weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:165)
         weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3153)
         weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
         weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:1973)
         weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:1880)
         weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1310)
         weblogic.work.ExecuteThread.execute(ExecuteThread.java:207)
         weblogic.work.ExecuteThread.run(ExecuteThread.java:179)
    [deadlocked thread] [ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)':
    Thread '[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'' is waiting to acquire lock 'com.crystaldecisions.reports.queryengine.av@12e0415' that is held by thread '[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)''
    Stack trace:
         com.crystaldecisions.reports.queryengine.av.V(Unknown Source)
         com.crystaldecisions.reports.queryengine.av.do(Unknown Source)
         com.crystaldecisions.reports.queryengine.as.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.c(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.a(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.a(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.cy.b(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.cy.long(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.o(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.a(Unknown Source)
         com.crystaldecisions.reports.common.ab.a(Unknown Source)
         com.crystaldecisions.reports.common.ab.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.o(Unknown Source)
         com.crystaldecisions.reports.reportengineinterface.a.a(Unknown Source)
         com.crystaldecisions.reports.reportengineinterface.a.a.b.a(Unknown Source)
         com.crystaldecisions.reports.sdk.ReportClientDocument.open(Unknown Source)
         com.sysarris.aris.crystalreports.RepServlet.generateReport(RepServlet.java:65)
         com.sysarris.aris.crystalreports.RepServlet.doPost(RepServlet.java:40)
         javax.servlet.http.HttpServlet.service(HttpServlet.java:763)
         javax.servlet.http.HttpServlet.service(HttpServlet.java:856)
         weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:225)
         weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:127)
         weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:272)
         weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:165)
         weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3153)
         weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
         weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:1973)
         weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:1880)
         weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1310)
         weblogic.work.ExecuteThread.execute(ExecuteThread.java:207)
         weblogic.work.ExecuteThread.run(ExecuteThread.java:179)
    Can you please suggest any work around for this?

    I'm not referring to Servlet threading issues.
    I'll clarify.
    You have two threads, both entering ReportClientDocument.open(...) method.
    Thread 4 is waiting to acquire 'com.crystaldecisions.reports.queryengine.av@15214b9'
    Thread 0 is waiting to acquire ''com.crystaldecisions.reports.queryengine.av@12e0415'
    So I'm thinking ??? are they the same objects?
    My specific question concerning the ReportClientDocument is that both are calling open - i.e., trying to open a new report.  You wouldn't be trying to open different reports using the same ReportClientDocument - so was wondering if you've cached the RCD and trying to open two different reports at the same time on the same instance via different threads.
    You'd normally tie a ReportClientDocument instance to a HTTP Session, to ensure each user gets their own copy.
    Sincerely,
    Ted Ueda

  • Getting Duplicate Object existing issue while deploying the BIAR file

    Hi All,
    We are trying to deploy BIAR File with XI R2 Command tool InstallEntSdkWrapper. But we are getting Duplicate Object exixting issue while deploying the BIAR file.
    Error Message:
    [report] [InstallEntSdkWrapper.main] Connecting to CMS plmdevapp31:6400 as administrator
       [report] [InstallEntSdkWrapper.CmsImportFile] Exception: An error occurred at the server :
       [report] Failed to commit objects to server : Duplicate object name in the same folder.
       [report]
       [report] [InstallEntSdkWrapper.main] BIAR File could not be imported
    If we are doing any promition with Import Wizard we have an option to "Overwrite object contents" option to overwite exixting objects. It will very helpful if any one suggest how we can achieve this through InstallEntSdkWrapper.
    Unfortunately there is no documentation availabe on InstallEntSdkWrapper.
    Cheers!

    That's a limitation with the XI Release 2 InstallEntSdkWrapper.jar tool.
    Sincerely,
    Ted Ueda

  • Can we create .CSV files through documaker

    Hi,
    Is it possible to create .csv files through documaker apart from pdf's
    Thanks

    With apologies, it is still unclear (to me) what you are asking.
    It is possible to create files using DAL, but to offer more specific guidance requires a better understanding of what it is that you want.
    A CSV is a Comma-Separate Value file. A PDF file is a printed document file containing formatted text content, graphics, lines, boxes, etc - in other words a fully-composed document.  I'm not exactly sure how a PDF and a CSV file could contain the same data.
    Are you perhaps asking to get an index of the created PDF files written as a CSV file? Something that contains a row of key transaction data along with the file name of the generated PDF? Perhaps something akin to the batch (BCH) files that can be produced per transaction recipient?
    Are you asking for an export file of all the field data that went into the creation of the document and included in the PDF? Normally a CSV has a header that describes the columns and then each row of data would be consistent with that header. Without knowing that every PDF file you create will have exactly the same number of fields defined, it is difficult to imagine that a single CSV file would be sufficient. And depending upon the number of fields and the size of the data assigned to each, this could be quite a long record per PDF in the resulting CSV file.
    Or perhaps you had in mind to get a name / value pair written for each field with data written on separate lines? This would not a be a true CSV file, but could have the name and values separated by commas if that is what you require.
    You have something specific in mind and yet, we are not in your mind.  We need more specific information on exactly what you are trying to accomplish.

  • How to import data from excel or csv files to Oracle table

    hello everybody,
    I am new here and new in Oracle. I would like to know the steps how to import data from excel or csv files to Oracle table.
    Let say I already have table inside the Oracle. Then my user give me the sets of data inside the Excel Worksheet.
    So, how can I import the excel data into Oracle table.
    Thank you in advance.
    cheers,
    shima

    Even easier. Download JDeveloper 11G from this site.
    Set up the database connection, right click on the table, select Import->Excel and specify your file to load it. On the import pop-up, you must view and update each tab indicating Columns, Data Types, and DML.
    Columns -- move the selected columns that you want to load to the box on the right
    Data Types -- select column name from second column to which the data for each column of the import file should load
    DML -- click this tab to generate the INSERT SQL
    Once done click 'Insert'

  • Issue while Processing the Huge File in BPEL

    Hi,
    We are facing an Issue while Processing a Hige file in BPEL Process (More than 1MB File). When i test the files with more than 1500 transactions (More than 1MB file) then the BPEL Process is automatically goes to OFF Mode or it goes to Perform Manually Recovery Queue.
    Even we are facing this issue in Production also so we are using UNIX Script to Split the file before place the file in BPEL Input directory.Any Pointers to resolve this issue will be helpful.
    Thanks,
    Saravana

    Hi,
    Please find the answers.
    1. Currently we are using SOA 10.1.2 Version and JDev10g
    2. we are using File Adapeter
    3. yes. We used debatching.
    4. Yes. I am able to recover from Manual Recovery Queue
    5. Please find the error message
    <2009-05-21 04:32:38,461> <DEBUG> <ESIBT.collaxa.cube.engine.dispatch> <Dispatcher::adjustThreadPool> Allocating 1 thread(s); pending threads: 1, active threads: 0, total: 83
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> File : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B is ready to be processed.
    <2009-05-21 04:32:44,077> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Processing file : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> onBatchBegin: Batch 'bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000' (/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B) starting...
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> Inside TranslatorFactory
    <2009-05-21 04:32:44,078> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> using version attribute = NXSD
    <2009-05-21 04:32:44,078> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> loading xlator class...oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl
    <2009-05-21 04:32:44,081> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> class loaded
    <2009-05-21 04:32:44,081> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Created translator : oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl@46908ae8
    <2009-05-21 04:32:44,098> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting up Control dir for debatching error recovery
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Control dir for debatching error recovery : /opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/j2ee/home/fileftp/controlFiles/localhost_ESIBT_BPELProcess_810~1.0_/inbound
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Invoking inbound translation for : Input5162009.B2B
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.translation> <NXSDTranslatorImpl::log> Starting translateFromNative
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.translation> <NXSDTranslatorImpl::log> Done with translateFromNative
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Completed inbound translation for : Input5162009.B2B
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> isTextFile : true
    <2009-05-21 04:32:44,139> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Translated inbound batch index 1 of file {Input5162009.B2B} with corrupted message count = 1
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Error Reader created using charset :ASCII
    <2009-05-21 04:32:44,139> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Sending message to Adapter Framework for rejection to user-configured rejection handlers : {
    fileName=/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B, startLine=1, startColumn=1, endLine=-1, endCol=-1, Exception=ORABPEL-11167
    Error while reading native data.
    [Line=1, Col=70] Expected "\t" at the specified position in the native data, while trying to read the data for "element with name HDR_STORE_NUM", using "style" as "terminated" and "terminatedBy" as "\t", but not found.
    Ensure that "\t", exists at the specified position in the native data.
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting batchId in NativeRecord to bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000
    <2009-05-21 04:32:44,139> <WARN> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> [Read_ptt::Read(Object)] - onReject: The resource adapter 'File Adapter' requested handling of a malformed inbound message. However, the following bpel.xml activation property has not been defined: 'rejectedMessageHandlers'. Please define it and redeploy the business process. Will use the default Rejection Directory file:///opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages for now.
    <2009-05-21 04:32:44,140> <WARN> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> [Read_ptt::Read(Object)] - onReject: Sending invalid inbound message to Exception Handler:
    <2009-05-21 04:32:44,140> <INFO> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> Handing rejected message to DEFAULT rejection handler: file:///opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages since none of the configured rejection handlers [] succeeded.
    <2009-05-21 04:32:44,140> <DEBUG> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> Finished persisting rejected message to file system under the name: /opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages/INVALID_MSG_BPELProcess_810_Read_20090521_043244_0140.dat
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting last error record to : -1
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Translator has failed to translate any message from batch number: 1
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Message not published as translation failed: {
    File=/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B, batchIndex=1, PublishSize=1
    <2009-05-21 04:32:44,141> <ERROR> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> onBatchFailure: Batch 'bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000' (/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B) has failed due to: ORABPEL-11167
    Error while reading native data.
    [Line=1, Col=70] Expected "\t" at the specified position in the native data, while trying to read the data for "element with name HDR_STORE_NUM", using "style" as "terminated" and "terminatedBy" as "\t", but not found.
    Ensure that "\t", exists at the specified position in the native data.
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleting file : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B after processing.
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleting file : Input5162009.B2B
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleted file : true
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Removing file /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B from files to be processed Map.
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Done processing File : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B
    <2009-05-21 04:33:09,698> <DEBUG> <ESIBT.collaxa.cube.engine.data> <ConnectionFactory::getConnection> GOT CONNECTION 4 Autocommit = false
    For this error message this shows due to some /t its not picking up the file. but even i am facing the same issue for all the files where load is huge.
    Thanks,
    Saravana

  • 7303030 patch - an error occured while generating forms library files

    Hi,
    While applying 12.1.1 upgrade patch, 7303030 i got an error at A1000 phase
    "An error occured while generating forms library files" continue as if it were successful..
    As we can generate forms after completion of the patch, i selected yes .
    Later, i got "An error occured while generating oracle forms files", here also i selected to continue.
    Please tell me is this the right way to continue the patch?
    I didn't find any errors in worker logs files, after all workers have quit, this error was reported.
    After completion of the patch what necessary steps i need to perform.
    Thanks,

    Hi,
    The errors are due to missing of pre-req patch 6400501
    Forms 10.1.2.3 Compilation against an 11.2 Fails with DB PL / SQL ERROR 801 ... internal error [60603] [ID 1065020.1]
    After Applying the patch, successfully gerated forms.
    I applied the patch 6400501 but still with the error internal error [60603], have something more to be done?
    Thank you.

  • How get output generated as csv file  by reading  by buffered reader and wr

    how get output generated as csv file by reading by buffered reader and writer

    String file_location = "C\temp\csv.txt");
    try {
         URL fileURL = getClass().getResource(file_location);
         if (fileURL != null){
              BufferedReader br = new BufferedReader(new InputStreamReader(fileURL.openStream()));
              String s = br.readLine();
              while (s != null)  {
                   if (!s.equals ("")) {
                        System.out.println(s);
                   s = br.readLine();
              br.close();
         else {
              // error
    catch (IOException ex){ex.printStackTrace();}rykk
    Message was edited by: a dummy
    rykk.

  • Create CSV file through a program in background

    Hello everyone!
    I'm facing a problem using class CL_GUI_FRONTEND_SERVICES to generate a csv file.
    I have a report in SCM that will export data from APO to some csv files.
    At this moment I'm using GUI_DOWNLOAD method, it works perfectly! The problem is that this program is supposed to run in a background job, and as far as I read and searched for it.. GUI_DOWNLOAD and GUI_UPLOAD are not supposed to run in bakcground because of a "client-server" problem... well.. ok...
    So... I tried with the "OPEN DATASET" and it worked fine also, but now comes another issue: the "open dataset" method only writes/reads files located in the system server(is that right?)... and my file will be located on another location in the intranet.
    What should I use to generate this csv files in a background job!?
    Thanks,
    Martin

    there isn't a "client-server" problem...background jobs have no connectivity to a gui/presentation server and therefore no ability to upload/download to a desktop target....
    You can save, as noted, to any server visible to the SAP instance you're executing on.  I would recommend that you look at a multi-step process....
    create your internal table
    store that internal table in a file on your apps server (comma-delimited is okay, but tab-delimited would be much better choice)
    if you need to place the file where another system can access,
    utilize FTP capability in SAP to "copy" or "send" that file to an FTP target, after you have successfully written the file to the APPS server.
    If the file can be retrieved by a SAP user,
    write or utilize an existing piece of code to grab the file, upload into SAP and then utilize GUI_DOWNLOAD to transfer to the user's desktop target.
    This allows you to have both a background process, but have the file accessible to an SAP user after the file has been created.  I have several processes that do this and retrieval by the user is both easy and convenient, since the user is notified by my background job that the file has been created, etc.

  • Performance issue while generating Query

    Hi BI Gurus.
    I am facing performance issue while generating query on 0IC_C03.
    It has a variable as (from & to) for generating the report for a particular time duration.
    if the variable (from & to) fields is filled then after taking a long time it shows run time error.
    & if the query is executed without mentioning the variable(which is optional) then the data is extracted from beginning to till date. the same takes less time in execution.
    & after that the period has to be selected manually by option keep filter value. please suggest how can i solve the error
    Regards
    Ritika

    HI RITIKA,
    WEL COME TO SDN.
    YOUHAVE TO CHECK THE FOLLOWING RUN TIME SEGMENTS USING ST03N TCODE:
    High Database Runtime
    High OLAP Runtime
    High Frontend Runtime
    if its high Database Runtime :
    - check the aggregates or create aggregates on cube and this helps you.
    if its High OLAP Runtime :
    - check the user exits if any.
    - check the hier. are used and fetching in deep level.
    If its high frontend runtime:
    - check if a very high number of cells and formattings are transferred to the Frontend ( use "All data" to get value "No. of Cells") which cause high network and frontend (processing) runtime.
    For From and to date variables, create one more set and use it and try.
    Regs,
    VACHAN

Maybe you are looking for