Issue while loading a csv file using sql*loader...

Hi,
I am loading a csv file using sql*loader.
On the number columns where there is data populated in them, decimal number/integers, the row errors out on the error -
ORA-01722: invalid number
I tried checking the value picking from the excel,
and found the chr(13),chr(32),chr(10) values characters on the value.
ex: select length('0.21') from dual is giving a value of 7.
When i checked each character as
select ascii(substr('0.21',5,1) from dual is returning a value 9...etc.
I tried the following command....
"to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
to remove all the non-number special characters. But still facing the error.
Please let me know, any solution for this error.
Thanks in advance.
Kiran

control file:
OPTIONS (ROWS=1, ERRORS=10000)
LOAD DATA
CHARACTERSET WE8ISO8859P1
INFILE '$Xx_TOP/bin/ITEMS.csv'
APPEND INTO TABLE XXINF.ITEMS_STAGE
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
ItemNum                    "trim(replace(replace(:ItemNum,chr(9),''),chr(13),''))",
cross_ref_old_item_num               "trim(replace(replace(:cross_ref_old_item_num,chr(9),''),chr(13),''))",
Mas_description               "trim(replace(replace(:Mas_description,chr(9),''),chr(13),''))",
Mas_long_description               "trim(replace(replace(:Mas_long_description,chr(9),''),chr(13),''))",
Org_description               "trim(replace(replace(:Org_description,chr(9),''),chr(13),''))",
Org_long_description               "trim(replace(replace(:Org_long_description,chr(9),''),chr(13),''))",
user_item_type                    "trim(replace(replace(:user_item_type,chr(9),''),chr(13),''))",
organization_code               "trim(replace(replace(:organization_code,chr(9),''),chr(13),''))",
primary_uom_code               "trim(replace(replace(:primary_uom_code,chr(9),''),chr(13),''))",
inv_default_item_status          "trim(replace(replace(:inv_default_item_status,chr(9),''),chr(13),''))",
inventory_item_flag               "trim(replace(replace(:inventory_item_flag,chr(9),''),chr(13),''))",
stock_enabled_flag               "trim(replace(replace(:stock_enabled_flag,chr(9),''),chr(13),''))",
mtl_transactions_enabled_flag          "trim(replace(replace(:mtl_transactions_enabled_flag,chr(9),''),chr(13),''))",
revision_qty_control_code          "trim(replace(replace(:revision_qty_control_code,chr(9),''),chr(13),''))",
reservable_type               "trim(replace(replace(:reservable_type,chr(9),''),chr(13),''))",
check_shortages_flag               "trim(replace(replace(:check_shortages_flag,chr(9),''),chr(13),''))",
shelf_life_code               "trim(replace(replace(replace(replace(:shelf_life_code,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
shelf_life_days               "trim(replace(replace(replace(replace(:shelf_life_days,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
lot_control_code               "trim(replace(replace(:lot_control_code,chr(9),''),chr(13),''))",
auto_lot_alpha_prefix               "trim(replace(replace(:auto_lot_alpha_prefix,chr(9),''),chr(13),''))",
start_auto_lot_number               "trim(replace(replace(:start_auto_lot_number,chr(9),''),chr(13),''))",
negative_measurement_error          "trim(replace(replace(replace(replace(:negative_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
positive_measurement_error          "trim(replace(replace(replace(replace(:positive_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
serial_number_control_code          "trim(replace(replace(:serial_number_control_code,chr(9),''),chr(13),''))",
auto_serial_alpha_prefix          "trim(replace(replace(:auto_serial_alpha_prefix,chr(9),''),chr(13),''))",
start_auto_serial_number          "trim(replace(replace(:start_auto_serial_number,chr(9),''),chr(13),''))",
location_control_code               "trim(replace(replace(:location_control_code,chr(9),''),chr(13),''))",
restrict_subinventories_code          "trim(replace(replace(:restrict_subinventories_code,chr(9),''),chr(13),''))",
restrict_locators_code               "trim(replace(replace(:restrict_locators_code,chr(9),''),chr(13),''))",
bom_enabled_flag               "trim(replace(replace(:bom_enabled_flag,chr(9),''),chr(13),''))",
costing_enabled_flag               "trim(replace(replace(:costing_enabled_flag,chr(9),''),chr(13),''))",
inventory_asset_flag               "trim(replace(replace(:inventory_asset_flag,chr(9),''),chr(13),''))",
default_include_in_rollup_flag          "trim(replace(replace(:default_include_in_rollup_flag,chr(9),''),chr(13),''))",
cost_of_goods_sold_account          "trim(replace(replace(:cost_of_goods_sold_account,chr(9),''),chr(13),''))",
std_lot_size                    "trim(replace(replace(replace(replace(:std_lot_size,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
sales_account                    "trim(replace(replace(:sales_account,chr(9),''),chr(13),''))",
purchasing_item_flag               "trim(replace(replace(:purchasing_item_flag,chr(9),''),chr(13),''))",
purchasing_enabled_flag          "trim(replace(replace(:purchasing_enabled_flag,chr(9),''),chr(13),''))",
must_use_approved_vendor_flag          "trim(replace(replace(:must_use_approved_vendor_flag,chr(9),''),chr(13),''))",
allow_item_desc_update_flag          "trim(replace(replace(:allow_item_desc_update_flag,chr(9),''),chr(13),''))",
rfq_required_flag               "trim(replace(replace(:rfq_required_flag,chr(9),''),chr(13),''))",
buyer_name                    "trim(replace(replace(:buyer_name,chr(9),''),chr(13),''))",
list_price_per_unit               "trim(replace(replace(replace(replace(:list_price_per_unit,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
taxable_flag                    "trim(replace(replace(:taxable_flag,chr(9),''),chr(13),''))",
purchasing_tax_code               "trim(replace(replace(:purchasing_tax_code,chr(9),''),chr(13),''))",
receipt_required_flag               "trim(replace(replace(:receipt_required_flag,chr(9),''),chr(13),''))",
inspection_required_flag          "trim(replace(replace(:inspection_required_flag,chr(9),''),chr(13),''))",
price_tolerance_percent          "trim(replace(replace(replace(replace(:price_tolerance_percent,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
expense_account               "trim(replace(replace(:expense_account,chr(9),''),chr(13),''))",
allow_substitute_receipts_flag          "trim(replace(replace(:allow_substitute_receipts_flag,chr(9),''),chr(13),''))",
allow_unordered_receipts_flag          "trim(replace(replace(:allow_unordered_receipts_flag,chr(9),''),chr(13),''))",
receiving_routing_code               "trim(replace(replace(:receiving_routing_code,chr(9),''),chr(13),''))",
inventory_planning_code          "trim(replace(replace(:inventory_planning_code,chr(9),''),chr(13),''))",
min_minmax_quantity               "trim(replace(replace(replace(replace(:min_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
max_minmax_quantity               "trim(replace(replace(replace(replace(:max_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
planning_make_buy_code               "trim(replace(replace(:planning_make_buy_code,chr(9),''),chr(13),''))",
source_type                    "trim(replace(replace(:source_type,chr(9),''),chr(13),''))",
mrp_safety_stock_code               "trim(replace(replace(:mrp_safety_stock_code,chr(9),''),chr(13),''))",
material_cost                    "trim(replace(replace(:material_cost,chr(9),''),chr(13),''))",
mrp_planning_code               "trim(replace(replace(:mrp_planning_code,chr(9),''),chr(13),''))",
customer_order_enabled_flag          "trim(replace(replace(:customer_order_enabled_flag,chr(9),''),chr(13),''))",
customer_order_flag               "trim(replace(replace(:customer_order_flag,chr(9),''),chr(13),''))",
shippable_item_flag               "trim(replace(replace(:shippable_item_flag,chr(9),''),chr(13),''))",
internal_order_flag               "trim(replace(replace(:internal_order_flag,chr(9),''),chr(13),''))",
internal_order_enabled_flag          "trim(replace(replace(:internal_order_enabled_flag,chr(9),''),chr(13),''))",
invoice_enabled_flag               "trim(replace(replace(:invoice_enabled_flag,chr(9),''),chr(13),''))",
invoiceable_item_flag               "trim(replace(replace(:invoiceable_item_flag,chr(9),''),chr(13),''))",
cross_ref_ean_code               "trim(replace(replace(:cross_ref_ean_code,chr(9),''),chr(13),''))",
category_set_intrastat               "trim(replace(replace(:category_set_intrastat,chr(9),''),chr(13),''))",
CustomCode                    "trim(replace(replace(:CustomCode,chr(9),''),chr(13),''))",
net_weight                    "trim(replace(replace(replace(replace(:net_weight,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
production_speed               "trim(replace(replace(:production_speed,chr(9),''),chr(13),''))",
LABEL                         "trim(replace(replace(:LABEL,chr(9),''),chr(13),''))",
comment1_org_level               "trim(replace(replace(:comment1_org_level,chr(9),''),chr(13),''))",
comment2_org_level               "trim(replace(replace(:comment2_org_level,chr(9),''),chr(13),''))",
std_cost_price_scala               "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
supply_type                    "trim(replace(replace(:supply_type,chr(9),''),chr(13),''))",
subinventory_code               "trim(replace(replace(:subinventory_code,chr(9),''),chr(13),''))",
preprocessing_lead_time          "trim(replace(replace(replace(replace(:preprocessing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
processing_lead_time                "trim(replace(replace(replace(replace(:processing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
wip_supply_locator               "trim(replace(replace(:wip_supply_locator,chr(9),''),chr(13),''))"
Sample data from csv file.
"9901-0001-35","390000","JMKL16 Pipe bend 16 mm","","JMKL16 Putkikaari 16 mm","","AI","FJE","Ea","","","","","","","","","","","","","","","","","","","","","","","","","21-21100-22200-00000-00000-00-00000-00000","0","21-11100-22110-00000-00000-00-00000-00000","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","0.1","Pull","AFTER PROD","","","Locator for Production"
The load errors out on especially two columns :
1) std_cost_price_scala
2) list_price_per_unit
both are number columns.
And when there is data being provided on them. It errors out. But, if they are holding null values, the records go through fine.
Message was edited by:
KK28

Similar Messages

  • Export batch data into CSV file using SQL SP

    Hi,
    I have created WCF-Custom receive adapter to poll Sql SP (WITH xmlnamespaces(DEFAULT 'Namespace' and For XML PATH(''), Type) . Get the result properly in batch while polling and but getting error while converting into CSV by using map.
    Please can anyone give me some idea to export SQL data into CSV file using SP.

    How are you doing this.
    You would have got XML representation for the XML batch received from SQL
    You should have a flat-file schema representing the CSV file which you want to send.
    Map the received XML representation of data from SQL to flat-file schema
    have custom pipeline with flat-file assembler on the assembler stage of the send pipeline.
    In the send port use the map which convert received XML from SQL to flat file schema and use the above custom flat-file disassembler send port
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • Issue while generating a CSV file through Oracle.

    Hi,
    I am generating a CSV file using oracle stored procedure.
    In an Oracle procedure I need to create a CSV file, which contains the data of a select query, and then emailing this file. Problem is that one of the field contains comma eg ('ABC,DE'). Now in the CSV file 'ABC' is shown in the one cell and 'DE' is shifted to the adjacent cell which is not required..
    Thanks.

    Hi Welcome,
    If you data contains comma then make ~ as your delimeter.
    or repalce all the commas form your data.
    select replace(column_name,',','') from table_name; /* before writing to the file */Regards,
    Bhushan

  • Column heading is not coming while generating a .csv file from .sql file

    Hi all,
    Now since I am able to generate a .csv file by executing my .sql file, the column heading is missing in that. Please advise. I have used the following parameters in my query:
    set linesize 1000
    set colsep ','
    set echo off
    set feedback off
    set pagesize 0
    set trimspool on
    spool /path/file.csv
    select ...... from .... where .....;
    spool off
    exit

    set pagesize 0 <-- your problem
    you must set it into a high value (max value 50000)
    see:
    SQL> select * from dual;
    D
    X
    SQL> set pagesize 0
    SQL> select * from dual;
    X
    SQL> set pagesize 50000
    SQL> select * from dual;
    D
    X

  • Unknown issue while loading .dbf file by sql loader

    Hi guys,
    I am having a unknown issue while loading .dbf file by sql loader.
    I need to load .dbf data into oracle table.for this I converted .dbf file by just changing its extension as .csv . file structure after changing .dbf to .csv --
    C_N_NUMBER,COMP_CODE,CPT_CODE,C_N_AMT,CM_NUMBER
    1810/4,LKM,30,45,683196
    1810/5,LKM,30,45,683197
    1810/6,LKM,30,45,683198
    1810/7,LKM,30,135,683200
    1810/8,LKM,30,90,683201
    1810/9,LKM,1,45,683246
    1810/9,LKM,2,90,683246
    1810/10,LKF,1,90,683286
    2810/13,LKJ,1,50.5,680313
    2810/14,LKJ,1,50,680316
    1910/1,LKQ,1,90,680344
    3910/2,LKF,1,238.12,680368
    3910/3,LKF,1,45,680382
    3910/4,LKF,1,45,680395
    7910/5,LKS,1,45,680397
    7910/6,LKS,1,90,680400
    7910/7,LKS,1,45,680401
    7910/8,LKS,1,238.12,680414
    7910/9,LKS,1,193.12,680415
    7910/10,LKS,1,45,680490
    then I am loading it by sql loader.but I am getting always error below ...
    Record 1: Rejected - Error on table C_N_DETL_TAB, column CPT_CODE.
    ORA-01438: value larger than specified precision allowed for this column
    Record 2: Rejected - Error on table C_N_DETL_TAB, column CPT_CODE.
    ORA-01438: value larger than specified precision allowed for this column
    table structure-
    create table C_N_DETL_tab
    "C_N_NUMBER" VARCHAR2(13),
    "COMP_CODE" VARCHAR2(3),
    "CPT_CODE" NUMBER(4),
    "C_N_AMT" NUMBER(20,18),
    "CM_NUMBER" NUMBER(7)
    control file-
    options(skip=1)
    load data
    infile '/softdump/pc/C_N_DETL.csv'
    badfile '/softdump/pc/C_N_DETL.bad'
    discardfile '/softdump/pc/C_N_DETL.dsc'
    into table C_N_DETL_tab
    truncate
    FIELDS TERMINATED BY ","
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    C_N_NUMBER CHAR,
    COMP_CODE CHAR,
    CPT_CODE INTEGER,
    C_N_AMT INTEGER,
    CM_NUMBER INTEGER
    but guys when I am increasing size of all columns of tabel upto its max value then data is loaded but when I am checking column max length after data loaded then its very less..
    changed table structure-
    create table C_N_DETL_tab
    "C_N_NUMBER" VARCHAR2(130),
    "COMP_CODE" VARCHAR2(30),
    "CPT_CODE" NUMBER(32), ---- max value of number
    "C_N_AMT" NUMBER(32,18), ---- max value of number
    "CM_NUMBER" NUMBER(32) ---- max value of number
    now i ma running ...
    sqlldr express/express control=C_N_DETL.ctl log=C_N_DETL.log
    o/p-
    Table C_N_DETL_TAB, loaded from every logical record.
    Insert option in effect for this table: TRUNCATE
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    ---------- ---- ---- C_N_NUMBER FIRST * , O(") CHARACTER
    COMP_CODE NEXT * , O(") CHARACTER
    CPT_CODE NEXT 4 INTEGER
    C_N_AMT NEXT 4 INTEGER
    CM_NUMBER NEXT 4 INTEGER
    Table C_N_DETL_TAB:
    20 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    select max(length( CPT_CODE))from C_N_DETL_tab ---> 9
    can u tell me why I need to increase size of table columns upto max value?although length of data is soo much less.
    kindly check it..thnx in advance...
    rgds,
    pc

    No database version of course. Unimportant.
    If I recall correctly, it is 'integer external ' and you would best double quoting the alphanumerics (which you didn't ).
    Try changing integer in integer external in the ctl file.
    Sybrand Bakker
    Senior Oracle DBA

  • Loading a CSV file to Sql Server

    Hi,
    How can i load a csv file, which is in a shared location itn osql using bods.
    Where can i create the data source pointing to csv file

    Hi,
    Use fileformat to load CSV file into SQL server, below the details
    A file format defines a connection to a file. Therefore, you use a file format to connect to source or target data when the data is stored in a file rather than a database table. The object library stores file format templates that you use to define specific file formats as sources and targets in data flows.
    To work with file formats, perform the following tasks:
    • Create a file format template that defines the structure for a file.
    • Create a specific source or target file format in a data flow. The source or target file format is based on a template and specifies connection information such as the file name.
    File format objects can describe files of the following types:
    • Delimited: Characters such as commas or tabs separate each field.
    • Fixed width: You specify the column width.
    • SAP transport: Use to define data transport objects in SAP application data flows.
    • Unstructured text: Use to read one or more files of unstructured text from a directory.
    • Unstructured binary: Use to read one or more binary documents from a directory.
    More details, refer the designer guide
    http://help.sap.com/businessobject/product_guides/sbods42/en/ds_42_designer_en.pdf

  • Issue while archiving the processed file in sender communication channel using SFTP adapter

    Hi All,
    In one of my scenario (File to IDOC), we are using SFTP sender communicationchannel.
    we are facing an issue while archiving the processed file. Some times PI processed the file successfully but unable to archive it and in the next poll PI process & archives the same file successfully which will creates duplicate orders in ECC.
    Please let us know how to resolve this issue.

    Hi Anil,
    Refer Archiving concepts in below links.
    http://help.sap.com/saphelp_nw73/helpdata/en/44/682bcd7f2a6d12e10000000a1553f6/content.htm?frameset=/en/44/6830e67f2a6d12e10000000a1553f6/frameset.htm
    http://scn.sap.com/docs/DOC-35572
    Warm Regards,
    DNK Siddhardha.

  • Problem loading XML-file using SQL*Loader

    Hello,
    I'm using 9.2 and tryin to load a XML-file using SQL*Loader.
    Loader control-file:
    LOAD DATA
    INFILE *
    INTO TABLE BATCH_TABLE TRUNCATE
    FIELDS TERMINATED BY ','
    FILENAME char(255),
    XML_DATA LOBFILE (FILENAME) TERMINATED BY EOF
    BEGINDATA
    data.xml
    The BATCH_TABLE is created as:
    CREATE TABLE BATCH_TABLE (
    FILENAME VARCHAR2 (50),
    XML_DATA SYS.XMLTYPE ) ;
    And the data.xml contains the following lines:
    <?xml version="2.0" encoding="UTF-8"?>
    <!DOCTYPE databatch SYSTEM "databatch.dtd">
    <batch>
    <record>
    <data>
    <type>10</type>
    </data>
    </record>
    <record>
    <data>
    <type>20</type>
    </data>
    </record>
    </batch>
    However, the sqlldr gives me an error:
    Record 1: Rejected - Error on table BATCH_TABLE, column XML_DATA.
    ORA-21700: object does not exist or is marked for delete
    ORA-06512: at "SYS.XMLTYPE", line 0
    ORA-06512: at line 1
    If I remove the first two lines
    "<?xml version="2.0" encoding="UTF-8"?>"
    and
    "<!DOCTYPE databatch SYSTEM "databatch.dtd">"
    from data.xml everything works, and the contentents of data.xml are loaded into the table.
    Any idea what I'm missing here? Likely the problem is with special characters.
    Thanks in advance,

    I'm able to load your file just by removing the second line <!DOCTYPE databatch SYSTEM "databatch.dtd">. I dont have your dtd file, so skipped that line. Can you check if it's problem with ur DTD?

  • Need suggestions on loading 5000+ files using sql loader

    Hi Guys,
    I'm writing a shell script to load more than 5000 files using sql loader.
    My intention is to load the files in parallel. When I checked the maximum number of sessions in v$parameter, it is around 700.
    Before starting the data load, programmatically I am getting the number of current sessions and maximum number of sessions and keeping free 200 sessions without using them (max. no. of sessions minus 200 ) and utilizing the remaining ~300 sessions to load the files in parallel.
    Also I am using a "wait" option to make the shell to wait until the 300 concurrent sql loader process to complete and moving further.
    Is there any way to make it more efficient? Also is it possible to reduce the wait time without hard coding the seconds (For Example: If any of those 300 sessions becomes free, assign the next file to the job queue and so on..)
    Please share your thoughts on this.
    Thanks.

    Manohar wrote:
    I'm writing a shell script to load more than 5000 files using sql loader.
    My intention is to load the files in parallel. When I checked the maximum number of sessions in v$parameter, it is around 700. Concurrent load you mean? Parallel processing implies take a workload, breaking that up into smaller workloads, and doing that in parallel. This is what the Parallel Query feature does in Oracle.
    SQL*Loader does not do that for you. It uses a single session to load a single file. To make it run in parallel, requires manually starting multiple loader sessions and perform concurrent loads.
    Have a look at Parallel Data Loading Models in the Oracle® Database Utilities guide. It goes into detail on how to perform concurrent loads. But you need to parallelise that workload yourself (as explained in the manual).
    Before starting the data load, programmatically I am getting the number of current sessions and maximum number of sessions and keeping free 200 sessions without using them (max. no. of sessions minus 200 ) and utilizing the remaining ~300 sessions to load the files in parallel.
    Also I am using a "wait" option to make the shell to wait until the 300 concurrent sql loader process to complete and moving further.
    Is there any way to make it more efficient? Also is it possible to reduce the wait time without hard coding the seconds (For Example: If any of those 300 sessions becomes free, assign the next file to the job queue and so on..)Consider doing it the way that Parallel Query does (as I've mentioned above). Take the workload (all files). Break the workload up into smaller sub-workloads (e.g. 50 files to be loaded by a process). Start a 100 processes in parallel and provide each one with a sub-workload to do (100 processes each loading 50 odd files).
    This is a lot easier to manage than starting for example a 5000 load processes and then trying some kind of delay method to ensure that not all hit the database at the same time.
    I'm loading about 100+ files (3+ million rows) every 60 seconds 24x7 using SQL*Loader. Oracle is quite scalable and SQL*Loader quite capable.

  • Load .JAR file using SQL Developer

    Hi,
    I have couple of .JAR files on my local machine which has to be loaded on to the oracle server (11g). I tried using the load Java functionality in the SQL Developer but realised one could load a java class only if you had the source code.
    Is my understanding correct or is there a way to load a JAR file using sql developer ?
    Thanks

    K,
    Thanks for the answer.
    Do you know if you have to restart oracle after loading .jar files ? I ask this because I am not aware if oracle/jvm automatically knows that the jar files have been uploaded.
    Thanks

  • Special character issue while loading data from SAP HR through VDS

    Hello,
    We have a special character issue, while loading data from SAP HR to IdM, using a VDS and following the standard documentation: http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e09fa547-f7c9-2b10-3d9e-da93fd15dca1?quicklink=index&overridelayout=true
    French accent like (é,à,è,ù), are correctly loaded but Turkish special ones (like : Ş, İ, ł ) are transformed into u201C#u201D in Idm.
    The question is : does someone know any special setting to do in the VDS or in IdM for special characters to solve this issue??
    Our SAP HR version is ECC6.0 (ABA/BASIS7.0 SP21, SAP_HR6.0 SP54) and we are using a VDS 7.1 SP5 and SAP NW IdM 7.1 SP5 Patch1 on oracle 10.2.
    Thanks

    We are importing directly to the HR staging area, using the transactions/programs "HRLDAP_MAP", "LDAP" and "/RPLDAP_EXTRACT", then we have a job which extract data from the staging area to a CSV file.
    So before the import, the character appears correctly in SAP HR, but by the time it comes through the VDS to the IDM's temporary table, it becomes "#".
    Yes, our data is coming from a Unicode system.
    So, could it be a java parameter to change / add in the VDS??
    Regards.

  • How to get DocSet property values in a SharePoint library into a CSV file using Powershell

    Hi,
    How to get DocSet property values in a SharePoint library into a CSV file using Powershell?
    Any help would be greatly appreciated.
    Thank you.
    AA.

    Hi AOK,
    Would you please post your current script and the issue for more effcient support.
    In addition, to manage document set in sharepoint please refer to this script to start:
    ### Load SharePoint SnapIn
    2.if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null)
    3.{
    4. Add-PSSnapin Microsoft.SharePoint.PowerShell
    5.}
    6.### Load SharePoint Object Model
    7.[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint”)
    8.
    9.### Get web and list
    10.$web = Get-SPWeb http://myweb
    11.$list = $web.Lists["List with Document Sets"]
    12.
    13.### Get Document Set Content Type from list
    14.$cType = $list.ContentTypes["Document Set Content Type Name"]
    15.
    16.### Create Document Set Properties Hashtable
    17.[Hashtable]$docsetProperties = @{"DocumentSetDescription"="A Document Set"}
    18.$docsetProperties = @{"CustomColumn1"="Value 1"}
    19.$docsetProperties = @{"CustomColum2"="Value2"}
    20. ### Add all your Columns for your Document Set
    21.
    22.### Create new Document Set
    23.$newDocumentSet = [Microsoft.Office.DocumentManagement.DocumentSets.DocumentSet]::Create($list.RootFolder,"Document Set Title",$cType.Id,$docsetProperties)
    24.$web.Dispose()
    http://www.letssharepoint.com/2011/06/document-sets-und-powershell.html
    If there is anything else regarding this issue, please feel free to post back.
    Best Regards,
    Anna Wang
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Issue while Processing the Huge File in BPEL

    Hi,
    We are facing an Issue while Processing a Hige file in BPEL Process (More than 1MB File). When i test the files with more than 1500 transactions (More than 1MB file) then the BPEL Process is automatically goes to OFF Mode or it goes to Perform Manually Recovery Queue.
    Even we are facing this issue in Production also so we are using UNIX Script to Split the file before place the file in BPEL Input directory.Any Pointers to resolve this issue will be helpful.
    Thanks,
    Saravana

    Hi,
    Please find the answers.
    1. Currently we are using SOA 10.1.2 Version and JDev10g
    2. we are using File Adapeter
    3. yes. We used debatching.
    4. Yes. I am able to recover from Manual Recovery Queue
    5. Please find the error message
    <2009-05-21 04:32:38,461> <DEBUG> <ESIBT.collaxa.cube.engine.dispatch> <Dispatcher::adjustThreadPool> Allocating 1 thread(s); pending threads: 1, active threads: 0, total: 83
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> File : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B is ready to be processed.
    <2009-05-21 04:32:44,077> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Processing file : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> onBatchBegin: Batch 'bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000' (/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B) starting...
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> Inside TranslatorFactory
    <2009-05-21 04:32:44,078> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> using version attribute = NXSD
    <2009-05-21 04:32:44,078> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> loading xlator class...oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl
    <2009-05-21 04:32:44,081> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> class loaded
    <2009-05-21 04:32:44,081> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Created translator : oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl@46908ae8
    <2009-05-21 04:32:44,098> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting up Control dir for debatching error recovery
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Control dir for debatching error recovery : /opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/j2ee/home/fileftp/controlFiles/localhost_ESIBT_BPELProcess_810~1.0_/inbound
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Invoking inbound translation for : Input5162009.B2B
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.translation> <NXSDTranslatorImpl::log> Starting translateFromNative
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.translation> <NXSDTranslatorImpl::log> Done with translateFromNative
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Completed inbound translation for : Input5162009.B2B
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> isTextFile : true
    <2009-05-21 04:32:44,139> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Translated inbound batch index 1 of file {Input5162009.B2B} with corrupted message count = 1
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Error Reader created using charset :ASCII
    <2009-05-21 04:32:44,139> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Sending message to Adapter Framework for rejection to user-configured rejection handlers : {
    fileName=/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B, startLine=1, startColumn=1, endLine=-1, endCol=-1, Exception=ORABPEL-11167
    Error while reading native data.
    [Line=1, Col=70] Expected "\t" at the specified position in the native data, while trying to read the data for "element with name HDR_STORE_NUM", using "style" as "terminated" and "terminatedBy" as "\t", but not found.
    Ensure that "\t", exists at the specified position in the native data.
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting batchId in NativeRecord to bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000
    <2009-05-21 04:32:44,139> <WARN> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> [Read_ptt::Read(Object)] - onReject: The resource adapter 'File Adapter' requested handling of a malformed inbound message. However, the following bpel.xml activation property has not been defined: 'rejectedMessageHandlers'. Please define it and redeploy the business process. Will use the default Rejection Directory file:///opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages for now.
    <2009-05-21 04:32:44,140> <WARN> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> [Read_ptt::Read(Object)] - onReject: Sending invalid inbound message to Exception Handler:
    <2009-05-21 04:32:44,140> <INFO> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> Handing rejected message to DEFAULT rejection handler: file:///opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages since none of the configured rejection handlers [] succeeded.
    <2009-05-21 04:32:44,140> <DEBUG> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> Finished persisting rejected message to file system under the name: /opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages/INVALID_MSG_BPELProcess_810_Read_20090521_043244_0140.dat
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting last error record to : -1
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Translator has failed to translate any message from batch number: 1
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Message not published as translation failed: {
    File=/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B, batchIndex=1, PublishSize=1
    <2009-05-21 04:32:44,141> <ERROR> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> onBatchFailure: Batch 'bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000' (/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B) has failed due to: ORABPEL-11167
    Error while reading native data.
    [Line=1, Col=70] Expected "\t" at the specified position in the native data, while trying to read the data for "element with name HDR_STORE_NUM", using "style" as "terminated" and "terminatedBy" as "\t", but not found.
    Ensure that "\t", exists at the specified position in the native data.
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleting file : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B after processing.
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleting file : Input5162009.B2B
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleted file : true
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Removing file /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B from files to be processed Map.
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Done processing File : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B
    <2009-05-21 04:33:09,698> <DEBUG> <ESIBT.collaxa.cube.engine.data> <ConnectionFactory::getConnection> GOT CONNECTION 4 Autocommit = false
    For this error message this shows due to some /t its not picking up the file. but even i am facing the same issue for all the files where load is huge.
    Thanks,
    Saravana

  • Issue while loading berkeley database

    Hi,
    I have an issue while loading data into the Berkeley Database. When I load the xml files into Berkeley database, some files are being created. the files are named something like db.001,db.002,log.00000001 etc. I have a server running which tries to access the files .When I try to reload the Berkeley db while my server is running, I am unable to load the Berkeley database. I have to stop the server, load the berkeley DB and restart the server again . Is there any way in which I can reload the database without having to restart the server. Your response would be of help to me to find a solution to this issue. I am currently using Berkeley Database version 2.2.13
    Thanks,
    Priyadarshini
    Message was edited by: Priyadarshini
    user569257

    Hi Priyadarshini,
    The db.001 and db.002 are the environment's region files and the log.00000001 is one of the environment's transactional logs. The region files are created when you use an environment, and their size and number depend on the subsystems that you configure on your environment (memory pool, logging, transactions, locking). The log files reflect the modifications that you perform on your environment's database(s), and they are used along with the transaction subsystem to provide recoverability, ACID capabilities and protect against application or system failures.
    Is there a reason why that server tries to access these files ? The server process that runs while you load your database should not interfere with those files as they are used by Berkeley DB.
    Regards,
    Andrei

  • Exception thrown while reading an XML file using Properties

    Exception thrown while reading an XML file using Properties.
    Exception in thread "main" java.util.InvalidPropertiesFormatException: org.xml.sax.SAXParseException:
    The processing instruction target matching "[xX][mM][lL]" is not allowed.
    at java.util.XMLUtils.load(Unknown Source)
    <?xml version="1.0"?>
    <Config>
         <Database>
              <Product>FX.O</Product>
              <URL>jdbc:oracle:thin:@0.0.0.0:ABC</URL>
         </Database>
    </Config>Am I missing anything?

    Thanks a lot for all yr help.I have got the basics of
    the DTD
    However,when I say thus
    <!DOCTYPE Config SYSTEM "c:/PartyConfig">
    ?xml version="1.0"?>
    <Config>
         <Database>
              <Product>FX</Product>
              <URL>jdbc:oracle:thin:@0.0.0.0:ABC</URL>
         </Database>
    </Config>I get the error
    Invalid system identifier: file:///c:/ParyConfig.dtd
    Please advise?for goodness sake, how many times do I have to tell you that you can't just expect the Properties class to accept arbitrary XML in any format? it reads XML in the format I linked to, and nothing else. Properties is not a general purpose XML binding. you can't do what you're trying to do with it. read this and in particular pay attention to the following
    The XML document must have the following DOCTYPE declaration:
    <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd">
    Furthermore, the document must satisfy the properties DTD described above.
    you are not reading my answers properly, are you? kindly do so

Maybe you are looking for