Issue Loading CSV file using OPENROWSET

Hi All,
I have a csv file that has a value like -200 when i tried open the file in SQL using OPENROWSET that value is read as null.
select * from
OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Text;Database=C:\TEST\;',
'SELECT * FROM ABC.csv')
when i save teh fiel as .xlsx and open using the OPENROWSET syntax for xlsx file it shows the correct value.
I am not sure if it is the behaviour of teh file or withe SQL.Can some one tell me what would be the possible reasons for that.
Thanks in advance.
Raghav

Check this link and sample:
http://www.databasejournal.com/features/mssql/article.php/10894_3331881_2
select * from OpenRowset('MSDASQL', 'Driver={Microsoft Text Driver (*.txt; *.csv)};
DefaultDir=C:\External;','select top 6 * from
MyCsv.csv')
Best Wishes, Arbi; Please vote if you find this posting was helpful or Mark it as answered.

Similar Messages

  • Issue loading CSV file into HANA

    Hi,
    From last couples of weeks i am trying to load my CSV file into HANA Table, but i am unable to succeed.
    I am getting error "Cannot open Control file, /dropbox/P1005343/CRM_OBJ_ID.CTL". I have followed each and every step in SDN, still I could not load data into my HANA table.
    FTP: /dropbox/P1005343
    SQL Command:
    IMPORT FROM '/dropbox/P1005343/crm_obj_id.ctl'
    Error:
    Could not execute 'IMPORT FROM '/dropbox/P1005343/crm_obj_id.ctl''
    SAP DBTech JDBC: [2]: general error: Cannot open Control file, /dropbox/P1005343/crm_obj_id.ctl
    Please help me on this
    Regards,
    Praneeth

    Hi All,
    I have successfully loaded the file into HANA database in folder P443348 but while importing file, I am getting the following error message such as
    SAP DBTECH JDBC: [2]  (at 13) : general error: Cannot open Control file, "/P443348/shop_facts.ctl"
    This is happening while I am executing the following import statement
    IMPORT FROM '/P443348/shop_facts.ctl';
    I have tried several options including changing the permissions of the folders and files to no success. As of now my folder has full access, which is 777
    Any help would be greatly appreciated so that I can proceed further
    Thanks
    Vamsidhar

  • Issue while loading a csv file using sql*loader...

    Hi,
    I am loading a csv file using sql*loader.
    On the number columns where there is data populated in them, decimal number/integers, the row errors out on the error -
    ORA-01722: invalid number
    I tried checking the value picking from the excel,
    and found the chr(13),chr(32),chr(10) values characters on the value.
    ex: select length('0.21') from dual is giving a value of 7.
    When i checked each character as
    select ascii(substr('0.21',5,1) from dual is returning a value 9...etc.
    I tried the following command....
    "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
    to remove all the non-number special characters. But still facing the error.
    Please let me know, any solution for this error.
    Thanks in advance.
    Kiran

    control file:
    OPTIONS (ROWS=1, ERRORS=10000)
    LOAD DATA
    CHARACTERSET WE8ISO8859P1
    INFILE '$Xx_TOP/bin/ITEMS.csv'
    APPEND INTO TABLE XXINF.ITEMS_STAGE
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    ItemNum                    "trim(replace(replace(:ItemNum,chr(9),''),chr(13),''))",
    cross_ref_old_item_num               "trim(replace(replace(:cross_ref_old_item_num,chr(9),''),chr(13),''))",
    Mas_description               "trim(replace(replace(:Mas_description,chr(9),''),chr(13),''))",
    Mas_long_description               "trim(replace(replace(:Mas_long_description,chr(9),''),chr(13),''))",
    Org_description               "trim(replace(replace(:Org_description,chr(9),''),chr(13),''))",
    Org_long_description               "trim(replace(replace(:Org_long_description,chr(9),''),chr(13),''))",
    user_item_type                    "trim(replace(replace(:user_item_type,chr(9),''),chr(13),''))",
    organization_code               "trim(replace(replace(:organization_code,chr(9),''),chr(13),''))",
    primary_uom_code               "trim(replace(replace(:primary_uom_code,chr(9),''),chr(13),''))",
    inv_default_item_status          "trim(replace(replace(:inv_default_item_status,chr(9),''),chr(13),''))",
    inventory_item_flag               "trim(replace(replace(:inventory_item_flag,chr(9),''),chr(13),''))",
    stock_enabled_flag               "trim(replace(replace(:stock_enabled_flag,chr(9),''),chr(13),''))",
    mtl_transactions_enabled_flag          "trim(replace(replace(:mtl_transactions_enabled_flag,chr(9),''),chr(13),''))",
    revision_qty_control_code          "trim(replace(replace(:revision_qty_control_code,chr(9),''),chr(13),''))",
    reservable_type               "trim(replace(replace(:reservable_type,chr(9),''),chr(13),''))",
    check_shortages_flag               "trim(replace(replace(:check_shortages_flag,chr(9),''),chr(13),''))",
    shelf_life_code               "trim(replace(replace(replace(replace(:shelf_life_code,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    shelf_life_days               "trim(replace(replace(replace(replace(:shelf_life_days,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    lot_control_code               "trim(replace(replace(:lot_control_code,chr(9),''),chr(13),''))",
    auto_lot_alpha_prefix               "trim(replace(replace(:auto_lot_alpha_prefix,chr(9),''),chr(13),''))",
    start_auto_lot_number               "trim(replace(replace(:start_auto_lot_number,chr(9),''),chr(13),''))",
    negative_measurement_error          "trim(replace(replace(replace(replace(:negative_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    positive_measurement_error          "trim(replace(replace(replace(replace(:positive_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    serial_number_control_code          "trim(replace(replace(:serial_number_control_code,chr(9),''),chr(13),''))",
    auto_serial_alpha_prefix          "trim(replace(replace(:auto_serial_alpha_prefix,chr(9),''),chr(13),''))",
    start_auto_serial_number          "trim(replace(replace(:start_auto_serial_number,chr(9),''),chr(13),''))",
    location_control_code               "trim(replace(replace(:location_control_code,chr(9),''),chr(13),''))",
    restrict_subinventories_code          "trim(replace(replace(:restrict_subinventories_code,chr(9),''),chr(13),''))",
    restrict_locators_code               "trim(replace(replace(:restrict_locators_code,chr(9),''),chr(13),''))",
    bom_enabled_flag               "trim(replace(replace(:bom_enabled_flag,chr(9),''),chr(13),''))",
    costing_enabled_flag               "trim(replace(replace(:costing_enabled_flag,chr(9),''),chr(13),''))",
    inventory_asset_flag               "trim(replace(replace(:inventory_asset_flag,chr(9),''),chr(13),''))",
    default_include_in_rollup_flag          "trim(replace(replace(:default_include_in_rollup_flag,chr(9),''),chr(13),''))",
    cost_of_goods_sold_account          "trim(replace(replace(:cost_of_goods_sold_account,chr(9),''),chr(13),''))",
    std_lot_size                    "trim(replace(replace(replace(replace(:std_lot_size,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    sales_account                    "trim(replace(replace(:sales_account,chr(9),''),chr(13),''))",
    purchasing_item_flag               "trim(replace(replace(:purchasing_item_flag,chr(9),''),chr(13),''))",
    purchasing_enabled_flag          "trim(replace(replace(:purchasing_enabled_flag,chr(9),''),chr(13),''))",
    must_use_approved_vendor_flag          "trim(replace(replace(:must_use_approved_vendor_flag,chr(9),''),chr(13),''))",
    allow_item_desc_update_flag          "trim(replace(replace(:allow_item_desc_update_flag,chr(9),''),chr(13),''))",
    rfq_required_flag               "trim(replace(replace(:rfq_required_flag,chr(9),''),chr(13),''))",
    buyer_name                    "trim(replace(replace(:buyer_name,chr(9),''),chr(13),''))",
    list_price_per_unit               "trim(replace(replace(replace(replace(:list_price_per_unit,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    taxable_flag                    "trim(replace(replace(:taxable_flag,chr(9),''),chr(13),''))",
    purchasing_tax_code               "trim(replace(replace(:purchasing_tax_code,chr(9),''),chr(13),''))",
    receipt_required_flag               "trim(replace(replace(:receipt_required_flag,chr(9),''),chr(13),''))",
    inspection_required_flag          "trim(replace(replace(:inspection_required_flag,chr(9),''),chr(13),''))",
    price_tolerance_percent          "trim(replace(replace(replace(replace(:price_tolerance_percent,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    expense_account               "trim(replace(replace(:expense_account,chr(9),''),chr(13),''))",
    allow_substitute_receipts_flag          "trim(replace(replace(:allow_substitute_receipts_flag,chr(9),''),chr(13),''))",
    allow_unordered_receipts_flag          "trim(replace(replace(:allow_unordered_receipts_flag,chr(9),''),chr(13),''))",
    receiving_routing_code               "trim(replace(replace(:receiving_routing_code,chr(9),''),chr(13),''))",
    inventory_planning_code          "trim(replace(replace(:inventory_planning_code,chr(9),''),chr(13),''))",
    min_minmax_quantity               "trim(replace(replace(replace(replace(:min_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    max_minmax_quantity               "trim(replace(replace(replace(replace(:max_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    planning_make_buy_code               "trim(replace(replace(:planning_make_buy_code,chr(9),''),chr(13),''))",
    source_type                    "trim(replace(replace(:source_type,chr(9),''),chr(13),''))",
    mrp_safety_stock_code               "trim(replace(replace(:mrp_safety_stock_code,chr(9),''),chr(13),''))",
    material_cost                    "trim(replace(replace(:material_cost,chr(9),''),chr(13),''))",
    mrp_planning_code               "trim(replace(replace(:mrp_planning_code,chr(9),''),chr(13),''))",
    customer_order_enabled_flag          "trim(replace(replace(:customer_order_enabled_flag,chr(9),''),chr(13),''))",
    customer_order_flag               "trim(replace(replace(:customer_order_flag,chr(9),''),chr(13),''))",
    shippable_item_flag               "trim(replace(replace(:shippable_item_flag,chr(9),''),chr(13),''))",
    internal_order_flag               "trim(replace(replace(:internal_order_flag,chr(9),''),chr(13),''))",
    internal_order_enabled_flag          "trim(replace(replace(:internal_order_enabled_flag,chr(9),''),chr(13),''))",
    invoice_enabled_flag               "trim(replace(replace(:invoice_enabled_flag,chr(9),''),chr(13),''))",
    invoiceable_item_flag               "trim(replace(replace(:invoiceable_item_flag,chr(9),''),chr(13),''))",
    cross_ref_ean_code               "trim(replace(replace(:cross_ref_ean_code,chr(9),''),chr(13),''))",
    category_set_intrastat               "trim(replace(replace(:category_set_intrastat,chr(9),''),chr(13),''))",
    CustomCode                    "trim(replace(replace(:CustomCode,chr(9),''),chr(13),''))",
    net_weight                    "trim(replace(replace(replace(replace(:net_weight,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    production_speed               "trim(replace(replace(:production_speed,chr(9),''),chr(13),''))",
    LABEL                         "trim(replace(replace(:LABEL,chr(9),''),chr(13),''))",
    comment1_org_level               "trim(replace(replace(:comment1_org_level,chr(9),''),chr(13),''))",
    comment2_org_level               "trim(replace(replace(:comment2_org_level,chr(9),''),chr(13),''))",
    std_cost_price_scala               "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
    supply_type                    "trim(replace(replace(:supply_type,chr(9),''),chr(13),''))",
    subinventory_code               "trim(replace(replace(:subinventory_code,chr(9),''),chr(13),''))",
    preprocessing_lead_time          "trim(replace(replace(replace(replace(:preprocessing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
    processing_lead_time                "trim(replace(replace(replace(replace(:processing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
    wip_supply_locator               "trim(replace(replace(:wip_supply_locator,chr(9),''),chr(13),''))"
    Sample data from csv file.
    "9901-0001-35","390000","JMKL16 Pipe bend 16 mm","","JMKL16 Putkikaari 16 mm","","AI","FJE","Ea","","","","","","","","","","","","","","","","","","","","","","","","","21-21100-22200-00000-00000-00-00000-00000","0","21-11100-22110-00000-00000-00-00000-00000","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","0.1","Pull","AFTER PROD","","","Locator for Production"
    The load errors out on especially two columns :
    1) std_cost_price_scala
    2) list_price_per_unit
    both are number columns.
    And when there is data being provided on them. It errors out. But, if they are holding null values, the records go through fine.
    Message was edited by:
    KK28

  • Load .csv file data with OWb Process flow using Web

    Hi,
    I Have a file in my local machine( Machines on multiple user's), need to load data through Web user interface.
    Let's say have a web page with multiple radio buttons respective to different sources, by clicking on each button will pass the path of .csv file to through Application, (API or Java programming interface) execute owb Process flow as a accepting file path as a input parameter to execute for loading purpose.
    Should facilitate view data, Update data through web based on user requests.
    Need your guidence how can i implement this with OWb 11g R2.
    Assuming with Web browser functionality. Please confirm it and if yes, please throw some light how could be the steps to implement.
    Thanks

    Hi David,
    Thanks for your reply.
    Undersatnd your proposed solution.But my requirement should be as follows.
    1. Currently under consideration using web page likely to be implement with Java, allowing users to load .csv file data into staging area.(Loading flat file into Data abse table)
    Case 1, Assuming OWB software is not installed on user machine. I think no.
    Is it possible through web page (this case Java page) to trigger java procedure/Pl/SQl procedure or integration of both to laod data into staging area.If yes, how it could effect performance of data load with 1 GB file.
    Case 2, OWb client software installed on User machine, while runtime passing parameters means passing manually?
    In case it is automated, how should i pass machine name & Path to owb runtime web browser.
    Could you please show me guidence how should I acheive this functionality with APEX customization part?
    Thanks agin for your support.
    Anil

  • How to get DocSet property values in a SharePoint library into a CSV file using Powershell

    Hi,
    How to get DocSet property values in a SharePoint library into a CSV file using Powershell?
    Any help would be greatly appreciated.
    Thank you.
    AA.

    Hi AOK,
    Would you please post your current script and the issue for more effcient support.
    In addition, to manage document set in sharepoint please refer to this script to start:
    ### Load SharePoint SnapIn
    2.if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null)
    3.{
    4. Add-PSSnapin Microsoft.SharePoint.PowerShell
    5.}
    6.### Load SharePoint Object Model
    7.[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint”)
    8.
    9.### Get web and list
    10.$web = Get-SPWeb http://myweb
    11.$list = $web.Lists["List with Document Sets"]
    12.
    13.### Get Document Set Content Type from list
    14.$cType = $list.ContentTypes["Document Set Content Type Name"]
    15.
    16.### Create Document Set Properties Hashtable
    17.[Hashtable]$docsetProperties = @{"DocumentSetDescription"="A Document Set"}
    18.$docsetProperties = @{"CustomColumn1"="Value 1"}
    19.$docsetProperties = @{"CustomColum2"="Value2"}
    20. ### Add all your Columns for your Document Set
    21.
    22.### Create new Document Set
    23.$newDocumentSet = [Microsoft.Office.DocumentManagement.DocumentSets.DocumentSet]::Create($list.RootFolder,"Document Set Title",$cType.Id,$docsetProperties)
    24.$web.Dispose()
    http://www.letssharepoint.com/2011/06/document-sets-und-powershell.html
    If there is anything else regarding this issue, please feel free to post back.
    Best Regards,
    Anna Wang
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Loading csv files from JDBC

    What is the best way to load a csv file using JDBC link? Right now I basically read a line and insert it, batching it doesn't seem to speed it up either.
    Thanks
    Brian Timothy
    null

    Hai !
    I think i can help u in this as i have done almost the same thing...
    In Oracle for importing the data from other databases there is one bulk insert command ie SQLLDR...This commands takes parameters as a Control File(*.ctl) and data file ie(*.csv) ......You can get more help on it on the site if you search for SQL Loader....
    But i think it cann't be executed from StoredProcedure's so what i am doing is running this command from my java application using Runtime.exec("SQLLDR...................") .
    I think this will help you ...You are always welcome for any more clarifications....
    Regards
    Brijesh Kumar
    SSI-Technologies

  • What is limit to load csv file for Apex Utility?

    What is limit to load csv file for Apex Utility?
    How many records can I load through apex load utility?
    I am loading csv files.
    Thank you.

    Hi,
    You download the report that is displayed in APEX which is limited to 65k. There may be some way of creating a custom file based off the table rather than using APEX's built in export to csv option but I don't know of any being released. Denes produced an export to .xls option but I don't know how many rows that caters too: http://htmldb.oracle.com/pls/otn/f?p=31517:108:1512181285516724::NO:: -- I assume 65k too though.
    Mike

  • Load multiple files using the same data load location

    has anybody tried loading multiple files using the same load locations. I need to do this as the data in these multiple files will need to be exported from FDM as a single export file. the problem i am facing is more user related. since these files will be received at different points of time, users will need a way to tell them what has been loaded and what is yet to be loaded.
    is it possible to throw a window on the web broser with OK and Cancel buttons from an event script?
    any pointers to possible solutions will be helpful

    was able to resolve this. the implementation method is as follows
    take a back up of previously imported data in the befcleardata event script. then in the beffileimport event append the data to the import file. there are many other intricacies but this is the broad implementation logic. it allowed my users to load multiple files without worrying about append or replace import type choices

  • How to get Document Set property values in a SharePoint library in to a CSV file using Powershell

    Hi,
    How to get Document Set property values in a SharePoint library into a CSV file using Powershell?
    Any help would be greatly appreciated.
    Thank you.
    AA.

    Hi,
    According to your description, my understanding is that you want to you want to get document set property value in a SharePoint library and then export into a CSV file using PowerShell.
    I suggest you can get the document sets properties like the PowerShell Command below:
    [system.reflection.assembly]::loadwithpartialname("microsoft.sharepoint")
    $siteurl="http://sp2013sps/sites/test"
    $listname="Documents"
    $mysite=new-object microsoft.sharepoint.spsite($siteurl)
    $myweb=$mysite.openweb()
    $list=$myweb.lists[$listname]
    foreach($item in $list.items)
    if($item.contenttype.name -eq "Document Set")
    if($item.folder.itemcount -eq 0)
    write-host $item.title
    Then you can use Export-Csv PowerShell Command to export to a CSV file.
    More information:
    Powershell for document sets
    How to export data to CSV in PowerShell?
    Using the Export-Csv Cmdlet
    Thanks
    Best Regards
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • Reading csv file using file adapter

    Hi,
    I am working on SOA 11g. I am reading a csv file using a file adapter. Below are the file contents, and the xsd which gets generated by the Jdev.
    .csv file:
    empid,empname,empsal
    100,Ram,20000
    101,Shyam,25000
    xsd generated by the Jdev:
    <?xml version="1.0" encoding="UTF-8" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd" xmlns:tns="http://TargetNamespace.com/EmpRead" targetNamespace="http://TargetNamespace.com/EmpRead" elementFormDefault="qualified" attributeFormDefault="unqualified"
    nxsd:version="NXSD"
    nxsd:stream="chars"
    nxsd:encoding="ASCII"
    nxsd:hasHeader="true"
    nxsd:headerLines="1"
    nxsd:headerLinesTerminatedBy="${eol}">
    <xsd:element name="Root-Element">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="Child-Element" minOccurs="1" maxOccurs="unbounded">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="empid" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy="&quot;" />
    <xsd:element name="empname" minOccurs="1" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy="&quot;" />
    <xsd:element name="empsal" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="${eol}" nxsd:quotedBy="&quot;" />
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    </xsd:schema>
    For empname i have added minoccurs=1. Now when i remove the empname column, the csv file still gets read from the server, without giving any error.
    Now, i created the following xml file, and read it through the file adapter:
    <?xml version="1.0" encoding="UTF-8" ?>
    <Root-Element xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://TargetNamespace.com/EmpRead xsd/EmpXML.xsd" xmlns="http://TargetNamespace.com/EmpRead">
    <Child-Element>
    <empid>100</empid>
    <empname></empname>
    <empsal>20000</empsal>
    </Child-Element>
    <Child-Element>
    <empid>101</empid>
    <empname>Shyam</empname>
    <empsal>25000</empsal>
    </Child-Element>
    </Root-Element>
    When i removed the value of empname, it throws the proper error for the above xml.
    Please tell me why the behaviour of file adapter is different for the csv file and the xml file for the above case.
    Thanks

    Hi,
    I am working on SOA 11g. I am reading a csv file using a file adapter. Below are the file contents, and the xsd which gets generated by the Jdev.
    .csv file:
    empid,empname,empsal
    100,Ram,20000
    101,Shyam,25000
    xsd generated by the Jdev:
    <?xml version="1.0" encoding="UTF-8" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd" xmlns:tns="http://TargetNamespace.com/EmpRead" targetNamespace="http://TargetNamespace.com/EmpRead" elementFormDefault="qualified" attributeFormDefault="unqualified"
    nxsd:version="NXSD"
    nxsd:stream="chars"
    nxsd:encoding="ASCII"
    nxsd:hasHeader="true"
    nxsd:headerLines="1"
    nxsd:headerLinesTerminatedBy="${eol}">
    <xsd:element name="Root-Element">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="Child-Element" minOccurs="1" maxOccurs="unbounded">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="empid" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy="&quot;" />
    <xsd:element name="empname" minOccurs="1" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy="&quot;" />
    <xsd:element name="empsal" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="${eol}" nxsd:quotedBy="&quot;" />
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    </xsd:schema>
    For empname i have added minoccurs=1. Now when i remove the empname column, the csv file still gets read from the server, without giving any error.
    Now, i created the following xml file, and read it through the file adapter:
    <?xml version="1.0" encoding="UTF-8" ?>
    <Root-Element xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://TargetNamespace.com/EmpRead xsd/EmpXML.xsd" xmlns="http://TargetNamespace.com/EmpRead">
    <Child-Element>
    <empid>100</empid>
    <empname></empname>
    <empsal>20000</empsal>
    </Child-Element>
    <Child-Element>
    <empid>101</empid>
    <empname>Shyam</empname>
    <empsal>25000</empsal>
    </Child-Element>
    </Root-Element>
    When i removed the value of empname, it throws the proper error for the above xml.
    Please tell me why the behaviour of file adapter is different for the csv file and the xml file for the above case.
    Thanks

  • How to upload a .CSV file using GUI_UPLOAD

    Hi Experts,
          In my report, I need to upload .CSV file using GUI_upload..So how to do ....Plz provide solution...

    Hi prashanthishetty,
    this is already answered many times in this forum!
    use forum search or wiki search
    [http://wiki.sdn.sap.com/wiki/display/Snippets/uploadcsvfilesintointernal+table]
    regards
    rea

  • Is it possible to monitor State change of a .CSV file using powershell scripting ?

    Hi All,
    I just would like to know Is it possible to monitor State change of a .CSV file using powershell scripting ? We have SCOM tool which has that capability but there are some drawbacks in that for which we are not able to utilise that. So i would like
    to know is this possible using powershell.
    So if there is any number above 303 in the .CSV file then i need a email alert / notification for the same.
    Gautam.75801

    Hi Jrv,
    Thank you very much. I modified the above and it worked.
    Import-Csv C:\SCOM_Tasks\GCC2010Capacitymanagement\CapacityMgntData.csv | ?{$_.Mailboxes -gt 303} | Export-csv -path C:\SCOM_Tasks\Mbx_Above303.csv;
    Send-MailMessage -Attachments "C:\SCOM_Tasks\Mbx_Above303.csv" -To “[email protected]" -From “abc@xyz" -SMTPServer [email protected] -Subject “Mailboxex are above 303 in Exchange databases” -Body “Mailboxex are above 303 in Exchange databases" 
    Mailboxex - is the line which i want to monitor if the values there are above 303. And it will extract the lines with all above 303 to another CSV file and 2nd is a mail script to email me the same with the attachment of the 2nd extract.
    Gautam.75801

  • I am  unable to load GRAPHIC files using the transaction SFP.

    I am  unable to load GRAPHIC files using the transaction SFP.
    The error message says that there is no connection to the below given url.
       http://<hostname:8000>/sap/bc/fp/
    is it something like i have to activate this service in transaction sicf ?

    Try http://<hostname:8000>/sap/bc/fp/!
    ! at the end ..
    Regards
    Juan

  • Adding row into existing CSV file using C#

    How to add row to existing CSV file using .NET Code.the file should not be overwrite,it need add another row with data.how to implement this scenario.

    Hi BizQ,
    If you only just write some data to CSV file. Please follow A.Zaied and Magnus  's reply. In general,we use CSV file to import or export some data. Like following thread and a good article in codeproject
    Convert a CSV file to Excel using C#
    Writing a DataTable to a CSV file
    Best regards,
    Kristin
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Can I load gMax files using Java3D?

    Can I load gMax files using Java3D?
    I need to know this if I am going to use Java for my project.

    What format are the files in? Can you export as .3ds? If you can, there are loaders for these files.
    There are also loaders available for VRML, Obj, etc - see what you can export the file as.

Maybe you are looking for