To load special records from worksheet in main memory: C#

Hello
I used this code. I do not have any error. My program is run after 55 seconds. My data is large.
Now all recirds load in RAM. How can I load special records in RAM. For example I want to load record 2 from Book1.xlsx in RAM.
Microsoft.Office.Interop.Excel.Application app_main = new Microsoft.Office.Interop.Excel.Application();
Microsoft.Office.Interop.Excel.Workbooks workbooks_main = app_main.Workbooks;
Microsoft.Office.Interop.Excel.Workbook workbook_main = workbooks_main.Open(Application.StartupPath + @"\Book1.xlsx");
Microsoft.Office.Interop.Excel.Worksheet worksheet_main = workbook_main.Worksheets[1];
var cellValue_main = (worksheet_main.Cells[2, 1] as Excel.Range).Value;
workbook_main.Close();
workbooks_main.Close();
System.Runtime.InteropServices.Marshal.ReleaseComObject(workbook_main);
System.Runtime.InteropServices.Marshal.ReleaseComObject(workbooks_main);
app_main.Quit();
System.Runtime.InteropServices.Marshal.ReleaseComObject(app_main);

Hi ARZARE,
I have made a test with your code and the value of “cellValue_main”, and I modified your code as below:
Microsoft.Office.Interop.Excel.Application app_main = new Microsoft.Office.Interop.Excel.Application();
Microsoft.Office.Interop.Excel.Workbooks workbooks_main = app_main.Workbooks;
Microsoft.Office.Interop.Excel.Workbook workbook_main = workbooks_main.Open(@"D:\Backup\Desktop\Test.xlsx");
Microsoft.Office.Interop.Excel.Worksheet worksheet_main = workbook_main.Worksheets[1];
var cellValue_main = (worksheet_main.Cells[2, 1] as Microsoft.Office.Interop.Excel.Range).Value;
var rowValue_main = (worksheet_main.Rows[1] as Microsoft.Office.Interop.Excel.Range).Value2; //row value
workbook_main.Close();
workbooks_main.Close();
System.Runtime.InteropServices.Marshal.ReleaseComObject(workbook_main);
System.Runtime.InteropServices.Marshal.ReleaseComObject(workbooks_main);
app_main.Quit();
System.Runtime.InteropServices.Marshal.ReleaseComObject(app_main);
If you want to get the value of row 2, you could store the “rowValue_main” and query from it. Also, you could loop the cell value of the row to get all the data of the row.
In addition, if you want to query data from the sheet, I think you could do as the reply from Joel. You could store the data to a dataset or datatable, and query from it.
If you have any further questions, please feel free to post in this forum.
Best Regards,
Edward
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place. <br/> Click <a
href="http://support.microsoft.com/common/survey.aspx?showpage=1&scid=sw%3Ben%3B3559&theme=tech"> HERE</a> to participate the survey.

Similar Messages

  • Upload pictures to PC from BlackBerry's main memory - no microSD memory card

    How do I upload pictures to PC from BlackBerry's main memory (no microSD memory card)?
    I could do this with the original PC software I got in November 2009. I got a new PC and loaded the latest Blackberry Desktop Software and it only looks for the memory card.
    Thanks
    Dave Lent

    Hello, Please see these helpful articles which discuss the various ways to manage user files on a BB. While they may have been written for earlier versions of DTM and/or BB OS, you should still be able to use them to configure properly to use your BB as a set of USB drives on your PC, thereby enabling any Windows Explorer methodology: http://supportforums.blackberry.com/t5/BlackBerry-Desktop-Software/HOWTO-use-your-blackberry-as-a-US... http://www.blackberryfaq.com/index.php/How_do_I_add_media_files_to_my_microSD_card%3F Good luck and let us know!
    Occam's Razor nearly always applies when troubleshooting technology issues!
    If anyone has been helpful to you, please show your appreciation by clicking the button inside of their post. Please click here and read, along with the threads to which it links, for helpful information to guide you as you proceed. I always recommend that you treat your BlackBerry like any other computing device, including using a regular backup schedule...click here for an article with instructions.
    Join our BBM Channels
    BSCF General Channel
    PIN: C0001B7B4   Display/Scan Bar Code
    Knowledge Base Updates
    PIN: C0005A9AA   Display/Scan Bar Code

  • Performance issue loading 4000 records from XML

    Hello, Im' trying to upload in a table with the following sqlstatement records from an XML having content of this type
    <?xml version="1.0" encoding="UTF-8"?>
    <custom-objects xmlns="http://www.mysite.com/xml/impex/customobject/2006-10-31">
        <custom-object type-id="NEWSLETTER_SUBSCRIBER" object-id="[email protected]">
      <object-attribute attribute-id="customer-no"><value>BLY00000001</value></object-attribute>
      <object-attribute attribute-id="customer_type"><value>registered</value></object-attribute>
            <object-attribute attribute-id="title"><value>Mr.</value></object-attribute>
            <object-attribute attribute-id="first_name"><value>Jean paul</value></object-attribute>
            <object-attribute attribute-id="is_subscribed"><value>true</value></object-attribute>
            <object-attribute attribute-id="last_name"><value>Pennati Swiss</value></object-attribute>
            <object-attribute attribute-id="address_line_1"><value>newsletter ADDRESS LINE 1 data</value></object-attribute>
            <object-attribute attribute-id="address_line_2"><value>newsletter ADDRESS LINE 2 data</value></object-attribute>
            <object-attribute attribute-id="address_line_3"><value>newsletter ADDRESS LINE 3 data</value></object-attribute>
            <object-attribute attribute-id="housenumber"><value>newsletter HOUSENUMBER data</value></object-attribute>
            <object-attribute attribute-id="city"><value>newsletter DD</value></object-attribute>
            <object-attribute attribute-id="post_code"><value>6987</value></object-attribute>
            <object-attribute attribute-id="state"><value>ASD</value></object-attribute>
            <object-attribute attribute-id="country"><value>ES</value></object-attribute>
            <object-attribute attribute-id="phone_home"><value>0044 1234567 newsletter phone_home</value></object-attribute>
            <object-attribute attribute-id="preferred_locale"><value>fr_CH</value></object-attribute>
            <object-attribute attribute-id="exported"><value>true</value></object-attribute>
            <object-attribute attribute-id="profiling"><value>true</value></object-attribute>
            <object-attribute attribute-id="promotions"><value>true</value></object-attribute>
            <object-attribute attribute-id="source"><value>https://www.mysite.com</value></object-attribute>
            <object-attribute attribute-id="source_ip"><value>10.10.1.1</value></object-attribute>
            <object-attribute attribute-id="pr_product_serial_number"><value>000123345678 product serial no.</value></object-attribute>
            <object-attribute attribute-id="pr_purchased_from"><value>Store where product to be registered was purchased</value></object-attribute>
            <object-attribute attribute-id="pr_date_of_purchase"><value></value></object-attribute>
            <object-attribute attribute-id="locale"><value>fr_CH</value></object-attribute> 
        </custom-object>
        <custom-object type-id="NEWSLETTER_SUBSCRIBER" object-id="[email protected]">
       <object-attribute attribute-id="customer-no"><value></value></object-attribute>
       <object-attribute attribute-id="customer_type"><value>unregistered</value></object-attribute>
            <object-attribute attribute-id="title"><value>Mr.</value></object-attribute>
            <object-attribute attribute-id="first_name"><value>Jean paul</value></object-attribute>
            <object-attribute attribute-id="is_subscribed"><value>true</value></object-attribute>
            <object-attribute attribute-id="last_name"><value>Pennati Swiss</value></object-attribute>
            <object-attribute attribute-id="address_line_1"><value>newsletter ADDRESS LINE 1 data</value></object-attribute>
            <object-attribute attribute-id="address_line_2"><value>newsletter ADDRESS LINE 2 data</value></object-attribute>
            <object-attribute attribute-id="address_line_3"><value>newsletter ADDRESS LINE 3 data</value></object-attribute>
            <object-attribute attribute-id="housenumber"><value>newsletter HOUSENUMBER data</value></object-attribute>
            <object-attribute attribute-id="city"><value>newsletter CASLANO</value></object-attribute>
            <object-attribute attribute-id="post_code"><value>6987</value></object-attribute>
            <object-attribute attribute-id="state"><value>TICINO</value></object-attribute>
            <object-attribute attribute-id="country"><value>CH</value></object-attribute>
            <object-attribute attribute-id="phone_home"><value>0044 1234567 newsletter phone_home</value></object-attribute>
            <object-attribute attribute-id="preferred_locale"><value>fr_CH</value></object-attribute>
            <object-attribute attribute-id="exported"><value>true</value></object-attribute>
            <object-attribute attribute-id="profiling"><value>true</value></object-attribute>
            <object-attribute attribute-id="promotions"><value>true</value></object-attribute>
            <object-attribute attribute-id="source"><value>https://www.mysite.com</value></object-attribute>
            <object-attribute attribute-id="source_ip"><value>85.219.17.170</value></object-attribute>
            <object-attribute attribute-id="pr_product_serial_number"><value>000123345678 product serial no.</value></object-attribute>
            <object-attribute attribute-id="pr_purchased_from"><value>Store where product to be registered was purchased</value></object-attribute>
            <object-attribute attribute-id="pr_date_of_purchase"><value></value></object-attribute>
            <object-attribute attribute-id="locale"><value>fr_CH</value></object-attribute> 
        </custom-object>
    </custom-objects>
    I use the following sequence of queries below to do the insert (XML_FILE is passed to the procedure as XMLType) 
    INSERT INTO DW_CUSTOMER.NEWSLETTERS (
       BRANDID,
       CUSTOMER_EMAIL,
       DW_WEBSITE_TAG
    Select
    p_brandid as BRANDID,
    CUSTOMER_EMAIL,
    p_website
    FROM
    (select XML_FILE from dual) p,
    XMLTable(
    xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31'),
    '/custom-objects/custom-object' PASSING p.XML_FILE
    COLUMNS
    customer_email PATH '@object-id'
    ) CUSTOMER_LEVEL1;
    INSERT INTO DW_CUSTOMER.NEWSLETTERS_C_ATT (
       BRANDID, 
       CUSTOMER_EMAIL,
       CUSTOMER_NO, 
       CUSTOMER_TYPE,
       TITLE,
       FIRST_NAME,
       LAST_NAME,
       PHONE_HOME,
       BIRTHDAY,
       ADDRESS1,
       ADDRESS2,
       ADDRESS3,
       HOUSENUMBER,
       CITY,
       POSTAL_CODE,
       STATE,
       COUNTRY,
       IS_SUBSCRIBED,
       PREFERRED_LOCALE,
       PROFILING,
       PROMOTIONS,
       EXPORTED,
       SOURCE,
       SOURCE_IP,
       PR_PRODUCT_SERIAL_NO,
       PR_PURCHASED_FROM,
       PR_PURCHASE_DATE,
       LOCALE,
       DW_WEBSITE_TAG)
        with mainq as
            SELECT
            CUST_LEVEL1.customer_email as CUSTOMER_EMAIL,
            CUST_LEVEL2.*
            FROM
            (select XML_FILE from dual) p,
            XMLTable(
            xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31'),
            '/custom-objects/custom-object' PASSING p.XML_FILE
            COLUMNS
            customer_email PATH '@object-id',
            NEWSLETTERS_C_ATT XMLType PATH 'object-attribute'
            ) CUST_LEVEL1,
            XMLTable(
            xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31'),
            '/object-attribute' PASSING CUST_LEVEL1.NEWSLETTERS_C_ATT
            COLUMNS
            attribute_id PATH '@attribute-id',
            thevalue PATH 'value'
            ) CUST_LEVEL2
        select
        p_brandid
        ,customer_email
        ,nvl(max(decode(attribute_id,'customer_no',thevalue)),SET_NEWSL_CUST_ID) customer_no   
        ,max(decode(attribute_id,'customer_type',thevalue)) customer_type
        ,max(decode(attribute_id,'title',thevalue)) title
        ,substr(max(decode(attribute_id,'first_name',thevalue)) ,1,64)first_name
        ,substr(max(decode(attribute_id,'last_name',thevalue)) ,1,64) last_name
        ,substr(max(decode(attribute_id,'phone_hone',thevalue)) ,1,64) phone_hone
        ,max(decode(attribute_id,'birthday',thevalue)) birthday
        ,substr(max(decode(attribute_id,'address_line1',thevalue)) ,1,100) address_line1
        ,substr(max(decode(attribute_id,'address_line2',thevalue)) ,1,100) address_line2
        ,substr(max(decode(attribute_id,'address_line3',thevalue)) ,1,100) address_line3   
        ,substr(max(decode(attribute_id,'housenumber',thevalue)) ,1,64) housenumber
        ,substr(max(decode(attribute_id,'city',thevalue)) ,1,128) city
        ,substr(max(decode(attribute_id,'post_code',thevalue)) ,1,64) postal_code
        ,substr(max(decode(attribute_id,'state',thevalue)),1,256) state
        ,substr(max(decode(attribute_id,'country',thevalue)),1,32) country
        ,max(decode(attribute_id,'is_subscribed',thevalue)) is_subscribed
        ,max(decode(attribute_id,'preferred_locale',thevalue)) preferred_locale
        ,max(decode(attribute_id,'profiling',thevalue)) profiling
        ,max(decode(attribute_id,'promotions',thevalue)) promotions
        ,max(decode(attribute_id,'exported',thevalue)) exported   
        ,substr(max(decode(attribute_id,'source',thevalue)),1,256) source   
        ,max(decode(attribute_id,'source_ip',thevalue)) source_ip       
        ,substr(max(decode(attribute_id,'pr_product_serial_number',thevalue)),1,64) pr_product_serial_number
        ,substr(max(decode(attribute_id,'pr_purchased_from',thevalue)),1,64) pr_purchased_from   
        ,substr(max(decode(attribute_id,'pr_date_of_purchase',thevalue)),1,32) pr_date_of_purchase
        ,max(decode(attribute_id,'locale',thevalue)) locale
        ,p_website   
        from
        mainq
        group by customer_email, p_website
    I CANNOT MANAGE TO INSERT 4000 records in less than 30 minutes!
    Can you help or advise how to reduce this to reasonable timings?
    Thanks

    Simplified example on a few attributes :
    -- INSERT INTO tmp_xml VALUES ( xml_file );
    INSERT ALL
      INTO newsletters (brandid, customer_email, dw_website_tag)
      VALUES (p_brandid, customer_email, p_website)
      INTO newsletters_c_att (brandid, customer_email, customer_no, customer_type, title, first_name, last_name)
      VALUES (p_brandid, customer_email, customer_no, customer_type, title, first_name, last_name)
    SELECT o.*
    FROM tmp_xml t
       , XMLTable(
           xmlnamespaces(default 'http://www.mysite.com/xml/impex/customobject/2006-10-31')
         , '/custom-objects/custom-object'
           passing t.object_value
           columns customer_email varchar2(256) path '@object-id'
                 , customer_no    varchar2(256) path 'object-attribute[@attribute-id="customer-no"]/value'
                 , customer_type  varchar2(256) path 'object-attribute[@attribute-id="customer_type"]/value'
                 , title          varchar2(256) path 'object-attribute[@attribute-id="title"]/value'
                 , first_name     varchar2(64)  path 'object-attribute[@attribute-id="first_name"]/value'
                 , last_name      varchar2(64)  path 'object-attribute[@attribute-id="last_name"]/value'
         ) o

  • Loading 361000 records at a time from csv file

    Hi,
    One of my collegue loaded 361000 records from one file file , how is this possible as excel accepts 65536 records in one file
    and even in the infopackage the following are selected what does this mean
    Data Separator   ;
    Escape Sign      "
    Separator for Thousands   .
    Character Used for Decimal Point   ,
    Pls let me know

    hi Maya,
    it just possible, other than ms-excel, we have editor like Textpad that support more 65k rows (and windows Notepad), the file may be generated by program or edited outside in excel, or newer version of excel is used, ms-excel 2007 support more 1 million rows.
    e.g we have csv file
    customer;product;quantity;revenue
    a;x;"1.250,25";200
    b;y;"5.5";300
    data separator ;
    - char/delimeter used to separate field, e.g
    escape sign, e.g
    - "1.250,25";200 then quantity = 1.250,25
    separator for thousands = .
    - 1.250,25 means one thousand two hundred ...
    char used for decimal point
    - - 1.250<b>,</b>25
    check
    http://help.sap.com/saphelp_nw70/helpdata/en/80/1a6581e07211d2acb80000e829fbfe/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/c2/678e3bee3c9979e10000000a11402f/frameset.htm
    hope this helps.

  • Load multiple records in 1 table from 1 line in text file w/sql loader

    hi guys,
    quick question, perhaps someone can help. searched around and didn't see this question asked before, and can't find any answer in SQL Loader faqs or docs.
    i know i can extract multiple logical records from a single physical record in the input file as two logical records, and then use two into tables clauses to load the data into the table. see oracle 9i sql loader control file reference chapter 5 to see what i am talking about.
    but my question follows:
    cust_id amount1_val amount1_qual amount2_val amount2_qual amount3_val amount3_qual
    123 1500.35 TA 230.34 VZ 3045.50 TW
    basically i want to use one sql loader statement to load these 3 records into 1 table. the issue for me is that i need to re-use the cust_id for all 3 records as the key, along with the qualifier code. the example in the Oracle docs only works for data where the logical records are completely separate -- no shared column values.
    i'm sure this is possible, perhaps using some :cust_id type parameter for the 2nd and 3rd records, or something, but i just don't have enough knowledge/experience with sql loader to know what to do. appreciate any help.
    wayne

    Hi wayne,
    I found an example on what exactly you were looking for from an SQL*Loader documentation. Please see if it of some help to you
    EXAMPLE
    The control file is ULCASE5.CTL.
    1234 BAKER 10 9999 101
    1234 JOKER 10 9999 777
    2664 YOUNG 20 2893 425
    5321 OTOOLE 10 9999 321
    2134 FARMER 20 4555 236
    2414 LITTLE 20 5634 236
    6542 LEE 10 4532 102
    2849 EDDS xx 4555
    4532 PERKINS 10 9999 40
    1244 HUNT 11 3452 665
    123 DOOLITTLE 12 9940
    1453 MACDONALD 25 5532
    In the above datafile
    Column1 - Empno
    Column2 - ENAME
    Column3 - Depno.
    Column4 - MGR
    Column5 - Proj no.
    -- Loads EMP records from first 23 characters
    -- Creates and loads PROJ records for each PROJNO listed
    -- for each employee
    LOAD DATA
    INFILE 'ulcase5.dat'
    BADFILE 'ulcase5.bad'
    DISCARDFILE 'ulcase5.dsc'
    1) REPLACE
    2) INTO TABLE emp
    (empno POSITION(1:4) INTEGER EXTERNAL,
    ename POSITION(6:15) CHAR,
    deptno POSITION(17:18) CHAR,
    mgr POSITION(20:23) INTEGER EXTERNAL)
    2) INTO TABLE proj
    (empno POSITION(1:4) INTEGER EXTERNAL,
    3) projno POSITION(25:27) INTEGER EXTERNAL) -- 1st proj
    Notes:
    REPLACE specifies that if there is data in the tables to be loaded (EMP and PROJ), SQL*loader should delete the data before loading new rows.
    Multiple INTO clauses load two tables, EMP and PROJ. The same set of records is processed three times, using different combinations of columns each time to load table PROJ.
    Regards,
    Murali Mohan

  • Loading Null values from source to target

    Hello all,
    Please help me in this issue....
    I am trying to load data from source ( state column, varchar50) into target (state column,varchar10).....
    in my source i have 10records for state...... 9states with lenght<10 and 1 NULL value......
    when i am trying to load the records from source to target...i am getting this error.....
    12899 : 72000 : java.sql.SQLException: ORA-12899: value too large for column "TARGET"."STATE" (actual: 11, maximum: 10)
    why is taking the lenght of NULL as 11...........
    Thanks

    there might be some special characters or international characters instead of Varchar2(10 byte ) try varchar2(10 char) and try again. As some of the unicode symbols or characters can take two byes even though they are single characters

  • Infopackage loads 0 record in process chain as BG job not running correctly

    Hi Experts,
    I have scheduled a daily process chain for Purchasing.
    The infopackage in the process chain loads 0 records from R/3 and shows green status but data is available in R/3. Its a FULL update.
    But when I load the data manually the infopackage fetchs the record form R/3.
    When I checked the background job in R/3 that loaded 0 records, the job gets finished abruptly without fetchig any data.
    Please find the job log below:
    Job started
    Step 001 started (program SBIE0001, variant &0000000210024, user ID RFCUSR)
    DATASOURCE = ZPUR_S600
             Current Values for Selected Profile Parameters               *
    abap/heap_area_nondia......... 0                                       *
    abap/heap_area_total.......... 0                                       *
    abap/heaplimit................ 40894464                                *
    zcsa/installed_languages...... ED                                      *
    zcsa/system_language.......... E                                       *
    ztta/max_memreq_MB............ 2047                                    *
    ztta/roll_area................ 3000320                                 *
    ztta/roll_extension........... 2000683008                              *
    Job finished
    But the background job for the request when I run the infopackage manually calls a function BW_BTE_CALL_BW204010_E (BTE) and fetches the record from R/3.
    Please find the job log that fetches the record when run manually.
    Job started
    Step 001 started (program SBIE0001, variant &0000000210036, user ID RFCUSR)
    DATASOURCE = ZPUR_S600
             Current Values for Selected Profile Parameters               *
    abap/heap_area_nondia......... 0                                       *
    abap/heap_area_total.......... 41230008320                             *
    abap/heaplimit................ 40000000                                *
    zcsa/installed_languages...... DE                                      *
    zcsa/system_language.......... E                                       *
    ztta/max_memreq_MB............ 2047                                    *
    ztta/roll_area................ 3000000                                 *
    ztta/roll_extension........... 2000000000                              *
    Call customer enhancement BW_BTE_CALL_BW204010_E (BTE) with 26,596 records
    Result of customer enhancement: 26,596 records
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 26,596 records
    Result of customer enhancement: 26,596 records
    Asynchronous send of data package 000001 in task 0002 (1 parallel tasks)
    Call customer enhancement BW_BTE_CALL_BW204010_E (BTE) with 26,596 records
    Result of customer enhancement: 26,596 records
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 26,596 records
    Result of customer enhancement: 26,596 records
    Asynchronous send of data package 000002 in task 0004 (1 parallel tasks)
    Call customer enhancement BW_BTE_CALL_BW204010_E (BTE) with 1,062 records
    Result of customer enhancement: 1,062 records
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 1,062 records
    Result of customer enhancement: 1,062 records
    Asynchronous send of data package 000003 in task 0006 (1 parallel tasks)
    tRFC: Data Package = 000001, TID = , Duration = 000001, ARFCSTATE =
    tRFC: Start = 20110217 054648, End = 20110217 054649
    tRFC: Data Package = 000003, TID = , Duration = 000000, ARFCSTATE =
    tRFC: Start = 20110217 054652, End = 20110217 054652
    tRFC: Data Package = 000002, TID = , Duration = 000001, ARFCSTATE =
    tRFC: Start = 20110217 054651, End = 20110217 054652
    Job finished
    Kindly help me on this issue as why the job run through process chain is not fetchibg any records.
    Regards,
    Indhuja

    Hi Prasanth,
    If the background jobs are not availbale when IP is triggered then the BG job will ot be executed and the IDOCs will get struck.
    Once we push the IDOc only the BG job will be executed until then the PC will wait with yellow status.
    Hence this is not the issue with the non availability of BG WP.
    Also for last one week this issue was happening but today the job executed successfully through the Process chain itself.
    Then it shows this is not the issue with PC also.
    The job is executed using RFCUSR.
    Please suggest what would be the possible cause for this issue.
    Regards,
    Indhuja
    Edited by: Induja Rajkamal on Feb 23, 2011 11:48 AM

  • How to load special characters (diacritics)

    When trying to load a record from a csv-file into an Oracle 10 g database (UTF8 characterset) with the name Morré in it (note the accent grave) SQL Loader returns the following error "no terminator found after TERMINATED and ENCLOSED field"
    using a backslash (Morr\é ) doesn't help
    removing the accent grave (Morre) from the csv-file solves it but is not really what I want. Any idea how to tackle this?
    Sample record from the csv-file
    AAN0813171,"00979108","Derek Morré","Transactiefinanciering","EXC6654","791742","12-11-08","Open - Bevestigd","1063","8-9-2008
    ","7777"
    ctl-file:
    LOAD DATA
    CHARACTERSET UTF8
    APPEND
    INTO TABLE CNV_PUB_INV_CTRL
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    (order_no
    ,item_no
    ,fullname
    ,description
    ,no
    ,class_no
    ,start_date "TO_DATE(:start_date, 'DD-MM-RRRR HH24:MI:SS')"
    ,order_item_status
    ,price
    ,updated_on "TO_DATE(:updated_on, 'DD-MM-RRRR HH24:MI:SS')"
    ,bill_no
    changing the client NLS_LANG parameter to AMERICAN_AMERICA.WE8ISO8859P1 as suggested in the following link
    http://www.akadia.com/services/ora_sql_loader_utf8.html
    doesn't help either.

    Remco,
    Did you ever resolve this? I'm running into a very similar problem with the letter a with a grave accent. Changing it to a standard a works but am trying to avoid that like you are. Changing the column definition in the external table to nvarchar2 does NOT help.
    Now the interesting is the instance where this us failing is set to the AL32UTF8 character set. My sandbox instance is set to WE8MSWIN1252 and it works fine here.
    Any ideas?
    Thanks,
    Bob Siegel

  • Load an existing Berkeley DB file into memory

    Dear Experts,
    I have created some Berkeley DB (BDB) files onto disk.
    I noticed that when I issue key-value retrievals, the page faults are substantial, and the CPU utilization is low.
    One sample of the time command line output is as follow:
    1.36user 1.45system 0:10.83elapsed 26%CPU (0avgtext+0avgdata 723504maxresident)k
    108224inputs+528outputs (581major+76329minor)pagefaults 0swaps
    I suspect that the bottleneck is the high frequency of file I/O.
    This may be because of page faults of the BDB file, and the pages are loaded in/out of disk fairly frequently.
    I wish to explore how to reduce this page fault, and hence expedite the retrieval time.
    One way I have read is to load the entire BDB file into main memory.
    There are some example programs on docs.oracle.com, under the heading "Writing In-Memory Berkeley DB Applications".
    However, I could not get them to work.
    I enclosed below my code:
    --------------- start of code snippets ---------------
    /* Initialize our handles */
    DB *dbp = NULL;
    DB_ENV *envp = NULL;
    DB_MPOOLFILE *mpf = NULL;
    const char *db_name = "db.id_url";   // A BDB file on disk, size  66,813,952
    u_int32_t open_flags;
    /* Create the environment */
    db_env_create(&envp, 0);
    open_flags =
    DB_CREATE | /* Create the environment if it does not exist */
    DB_INIT_LOCK | /* Initialize the locking subsystem */
    DB_INIT_LOG | /* Initialize the logging subsystem */
    DB_INIT_MPOOL | /* Initialize the memory pool (in-memory cache) */
    DB_INIT_TXN |
    DB_PRIVATE; /* Region files are not backed by the filesystem.
    * Instead, they are backed by heap memory. */
    * Specify the size of the in-memory cache.
    envp->set_cachesize(envp, 0, 70 * 1024 * 1024, 1); // 70 Mbytes, more than the BDB file size of 66,813,952
    * Now actually open the environment. Notice that the environment home
    * directory is NULL. This is required for an in-memory only application.
    envp->open(envp, NULL, open_flags, 0);
    /* Open the MPOOL file in the environment. */
    envp->memp_fcreate(envp, &mpf, 0);
    int pagesize = 4096;
    if ((ret = mpf->open(mpf, "db.id_url", 0, 0, pagesize)) != 0) {
    envp->err(envp, ret, "DB_MPOOLFILE->open: ");
    goto err;
    int cnt, hits = 66813952/pagesize;
    void *p=0;
    for (cnt = 0; cnt < hits; ++cnt) {
    db_pgno_t pageno = cnt;
    mpf->get(mpf, &pageno, NULL, 0, &p);
    fprintf(stderr,"\n\nretrieve %5d pages\n",cnt);
    /* Initialize the DB handle */
    db_create(&dbp, envp, 0);
    * Set the database open flags. Autocommit is used because we are
    * transactional.
    open_flags = DB_CREATE | DB_AUTO_COMMIT;
    dbp->open(dbp, // Pointer to the database
    NULL, // Txn pointer
    NULL, // File name -- NULL for inmemory
    db_name, // Logical db name
    DB_BTREE, // Database type (using btree)
    open_flags, // Open flags
    0); // File mode. defaults is 0
    DBT key,data; int test_key=103456;
    memset(&key, 0, sizeof(key));
    memset(&data, 0, sizeof(data));
    key.data = (int*)&test_key;
    key.size = sizeof(test_key);
    dbp->get(dbp, NULL, &key, &data, 0);
    printf("%d --> %s ", *((int*)key.data),(char*)data.data );
    /* Close our database handle, if it was opened. */
    if (dbp != NULL) {
    dbp->close(dbp, 0);
    if (mpf != NULL) (void)mpf->close(mpf, 0);
    /* Close our environment, if it was opened. */
    if (envp != NULL) {
    envp->close(envp, 0);
    /* Final status message and return. */
    printf("I'm all done.\n");
    --------------- end of code snippets ---------------
    After compilation, the code output is:
    retrieve 16312 pages
    103456 --> (null) I'm all done.
    However, the test_key input did not get the correct value retrieval.
    I have been reading and trying this for the past 3 days.
    I will appreciate any help/tips.
    Thank you for your kind attention.
    WAN
    Singapore

    Hi Mike
    Thank you for your 3 steps:
    -- create the database
    -- load the database
    -- run you retrievals
    Recall that my original intention is to load in an existing BDB file (70Mbytes) completely into memory.
    So following your 3 steps above, this is what I did:
    Step-1 (create the database)
    I have followed the oracle article on http://docs.oracle.com/cd/E17076_02/html/articles/inmemory/C/index.html
    In this step, I have created the environment, set the cachesize to be bigger than the BDB file.
    However, I have some problem with the code that opens the DB handle.
    The code on the oracle page is as follow:
    * Open the database. Note that the file name is NULL.
    * This forces the database to be stored in the cache only.
    * Also note that the database has a name, even though its
    * file name is NULL.
    ret = dbp->open(dbp, /* Pointer to the database */
    NULL, /* Txn pointer */
    NULL, /* File name is not specified on purpose */
    db_name, /* Logical db name. */
    DB_BTREE, /* Database type (using btree) */
    db_flags, /* Open flags */
    0); /* File mode. Using defaults */
    Note that the open(..) API does not include the BDB file name.
    The documentation says that this is so that the API will know that it needs an in-memory database.
    However, how do I tell the API the source of the existing BDB file from which I wish to load entirely into memory ?
    Do I need to create another DB handle (non-in-memory, with a file name as argument) that reads from this BDB file, and then call DB->put(.) that inserts the records into the in-memory DB ?
    Step-2 (load the database)
    My question in this step-2 is the same as my last question in step-1, on how do I tell the API to load in my existing BDB file into memory?
    That is, should I create another DB handle (non-in-memory) that reads from the existing BDB file, use a cursor to read in EVERY key-value pair, and then insert into the in-memory DB?
    Am I correct to say that by using the cursor to read in EVERY key-value pair, I am effectively warming the file cache, so that the BDB retrieval performance can be maximized ?
    Step-3 (run your retrievals)
    Are the retrieval API, e.g. c_get(..), get(..), for the in-memory DB, the same as the file-based DB ?
    Thank you and always appreciative for your tips.
    WAN
    Singapore

  • What is the best way to load 14 million COPA records from BW into HANA?

    I have been managing a project in which we are attempting to load COPA data from BW into HANA using Open Hub and we continue to run into memory allocation errors in BW. We have been able to load 350,000 records.
    Any suggestions on what the best approach would be along with BW memory parameters.
    Your replies are appreciated.
    Rob

    Hello,
    this seems to be issue in BW caused by big volume of migrated data. I do not think that this is HANA related problem. I would suggest to post this message into BW area - you might get much better support there.
    But to help as much as I can - I found this (see point 7):
    http://help.sap.com/saphelp_nw04/helpdata/en/66/76473c3502e640e10000000a114084/frameset.htm
    7. Specify the number of rows per data package for the data records to be extracted. You can use this parameter to control the maximum size of a data package, and hence also how many main memories need to be made available to structure the data package.
    Hope it helps.
    Tomas

  • HT1202 I have an ipod classic loaded with songs from my old computer that wre put on the ipod from cd's. can I transfer the songs from my ipod to the new computer or do I have to put the songs back in my library by re-recording them one disc at a time?

    I have an ipod classic loaded with songs from my old computer that were put on my library from cd's. can I transfer those songs onto my new computer from my ipod or do I need to re-record them to my library on disc at a time?

    See this excellent user tip from another forum member turingtest2 outlining the different methods and software available to help you copy content from your iPod back to your PC and into iTunes.
    Recovering your iTunes library from your iPod or iOS device
    B-rock

  • How to look at the delta (daily load) records from R/3 system to BW master

    How to look at the delta (daily load) records from R/3 system to BW master on a particular date, let us say,
    today delta happened from R/3 to 0vendor  in BW , how to view what are all the records came to BW - 0vendor (from R/3) in today's delta.
    Regards
    Siva

    Hi,
    youi can see the data in the PSA.
    Just go to the request in monitor tab and in the monitor and in the left top click on the PSA button.
    In the nest window give the selection on the number of records you want to see from that if you want to see all the records give the number which get laoded in that request.
    this will show you the data loaded in that request.
    Hope it clears
    thanks

  • SQL Loader problem while loading records from txt file to database table.

    I am getting following error while loading records from flat txt file into database table with the help of 'sqlldr' command. I have executed catldr.sql from RDBMS folder but it is still showing same error. I am setting DIRECT = TRUE while issuing sqlldr command. If I try with DIRECT = FALSE then it works fine. Database is Oracle 8i.
    SQL*Loader-951: Error calling once/load initialization
    ORA-24329: invalid character set identifier
    F1 Please.

    Hello,
    Direct path load, can only be used with SQL*Loader and Database have the same version.
    Care to tell the database version and sql*loader version you are using.
    -Sri

  • Sqlldr is loading only 1st record from xml document

    Hi,
    I am trying to load XML doc with multiple records using sql*loader.
    I have registered my XSD perfectly.
    This is my control file
    LOAD DATA
    INFILE *
    INTO TABLE Orders APPEND
    XMLType(xmldata)
    FIELDS(
         xmldata LOBFILE (CONSTANT FULDTL_2.xml)
    TERMINATED BY '???')
    BEGINDATA
    FULDTL_2.xml
    -- Here, what I have to give for TERMINATED BY '???'
    My xml doc
    <Order ID="146120486" Status="CL" Comments="Shipped On 08/05/2008"/>
    <Order ID="143417590" Status="CL" Comments="Handset/Device has been received at NRC" ShipDate=""/>
    sqlldr is loading only 1st record from the file.
    How can I make it to load all the records from my xml doc.
    Thanks in advance.

    thanks for both the replies above - essentially the same correct solution.
    something worth noting now that I've written and tested both a SAX solution and a DOM solution is that there is a significant (4 x) time penalty using SAX.
    I considering dividing the vector I am storing/recovering into chunks and saving each chunk separately using DOM to speed things up...
    any thoughts on this approach?

  • Fetch records from ETL Load control tables in BODS

    Hi,
    Please anyone tell me, how to fetch the records from the ETL Load control tables in BODS.
    (E.g) ETL_BATCH, ETL_JOB, ETL_DATAFLOW, ETL_RECON, ETL_ERROR.
    These are some ETL load tables..
    Thanks,
    Ragrds,
    Ranjith.

    Hi Ranjith,
    You can ask your administrator for BODS repository login details.
    Once you get login details you will get all the tables.
    Please check following links :
    Data Services Metadata Query Part 1
    http://www.dwbiconcepts.com/etl/23-etl-bods/171-data-services-metadata-query-part-2.html
    http://www.dwbiconcepts.com/etl/23-etl-bods/171-data-services-metadata-query-part-3.html
    I hope this will help.
    If you want more info then please let me know.
    Thanks,
    Swapnil

Maybe you are looking for

  • Unable to collect Product Return History using legacy collection

    Hi, I am facing issue in collecting product return history using legacy collection, File Upload (User File Upload) & Loader Worker erroring out as below. As I observe, its inserting space after .ctl, .dis & .bad file path. Can some one guide me how t

  • Is the iPod Touch 2nd Gen no longer usable? Is this what is store for all Apple products?

    I am very disheartened with Apple.  I have loved the iPod touchs up until now.  It seems the iPod Touch 2nd Gen no longer download apps that work on the 4.2.1 iOS.  I was just selling my wife on Apple products and now this has shut her and I down on

  • Restrict GR date prior to PO creation date.

    Dear experts, The Present configuration allows GR date prior to PO creation date. The system allows users to post Goods receipt (MIGO), even if the GR date is prior to the PO creation date. How to restrict this? Is there a standard customization in M

  • File in use by another Application...Again

    Thought it was solved....not Ran permissions repair and added the render folder to privacy. Tried rendering a mixdown and the message pops up. I added my entire External drive to Privacy..hopefully this will do the trick.

  • Cant access my Apex installation remotely

    Tried to login in into my Apex application using http://pcname.networkname.com:7777/pls/htmldb/f?p=101:1 from another machine on the same LAN and returned "Page Can Not be Displayed" having contacted the Apex installed machine and displaying its IP a