Csv file as input?

Hello,
I have and csv file with voltages. Now I would like to read this CSV file with an LabView programme and send the voltages to an analog output of a DAQ Card. (Each microsecond one Voltage value)
Is that possible?
regards
doctorbluex

Read your file using "read from Spreadheed file" and wire a comma as delimiter. (Transpose if needed).
You'll get an array that you can use for any purpose (display in a graph, send to AO, etc.). Just slice out the desired row or column.  
LabVIEW Champion . Do more with less code and in less time .

Similar Messages

  • Reg..csv file as input parameter in sqlloader

    Hi,
    I have .ctl file. every time i received the file name in diff name.
    rather than hardcode file name i wants to take .csv file as input parameter plesase do help in this.
    here is the code..
    OPTIONS (SKIP = 1, BINDSIZE=100000)
    LOAD DATA
    CHARACTERSET WE8ISO8859P1
    INFILE '/WOAU1/bkp/pgp_masterkey.csv'
    BADFILE '/WOAU1/bkp/pgp_masterkey.bad'
    DISCARDFILE '/WOAU1/bkp/pgp_masterkey.dsc'
    Thanks
    Atul

    A better alternative would be to avoid using SQL*Loader and instead use External Tables for which you can use an ALTER TABLE statement to change the LOCATION of the table (which details the filename). (A valid reason for using EXECUTE IMMEDIATE in PL/SQL). That way you keep all your control inside the database and you're not messing about with o/s scripts to pass different parameters to an external program.

  • IMPORTING 3 CSV FILES AS INPUT WITH BROWSE BUTTON COMPARING 3 CSV FILES

    I HAVE 3 .CSV FILES WHICH ARE MY INPUT
    1. TRUE FILE
    2. LIMIT FILE
    3. REFERENCE FILE
    I want to compare limit file and true file and delete some columns from true file and for rest of the columns calculate mean,standard deviation how do I approach the problem

    if possible, best thing would be bring the contents to staging tables and do comparison in sql. This would have the advantage of ability to apply set based logic and would be much better from performance perspective rather than trying to do this directly
    from the file.
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Help me in creating a Device Collection - i have a list of machine name (in a excel or CSV file)

    Hello Guys,
    I have created a Device collection for UK region (2000+ machines)
    Now i have been given a list of 1000 machines to which i need to deploy an application.
    I have to create a device collection for this 1000+ machines. as an input i have a excel or CSV file with a list of machine names.
    Please suggest me how can i create a device collection with CSV file as input. Is my CSV file should be in particular format.
    Or is there any other way i can create a collection for this 1000 specific machines.
    Please suggest.

    My previous post was for sccm 2012.
    here its for 2007
    In the Operating System Deployment section of SCCM right click on Computer Association and choose
    Import Computer Information
    when the wizard appears select Import Computers using a file
    The file itself must contain the information we need in this (CSV) format
    COMPUTERNAME,GUID,MACADDRESS
    (sample below)
    Quote
    deployvista,3ED92460-0448-6C45-8FB8-A60002A5B52F,00:03:FF:71:7D:76
    NEWCOMP1,55555555-5555-5555-5555-555555555555,05:06:07:08:09:0A
    NEWPXE,23CA788C-AF62-6246-9923-816CFB6DD39F,00:03:FF:72:7D:76
    w2k8deploy,BFAD6FF2-A04E-6E41-9060-C6FB9EDD4C54,00:03:FF:77:7D:76
    if we look at the last line, I've marked the computer name in Red, the GUID in BLUE and the MAC address in GREEN, separate these values with commas as above.
    w2k8deploy,BFAD6FF2-A04E-6E41-9060-C6FB9EDD4C54,00:03:FF:77:7D:76
    the file can be a standard TEXT file that you create in notepad, and you can rename it to CSV for easier importing into our wizard...
    so, click on Browse and browse to where you've got your CSV file
    on the Choose Mapping screen, you can select columns and define what to do with that mapping, eg: you could tell it to ignore the GUID value (we won't however)
    on the next screen you'll see a Data Preview, and this is useful as it will highlight any errors it finds with a red exclamation mark, in the example below a typo meant that it correctly flagged the MAC address as invalid
    so edit your CSV file again and fix the error, click previous (back) and try again
    Next choose the target collection where you want these computers to end up in
    review the summary
    in SCCM collections, we can now see the computers we've just imported from File,
    Enjoy
    Nikkoscy

  • Import-csv how to make powershell ignore a line in a csv file if a column contains a certain value

    Basically I got a very basic script that uses a csv file to input values needed to remove people from a distro list. That part is working fine. I'd like to add to it's functionality so it can look at values in a certain column, and if any of those certain
    values are present, I want powershell to skip that line and not process it. For instance I have a column in the csv called Group. If I have a listing under Group that says ABC I want the script to skip that line and not try to process it. What could I insert
    into the script to acheive this?

    You're welcome. You can add to the if test with -and:
    Import-Csv .\groups.csv | ForEach {
    If ($_.Group -ne 'ABC' -and $_.Group -ne 'DEF') {
    Write-Output "Group is $($_.Group)"
    Write-Output "Name is $($_.Name)"
    Output:
    Group is GHI
    Name is Show Me
    groups.csv:
    Group,Name
    ABC,Skip Me
    DEF,Skip Me Too
    GHI, Show Me
    Don't retire TechNet! -
    (Don't give up yet - 13,085+ strong and growing)

  • Handling CSV files from Crystal 10

    <p>I&#39;m encountering some difficulty with a report with a CSV file as input.Crystal is automatically interpreting a column to have a numeric data type instead of text. Any ideas how I can force Crystal to interpret it differently?</p><p> Thanks a lot.</p>

    Hi, thanks for the reply. I'm connecting using ADO for Access/Excel. Also tried using ODBC for text file. Column  number 9 ('Sec. ID num') is the column in question.
    Here's the contents of a sample file (it's not too small unfortunately)
    Record Type,Portfolio Number,Purs/Sale,Trans. Number,Settlement Date,Trade Date,Asset Type,Sec. ID Type,Sec. ID Num,Sec Name,Euroclear Ind,Nominal,Quoted Ccy,Trade Price,Consideration,Comm,Misc. Comm,Inc/Pch Sold,Sett Ccy,Sett Cash,Trd Broker,Trd Broker Nm,Trd Brok Acc #,Sett Broker,Sett Broker Nm,Sett Broker Acc#,Comments,Trade Src ID,Cancel Trade,Rel Trans ID, Port Num,Dim Trans Num,Header 1,Header 2,Header 3,Header 4,Header 5
    DET,AAAA,P,3072779,20070126,20070123,EQ,SD,AAAABBB,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,MLILGB3LESF,MERRIL-02,,BKAUATWW,,,,9644303,N,0,A-SUMMBAL,20070124000002,HDR,9644303,16184,20050201,161849
    DET,AAAA,P,3072779,20070126,20070123,EQ,SD,4943111,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,MLILGB3LESF,MERRIL-02,,BKAUATWW,,,,9644303,N,0,A-SUMMBAL,20070124000002,HDR,9644303,16184,20070201,161849
    DET,AAAA,S,3072780,20070126,20070123,EQ,SD,5541979,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,9900.00,SBILGB2L,SALBRO-01,,CITTGB2L,,,,9644303,N,0,A-SUMMBAL,20070124000003,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072781,20070126,20070123,EQ,SD,2676302,Stock A,,1000,CAD,10,10000,100.0000000,0.0000000,0.00,CAD,10100.00,SLIIGB2L,LEHBRO-01,T12702711,ROYCCAT2,,/ADEP/RBCT,,9644303,N,0,A-SUMIGLOB,20070124000004,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072782,20070126,20070123,EQ,SD,7103065,Stock A,,1000,CHF,10,10000,100.0000000,0.0000000,0.00,CHF,10100.00,MLILGB3LESF,MERRIL-02,,MLILGB3LESF,,/ADEP/GB101073,,9644303,N,0,A-SUMIGLOB,20070124000005,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072783,20070126,20070123,EQ,SD,5086577,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,MLILGB3LESF,MERRIL-02,6500440000,PARBDEFF,,/ADEP/0007259,,9644303,N,0,A-SUMIBAL,20070124000006,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072784,20070126,20070123,EQ,SD,B01CP21,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,MLILGB3LESF,MERRIL-02,,PARBESM,,,,9644303,N,0,A-SUMMTECH,20070124000007,HDR,9644303,16184,20070201,161849
    DET,AAAA,S,3072785,20070126,20070123,EQ,SD,5051252,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,9900.00,MLILGB3LESF,MERRIL-02,,NDEAFIHH,,,,9644303,N,0,A-SUMMTECH,20070124000008,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072786,20070126,20070123,EQ,SD,7088429,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,ABNAGB22,ABNAMR-03,23667,ABNAGB22,,/ADEP/00611,,9644303,N,0,A-SUMMBAL,20070124000009,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072787,20070126,20070123,EQ,SD,B00G0S5,Stock A,,1000,HKD,10,10000,100.0000000,110.0000000,0.00,HKD,10210.00,MLPFUS31XXX,MERRIL-02,000000438280,MLFEHKHH,,,,9644303,N,0,A-SUMIGLOB,20070124000010,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072788,20070126,20070123,EQ,SD,5013832,Stock A,,1000,EUR,10,10000,100.0000000,101.2500000,0.00,EUR,10201.25,DAVYIE21,DAVBRO-01,,DAVYIE21,,/CRST/189,,9644303,N,0,A-SUMMBAL,20070124000011,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072789,20070126,20070123,EQ,SD,2569286,Stock A,,1000,USD,10,10000,100.0000000,0.0000000,0.00,USD,10100.00,MLPFUS31XXX,MERRIL-02,,MLPFUS31,,/ADTC/161,,9644303,N,0,A-SUMIGLOB,20070124000012,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072790,20070126,20070123,EQ,SD,B0DJNG0,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,SBILGB2L,SALBRO-01,,CITIITMX,,,,9644303,N,0,A-SUMMGROW,20070124000013,HDR,9644303,16184,20070201,161849
    DET,AAAA,S,3072791,20070126,20070123,EQ,SD,6435145,Stock A,,1000,JPY,10,10000,100.0000000,0.0000000,0.00,JPY,9900.00,MLPFUS31XXX,MERRIL-02,,MLCOJPJT,,,,9644303,N,0,A-SUMIGLOB,20070124000014,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072792,20070126,20070123,EQ,SD,4942818,Stock A,,1000,USD,10,10000,100.0000000,0.0000000,0.00,USD,10100.00,MSNYUS33,MORGAN-02,,MSNYUS33,,/ADTC/050,,9644303,N,0,A-SUMIGLOB,20070124000015,HDR,9644303,16184,20070201,161849
    DET,AAAA,S,3072793,20070126,20070123,EQ,SD,5727360,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,9900.00,MSLNGB2X,MORGAN-01,47612G,PARBFRPPNLC,,/ADEP/019,,9644303,N,0,A-SUMMGROW,20070124000016,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072794,20070126,20070123,EQ,SD,B11HK39,Stock A,,1000,NOK,10,10000,0.0000000,0.0000000,0.00,NOK,10000.00,MIDLGB22JAC,HSBCGP-01,,ESSENOKX,,,,9644303,N,0,A-SUMIBAL,20070124000017,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072795,20070126,20070123,EQ,SD,5959378,Stock A,,1000,SEK,10,10000,100.0000000,0.0000000,0.00,SEK,10100.00,MLILGB3LESF,MERRIL-02,,ESSESESS,,,,9644303,N,
    0,A-SUMMTECH,20070124000018,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072796,20070126,20070123,EQ,SD,6916844,Stock A,,100,SGD,10,1000,100.0000000,0.0000000,0.00,SGD,1100.00,MLPFUS31XXX,MERRIL-02,,MLSSSGSG,,,,9644303,N,0,A-SUMIGLOB,20070124000019,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072797,20070126,20070123,EQ,SD,2113382,Stock A,,1000,USD,10,10000,100.0000000,0.0000000,0.00,USD,10100.00,MLPFUS31XXX,MERRIL-02,,MLPFUS31,,/ADTC/161,,9644303,N,0,A-SUMMTECH,20070124000020,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072798,20070126,20070123,EQ,SD,3134865,Stock A,,1000,GBP,10,10000,100.0000000,101.0000000,0.00,GBP,10201.00,CITIGB2L,SANBER-01,,CITIGB2L,,/CRST/899,,9644303,N,0,A-SUMMGROW,20070124000021,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072799,20070126,20070123,EQ,SD,2890005,Stock A,,1000,USD,10,10000,100.0000000,0.0000000,0.00,USD,10100.00,MLPFUS31XXX,MERRIL-02,,MLPFUS31,,/ADTC/161,,9644303,N,0,A-SUMMTECH,20070124000022,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072815,20070126,20070123,EQ,SD,2890005,Stock A,,1000,USD,11,11000,101.0000000,0.0000000,0.00,USD,11101.00,MLPFUS31XXX,MERRIL-02,,MLPFUS31,,/ADTC/161,,9644303,Y,3072799,A-SUMMTECH,20070124000022,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072800,20070126,20070123,FI,SD,7624641,Stock A,Y,1000,EUR,100,1000,0.0000000,0.0000000,21.16,EUR,1021.16,JPMSGB2L,CHAINV-04,,JPMSGB2L,,/AEID/95724,,9644303,N,0,A-SUMMGROW,20070124000023,HDR,9644303,16184,20070201,161849
    DET,AAAA,S,3072801,20070126,20070123,FI,SD,7624641,Stock A,Y,1000,EUR,100,1000,0.0000000,0.0000000,21.16,EUR,1021.16,BARCGB33,BARCLY-03,,BARCGB33,,/ACID/34797,,9644303,N,0,A-SUMMGROW,20070124000024,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072802,20070126,20070123,FI,SD,3258486,Stock A,Y,1000,EUR,100,1000,0.0000000,0.0000000,25.20,EUR,1025.20,RBOSGB2RTCM,RBSGRP-01,,RBOSGB2RTCM,,/AEID/97802,,9644303,N,0,A-SUMMGROW,20070124000025,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072810,20070126,20070123,FI,SD,3258486,Stock A,Y,1000,EUR,100,1000,0.0000000,0.0000000,25.20,EUR,1025.20,RBOSGB2RTCM,RBSGRP-01,,RBOSGB2RTCM,,/AEID/97802,,9644303,Y,3072802,A-SUMMGROW,20070124000025,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072803,20070126,20070123,FI,SD,B0LNX64,Stock A,,1000,GBP,100,1000,0.0000000,0.0000000,16.42,GBP,1016.42,BARCGB33,BARCLY-03,,BARCGB33,,/CRST/034,,9644303,N,0,A-SUMMBAL,20070124000026,HDR,9644303,16184,20070201,161849
    DET,AAAA,P,3072804,20070126,20070123,FI,SD,B06FWG8,Stock A,Y,1000,EUR,100,1000,0.0000000,0.0000000,15.78,EUR,1015.78,DEUTGB2L,DEUTSC-03,,DEUTGB2L,,/AEID/91255,,9644303,N,0,A-SUMMGROW,20070124000027,HDR,9644303,16184,20070201,161849
    TLR,28,219738.17,9644303,,,,,,,,,,,,,,,,,,,,,,,,,,,,,HDR,9644303,16184,20070201,161849

  • Importing several CSV files into Excel

    Hello Everyone,
    I managed to piece together this code which runs several powershell scripts that each output a CSV for every day of the week.
    Next, I located a function that takes a CSV file as input and exports that as a worksheet in an Excel document.
    My problem is, the last column in the Excel file should be the first column and I just cannot spot why this behavior is occuring.
    #First phase, output CSV files used later in the script.
    Monday.ps1
    Tuesday.ps1
    Wednesday.ps1
    Thursday.ps1
    Friday.ps1
    Saturday.ps1
    Sunday.ps1
    #Now the function to export the CSV from Phase 1 into an Excel Spreadsheet.
    function Export-Excel {
    [cmdletBinding()]
    Param([Parameter(ValueFromPipeline=$true)][string]$junk)
    begin{
    $header=$null
    $row=1
    process{
    if(!$header){
    $i=0
    $header=$_ | Get-Member -MemberType NoteProperty | select name
    $header | %{$Global:ws.cells.item(1,++$i)=$_.Name}
    $i=0
    ++$row
    foreach($field in $header){
    $Global:ws.cells.item($row,++$i)=$($_."$($field.Name)")
    $xl=New-Object -ComObject Excel.Application
    $wb=$xl.WorkBooks.add(1)
    $Global:ws=$wb.WorkSheets.item(1)
    $Global:ws.Name='Sunday'
    import-csv 'C:\Sunday.csv' | Export-Excel
    $Global:ws=$wb.WorkSheets.Add()
    $Global:ws.Name='Saturday'
    import-csv 'C:\Saturday.csv' | Export-Excel
    $Global:ws=$wb.WorkSheets.Add()
    $Global:ws.Name='Friday'
    import-csv 'C:\Friday.csv' | Export-Excel
    $Global:ws=$wb.WorkSheets.Add()
    $Global:ws.Name='Thursday'
    import-csv 'C:\Thursday.csv' | Export-Excel
    $Global:ws=$wb.WorkSheets.Add()
    $Global:ws.Name='Wednesday'
    import-csv 'C:\Wednesday.csv' | Export-Excel
    $Global:ws=$wb.WorkSheets.Add()
    $Global:ws.Name='Tuesday'
    import-csv 'C:\Tuesday.csv' | Export-Excel
    $Global:ws=$wb.WorkSheets.Add()
    $Global:ws.Name='Monday'
    import-csv 'C:\Monday.csv' | Export-Excel
    $xl.Visible=$true

    That is interesting considering this script I found on Hey Scripting Guy.
    http://blogs.technet.com/b/heyscriptingguy/archive/2010/09/09/copy-csv-columns-to-an-excel-spreadsheet-by-using-powershell.aspx
    I have run your version above and have no issues with order. 
    I still recommend using WorkBook.OpenText($csvfile)
    and
    $wb.Sheets($csvfile).Move($wb2.Sheet(1))
    This is much faster and takes much less code.
    ¯\_(ツ)_/¯
    Could you give a terse example?  I'm trying to put something together but am not having much success.
    $sheets = @(LS D:\Scripts\work | select FullName -ExpandProperty FullName)
    $Excel = New-Object -ComObject excel.application
    $Excel.visible = $false
    $file = $Excel.Workbooks.Open("C:\tmp\mytest.xlsx")
    #$wb2 = $file.Sheets
    for ($i=0; $i -lt $sheets.length; $i++)
    $workbook = $Excel.WorkBooks.OpenText($sheets[$i])
    $workbook.Sheets($sheets[$i]).Move($file.Sheet(1))
    $Excel.visible = $true

  • B1if Debugging a csv file import?

    Hi All
    I am new to B1if, what are the steps involved in debugging a scenario that takes a csv file as input?
    I.e. How do you specify a given file as input at the step level if this is possible?

    In the Connection manager properties for the csv file set the connection string through an expression some thing like below
    @[User::myfilename] + "/Bensfile" + (DT_STR,4,1252)DATEPART( "yyyy" , getdate() ) + RIGHT("0" + (DT_STR,4,1252)DATEPART( "mm" , getdate() ), 2) + RIGHT("0" + (DT_STR,4,1252)DATEPART( "dd" , getdate() ), 2) +".csv"
    Where myfilename waill be the variable to store the full file path
    Surender Singh Bhadauria
    My Blog

  • How to process large input CSV file with File adapter

    Hi,
    could someone recommend me the right BPEL way to process the large input CSV file (4MB or more with at least 5000 rows) with File Adapter?
    My idea is to receive data from file (poll the UX directory for new input file), transform it and then export to one output CSV file (input for other system).
    I developed my process that consists of:
    - File adapter partnerlink for read data
    - Receive activity with checked box to create instance
    - Transform activity
    - Invoke activity for writing to output CSV.
    I tried this with small input file and everything was OK, but now when I try to use the complete input file, the process doesn't start and automatically goes to OFF state in BPEL console.
    Could I use the MaxTransactionSize parameter as in DB adapter, should I batch the input file or other way could help me?
    Any hint from you? I've to solve this problem till this thursday.
    Thanks,
    Milan K.

    This is a known issue. Martin Kleinman has posted several issues on the forum here, with a similar scenario using ESB. This can only be solved by completely tuning the BPEL application itself, and throwing in big hardware.
    Also switching to the latest 10.1.3.3 version of the SOA Suite (assuming you didn't already) will show some improvements.
    HTH,
    Bas

  • Data-Driven test : Compilation should be avoided while running tests in batch when .csv file inputs changed to use them in script

    Hi,
    I am running Data-Driven  test on different machines with different  input values in .CSV file in batch mode.we are facing following problem:
     Test not considering modified values in  .CSV file until we recompile the test.
    Is there any way to avoid this dependency of compilation after updating .CSV file???
    Regards,
    Nagasree.

    Assuming the CSV is part of the Visual Studio solution. Open the properties panel for the CSV file from solution explorer. Set "Copy to output directory" to "Copy if newer" or to "Copy always". Some documents recommend
    "Copy if newer" but I prefer "Copy always" as occasionally a file was not copied as I expected. The difference between the two copy methods is a little disk space and a little time, but disks are normally big and the time to copy is normally
    small. Any savings are, in my opinion, far outweighed by being sure that the file will be copied correctly.
    See also
    http://stackoverflow.com/questions/23469100/how-to-run-a-test-many-times-with-data-read-from-csv-file-data-driving/25742114#25742114
    Regards
    Adrian

  • Variables to pass the input CSV file names

    Hi,
    I have created an interface source from 2 different CSV files.
    Do I need to generate 2 different scenarios or one scenario with variable to pass 2 file names one after the other
    2 CSVs are of same type
    Thanks for the help

    Step1. Create an Interface and load first file and test if that is working fine.
    Step 2. Change the resource Name in File to Variable
    Step 3. Set variable pass the file 1 and call interface 1 and set variable again with File 2 and call Interface and run the package.
    This way you can use the same interface to load file with same structure.
    In case you need to carry this same logic with multiple file then
    Declare variable and call interface and package and create scenario. Now trigger the scenario and pass the file name.

  • Split records into Multiple csv files using a Threshold percentage

    Hi Gurus,
    I have a requirement to split the data into two csv file from a table using a threshold value(in Percentage) .
    Assume that If my source select query of interface fetches 2000 records , I will provide a threshold value like 20%.
    I need to generate a csv1 with 400 records(20% of 2000) and the rest of the records into another csv2.
    For implementing this I am trying to use the following process.
    1) Create a procedure with the select query to get the count of records.
    Total Records count: select count(1) from source_table <Joins> <Lookups> <Conditions>;
    2) Calculate the Record count to first CSV using the threshold_value.
    CSV1_Count=Total records count /threshold_value
    3) Create a view that fetches the CSV1_Count(400) records for CSV1 as follows.
    Create view CSV1_view as select Col1,Col2,Col3 from source_table <Joins> <Lookups> <Conditions>
    Where rownum<=CSV1_Count;
    4) Generate CSV1 file using View 'CSV1_View'
    5) Generate CSV2 File using the Interface with same select statement (with columns ) to generate a CSV.
    select Col1,Col2,Col3 from source_table ST <Joins> <Lookups> <Conditions>
    Left outer join (Select Col1 from CSV1_View ) CS on CS.Col1=ST.Col1 where CS.Col1 is null;
    Which gives the Total records minus the CS1_View records.
    The above process seems a bit complex and very simple . If any changes in my Interface I also need to change the procedure (counts the no:of records).
    Please provide your comments and feedback about this and looking for your inputs for any new simple approach or fine tune the above approach.
    Thanks,
    Arjun

    Arjun,
    This are my thoughts and Lets do it in 3 Steps
    Step 1.  ODI Procedure
    Drop table Temp_20 ;
    Create table Temp_20 as select * from table where rownum < ( SELECT TRUNC( COUNT(1) /5) FROM TABLE ) .
    [ ** This way iam fetching approx 20% of the table data and loading into Temp table . 1/5 th is 20%  so i am dividing count by 5
    I don't believe View will help you especially with RowNum as if you run the same query with rownum < N the rows order might differ . so Temp table is great ]
    Step 2 .  Use OdiSqlUnload  with select columns  from temp_20
    Step 3 . Use again OdiSqlUnload  with  select columns from table where  ( uk keys ) not in ( selecy uk_keys from temp_20)
    [** this way you can pick the remaining 80% ** and the data will be not repeat itself across 20% and 80% , as might happen with view ]
    what do you think ?

  • Import data from excel/csv file in web dynpro

    Hi All,
    I need to populate a WD table by first importing a excel/CSV file thru web dynpro screen and then reading thru the file.Am using FileUpload element from NW04s.
    How can I read/import data from excel / csv file in web dynpro table context?
    Any help is appreciated.
    Thanks a lot
    Aakash

    Hi,
    Here are the basic steps needed to read data from excel spreadsheet using the Java Excel API(jExcel API).
    jExcel API can read a spreadsheet from a file stored on the local file system or from some input stream, ideally the following should be the steps while reading:
    Create a workbook from a file on the local file system, as illustrated in the following code fragment:
              import java.io.File;
              import java.util.Date;
              import jxl.*;
             Workbook workbook = Workbook.getWorkbook(new File("test.xls"));
    On getting access to the worksheet, once can use the following code piece to access  individual sheets. These are zero indexed - the first sheet being 0, the  second sheet being 1, and so on. (You can also use the API to retrieve a sheet by name).
              Sheet sheet = workbook.getSheet(0);
    After getting the sheet, you can retrieve the cell's contents as a string by using the convenience method getContents(). In the example code below, A1 is a text cell, B2 is numerical value and C2 is a date. The contents of these cells may be accessed as follows
    Cell a1 = sheet.getCell(0,0);
    Cell b2 = sheet.getCell(1,1);
    Cell c2 = sheet.getCell(2,1);
    String a1 = a1.getContents();
    String b2 = b2.getContents();
    String c2 = c2.getContents();
    // perform operations on strings
    However in case we need to access the cell's contents as the exact data type ie. as a numerical value or as a date, then the retrieved Cell must be cast to the correct type and the appropriate methods called. The code piece given below illustrates how JExcelApi may be used to retrieve a genuine java double and java.util.Date object from an Excel spreadsheet. For completeness the label is also cast to it's correct type. The code snippet also illustrates how to verify that cell is of the expected type - this can be useful when performing validations on the spreadsheet for presence of correct datatypes in the spreadsheet.
      String a1 = null;
      Double b2 = 0;
      Date c2 = null;
                        Cell a1 = sheet.getCell(0,0);
                        Cell b2 = sheet.getCell(1,1);
                        Cell c2 = sheet.getCell(2,1);
                        if (a1.getType() == CellType.LABEL)
                           LabelCell lc = (LabelCell) a1;
                           stringa1 = lc.getString();
                         if (b2.getType() == CellType.NUMBER)
                           NumberCell nc = (NumberCell) b2;
                           numberb2 = nc.getValue();
                          if (c2.getType() == CellType.DATE)
                            DateCell dc = (DateCell) c2;
                            datec2 = dc.getDate();
                           // operate on dates and doubles
    It is recommended to, use the close()  method (as in the code piece below)   when you are done with processing all the cells.This frees up any allocated memory used when reading spreadsheets and is particularly important when reading large spreadsheets.              
              // Finished - close the workbook and free up memory
              workbook.close();
    The API class files are availble in the 'jxl.jar', which is available for download.
    Regards
    Raghu

  • Read data from a CSV file

    hi,
    i've managed to make a CSV file which includes logged data of how much my NXT car is turning left(L),right(R) or going straight(S). format is:direction/timer value (L/1922 means it went to left for 1922 ms).
    where i got stuck is how can i read those time and direction values from csv file in such way that those values could be used for the motor move input?
    Actually what i'm trying to do is severeal VIs which function like record/play action in NXT Toolkit.
    (i've attached the vi and the csv file, the file extension is CSV but it is not written in csv format it's a basic txt file actually )
    Message Edited by szemawy on 05-23-2008 10:34 AM
    Attachments:
    read_deneme.vi ‏10 KB
    testus.csv ‏1 KB

    after i get the data in 1-D array format I cannot convert it to string so that i can send it by write BT message. i tried flatten to string but didnt work out
    [nevermind the mess, just see the part where the 1-D string is being converted to string]
    Attachments:
    read_test2.JPG ‏184 KB
    read_test2_fp.JPG ‏132 KB

  • Error while running bulk load utility for account data with CSV file

    Hi All,
    I'm trying to run the bulk load utility for account data using CSV but i'm getting following error...
    ERROR ==> The number of CSV files provided as input does not match with the number of account tables.
    Thanks in advance........

    Please check your child table.
    http://docs.oracle.com/cd/E28389_01/doc.1111/e14309/bulkload.htm#CHDCGGDA
    -kuldeep

Maybe you are looking for