Import, search, display and modify a CSV file.

I am new to powershell so please give me a bit of slack but this is what I have and what I am wanting to do:
I have a CSV with |customer no.|full name|address| headings
I want to be able to prompt a message box to allow a search for a customer number then to bring up the whole row of data to read, once this has been completed I need to edit the value saying completed.
I am more wondering whether this is possible as I have spent ages searching with little result.
Thanks in advance

thanks jrv,
basically I need to be able to display the rest of the information about the customer by just searching their customer number. Once this has been done I need to edit the customer to say that I have searched for them already. 
Hope that make it a little clearer.
That is exactly what my demonstration does. n You need to implement it.  We cannot design and write your code.  If you need this done then I recommend calling a consultant.
The script learning is a link at the top of the page.
You can also very easily do this in Excel with no script.  Just search the customer number coloumn with the Excel search box.
¯\_(ツ)_/¯

Similar Messages

  • Read and write a .CSV file contains cirillic characters issue

    Hi guys,
    I am a developer of a web application project which uses Oracle Fusion Middleware technologies. We use JDeveloper 11.1.1.4.0 as development IDE.
    I have a requirement to get a .csv file from WLS to application running machine. I used a downloadActinLinsener in front end .jspx in order to do that.
    I use OpenCSV library to read and write .csv files.
    Here is my code for read and write the .csv file,
    public void dwdFile(FacesContext facesContext, OutputStream out) {
    System.out.println("started");
    String [] nextLine;
    try {
    FileInputStream fstream1 = new FileInputStream("Downloads/filetoberead.CSV");
    DataInputStream in = new DataInputStream(fstream1);
    BufferedReader br = new BufferedReader(new InputStreamReader(in,"UTF-8"));
    CSVReader reader = new CSVReader(br,'\n');
    //CSVReader reader = new CSVReader(new FileReader("Downloads/ACTIVITY_LOG_22-JAN-13.csv"),'\n');
    List<String> list=new ArrayList();
    while ((nextLine = reader.readNext()) != null) {
    if(nextLine !=null){
    for(String s:nextLine){
    list.add(s);
    System.out.println("list size ; "+list.size());
    OutputStreamWriter w = new OutputStreamWriter(out, "UTF-8");
    CSVWriter writer = new CSVWriter(w, ',','\u0000');
    for(int i=0;i<list.size();i++){
    System.out.println("list items"+list.get(i));
    String[] entries = list.get(i).split(",");
    writer.writeNext(entries);
    //System.out.println("list items : "+list.get(i));
    writer.close();
    } catch (IOException e) {
    e.printStackTrace();
    say the filetoberead.CSV contains following data,
    0,22012013,E,E,ASG,,O-0000,O,0000,100
    1,111211,LI,0,TABO,B,M002500003593,,,К /БЭ60072715/,КАРТЕНБАЙ
    2,07,Balance Free,3
    1,383708,LI,0,BDSC,B,НЭ63041374,,,Т /НЭ63041374/,ОТГОНБААТАР
    2,07,Balance Free,161
    It reads and writes the numbers and english characters correct. All cirillic characters it prints "?" as follows,
    0,22012013,E,E,ASG,,O-0000,O,0000,100
    1,111211,LI,0,TABO,B,M002500003593,,,? /??60072715/,?????????
    2,07,Balance Free,3
    1,383708,LI,0,BDSC,B,??63041374,,,? /??63041374/,???????????
    2,07,Balance Free,161
    can somone please help me to resolve this problem?
    Regards !
    Sameera

    Are you sure that the input file (e.g. "Downloads/filetoberead.CSV") is in UTF-8 character set? You can also check it using some text editor having a view in hex mode. If each Cyrillic character in your input file occupies a single byte (instead of two), then the file is not in UTF-8. Most probably it is in Cyrillic for Windows (CP1251).
    If this is the case, you should modify the line
    BufferedReader br = new BufferedReader(new InputStreamReader(in,"UTF-8"));toBufferedReader br = new BufferedReader(new InputStreamReader(in,"windows-1251"));Dimitar

  • "How Display and Output Acq'd File.VI works?"

    "My present project is somewhat similar to Display and Output Acq'd File.vi. But I don't clearly know how it works. My questions are:
    1. What will happen if the data-reading from file is faster than output-generation from buffered data?
    2. How does it match pace of reading and that of generation, and in which vi does the matching achieve?
    3. Is it possible to get these control information, like interrupt or DMA status in labview application?
    Thank you for your help
    Steven"

    I attached a bmp that illustrates how to read and write simultaneously from one file. The VI is using notifiers (under the advanced palette) to synchronize reader and writer loops.
    Note that the Wait function in the reader loop is irrelevant because the Wait on Notifier function blocks until data is available. This forces reader to be at least as slow as writer.
    Attachments:
    read_and_write.bmp ‏1014 KB

  • Can I import into Illustrator and modify a drawing in Appleworks?

    Can I import into Illustrator and modify a drawing in Appleworks?

    Depending on the detail/complexity/accuracy of the graphics,  you will not likely find back-and-forth wokflows between PostScript-compatible drawing programs (Illustrator, FreeHand, Canvas, Draw, Xara) and OS-meta format (QuickDraw, Quartz, WMF, EMF) based programs ("Works" or "Office" applications) very satisfying.
    You encounter instance-specific caveats stemming from two basic issues:
    PostScript compatible drawing programs use cubic Bezier curves; two curve handles per path segment. "Works" or "Office" type drawing modules typically use simpler quadratic Bezier curves; one handle per path segment. So going back and forth between involves geometric translations and re-translations which wreck the practicality of editng the paths. PostScript drawing programs use cubic curves for good reason: more shape control with fewer anchors.
    The OS-specific meta formats are really not designed for the device-independent resolution fidelity of commercial print. Resolution in vector drawing? Yes. Vector paths are essentially mathematical formulae for *plotting* curves onto a raster grid. (Everything is eventually rasterized; the practical difference between raster and vector graphics is a matter of when.)
    So it's a matter of accuracy; the "fineness" of the theoretical grids the curve-handling format/application is designed to plot upon . Meta formats are more about drawing an acceptable shape onscreen, and you can kind of think of what gets sent to a dumb printing device (one without a PostScript "brain") as a glorified enlarged screenshot. The path objects you draw in a PostScript compatible program that have smooth curves that scale well both upward and downward often translate to meta formats as paths with ugly flats or extra kinks that become apparent when scaled. These caveats are unpredictable and tedious to correct. The bother ends up outweighing the advantages you are seeking.
    In a nutshell, that's why Illustrator's effective "recommendation" for exporting to Office applicaitons (Save For Office...) is the lowest-common-denominator approach of defaulting to a common raster format (PNG), even though Windows does have a vector-based format (EMF) which Illustrator can export to.
    If what you're trying to do is just a one-way trip from some legacy AppleWorks drawings to Illustrator, you'll need an intermediary translation from .awk to something Illustrator can import. You may be able to find an open-source software for doing that.
    JET

  • Power shell search one column of a csv file and replace text in that column

    I have a huge CSV file.
    Column J has number which represents states.
    I would like to search through column J of output.csv and replace the number with the state name.
         J
    State
    233
    219
    233
    210
    Becomes
       J
    State
    NC
    TN
    NC
    SC
    I have tried several methods that seem to do noting or erase everything in my csv file or at best searches every column and if a phone number has 210 in it, it changes it to SC.
    Can any one point me in the correct location?
    Thanks!
    R White

    Thanks so much
    I gave it a try using this
    Import-Csv C:\temp\outfile.csv| ForEach-Object {
    if ($_.State.tostring() -like '256') { $_.State.tostring().replace('256', 'Somethingelse256')}
    if ($_.State.tostring() -like '257') { $_.State.tostring().replace('257', 'Somethingelse257xx')}
     } | export-csv C:\temp\outfileNEW.csv
    it produced a C:\temp\outfileNEW.csv
    with only a column A that had this all the way down it
    #TYPE System.String
    Length
    16
    16
    16
    16
    16
    16
    16
    R White

  • Import from dsv files and export to csv files

    hi every body..
    how can I create a project in NetBeans which does:
    1- import a .dsv (Delimiter-Separated Values) file content and save it to array
    - the values in this format separated by fixed commas
    example
    "AIG" "Insurance" "64.91" "25/11/06"
    2- export into .csv file (Comma Separated Value)
    -the values in this format separated by commas
    example:
    AIG,Insurance,64.91,25/11/06

    Well, you need to learn Java so you can read files, divide the data by the delimiters, and then write it out as a csv file.
    Can't really give better instructions than that...maybe start by reading the basic tutorials?
    We don't give out full program code here so you'll need to ask more precise questions. Such as what is it that you're having problems with.

  • Collect file count and add to CSV file.

    I have this script efficiently crafted by Jacques Rioux and I now what to do a little more with it.
    What it currently does is look on my desktop at a select number of Folders on my desktop. It then looks at the keyword information and then returns the results to a csv file.
    it looks for all the photographs; shot by Matthew. edited by Matthew etc.... with the date appended to the start and then the next time the script is run it adds the next data to the bottom of the last.
    The result looks like this
    19/12/2012,255,412,37,68
    27/12/2012,197,342,16,26
    From the fist line you can see on the 19th December 2012 I shot 255 images
    No what I would like it to do is:-
    a) Specifaically look in the folders of the desktop whose name begins with BH, BU, DA, DI, DO, FR, IN, NO, MA, TM, WA, PR, SE (These folders may or may not exist at the time, but are the only folders it should look at)
    b) also do a file count of the contents of the above individual folders and append it to the csv file. Again a folder may not exist. Where it doesn't exist the file count must = 0 so that it can then be added to the CSV file.
    This is how I hope the line to look like from the CSV file,
    19/12/2012,255,412,37,68, 5,3,20,25,60,101,25,0,85,5,40,0,0
    from the line above you can see that the folders NO, PR, and SE were all non existant and therefore a 0 was written in its place on the CSV file.
    Below is the working script that looks for the keywords.
    set spotlightqueryList to {"Shot by Matthew", "Editted by Matthew", "Shot by Shah", "Editted by Shah"}
    set thefolders to {"Desktop"}
    set thekind to "PSD"
    set csvFileName to "ProductivityLog.csv"
    set tHome to path to home folder as string
    set tc to count spotlightqueryList
    set theseCount to {}
    repeat tc times
              set end of theseCount to 0
    end repeat
    repeat with i in thefolders
              set thepath to my existsItem(tHome & i)
              if thepath is not "" then -- exists
                        repeat with j from 1 to tc
                                  set tQuery to item j of spotlightqueryList
                                  do shell script "mdfind -onlyin " & thepath & " " & tQuery & " " & thekind & " | wc -l" -- wc return the number of lines
                                  set item j of theseCount to (item j of theseCount) + (the result as integer) -- add the number of lines
                        end repeat
              end if
    end repeat
    set csvPath to "DCKGEN:Brands:Zoom:Online Photography:" & csvFileName
    set oTID to text item delimiters
    set text item delimiters to "," -- CSV delimiter
    set thisLine to (theseCount as text) -- convert list to text, each number is separated by comma
    set text item delimiters to oTID
    tell (current date) to set tDate to short date string
    set beginning of theseCount to tDate -- insert the date (first column)
    set csvPath to "DCKGEN:Brands:Zoom:Online Photography:" & csvFileName
    set oTID to text item delimiters
    set text item delimiters to "," -- CSV delimiter
    set thisLine to (theseCount as text) -- convert list to text, each number is separated by comma
    set text item delimiters to oTID
    --- append this line to CSV file
    do shell script "echo " & (quoted form of thisLine) & " >>" & quoted form of POSIX path of csvPath
    on existsItem(f)
              try
                        return quoted form of POSIX path of (f as alias) -- exists
              end try
              return "" -- else not exists
    end existsItem
    (* just a way to visually see it working
    set dialog to "Matt Shot: \"" & item 1 of theseCount & "\"" & return & return & "Matt Edit: \"" & item 2 of theseCount & "\"" & return & return & "Shah Shot: \"" & item 3 of theseCount & "\"" & return & return & "Shah Edit: \"" & item 4 of theseCount & "\"" & return & return
    display dialog dialog
    This is what I began to wrote but really have no idea how I would write it into the data into the CSV file and also I was struggling to get the non existant folder to = 0?
    tell application "Finder"
              set folderA to (get first folder of desktop whose name starts with "BH")
              set folderB to (get first folder of desktop whose name starts with "Bu")
              set folderC to (get first folder of desktop whose name starts with "Da")
              set folderD to (get first folder of desktop whose name starts with "DI")
              set folderE to (get first folder of desktop whose name starts with "Do")
              set folderF to (get first folder of desktop whose name starts with "Fr")
              set folderG to (get first folder of desktop whose name starts with "In")
              set folderH to (get first folder of desktop whose name starts with "Ma")
              if (exists (get first folder of desktop whose name starts with "No")) is true then
                        set folderI to (get first folder of desktop whose name starts with "No")
              else
                        set folderI to "0"
                        set folderJ to (get first folder of desktop whose name starts with "To")
                        set folderK to (get first folder of desktop whose name starts with "Wa")
                        if (exists (get first folder of desktop whose name starts with "SE")) is truethen
                                  set folderL to (get first folder of desktop whose name starts with"SE")
                        else
                                  set folderL to "0"
                                  if (exists (get first folder of desktop whose name starts with "PR"))is true then
                                            set folderM to (get first folder of desktop whose name starts with "PR")
                                  else
                                            set folderM to "0"
                                            set folderM to (get first folder of desktop whose name starts with "PR")
                                  end if
                        end if
              end if
              tell application "System Events"
                        set contentsA to (number of files in folderA)
                        set contentsB to (number of files in folderB)
                        set contentsC to (number of files in folderC)
                        set contentsD to (number of files in folderD)
                        set contentsE to (number of files in folderE)
                        set contentsF to (number of files in folderF)
                        set contentsG to (number of files in folderG)
                        set contentsH to (number of files in folderH)
                        set contentsI to (number of files in folderI)
                        set contentsJ to (number of files in folderJ)
                        set contentsK to (number of files in folderK)
                        set contentsL to (number of files in folderL)
                        set contentsM to (number of files in folderM)
              end tell
    end tell
    I hope someone can help me compile the remaining data.
    Many thanks
    Matt

    OK i've done my homework and I have been able to get a lot closer I just need to make the search specific to a number of folders on the desktop?
    Line 7 explains how I would like it to search.
    set spotlightqueryList to {"Shot_by_Matthew", "Editted_by_Matthew", "Shot_by_Shah", "Editted_by_Shah"}
    set spotlightqueryList2 to {"AL70", "BH70", "BH70", "BU40", "ES20", "DV25", "DJ30", "RA30", "FR10", "GT55", "MA65", "MB65", "MC65", "FI65", "MF65", "MH65", "NN_", "TM15", "WA35", "PR_", "SE_"}
    set thefolders to {"Desktop"}
    --Here I need to limit the search so that it only looks in folders of the desktop whose name begins with "BH", "BU", "DA", "DI", "DO", "FR", "IN", "MA", "NO", "TM", "WA", "PR", "SE"
    set thekind to "PSD"
    set csvFileName to "ProductivityLog.csv"
    set tHome to path to home folder as string
    set tc to count spotlightqueryList
    set theseCount to {}
    repeat tc times
              set end of theseCount to 0
    end repeat
    set tc2 to count spotlightqueryList2
    set theseCount2 to {}
    repeat tc2 times
              set end of theseCount2 to 0
    end repeat
    repeat with i in thefolders
              set thepath to my existsItem(tHome & i)
              if thepath is not "" then -- exists
                        repeat with j from 1 to tc
                                  set tQuery to item j of spotlightqueryList
                                  do shell script "mdfind -onlyin " & thepath & " " & tQuery & " " & thekind & " | wc -l" -- wc return the number of lines
                                  set item j of theseCount to (item j of theseCount) + (the result as integer) -- add the number of lines
                        end repeat
              end if
    end repeat
    repeat with i2 in thefolders
              set thepath2 to my existsItem2(tHome & i2)
              if thepath2 is not "" then -- exists
                        repeat with j2 from 1 to tc2
                                  set tQuery2 to item j2 of spotlightqueryList2
                                  do shell script "mdfind -onlyin " & thepath2 & "  -name " & tQuery2 & " " & thekind & " | wc -l" -- wc return the number of lines
                                  set item j2 of theseCount2 to (item j2 of theseCount2) + (the result as integer) -- add the number of lines
                        end repeat
              end if
    end repeat
    set csvPath to "DCKGEN:Brands:Zoom:Online Photography:" & csvFileName
    set oTID to text item delimiters
    set text item delimiters to "," -- CSV delimiter
    set thisLine to (theseCount as text) -- convert list to text, each number is separated by comma
    set thisLine2 to (theseCount2 as text) -- convert list to text, each number is separated by comma
    set text item delimiters to oTID
    tell (current date) to set tDate to short date string
    set beginning of theseCount to tDate -- insert the date (first column)
    set csvPath to "DCKGEN:Brands:Zoom:Online Photography:" & csvFileName
    set oTID to text item delimiters
    set text item delimiters to "," -- CSV delimiter
    set thisLine to (theseCount as text) -- convert list to text, each number is separated by comma
    set thisLine2 to (theseCount2 as text) -- convert list to text, each number is separated by comma
    set text item delimiters to oTID
    --- append this line to CSV file
    do shell script "echo " & (quoted form of thisLine) & (quoted form of thisLine2) & " >>" & quoted form of POSIX path of csvPath
    on existsItem(f)
              try
                        return quoted form of POSIX path of (f as alias) -- exists
              end try
              return "" -- else not exists
    end existsItem
    on existsItem2(f)
              try
                        return quoted form of POSIX path of (f as alias) -- exists
              end try
              return "" -- else not exists
    end existsItem2
    (* just a way to visually see it working
    set dialog to "Matt Shot: \"" & item 1 of theseCount & "\"" & return & return & "Matt Edit: \"" & item 2 of theseCount & "\"" & return & return & "Shah Shot: \"" & item 3 of theseCount & "\"" & return & return & "Shah Edit: \"" & item 4 of theseCount & "\"" & return & return
    display dialog dialog

  • How to Query remote PC's registry by OU for 2 values and export to CSV file.

    I'm new to scripting and to Powershell but this is what I have managed to put together so far. Of course it fails. We have two custom entries in the registry that I want to query remote workstations for these values, Monitor 1 and Monitor 2. Output to a
    CSV along with the workstations name. Because of our AD structure I figured its just easier to input the OU individually as seen in the script. That portion of the script seems to work. I get the following error in bold when I run the script: I've Google'd
    and tinkered with this for a week now with no resolution and seem to be going in circles.  And yes, I had help to get this far.
    Exception calling "OpenRemoteBaseKey" with "2" argument(s): "The network path was not found.
    At C:\utils\RegMonitor2.ps1:33 char:5
    +     $regKey = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($Hive,$result.pro ...
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
        + FullyQualifiedErrorId : IOException
    Exception calling "OpenRemoteBaseKey" with "2" argument(s): "The network path was not found.
    At C:\utils\RegMonitor2.ps1:33 char:5
    +     $regKey = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($Hive,$result.pro ...
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
        + FullyQualifiedErrorId : IOException
    # 1) Searches Active Directory for all Computers under said OU
    # 2) Searches remote registry of those machines for the mentioned Monitor and Monitor2 subkeys.
    # 3) Exports CSV (Can be opened and saved as excel format later) with ordered columns Computername, Monitor1 value, monitor2 value.
    # ================================================================
    $SearchPath = "OU=XXX,OU=XXX,OU=XXX,DC=XXX,DC=XXX,DC=XXX"
    $objSearcher = New-Object System.DirectoryServices.DirectorySearcher
    $objSearcher.SearchRoot = New-Object System.DirectoryServices.DirectoryEntry("LDAP://$SearchPATH")
    $objSearcher.PageSize = 1000
    $objSearcher.Filter = "(objectClass=computer)"
    $objSearcher.SearchScope = "Subtree"
    $colProplist = "name"
    $colResults = $objSearcher.FindAll()
    $Store = @()
    $Hive = [Microsoft.Win32.RegistryHive]"LocalMachine";
    foreach ($result in $colResults)
    # Use $result.properties.name to retreive ComputerName
    $obj = New-Object PsObject
    $obj | Add-member -type noteproperty -name "Computername" -Value $result.properties.name
    $regKey = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($Hive,$result.properties.name);
    $ref = $regKey.OpenSubKey("SYSTEM\CurrentcontrolSet\control\Session Manager\Environment");
    $obj | Add-member -type Noteproperty -name "Monitor1" -value $ref.OpenSubKey("Monitor")
    $obj | Add-member -type Noteproperty -name "Monitor2" -value $ref.OpenSubKey("Monitor2")
    $store += $obj
    $store | Select-Object Computername,Monitor1,Monitor2 | Export-CSV -noTypeInformation -Path "Pathtosave.csv"
    People are always promising the apocalypse. They never deliver.
    Ok, I have modified the end of the script a bit, and no more error: Instead I get an unexpected output.
    foreach ($result in $colResults)
        # Use $result.properties.name to retreive ComputerName
        $obj = New-Object PsObject
        $obj | Add-member -type noteproperty -name "Computername" -Value $result.properties.name
        $regKey = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($Hive,$result.properties.name);
        $ref = $regKey.OpenSubKey("SYSTEM\CurrentcontrolSet\control\Session Manager\Environment");
        $obj | Add-member -type Noteproperty -name "Monitor1" -value $ref.OpenSubKey("Monitor")
        $obj | Add-member -type Noteproperty -name "Monitor2" -value $ref.OpenSubKey("Monitor2")
        $store += $obj
    $store | Select-Object Computername,Monitor1,Monitor2 | Export-CSV -noTypeInformation -Path "C:\Utils\Data.csv"
    Unexpected output:
    "Computername","Monitor1","Monitor2"
    "System.DirectoryServices.ResultPropertyValueCollection",,
    "System.DirectoryServices.ResultPropertyValueCollection",,
    "System.DirectoryServices.ResultPropertyValueCollection",,

    Hi,
    What do your registry values look like in the Monitor and Monitor2 subkeys?
    EDIT: This might help:
    # Retrieve list of computers using Get-ADComputer and process each
    Get-ADComputer -Filter * -SearchBase 'OU=Test PCs,DC=domain,DC=com' | ForEach {
    # Verify PC is alive
    If (Test-Connection $_.Name -Quiet -Count 1) {
    # Connect to registry
    $remoteHive = [Microsoft.Win32.RegistryHive]“LocalMachine”;
    $regKey = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($remoteHive,$($_.Name))
    # Open the Environment key
    $ref = $regKey.OpenSubKey('SYSTEM\CurrentcontrolSet\control\Session Manager\Environment')
    # Create an ordered hashtable with the data from string values named 'String Value One/Two' in Monitor and Monitor2 subkeys
    # You'll need to adjust these values based on your actual data
    # If you are running v2, remove [ordered] below (so the line reads $props = @{)
    $props = [ordered]@{
    Computer=$_.Name
    Monitor =$ref.OpenSubKey('Monitor').GetValue('String Value One')
    Monitor2=$ref.OpenSubKey('Monitor2').GetValue('String Value Two')
    # Create a custom object based on the hashtable above
    New-Object PsObject -Property $props
    } | Sort-Object Computer | Export-Csv .\MonitorRegistryCheck.csv -NoTypeInformation
    # The line above sorts the output object by the computer name and then exports the object to a CSV file
    Don't retire TechNet! -
    (Don't give up yet - 12,575+ strong and growing)

  • Importing problem-iCal won't recognize .csv files

    I've recently upgraded to 10.4.9 from 10.3.9 on my Power PC iMac G5. The old version of iCal had no problem seeing downloaded .csv files and importing them. The new version won't see them or import them. Under 10.3.9, all I had to do was double click on the downloaded .csv file and it would import all events into iCal. Now, nothing happens. I'm a basic user and don't know about scripting or anything else. Any help?

    YES! SUCCESS!
    I found two ways of doing this. After searching the net, I found this online app on this guys website that worked the best:
    http://manas.tungare.name/projects/yahoo2ical/
    It says it's a "yahoo" csv converter, but it worked fine for my outlook csv file.
    The other option is, if you have an account with google, like a gmail account, you can use their google calendar to import your csv file. Once you've done that, you can subscribe to that calendar you imported via the "share" option. Just get the .ics address. This took a couple tries on google's site to import, but it worked.

  • Labview to read and white into .csv file only

    I have a security question.
    Is it possible for labview to read/write into a .CSV file only and make it not applicable to the user?
    for instant, labview writes to file OUTPUT.CSV, 
    my current system, on windows xp, i can go to output.csv and use notepad/excel to read/write  and save output.csv data.
    i want to make it not accessible to write or modify output.csv (ONLY POSSIBLE IF DONE USING LABVIEW PROGRAM).
    how would i do something like that?
    thanks in advance.
    Best regards,
    Krispiekream

    I am not sure if you can do that.  What makes this file so important?  What does it do?
    Depending on the reason you want to do this there might be a different/better soloution.  Maybe save the file as binary and if it was opened it would not be easy to read.  Or save the file in a location that has nothing to do with LabVIEW?
    Under File I/O >> Advanced,  There is a VI called Access Rights that you might be able to use.

  • How do i save and access a csv file on my ipad when filling in a smartform. Currently only option goes to photos

    Trying to save a csv file on my ipad so I can auto populate into a SmartForm. It will only allow me to access photos when I hit the import button

    search the app store for PDF Writer. I've found a lot that will convert documents to PDF, but none yet that will write within the PDF. One thing you want to avoid are cloud based apps. Any of them that talk about editing on the cloud, etc, aren't going to be as standalone as you want.
    It's possible, if you can take your template PDF, turn it into a word document that you can edit, you can then convert that to PDF...kinda a workaround way to do what you want. And apps that convert to PDF are much easier to find

  • SharePoint 2013 - What are all requirement components for People Pickers to list, search, display, and assign users permission

    Hi All
    the past few months, I have been working with permission issues related to SharePoint 2013 site permission settings using People Pickers to list, search, display users to assign or check permission.
    Our environment include multiple domains and few forests. Our SharePoint farm is installed on one domain but the good thing is our AD structure are configured to have all other domains and forests with 2 ways trusts with this domain so domain
    users are authenticated and can access SharePoint just fine. Also SharePoint use default claim authentication.
    The problem is People Picker is not display all domains user accounts when site owners need to assign permission. So to resolve the problem, I had provisioned
    SA - User profile service and Import AD domain user accounts (one way) into Sharepoint.
    I configured stsadm.exe -o setproperty -pn peoplepicker-searchadforests -pv
    for all domains and forests (eventhough, as mentioned we do have 2 ways trust)
    and sometime tried different query (user last name, domain\logonname, email address) if one is not showing.
    With all that added, People Pickers seem to find and display user account for all domains now.
    My question now is do UPS and all AD domains users need to be imported into SharePoint and STSadm configuration are required in order to have all domains user accounts to display in People Pickers so the site owners can
    find them and assign permission when needed?
    Please share your advices, comments as they are really valuable to me.
    Thanks
    Swanl

    UPS and people pickers are virtually unrelated. The only connection between them is to do with caching and updating user names and emails if they change over time, or in other words not relevant to your situation.
    To answer your question directly; Nope, you do not need to set up synchronisation connections to a domain to be able to pick up a person in a people picker. As you've seen you may need to run some STSADM commands to make sure they are checking the right
    places.

  • How to create and modify an XML file from an Oracle Form

    I would like to build an Oracle Form to maintain a small XML file in the file system (i.e. Not in Oracle database but in the operating system).
    I would like the Form to display existing values from the XML file and the user can update and save content back to the XML file.
    Can any one tell me how this can be done? Thanks.

    Does Forms 9i provide any XML Parser Functions?
    Can I insert the XML file into a table column by inserting XML using the XSU Front End rather than using TEXT_IO to maintain the XML file directly?
    Can I use XSU PL/SQL API in Forms to retrieve and modify XML values?
    Any help is appreciated.

  • Data formatting and reading a CSV file without using Sqlloader

    I am reading a csv file to an Oracle table called sps_dataload. The table is structured based on the record type of the data at the beginning of
    each record in the csv file. But the first two lines of the file are not going to be loaded to the table due to the format.
    Question # 1:
    How can I skip reading the first two lines from my csv file?
    Question # 2:
    There are more fields in the csv file than there are number of columns in my table. I know I can add filler as an option, but then there are
    about 150 odd fields which are comma-separated in the file and my table has 8 columns to load from the file. So, do I really have to use filler
    for 140 times in my script or, there is a better way to do this?
    Question # 3:
    This is more of an extension of my question above. The csv file has fields with block quotes - I know this could be achieved in sql loader when we mention Occassionally enclosed by '"'.
    But can this be doable in the insert as created in the below code?
    I am trying to find the "wrap code" button in my post, but do not see it.
    Heres my file layout -
    PROSPACE SCHEMATIC FILE
    ; Version 2007.7.1
    Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
    Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
    Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
    Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
    Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
    Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
    Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1
    This is my table structure -
    desc sps_dataload;
    File_Name     Varchar2 (50) Not Null,
    Record_Layer Varchar2 (20) Not Null,     
    Level_Id     Varchar2 (20),
    Desc1          Varchar2 (50),
    Desc2          Varchar2 (50),
    Desc3          Varchar2 (50),
    Desc4          Varchar2 (50)
    Heres my code to do this -
    create or replace procedure insert_spsdataloader(p_filepath IN varchar2,
    p_filename IN varchar2,
    p_Totalinserted IN OUT number) as
    v_filename varchar2(30) := p_filename;
    v_filehandle UTL_FILE.FILE_TYPE;
    v_startPos number; --starting position of a field
    v_Pos number; --position of string
    v_lenstring number; --length of string
    v_record_layer varchar2(20);
    v_level_id varchar2(20) := 0;
    v_desc1 varchar2(50);
    v_desc2 varchar2(50);
    v_desc3 varchar2(50);
    v_desc4 varchar2(50);
    v_input_buffer varchar2(1200);
    v_delChar varchar2(1) := ','
    v_str varchar2(255);
    BEGIN
    v_Filehandle :=utl_file.fopen(p_filepath, p_filename, 'r');
    p_Totalinserted := 0;
    LOOP
    BEGIN
    UTL_FILE.GET_LINE(v_filehandle,v_input_buffer);
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    EXIT;
    END;
    -- this will read the 1st field from the file --
    v_Pos := instr(v_input_buffer,v_delChar,1,1);
    v_lenString := v_Pos - 1;
    v_record_layer := substr(v_input_buffer,1,v_lenString);
    v_startPos := v_Pos + 1;
    -- this will read the 2nd field from the file --
    v_Pos := instr(v_input_buffer,v_delChar,1,2);
    v_lenString := v_Pos - v_startPos;
    v_desc1 := substr(v_input_buffer,v_startPos,v_lenString);
    v_startPos := v_Pos + 1;
    -- this will read the 3rd field from the file --
    v_Pos := instr(v_input_buffer,v_delChar,1,3);
    v_lenString := v_Pos - v_startPos;
    v_desc2 := substr(v_input_buffer,v_startPos,v_lenString);
    v_startPos := v_Pos + 1;
    -- this will read the 4th field from the file --
    v_Pos := instr(v_input_buffer,v_delChar,1,4);
    v_lenString := v_Pos - v_startPos;
    v_desc3 := substr(v_input_buffer,v_startPos,v_lenString);
    v_startPos := v_Pos + 1;
    -- this will read the 5th field from the file --
    v_Pos := instr(v_input_buffer,v_delChar,1,5);
    v_lenString := v_Pos - v_startPos;
    v_desc4 := substr(v_input_buffer,v_startPos,v_lenString);
    v_startPos := v_Pos + 1;
    v_str := 'insert into table sps_dataload values('||v_filename||','||v_record_layer||','||v_level_id||','||v_desc1||','||v_desc2||','||v_desc3||','||v_desc4||')';
    Execute immediate v_str;
    p_Totalinserted := p_Totalinserted + 1;
    commit;
    END LOOP;
    UTL_FILE.FCLOSE(v_filehandle);
    EXCEPTION
    WHEN UTL_FILE.INVALID_OPERATION THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20051, 'sps_dataload: Invalid Operation');
    WHEN UTL_FILE.INVALID_FILEHANDLE THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20052, 'sps_dataload: Invalid File Handle');
    WHEN UTL_FILE.READ_ERROR THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20053, 'sps_dataload: Read Error');
    WHEN UTL_FILE.INVALID_PATH THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20054, 'sps_dataload: Invalid Path');
    WHEN UTL_FILE.INVALID_MODE THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20055, 'sps_dataload: Invalid Mode');
    WHEN UTL_FILE.INTERNAL_ERROR THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20056, 'sps_dataload: Internal Error');
    WHEN VALUE_ERROR THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE_APPLICATION_ERROR(-20057, 'sps_dataload: Value Error');
    WHEN OTHERS THEN
    UTL_FILE.FCLOSE(v_FileHandle);
    RAISE;
    END insert_spsdataloader;
    /

    Justin, thanks. I did happen to change my pl sql procedure using utl_file.get_file and modifying the instr function based on position of ',' in the file, but my procedure is getting really big and too complex to debug. So I got motivated to use external tables or sql loader as plan b.
    As I was reading more about creating an external table as an efficient way and thus believe I can perhaps build an extern table with my varying selection from the file. But I am still unclear if I can construct my external table by choosing different fields in a record based on a record identifier string value (which is the first field of any record). I guess I can, but I am looking for the construct as to how am I going to use the instr function for selecting the field from the file while creating the table.
    PROSPACE SCHEMATIC FILE
    ; Version 2007.7.1
    Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
    Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
    Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
    Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
    Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
    Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
    Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1For example, if I want to create an external table like this -
    CREATE TABLE extern_sps_dataload
    ( record_layer            VARCHAR2(20),
      attr1                   VARCHAR2(20),
      attr2                   VARCHAR2(20),
      attr3                   VARCHAR2(20),
      attr4                   VARCHAR2(20)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
      DEFAULT DIRECTORY dataload
      ACCESS PARAMETERS
      ( RECORDS DELIMITED BY NEWLINE
        BADFILE     dataload:'sps_dataload.bad'
        LOGFILE     dataload:'sps_dataload.log'
        DISCARDFILE dataload:'sps_dataload.dis'
        SKIP 2
        VARIABLE 2 FIELDS TERMINATED BY ',' 
        OPTIONALLY ENCLOSED BY '"' LRTRIM
        MISSING FIELD VALUES ARE NULL
        +LOAD WHEN RECORD_LAYER = 'PROJECT' (FIELD2, FIELD3,FIELD7,FIELD9)+
        +LOAD WHEN RECORD_LAYER= 'PRODUCT' (FIELD3,FIELD4,FIELD8,FIELD9)+
        +LOAD WHEN RECORD_LAYER= 'SEGMENT' (FIELD1,FIELD2,FIELD4,FIELD5)+    LOCATION ('sps_dataload.csv')
    REJECT LIMIT UNLIMITED;
    {code}
    While I was reading the external table documentation, I thought I could achieve similar things by using position_spec option, but I am not getting behind its parameters. I have highlighted italics in the code above(from LOAD WHEN....FIELDS....), the part I think I am going to use, but not sure of it's construct.
    Thank you for your help!! Appreciate your thoughts on this..
    Sanders.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • How to display data from local csv files (in a folder on my desktop) in my flex air application using a datagrid?

    Hello, I am very new to flex and don't have a programming background. I am trying to create an air app with flex that looks at a folder on the users desktop where csv files will be dropped by the user. In the air app the user will be able to browse and look for a specific csv file in a list container, once selected the information from that file should be displayed in a datagrid bellow. Finally i will be using Alive PDF to create a pdf from the information in this datagrid laid out in an invoice format. Bellow is the source code for my app as a visual refference, it only has the containers with no working code. I have also attached a sample csv file so you can see what i am working with. Can this be done? How do i do this? Please help.
    <?xml version="1.0" encoding="utf-8"?>
    <mx:WindowedApplication xmlns:mx="http://www.adobe.com/2006/mxml" layout="absolute" width="794" height="666">
        <mx:Label x="280" y="19" text="1. Select Purchase Order"/>
        <mx:List y="45" width="232" horizontalCenter="0"></mx:List>
        <mx:Label x="158" y="242" text="2. Verify Information"/>
        <mx:DataGrid y="268" height="297" horizontalCenter="0" width="476">
            <mx:columns>
                <mx:DataGridColumn headerText="Column 1" dataField="col1"/>
                <mx:DataGridColumn headerText="Column 2" dataField="col2"/>
                <mx:DataGridColumn headerText="Column 3" dataField="col3"/>
            </mx:columns>
        </mx:DataGrid>
        <mx:Label x="355" y="606" text="3. Generated PDF"/>
        <mx:Button label="Click Here" horizontalCenter="0" verticalCenter="311"/>
    </mx:WindowedApplication>

    Open the file, parse it, populate an ArrayCollection or XMLListCollection, and make the collection the DataGrid dataProvider:
    http://livedocs.adobe.com/flex/3/html/help.html?content=Filesystem_08.html
    http://livedocs.adobe.com/flex/3/html/help.html?content=12_Using_Regular_Expressions_01.ht ml
    http://livedocs.adobe.com/flex/3/html/help.html?content=dpcontrols_6.html
    http://livedocs.adobe.com/flex/3/langref/mx/collections/ArrayCollection.html
    http://livedocs.adobe.com/flex/3/langref/mx/collections/XMLListCollection.html
    If this post answered your question or helped, please mark it as such.

Maybe you are looking for