Import CSV Is Incorrect - Misinterprets RETURN within Strings as New Row

When importing a CSV file into Numbers 3.0.1, it produces bad results, because it interprets RETURNs within strings as new rows, when it should ignore them.
How can this be worked around?

If the CSV is properly formatted and the strings that have the carriage return(s) are enclosed in quotes, it works fine  (at least it does for me). If the strings do not have quotes around them, the returns should be interpreted as a new row.
The workaround would be to somehow get quotes around those strings before importing into Numbers. I'm not sure how to do that except manually in a text editor or for the app that created the CSV to format it correctly.

Similar Messages

  • Need to send text string to new row in expandable table

    I have a table that is a summary of other choices made throughout the document.  For specific check boxes in the form, there is a defined phrase that needs to be sent to a new row in the summary table.
    The general idea is that for checkbox:
    form1.sfMain.sfContent.tableC2.Row1.cbC2b6
    if (this.rawValue == "1") {
    this.resolveNode('form1.sfApp6.tableApp6A._Row1').addInstance(1);
    this.resolveNode("form1.sfApp6.tableApp6A.Row1[*].ddApp6A1c").rawValue = "Blue bottle";
    This needs to be something I can specify for each applicable checkbox (each will have a different text string), but I don't know what the final count is going to be (depends on choices made by the user) so I can't delete rows in the table and start over every time.  It needs to be additive.
    Any suggestions?
    Message was edited by: cyndilynnrose

    For anyone needing the same thing, I found exactly what I needed:
    http://www.truetechtroubleshooting.com/2012/02/advanced-expanding-tables-and-script.html
    The hardest part of getting it working was figuring out how many parent levels I needed to add since I have a heavily layered form.

  • How to gracefully import CSV files into Numbers.app v3?

    I've posted this as a question at Stack Exchange, also…
    I have a process, which has worked faithfully for years in Numbers '09, whereby I download my Bank account data in CSV format, then drag that data directly into my Numbers sheet (after creating an appropriate number of blank rows).
    This no longer works, and I can't find an option to import csv! A help search within numbers for 'csv' returns a single result which describes exporting.
    I've tried the old method, no dice. Also, the menu option: Insert > Choose… doesn't permit .csv to be selected.
    At the moment, my workaround is:
    drag the CSV onto the Numbers dock or task-switcher icon to create a temp sheet
    select and copy the content
    paste-and-match-format into my desired location
    close don't save the temp file
    Doe anyone know a better way? A hack or hidden flag I can toggle to get the old functionality back?

    Hi David,
    Your CSV file seems to be a hybrid (mongrel). Where did it originate?
    Regions that use a full-stop (.) as as the decimal point use comma (,) as the separator in CSV files. But wait, there is more! CSV can also mean Character Separated Values, and that character could be semi-colon (;) or Tab.
    Are you able to open that file in TextEdit or Pages or Word and replace the separator? Replacing ; with Tab may work.
    I must admit that I have never had trouble with CSV opening in Numbers with a double-click in Finder. For example, my bank statements download with what works in my Region (strings enclosed with "double quotes" and columns separated by commas and numbers with a full-stop as the decimal point. Maybe I am just lucky. .
    Regards,
    Ian.

  • Importing CSV file and parsing it

    First of all I am very new to writing powershell code.  Therefore, my question could be very rudimentary, but I cannot find an answer, so please help.
    I'm trying to read a CSV file and parse it.  I cannot figure out how to access nth element without hardcoding its name.
    $data = Import-Csv $file   #import CSV file
    # read column headers (manually read the first row of the data file, or import it from other source, or ...)
    $file_dump = Get-Content $file  #OK, I'm sure there is another way to get just the first line, but that's not relevant
    $name_list = $file_dump[0].split(",")
    # access element
    $temp = $data[$i].Name  # works - but that's HARDCODING the column name into the script - what if someone changes it?
    #but what I want to do is
    $temp = $data[$i].$name_list[0]
    How do I do this in PowerShell?

    So you're asking how to get the first data point from the first column, no matter what the header is?
    Why won't you know what your input file looks like?
    You can always drop the first line of the file to remove the existing headers and then use the -Header parameter of Import-Csv to give yourself known headers to reference (this will only work if you know how many columns to expect, etc etc etc).
    http://ss64.com/ps/import-csv.html
    Don't retire TechNet! -
    (Don't give up yet - 13,085+ strong and growing)

  • How to import csv data into Oracle using c#

    Hello,
    How to import csv data into Oracle using c #. Where data to be imported 3GB in size and number of rows 7512263. I've managed to import csv data into Oracle, but the time it takes about 1 hour. How to speed up the time it takes to import csv data into oracle. Thank you.
    This is my code:
    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Text;
    using System.Diagnostics;
    using System.Threading;
    using System.Text.RegularExpressions;
    using System.IO;
    using FileHelpers;
    using System.Data.OracleClient;
    namespace sqlloader
    class Program
    static void Main(string[] args)
    int jum;
    int i;
    bool isFirstLine = false;
    FileHelperEngine engine = new FileHelperEngine(typeof(XL_XDR));
    //Connect To Database
    string constr = "Data Source=(DESCRIPTION=(ADDRESS_LIST="
    + "(ADDRESS=(PROTOCOL=TCP)(HOST= pt-9a84825594af )(PORT=1521 )))"
    + "(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=o11g)));"
    + "User Id=xl;Password=rahasia;";
    OracleConnection con = new OracleConnection(constr);
    con.Open();
    // To Read Use:
    XL_XDR[] res = engine.ReadFile("DataOut.csv") as XL_XDR[];
    jum = CountLinesInFile("DataOut.csv");
    FileInfo f2 = new FileInfo("DataOut.csv");
    long s2 = f2.Length;
    int jmlRecord = jum - 1;
    for (i = 0; i < jum; i++)
    ShowPercentProgress("Processing...", i, jum);
    Thread.Sleep(100);
    if (isFirstLine == false)
    isFirstLine = true;
    else
    string sql = "INSERT INTO XL_XDR (XDR_ID, XDR_TYPE, SESSION_START_TIME, SESSION_END_TIME, SESSION_LAST_UPDATE_TIME, " +
    "SESSION_FLAG, VERSION, CONNECTION_ROW_COUNT, ERROR_CODE, METHOD, HOST_LEN, HOST, URL_LEN, URL, CONNECTION_START_TIME, " +
    "CONNECTION_LAST_UPDATE_TIME, CONNECTION_FLAG, CONNECTION_ID, TOTAL_EVENT_COUNT, TUNNEL_PAIR_ID, RESPONSIVENESS_TYPE, " +
    "CLIENT_PORT, PAYLOAD_TYPE, VIRTUAL_TYPE, VID_CLIENT, VID_SERVER, CLIENT_ADDR, SERVER_ADDR, CLIENT_TUNNEL_ADDR, " +
    "SERVER_TUNNEL_ADDR, ERROR_CODE_2, IPID, C2S_PKTS, C2S_OCTETS, S2C_PKTS, S2C_OCTETS, NUM_SUCC_TRANS, CONNECT_TIME, " +
    "TOTAL_RESP, TIMEOUTS, RETRIES, RAI, TCP_SYNS, TCP_SYN_ACKS, TCP_SYN_RESETS, TCP_SYN_FINS, EVENT_TYPE, FLAGS, TIME_STAMP, " +
    "EVENT_ID, EVENT_CODE) VALUES (" +
    "'" + res.XDR_ID + "', '" + res[i].XDR_TYPE + "', '" + res[i].SESSION_START_TIME + "', '" + res[i].SESSION_END_TIME + "', " +
    "'" + res[i].SESSION_LAST_UPDATE_TIME + "', '" + res[i].SESSION_FLAG + "', '" + res[i].VERSION + "', '" + res[i].CONNECTION_ROW_COUNT + "', " +
    "'" + res[i].ERROR_CODE + "', '" + res[i].METHOD + "', '" + res[i].HOST_LEN + "', '" + res[i].HOST + "', " +
    "'" + res[i].URL_LEN + "', '" + res[i].URL + "', '" + res[i].CONNECTION_START_TIME + "', '" + res[i].CONNECTION_LAST_UPDATE_TIME + "', " +
    "'" + res[i].CONNECTION_FLAG + "', '" + res[i].CONNECTION_ID + "', '" + res[i].TOTAL_EVENT_COUNT + "', '" + res[i].TUNNEL_PAIR_ID + "', " +
    "'" + res[i].RESPONSIVENESS_TYPE + "', '" + res[i].CLIENT_PORT + "', '" + res[i].PAYLOAD_TYPE + "', '" + res[i].VIRTUAL_TYPE + "', " +
    "'" + res[i].VID_CLIENT + "', '" + res[i].VID_SERVER + "', '" + res[i].CLIENT_ADDR + "', '" + res[i].SERVER_ADDR + "', " +
    "'" + res[i].CLIENT_TUNNEL_ADDR + "', '" + res[i].SERVER_TUNNEL_ADDR + "', '" + res[i].ERROR_CODE_2 + "', '" + res[i].IPID + "', " +
    "'" + res[i].C2S_PKTS + "', '" + res[i].C2S_OCTETS + "', '" + res[i].S2C_PKTS + "', '" + res[i].S2C_OCTETS + "', " +
    "'" + res[i].NUM_SUCC_TRANS + "', '" + res[i].CONNECT_TIME + "', '" + res[i].TOTAL_RESP + "', '" + res[i].TIMEOUTS + "', " +
    "'" + res[i].RETRIES + "', '" + res[i].RAI + "', '" + res[i].TCP_SYNS + "', '" + res[i].TCP_SYN_ACKS + "', " +
    "'" + res[i].TCP_SYN_RESETS + "', '" + res[i].TCP_SYN_FINS + "', '" + res[i].EVENT_TYPE + "', '" + res[i].FLAGS + "', " +
    "'" + res[i].TIME_STAMP + "', '" + res[i].EVENT_ID + "', '" + res[i].EVENT_CODE + "')";
    OracleCommand command = new OracleCommand(sql, con);
    command.ExecuteNonQuery();
    Console.WriteLine("Successfully Inserted");
    Console.WriteLine();
    Console.WriteLine("Number of Row Data: " + jmlRecord.ToString());
    Console.WriteLine();
    Console.WriteLine("The size of {0} is {1} bytes.", f2.Name, f2.Length);
    con.Close();
    static void ShowPercentProgress(string message, int currElementIndex, int totalElementCount)
    if (currElementIndex < 0 || currElementIndex >= totalElementCount)
    throw new InvalidOperationException("currElement out of range");
    int percent = (100 * (currElementIndex + 1)) / totalElementCount;
    Console.Write("\r{0}{1}% complete", message, percent);
    if (currElementIndex == totalElementCount - 1)
    Console.WriteLine(Environment.NewLine);
    static int CountLinesInFile(string f)
    int count = 0;
    using (StreamReader r = new StreamReader(f))
    string line;
    while ((line = r.ReadLine()) != null)
    count++;
    return count;
    [DelimitedRecord(",")]
    public class XL_XDR
    public string XDR_ID;
    public string XDR_TYPE;
    public string SESSION_START_TIME;
    public string SESSION_END_TIME;
    public string SESSION_LAST_UPDATE_TIME;
    public string SESSION_FLAG;
    public string VERSION;
    public string CONNECTION_ROW_COUNT;
    public string ERROR_CODE;
    public string METHOD;
    public string HOST_LEN;
    public string HOST;
    public string URL_LEN;
    public string URL;
    public string CONNECTION_START_TIME;
    public string CONNECTION_LAST_UPDATE_TIME;
    public string CONNECTION_FLAG;
    public string CONNECTION_ID;
    public string TOTAL_EVENT_COUNT;
    public string TUNNEL_PAIR_ID;
    public string RESPONSIVENESS_TYPE;
    public string CLIENT_PORT;
    public string PAYLOAD_TYPE;
    public string VIRTUAL_TYPE;
    public string VID_CLIENT;
    public string VID_SERVER;
    public string CLIENT_ADDR;
    public string SERVER_ADDR;
    public string CLIENT_TUNNEL_ADDR;
    public string SERVER_TUNNEL_ADDR;
    public string ERROR_CODE_2;
    public string IPID;
    public string C2S_PKTS;
    public string C2S_OCTETS;
    public string S2C_PKTS;
    public string S2C_OCTETS;
    public string NUM_SUCC_TRANS;
    public string CONNECT_TIME;
    public string TOTAL_RESP;
    public string TIMEOUTS;
    public string RETRIES;
    public string RAI;
    public string TCP_SYNS;
    public string TCP_SYN_ACKS;
    public string TCP_SYN_RESETS;
    public string TCP_SYN_FINS;
    public string EVENT_TYPE;
    public string FLAGS;
    public string TIME_STAMP;
    public string EVENT_ID;
    public string EVENT_CODE;
    I hope someone can give me a solution. Thanks

    The fastest way is to use external tables or sql loader (sqlldr). (If you use external tables or sql loader, you don't use C# at all).

  • How to find out what was imported from Import-CSV

    If I have an input file that could contain any CSV field names in the headers and after I run Import-CSV I want to find the names of the fields that were imported. How can I do this?

    Hi Patrick,
    generally speaking, you can figure out a lot about what you are working with by using the PowerShell commandline, not a script. You can inspect an object's 'attributes' - as shown by Adam - using the Get-Member cmdlet (You can do this to the content of any
    variable, not just Csv imports):
    # Get content of Csv file
    $Data = Import-Csv "C:\temp\Input.csv"
    # List properties:
    $Data | Get-Member
    The resulting information is a table with 3 columns:
    - Name: The Name of the attribute
    - MemberType: This tells you what it is. The two Types usually of interest are "Method" and anything containing the word "Property". Method does something (Save a file, disable a Mailbox, etc.). Property is a piece of Information (eg. UserName,
    Email, Address, ...).
    - Definition: The definition-column is in my opinion the most interesting aspect of the output of Get-Member. This tells you how to interact with the objects' 'attribute' in question. In case of methods, it shows how to call it and what it returns (Let's say
    you want to save a file: The Method Save() might then require you to specify a path where to save the file).
    In case of Properties however it tells you what kind of information is stored in the Property. If you import a csv, that's always a string for each column, since all you can store in a csv file is text. In some cases, when the console behaves differently to
    what you'd expect, check this out - it may well be that the property is of a type you didn't expect. All Types can be looked up on MSDN (just copy paste the full type-name to google and it'll send you there).
    Learn to use the Get-Member cmdlet and you can figure out how to work with what the console gives you all on your own (Seriously, it's one of the three most important cmdlets in all of PowerShell). Try reading:
    Get-Help Get-Member -Examples
    Those examples explain the use of Get-Member better than I ever could :)
    Cheers,
    Fred
    There's no place like 127.0.0.1

  • Address Book -- Importing CSV File created by Numbers

    I'm trying to import a csv file generated from MS outlook into address book. When I attempt to import the csv file that Outlook created, Address Book is able to recognize the file. However, I need to modify the file before importing. To do that, I opened in Numbers, deleted some columns and renamed some columns. Then, I exported from Numbers as a .CSV file. However, when I try to import the .CSV file created by numbers, I get the error that I have modified with Numbers, I get the error "Text file import failure".
    Does anybody know why address book can't import a CSV file created by Numbers?

    Many people have run into this problem, including myself. I had worked many hours on a spreadsheet in Numbers, exported to CSV, and received the error message when trying to import into Address Book. The answers to this dilemma are out there, but tidbits of the whole answer are spread around discussion boards. So I hope to consolidate those answers here.
    The main problem for me -- and it's reasonable to assume that others receive the "Text File import error" for the same reason -- is that 1) I had commas in my spreadsheet, and 2) I had hidden RETURNs in the spreadsheet. Once I got rid of those two things, I had no problem importing into Address Book.
    Deleting Commas>
    The problem with commas in your spreadsheet is that the CSV format uses commas to distinguish when a new cell is formed. So having other commas will inevitably confuse Address Book during the attempted import. I didn't have too many commas in my spreadsheet, so I was able to quickly remove them. Yet, if you have dozens or hundreds of cells with commas in them, you'll need an easier solution. I imagine opening Number's Inspector>Cell Inspector>Format will have the solution of removing commas from, say, currency amounts or something else. Yet, how many commas arise in contact information? Perhaps the easiest way is to use the same method of removing those hidden RETURNs. Read on...
    Deleting Hidden RETURNs>
    You can't see them, and I couldn't find a "show invisibles" option in Numbers. So you need to search and find them. Yet, how do you search for something hidden? Do the Command-F thing to bring up the search field. The search bar should automatically be ready to search what you type. Hitting the Return key will do nothing for you, but hitting Shift-Option-Return will (you'll likely need to hit Return again to actually perform the search). Numbers will point out how many times these hidden returns are in your spreadsheet, which cells contain them, and give you the option to automatically replace them with something else (a space is a safe option). You can do the same search and replace with commas too.
    Save your spreadsheet, export it as CSV, and -- as long as the only errors stemmed from commas and hidden RETURNs -- you shouldn't have a problem having Address Book import it. Good luck!
    PS- After importing my spreadsheet, I noticed that some of my Smart Groups in Address Book were not showing some people who should have automatically been in there. The problem was that the Smart Group search had straight apostrophes in it, but the imported contact had curved apostrophes. The computer saw these as two unrelated characters. After some copying and pasting, the problem was solved.

  • Returning a string from one class to another

    I have a file from where i creating an object of another class and calling the fuctionsl
    This is how i am doing
    public login()
    String input =new String (passwordField.getPassword());
         input= input.trim();
    tempuser= userField.getText();
              db2sqln patientsqln =new db2sqln(); //creating an object
              patientsqln.connectdb2();// calling a connect to database function
              patientsqln.dbselect(tempuser);// passing the login informaiont
    in patientsqln i am getting the password from the database and want to pass it back to login class so that i can compare the entered password and the database password
    here is i am using my patientsqln.dbselect()
    //public void dbselect(String userinput)
              try
              Statement stmt = con.createStatement();
              String select = "Select patientpass from patient where PATIENTUSER="+"'"+userinput+"'";
              ResultSet rs = stmt.executeQuery(select);
              while (rs.next())
              name=rs.getString("PATIENTPASS");
              name = name.trim();
              System.out.println(name+"from db2sqln");
              rs.close();
    My question is that everything else works fine, but i donot get any password back in login() class it returns null value. looks like i am not returing the string back. how should i do that
    I am comparing like this
    if(input.equals("name"))
    System.out.println("Correct password");
    else{
    System.out.println("Incorrect password");
    //

    I have a file from where i creating an object of another class and calling the fuctionsl
    This is how i am doing
    public login()
    String input =new String (passwordField.getPassword());
    input= input.trim();
    tempuser= userField.getText();
    db2sqln patientsqln =new db2sqln(); //creating an object
    patientsqln.connectdb2();// calling a connect to database function
    patientsqln.dbselect(tempuser);// passing the login informaiont
    in patientsqln i am getting the password from the database and want to pass it back to login class so that i can compare the entered password and the database password
    here is i am using my patientsqln.dbselect()
    //public void dbselect(String userinput)
    try
    Statement stmt = con.createStatement();
    String select = "Select patientpass from patient where PATIENTUSER="+"'"+userinput+"'";
    ResultSet rs = stmt.executeQuery(select);
    while (rs.next())
    name=rs.getString("PATIENTPASS");
    name = name.trim();
    System.out.println(name+"from db2sqln");
    rs.close();
    My question is that everything else works fine, but i donot get any password back in login() class it returns null value. looks like i am not returing the string back. how should i do that
    I am comparing like this
    if(input.equals("name"))
    System.out.println("Correct password");
    else{
    System.out.println("Incorrect password");
    //

  • Using PowerShell to import CSV data from Vendor database to manipulate Active Directory Users

    Hello,
    I have a big project I am trying to automate.  I am working in a K-12 public education IT Dept. and have been tasked with importing data that has been exported from a vendor database via .csv file into Active Directory to manage student accounts. 
    My client wants to use this data to make bulk changes  to student user accounts in AD such as moving accounts from one OU to another, modifying account attributes based on State ID, lunchroom ID, School, Grade, etc. and adding new accounts / disabling
    accounts for students no longer enrolled.
    The .csv that is exported doesn't have headers that match up with what is needed for importing in AD, so those have to be modified in this process, or set as variables to get the correct info into the correct attributes in AD or else this whole project is
    a bust.  He is tired of manually manipulating the .csv data and trying to get it onto AD with few or no errors, hence the reason it has been passed off to me.
    Since this information changes practically daily, I need a way to automate user management by accomplishing the following on a scheduled basis.
    Process must:
    Check to see if Student Number already exists
    If yes, then modify account
    Update {School Name}, {Site Code}, {School Number}, {Grade Level} (Variables)
    Add correct group memberships (School / Grade Specific)
    Move account to correct OU (OU={Grade},OU=Students,OU=Users,OU={SiteCode},DC=Domain,DC=net)
    Remove incorrect group memberships (School / Grade Specific)
    Set account status (enabled / disabled)
    If no, create account
    Import Student #
    Import CNP #
    Import Student name
    Extract First and Middle initial
    If duplicate name exists
    Create log entry for review
    Import School, School Number, Grade Level
    Add to correct Group memberships (School / Grade Specific)
    Set correct OU (OU={Grade},OU=Students,OU=Users,OU={SiteCode},DC=Domain,DC=net)
    Set account Status
    I am not familiar with Powershell, but have researched enough to know that it will be the best option for this project.  I have seen some partial solutions in VB, but I am more of an infrastructure person instead of scripting / software development. 
    I have just started creating a script and already have hit a snag.  Maybe one of you could help.
    #Connect to Active Directory
    Import-Module ActiveDirectory
    # Import iNOW user information
    $Users = import-csv C:\ADUpdate\INOW_export.csv
    #Check to see if the account already exists in AD
    ForEach ( $user in $users )
    #Assign the content to variables
    $Attr_employeeID = $users."Student Number"
    $Attr_givenName = $users."First Name"
    $Attr_middleName = $users."Middle Name"
    $Attr_sn = $users."Last Name"
    $Attr_postaldeliveryOfficeName = $users.School
    $Attr_company = $users."School Number"
    $Attr_department = $users."Grade Level"
    $Attr_cn = $Attr_givenName.Substring(0,1) + $Attr_middleName.Substring(0,1) + $Attr_sn
    IF (Get-ADUser $Attr_cn)
    {Write-Host $Attr_cn already exists in Active Directory

    Thank you for helping me with that before it became an issue later on, however, even when modified to be $Attr_sAMAaccountName i still get errors.
    #Connect to Active Directory
    Import-Module ActiveDirectory
    # Import iNOW user information
    $Users = import-csv D:\ADUpdate\Data\INOW_export.csv
    #Check to see if the account already exists in AD
    ForEach ( $user in $users )
    #Assign the content to variables
    $Attr_employeeID = $users."Student Number"
    $Attr_givenName = $users."First Name"
    $Attr_middleName = $users."Middle Name"
    $Attr_sn = $users."Last Name"
    $Attr_postaldeliveryOfficeName = $users.School
    $Attr_company = $users."School Number"
    $Attr_department = $users."Grade Level"
    $Attr_sAMAccountName = $Attr_givenName.Substring(0,1) + $Attr_middleName.Substring(0,1) + $Attr_sn
    IF (Get-ADUser $Attr_sAMAccountName)
    {Write-Host $Attr_sAMAccountName already exists in Active Directory
    PS C:\Windows\system32> D:\ADUpdate\Scripts\INOW-AD.ps1
    Get-ADUser : Cannot convert 'System.Object[]' to the type 'Microsoft.ActiveDirectory.Management.ADUser'
    required by parameter 'Identity'. Specified method is not supported.
    At D:\ADUpdate\Scripts\INOW-AD.ps1:28 char:28
    + IF (Get-ADUser $Attr_sAMAccountName)
    + ~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidArgument: (:) [Get-ADUser], ParameterBindingException
    + FullyQualifiedErrorId : CannotConvertArgument,Microsoft.ActiveDirectory.Management.Commands.GetAD
    User

  • How to import csv-file in Numbers 3.2.2.

    I start using Numbers in stead of Excel. I would like to import csv-files from my bank, but when I open the csv-file in Numbers, everything is imported in the same cell. I composed a testfile: 01/08/2014,”text”,”more text”,”even more text” in Pages, exported to a textfile and changed the extension from .txt in .csv. It did not help, everything was in the same cell. What must be changed to become successful in importing csv-files? I am using Numbers 3.2.2. and an iMac with 2,8 GHz Intel Core i7 processor and 8 GB 1067 MHz DDR3 memory with OS X 10.9.4.
    Thanks, Joan Voormolen

    You can do this using Pages. Without using outside scripts or functions. The Pages Find/Replace function will let you change the delimiter on the data in your file.
    Open the file in Pages. Click Show Invisibles. (this will show you the delimiter used in the file)
    If you see a * as the delimiter, that is a space. Some data files are space delimited. This is a really poor way to delimit numerical data files.
    If you see a fat arrow to the right, the file is Tab delimited
    Obviously, a comma is not a hidden character. Some files are comma delimited
    Whatever else might have been used as a delimiter (for example a semi colon is sometimes used) will be apparent.
    The delimiter should be something that is not used anywhere else in the "data"... text, numbers, etc., you want to delimit. Numbers considers a comma as a valid delimiter for files with the suffix .csv . It considers a tab as a valid delimiter with files with the suffix .txt . It does not consider spaces a valid delimiter in with any file suffix. But some programs use odd delimiters (semi colon, colon, double spaces, etc) as delimiters.
    Use the Find command, then Find/Replace as you need to create that delimiter numbers recognizes. Let's say a semi colon was used as a delimiter. Enter the current delimiter (semi colon)  into the Find box. Pages should highlight all the instances of your entry. Enter a comma (to create comma delimited data file) in the Replace box. You should now see a comma as the delimiter.
    Important Don't forget, any other comma used in the file will also be considered a delimiter. (a comma in 1,000 for example). So check the data. If you see a comma used another way you will want to eliminate that BEFORE you do the "comma as delimiter" replacement. If you have 1,000, do a find/replace with comma as the find, nothing as the replace, first. THEN do the replacement of the semi colon.
    Now comes the "tricky" part from what I could see. You want to save this new file with a suffix of .csv. (Export the file) Numbers will only open a comma delimited file with separated data (by comma) if it's suffix is .csv. Pages only gives you limited export options and puts the file suffix on for you automatically. CSV is not one of the options!
    Choose Text. Pages will name the file .txt. Quit Pages. Go to the file on your desktop (or wherever you saved it). Change the file suffiix from .txt to .csv.
    That's it. Open the file with Numbers. Numbers will create a separate column for everything between the comma's.
    You can use this same method to alter your data file before you import it into Numbers. For example, one file I wanted to import had time=xxx . I only wanted the actual time, not the text attached to it, in my spreadsheet. I did a find/replace with "time=" as the find. A comma as the replace. Even though "time=xxx" is one "word", Pages identified the "time=" within the word to allow the replacement.
    Numbers does not provide a "choose delimiter" function when opening a file. Instead it automatically uses the standard delimiter based on the file suffix. CSV means Comma, so if the file is named .csv it will only look for and use a comma as the delimiter to put the data into separate columns. I believe .txt uses only a tab as the delimiter. In the above example you could find/replace to a Tab. Then Export to Text. And numbers will open the data into columns the way you want, without the extra step of renaming the file on your desktop.
    While some files use a second space (ie two in a row) as a delimiter that's a nasty way to delimit. You always want a specific delimiter that is not used within the data element.
    The above is to import numerical data into separate columns. You could use the same method to manipulate a file that contains text. Let's say you had a file with the suffix .txt. In the file are names and addresses.  John Smith 246 Rose Road . You want Name in one column. Address in another.  Look at all the spaces, which ones should be delimiters which not? Are there any delimiters in the file?
    If you open with Pages and choose show invisibles you can see. You might see John Smith --> 246 Rose Road. (the --> will look like a fat arrow in Pages). Numbers will open this file, IF it has .txt as the suffix, based on the Tab,  with name in one column, Address in another.
    Or you might see John*Smith**246*Rose*Road. Even though the creator of this intended two spaces to be a delimiter Numbers does not recognize that. Numbers will put everything into one column. The fix? In Pages, put a tab between name and address. Find/replace two spaces with Tab. Export, as Text.
    Based on what you see (with show invisible active) in Pages, you can use the Find/Replace function to create the specific delimiter you want (tab or comma). You can use that function to manipulate the file easily so the data you want shows up in separate columns. You may need to get clever to accomplish the unique delimiters. You might even need to do two passes with Find/Replace.
    In the instance above  if there was only one space between each element. (not two as a pseudo delimiter) You could replace all spaces with a tab in Pages. Export as Text.  Numbers will open that file with a column for each word (one for John, one for Smith). Then  "Merge" the two cells (columns) you want to put back together. 

  • Returning XML String From Servlet

              Is there a simple way to disable the HTML character escaping when returning
              a string from a servlet. The returned string contains well formed XML, and
              I don't want the tags converted to > and < meta characters in the
              HTTP reply.
              The code is basically "hello world", version 7.0 SP2.
              Thanks
              > package xxx.servlet;
              >
              > import weblogic.jws.control.JwsContext;
              >
              >
              > /**
              > * @jws:protocol http-xml="true" form-get="false" form-post="false"
              > */
              > public class HelloWorld
              > {
              > /** @jws:context */
              > JwsContext context;
              >
              > /**
              > * @jws:operation
              > * @jws:protocol http-xml="false" form-post="true" form-get="false" soap-s
              tyle="document"
              > * @jws:return-xml xml-map::
              > * <HelloWorldResponse xmlns="http://www.xxx.com/">
              > * {return}
              > * </HelloWorldResponse>
              >
              > * ::
              > * @jws:parameter-xml xml-map::
              > * <HelloWorld xmlns="http://www.xxx.com/">
              > * <ix>{ix}</ix>
              > * <contents>{contents}</contents>
              > * </HelloWorld>
              >
              > * ::
              > */
              > public String HelloWorld(String s)
              > {
              > return "<a> xyz </a>";
              > }
              > }
              

              Radha,
              We have a client/server package which speaks SOAP over a
              streaming HTTP channel for which we are writing a WebLogic
              servlet. For reasons of efficiency, we want to deserialize
              only the very top-level tags of the messages as they pass
              through the servlet. Yes, in theory, we should probably
              deserialize and validate the entire message contents...
              When we add support for other clients, we will fully
              deserialize inside those servlets.
              I have not looked any further into how to stop the inner
              tags from being escaped yet -- it is an annoyance more than
              a disaster, since the client handle meta escapes.
              My current guess is to use ECMAScript mapping...
              -Tony
              "S.Radha" <[email protected]> wrote:
              >
              >"Tony Hawkins" <[email protected]> wrote:
              >>
              >>Is there a simple way to disable the HTML character escaping when returning
              >>a string from a servlet. The returned string contains well formed XML,
              >>and
              >>I don't want the tags converted to > and < meta characters in the
              >>HTTP reply.
              >>
              >>The code is basically "hello world", version 7.0 SP2.
              >
              >>
              >>Thanks
              >>
              >>> package xxx.servlet;
              >>>
              >>> import weblogic.jws.control.JwsContext;
              >>>
              >>>
              >>> /**
              >>> * @jws:protocol http-xml="true" form-get="false" form-post="false"
              >>> */
              >>> public class HelloWorld
              >>> {
              >>> /** @jws:context */
              >>> JwsContext context;
              >>>
              >>> /**
              >>> * @jws:operation
              >>> * @jws:protocol http-xml="false" form-post="true" form-get="false"
              >>soap-s
              >>tyle="document"
              >>> * @jws:return-xml xml-map::
              >>> * <HelloWorldResponse xmlns="http://www.xxx.com/">
              >>> * {return}
              >>> * </HelloWorldResponse>
              >>>
              >>> * ::
              >>> * @jws:parameter-xml xml-map::
              >>> * <HelloWorld xmlns="http://www.xxx.com/">
              >>> * <ix>{ix}</ix>
              >>> * <contents>{contents}</contents>
              >>> * </HelloWorld>
              >>>
              >>> * ::
              >>> */
              >>> public String HelloWorld(String s)
              >>> {
              >>> return "<a> xyz </a>";
              >>> }
              >>> }
              >>
              >>
              >Hi Tony,
              >
              > Can you let me know for what purpose you want to disable the
              >HTML character
              >escaping.In case if you
              >
              >have tried this using someway,pl. let me know.
              >
              >rgds
              >Radha
              >
              >
              

  • Queue returns blank strings from template VI

    I have created a master VI which creates a named queue, this VI then spawns multiple copies of a template VI. These sub VIs all gain access to the queue and try writing data to it. I understood that the queue VI was protected so this should not cause a problem. Infact this does not work at all, when i extract element within the master VI it only finds a few of the elements and returns blank strings between. Help what is going on. I am not destroying the queue or such, it seems to be a problem with the protection of the queue when using template VIs.
    Cheers Tom.

    I have not been able to reproduce the problem on LV 6.0.2 Win98. Do you check the error cluster when an empty element is returned? Posting your code might help to find the problem.
    LabVIEW, C'est LabVIEW

  • Import-CSV and Takeown.

    I'm currently trying to come up with a way to search an entire folder directory to find all objects that a particular user is owner to, export that list to csv, then import that csv and using takeown to grant local Administrators Owner.
    Scouring the internet, I've been able to come up with this Powershell script to scan a directory and export the findings to a csv.
    param(
    [string]$username = "NameofUserSearchingFor",
    [string]$logfile
    Set-ExecutionPolicy Unrestricted
    if ($logfile -eq "") {
    $logfile = "c:\" + $username + "-Owner.csv"
    Write-Host "Setting log file to $logfile"
    #Path to search in
    [String]$path = "c:\TestFolder"
    [String]$AD_username = "Domain\" + $username
    #check that we have a valid AD user
    if (!(Get-ADUser $AD_username)){
    Write-Host "ERROR: Not a valid AD User: $AD_username"
    Exit 0
    Write-Output "$username" | Out-File -FilePath $logfile
    $files = Get-ChildItem $path -Recurse
    ForEach ($file in $files)
    $f = Get-Acl $file.FullName
    if ($f.Owner -eq $AD_username)
    Write-Output $file.FullName | Out-File -FilePath $logfile -Append
    exit 0
    That script exports data in the form of:
    NameofUserSearchingFor
    C:\TestFolder\TestFolder1
    C:\TestFolder\TestFolder2
    C:\TestFolder\TestFolder1\test1.txt
    C:\TestFolder\TestFolder2\test2.txt
    I'd like to use takeown to read each line of text and take ownership for local Administrators.  
    The script i'm trying to use doesn't do anything though.
    #Local Admininstrator Take Ownership
    $rows = Import-Csv "c:\NameofUserSearchingFor-Owner.csv"
    ForEach ($row in $rows)
    takeown /A /F $row
    Perhaps I'm going about this all wrong.  I'm relatively new to Powershell and have been trying to come up with a way to do this for the past 3 days.
    Any assistance would be greatly appreciated!
    *Update*
    If I reconfigure the takeown portion a bit to this:
    #Local Admininstrator Take Ownership
    $Path = "c:\NameofUserSearchingFor-Owner.csv"
    $rows = Import-Csv $Path
    ForEach ($line in $rows)
    takeown /A /F $Path
    The result is:
    SUCCESS: The file (or folder): "c:\NameofUserSearchingFor-Owner.csv" now owned by the administrators group.
    But will repeat as many times as there are lines of text in the csv.  So if there are 4 lines of text in the csv, that line will repeat 4 times.  I find it interesting that it knows how many lines there are but instead of granting local Administrator
    the owner to the path specified in the line it will instead grant local Administrator to the csv file.

    That is because you are takingownership of the CSV.
    ForEach ($line
    in$rows){
        takeown /A
    /F $Path
    should be:
    ForEach ($line
    in$rows){
        takeown /A
    /F $line
    ¯\_(ツ)_/¯

  • Mav Contacts does not import csv file

    Trying to import my Outlook exchange contacts to Mavericks Contacts.  Exported from outlook to excel. Saved excel file to csv format.  Tried to import csv file to Mavericks Contacts:  Error message says the csv file is not compatible.  Also imported the csv file to Numbers and saved again as another csv file.  Same error message.  Suggestions?

    The following was copied from the Contacts Help.
    Import contact information saved or exported from other apps. The files can be in LDAP Interchange Format (LDIF) or text format (tab-delimited or comma-separated value (CSV)).
    If you’re importing a tab-delimited or CSV file, make sure it’s formatted correctly using a text editor such as TextEdit.
    Remove any line breaks within a contact’s information.
    Make sure that fields in a tab-delimited file are separated by a tab, instead of by another character. Don’t include spaces before or after tabs.
    Make sure that fields in a CSV file are separated by a comma, instead of by another character. Don’t include spaces before or after commas.
    Make sure all addresses have the same number of fields. Add empty fields as needed.
    In Contacts, choose File > Import, select the file, change the encoding if necessary, then click Open.
    If you’re importing a text file, review the field labels.
    If the first card contains headers, make sure the headers are correctly labeled or marked “Do not import.” Any changes you make to this card are made to all cards in the file. To not import the headers card, select “Ignore first card.”
    To change a label, click the arrows next to the label and choose a new label. If you don’t want to import a field, choose “Do not import.”

  • Skip Blank Values in Import-CSV

    I am attempting to do a mass import of user attributes (phone number, address, city, state, zip code, title, company). All goes well until I hit a blank value in the CSV. Here is the Powershell script I am trying to use.
    Import-Csv IA-Test2.csv |
    ForEach-Object {
    $record = $_
    get-ADUser -LDAPFilter "(samaccountname=$($record.samaccountname))" |
    Set-ADUser -city $record.l -postalcode $record.postalcode -OfficePhone $record.OfficePhone -company $record.company -streetaddress $record.streetaddress -title $record.title
    How can I modify this to skip blank values?

    Okay.  They're not really blank, but null.
    $properties = @(
    'city',
    'postalcode',
    'OfficePhone',
    'company',
    'streeaddress',
    'title'
    Import-Csv IA-Test2.csv |
    ForEach-Object {
    $params = @{}
    foreach ($property in $properties)
    if ($_.$property)
    { $params.$property = $_.property }
    get-ADUser -LDAPFilter "(samaccountname=$($_.samaccountname))" |
    Set-ADUser @params
    [string](0..33|%{[char][int](46+("686552495351636652556262185355647068516270555358646562655775 0645570").substring(($_*2),2))})-replace " "

Maybe you are looking for

  • Internet Explorer 11 and SharePoint 2013 Standard - Web Parts and Calendar Don't Display/Render Correctly

    Hello everyone. I am currently running a fully patched (Service Pack 1) version of SharePoint Standard 2013 On Premises. I am having 2 problems that are reproducible on both Windows 7 and Windows 8.1 on multiple computers when running Internet Explor

  • Adobe pdf converter question

    I am trying to print a document from a DOS window to the "Adobe PDF" printer that converts documents printed to PDF files. It is using the standard DOS print job called "Remote downlevel document". It shows this job in the print queue for a moment bu

  • HP ScanJet 6200c with 10.4.11 - without VueScan

    I have a perfectly good ScanJet 6200c that I would like to use with my (new to me) g4 10.4.11. I checked out VueScan which is nice, but does anyone know if there anything out there that is free? Seems a shame to have to buy a new scanner since this o

  • How to make the light of screen the same every time

    every time I restart the laptop, the light of LCD will return to the maximum. I hope I could change it one time. Thank you.

  • RME 4.3 SyslogAnalyzer error

    Hi, I've installed LMS 3.2 with RME 4.3. After service start for a few days, the syslog collector stop unexpectedly. I can see the syslog.log file being updated but I can't see any syslog from report generator. Also I got no alarms from automated act