Importing CSV dates - wrong format

Hi,
I have some CSV data I want to import into Numbers and the CSV has dates. I work with "British" style dd/mm/yy dates but the CSV is "American" mm/dd/yy. How can I persuade Numbers to translate between the two styles?
Thanks,
Dave

Run this script or drag & drop the csv on the script icon (saved as an application).
The embedded dates will be changed from US format to English (French) one.
The edited csv file will be saved in the Temporary items folder an opened by Numbers.
--[SCRIPT csvUStocsvUSEN]
--=====
Yvan KOENIG (VALLAURIS, France)
2011/01/04
--=====
property theApp : "Numbers"
property permitted : {"public.comma-separated-values-text", "public.csv"}
--=====
on run
if my parle_anglais() then
set myPrompt to "Choose a csv file…"
else
set myPrompt to "Choisir un fichier csv…"
end if -- parleAnglais
set allowed to my permitted
if 5 = (system attribute "sys2") then (*
it's Mac OS X 10.5.x with a buggued Choose File *)
set allowed to {}
end if -- 5 = (system…
my commun(choose file with prompt myPrompt of type allowed without invisibles) (* un alias *)
end run
--=====
on open (sel)
my commun(item 1 of sel) (* an alias *)
end open
--=====
on commun(le_csv)
local uti, en_texte, nom_fichier, p2t
tell application "System Events"
try
set uti to type identifier of disk item ("" & le_csv)
on error
set uti to "???"
end try
end tell
if uti is not in permitted then
if my parleanglais() then
error "The document “" & le_csv & "” isn’t a csv one !"
else
error "Le document « " & le_csv & " » n’est pas un fichier csv !"
end if
end if
set les_lignes to paragraphs of (read le_csv)
repeat with l from 1 to count of les_lignes
set une_ligne to item l of les_lignes
if une_ligne contains "/" then
set une_ligne to my decoupe(une_ligne, ",")
repeat with c from 1 to count of une_ligne
set maybe to my decoupe(item c of une_ligne, "/")
if (count of maybe) = 3 then
set item c of une_ligne to my recolle({item 2 of maybe, item 1 of maybe, item 3 of maybe}, "/")
end if -- count of maybe…
end repeat
set item l of les_lignes to my recolle(une_ligne, ",")
end if
-- set item l of les_lignes to my remplace(item l of les_lignes, ",", ";") -- Useful for French systems
end repeat
set en_texte to my recolle(les_lignes, return)
set nom_fichier to (do shell script "date +%Y%m%d%H%M%S.csv")
set p2t to (path to temporary items from user domain)
set lenouveaucsv to ("" & p2t & nom_fichier)
tell application "System Events" to make new file at end of p2t with properties {name:nom_fichier}
set la_longueur to count of en_texte
write en_texte to file lenouveaucsv
tell application "System Events"
repeat while size of file lenouveaucsv < la_longueur
delay 0.2
end repeat
end tell
tell application "Numbers" to open file lenouveaucsv
end commun
--=====
on decoupe(t, d)
local oTIDs, l
set oTIDs to AppleScript's text item delimiters
set AppleScript's text item delimiters to d
set l to text items of t
set AppleScript's text item delimiters to oTIDs
return l
end decoupe
--=====
on recolle(l, d)
local oTIDs, t
set oTIDs to AppleScript's text item delimiters
set AppleScript's text item delimiters to d
set t to l as text
set AppleScript's text item delimiters to oTIDs
return t
end recolle
--=====
replaces every occurences of d1 by d2 in the text t
on remplace(t, d1, d2)
local oTIDs, l
set oTIDs to AppleScript's text item delimiters
set AppleScript's text item delimiters to d1
set l to text items of t
set AppleScript's text item delimiters to d2
set t to l as text
set AppleScript's text item delimiters to oTIDs
return t
end remplace
--=====
on parle_anglais()
return (do shell script "defaults read 'Apple Global Domain' AppleLocale") does not start with "fr_"
end parle_anglais
--=====
-- [/SCRIPT]
Yvan KOENIG (VALLAURIS, France) mardi 4 janvier 2011 09:39:38

Similar Messages

  • I'm getting a 'The Management Pack element is not declared' error when trying to import CSV data into my *extended* WindowsComputer class

    Background:
    I have a class called SUS_WindowsComputerMP, that is an extension of the Microsoft class, Microsoft.Windows.Computer
    I'm trying to import CSV data into this extended class and to the base class as well.
    Question:
    What am I doing wrong? I have a feeling that the Import CSV Format file is different for importing data into *extended* classes like mine, because the XML structure below would work for non-extended classes.
    "...Creating new CSVImporter
    Data Filename: D:\Peter\CMDB II\Exported MPs\TestMPs\SUS_WindowsComputer.csv
    Format Filename: D:\Peter\CMDB II\Exported MPs\SUS_WindowsComputerMP.xml
    Validating against XSD schema...
    The 'ManagementPack' element is not declared.
    Validation completed.
    Format file D:\Peter\CMDB II\Exported MPs\SUS_WindowsComputerMP.xml contains an invalid root element. Expected: root node with name \"CSVImportFormat\"
    Could not initialize a Management Object Creator from format file D:\Peter\CMDB II\Exported MPs\SUS_WindowsComputerMP.xml. Import thread exiting.
    My import format XML is this:
    <CSVImportFormat>
    <Class Type="ClassExtension_a3ae3e0f_d578_43dc_aa3e_9037a094763c" >
    <Property ID="WindowsServerID" />
    <Property ID="PrincipalName" />
    <Property ID="NetbiosComputerName" />
    <Property ID="IPAddress" />
    <Property ID="NetbiosDomainName" />
    <Property ID="DNSName" />
    <Property ID="OSVersionDisplayName" />
    <Property ID="SerialNo" />
    <Property ID="ServerDescription" />
    <Property ID="AssetTagNo" />
    <Property ID="ServerNameRow" />
    <Property ID="ChassisType" />
    <Property ID="InstallDate" />
    <Property ID="IsVirtualMachine" />
    <Property ID="BusinessUnitCustomersEnum" />
    <Property ID="RegionLocationEnum" />
    <Property ID="OtherFunctionRoleEnum" />
    <Property ID="ProductTypeEnum" />
    <Property ID="ObjectStatus" />
    <Property ID="AssetStatus" />
    <Property ID="CriticalityEnum" />
    <Property ID="EnvironmentEnum" />
    <Property ID="CostCodeClassEnum" />
    <Property ID="DataClassificationEnum" />
    <Property ID="Manufacturer" />
    </Class>
    </CSVImportFormat>

    Hello,
    Can anyone please help me out with this weird issue.
    thanks,
    orton

  • How to import csv data into Oracle using c#

    Hello,
    How to import csv data into Oracle using c #. Where data to be imported 3GB in size and number of rows 7512263. I've managed to import csv data into Oracle, but the time it takes about 1 hour. How to speed up the time it takes to import csv data into oracle. Thank you.
    This is my code:
    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Text;
    using System.Diagnostics;
    using System.Threading;
    using System.Text.RegularExpressions;
    using System.IO;
    using FileHelpers;
    using System.Data.OracleClient;
    namespace sqlloader
    class Program
    static void Main(string[] args)
    int jum;
    int i;
    bool isFirstLine = false;
    FileHelperEngine engine = new FileHelperEngine(typeof(XL_XDR));
    //Connect To Database
    string constr = "Data Source=(DESCRIPTION=(ADDRESS_LIST="
    + "(ADDRESS=(PROTOCOL=TCP)(HOST= pt-9a84825594af )(PORT=1521 )))"
    + "(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=o11g)));"
    + "User Id=xl;Password=rahasia;";
    OracleConnection con = new OracleConnection(constr);
    con.Open();
    // To Read Use:
    XL_XDR[] res = engine.ReadFile("DataOut.csv") as XL_XDR[];
    jum = CountLinesInFile("DataOut.csv");
    FileInfo f2 = new FileInfo("DataOut.csv");
    long s2 = f2.Length;
    int jmlRecord = jum - 1;
    for (i = 0; i < jum; i++)
    ShowPercentProgress("Processing...", i, jum);
    Thread.Sleep(100);
    if (isFirstLine == false)
    isFirstLine = true;
    else
    string sql = "INSERT INTO XL_XDR (XDR_ID, XDR_TYPE, SESSION_START_TIME, SESSION_END_TIME, SESSION_LAST_UPDATE_TIME, " +
    "SESSION_FLAG, VERSION, CONNECTION_ROW_COUNT, ERROR_CODE, METHOD, HOST_LEN, HOST, URL_LEN, URL, CONNECTION_START_TIME, " +
    "CONNECTION_LAST_UPDATE_TIME, CONNECTION_FLAG, CONNECTION_ID, TOTAL_EVENT_COUNT, TUNNEL_PAIR_ID, RESPONSIVENESS_TYPE, " +
    "CLIENT_PORT, PAYLOAD_TYPE, VIRTUAL_TYPE, VID_CLIENT, VID_SERVER, CLIENT_ADDR, SERVER_ADDR, CLIENT_TUNNEL_ADDR, " +
    "SERVER_TUNNEL_ADDR, ERROR_CODE_2, IPID, C2S_PKTS, C2S_OCTETS, S2C_PKTS, S2C_OCTETS, NUM_SUCC_TRANS, CONNECT_TIME, " +
    "TOTAL_RESP, TIMEOUTS, RETRIES, RAI, TCP_SYNS, TCP_SYN_ACKS, TCP_SYN_RESETS, TCP_SYN_FINS, EVENT_TYPE, FLAGS, TIME_STAMP, " +
    "EVENT_ID, EVENT_CODE) VALUES (" +
    "'" + res.XDR_ID + "', '" + res[i].XDR_TYPE + "', '" + res[i].SESSION_START_TIME + "', '" + res[i].SESSION_END_TIME + "', " +
    "'" + res[i].SESSION_LAST_UPDATE_TIME + "', '" + res[i].SESSION_FLAG + "', '" + res[i].VERSION + "', '" + res[i].CONNECTION_ROW_COUNT + "', " +
    "'" + res[i].ERROR_CODE + "', '" + res[i].METHOD + "', '" + res[i].HOST_LEN + "', '" + res[i].HOST + "', " +
    "'" + res[i].URL_LEN + "', '" + res[i].URL + "', '" + res[i].CONNECTION_START_TIME + "', '" + res[i].CONNECTION_LAST_UPDATE_TIME + "', " +
    "'" + res[i].CONNECTION_FLAG + "', '" + res[i].CONNECTION_ID + "', '" + res[i].TOTAL_EVENT_COUNT + "', '" + res[i].TUNNEL_PAIR_ID + "', " +
    "'" + res[i].RESPONSIVENESS_TYPE + "', '" + res[i].CLIENT_PORT + "', '" + res[i].PAYLOAD_TYPE + "', '" + res[i].VIRTUAL_TYPE + "', " +
    "'" + res[i].VID_CLIENT + "', '" + res[i].VID_SERVER + "', '" + res[i].CLIENT_ADDR + "', '" + res[i].SERVER_ADDR + "', " +
    "'" + res[i].CLIENT_TUNNEL_ADDR + "', '" + res[i].SERVER_TUNNEL_ADDR + "', '" + res[i].ERROR_CODE_2 + "', '" + res[i].IPID + "', " +
    "'" + res[i].C2S_PKTS + "', '" + res[i].C2S_OCTETS + "', '" + res[i].S2C_PKTS + "', '" + res[i].S2C_OCTETS + "', " +
    "'" + res[i].NUM_SUCC_TRANS + "', '" + res[i].CONNECT_TIME + "', '" + res[i].TOTAL_RESP + "', '" + res[i].TIMEOUTS + "', " +
    "'" + res[i].RETRIES + "', '" + res[i].RAI + "', '" + res[i].TCP_SYNS + "', '" + res[i].TCP_SYN_ACKS + "', " +
    "'" + res[i].TCP_SYN_RESETS + "', '" + res[i].TCP_SYN_FINS + "', '" + res[i].EVENT_TYPE + "', '" + res[i].FLAGS + "', " +
    "'" + res[i].TIME_STAMP + "', '" + res[i].EVENT_ID + "', '" + res[i].EVENT_CODE + "')";
    OracleCommand command = new OracleCommand(sql, con);
    command.ExecuteNonQuery();
    Console.WriteLine("Successfully Inserted");
    Console.WriteLine();
    Console.WriteLine("Number of Row Data: " + jmlRecord.ToString());
    Console.WriteLine();
    Console.WriteLine("The size of {0} is {1} bytes.", f2.Name, f2.Length);
    con.Close();
    static void ShowPercentProgress(string message, int currElementIndex, int totalElementCount)
    if (currElementIndex < 0 || currElementIndex >= totalElementCount)
    throw new InvalidOperationException("currElement out of range");
    int percent = (100 * (currElementIndex + 1)) / totalElementCount;
    Console.Write("\r{0}{1}% complete", message, percent);
    if (currElementIndex == totalElementCount - 1)
    Console.WriteLine(Environment.NewLine);
    static int CountLinesInFile(string f)
    int count = 0;
    using (StreamReader r = new StreamReader(f))
    string line;
    while ((line = r.ReadLine()) != null)
    count++;
    return count;
    [DelimitedRecord(",")]
    public class XL_XDR
    public string XDR_ID;
    public string XDR_TYPE;
    public string SESSION_START_TIME;
    public string SESSION_END_TIME;
    public string SESSION_LAST_UPDATE_TIME;
    public string SESSION_FLAG;
    public string VERSION;
    public string CONNECTION_ROW_COUNT;
    public string ERROR_CODE;
    public string METHOD;
    public string HOST_LEN;
    public string HOST;
    public string URL_LEN;
    public string URL;
    public string CONNECTION_START_TIME;
    public string CONNECTION_LAST_UPDATE_TIME;
    public string CONNECTION_FLAG;
    public string CONNECTION_ID;
    public string TOTAL_EVENT_COUNT;
    public string TUNNEL_PAIR_ID;
    public string RESPONSIVENESS_TYPE;
    public string CLIENT_PORT;
    public string PAYLOAD_TYPE;
    public string VIRTUAL_TYPE;
    public string VID_CLIENT;
    public string VID_SERVER;
    public string CLIENT_ADDR;
    public string SERVER_ADDR;
    public string CLIENT_TUNNEL_ADDR;
    public string SERVER_TUNNEL_ADDR;
    public string ERROR_CODE_2;
    public string IPID;
    public string C2S_PKTS;
    public string C2S_OCTETS;
    public string S2C_PKTS;
    public string S2C_OCTETS;
    public string NUM_SUCC_TRANS;
    public string CONNECT_TIME;
    public string TOTAL_RESP;
    public string TIMEOUTS;
    public string RETRIES;
    public string RAI;
    public string TCP_SYNS;
    public string TCP_SYN_ACKS;
    public string TCP_SYN_RESETS;
    public string TCP_SYN_FINS;
    public string EVENT_TYPE;
    public string FLAGS;
    public string TIME_STAMP;
    public string EVENT_ID;
    public string EVENT_CODE;
    I hope someone can give me a solution. Thanks

    The fastest way is to use external tables or sql loader (sqlldr). (If you use external tables or sql loader, you don't use C# at all).

  • Using PowerShell to import CSV data from Vendor database to manipulate Active Directory Users

    Hello,
    I have a big project I am trying to automate.  I am working in a K-12 public education IT Dept. and have been tasked with importing data that has been exported from a vendor database via .csv file into Active Directory to manage student accounts. 
    My client wants to use this data to make bulk changes  to student user accounts in AD such as moving accounts from one OU to another, modifying account attributes based on State ID, lunchroom ID, School, Grade, etc. and adding new accounts / disabling
    accounts for students no longer enrolled.
    The .csv that is exported doesn't have headers that match up with what is needed for importing in AD, so those have to be modified in this process, or set as variables to get the correct info into the correct attributes in AD or else this whole project is
    a bust.  He is tired of manually manipulating the .csv data and trying to get it onto AD with few or no errors, hence the reason it has been passed off to me.
    Since this information changes practically daily, I need a way to automate user management by accomplishing the following on a scheduled basis.
    Process must:
    Check to see if Student Number already exists
    If yes, then modify account
    Update {School Name}, {Site Code}, {School Number}, {Grade Level} (Variables)
    Add correct group memberships (School / Grade Specific)
    Move account to correct OU (OU={Grade},OU=Students,OU=Users,OU={SiteCode},DC=Domain,DC=net)
    Remove incorrect group memberships (School / Grade Specific)
    Set account status (enabled / disabled)
    If no, create account
    Import Student #
    Import CNP #
    Import Student name
    Extract First and Middle initial
    If duplicate name exists
    Create log entry for review
    Import School, School Number, Grade Level
    Add to correct Group memberships (School / Grade Specific)
    Set correct OU (OU={Grade},OU=Students,OU=Users,OU={SiteCode},DC=Domain,DC=net)
    Set account Status
    I am not familiar with Powershell, but have researched enough to know that it will be the best option for this project.  I have seen some partial solutions in VB, but I am more of an infrastructure person instead of scripting / software development. 
    I have just started creating a script and already have hit a snag.  Maybe one of you could help.
    #Connect to Active Directory
    Import-Module ActiveDirectory
    # Import iNOW user information
    $Users = import-csv C:\ADUpdate\INOW_export.csv
    #Check to see if the account already exists in AD
    ForEach ( $user in $users )
    #Assign the content to variables
    $Attr_employeeID = $users."Student Number"
    $Attr_givenName = $users."First Name"
    $Attr_middleName = $users."Middle Name"
    $Attr_sn = $users."Last Name"
    $Attr_postaldeliveryOfficeName = $users.School
    $Attr_company = $users."School Number"
    $Attr_department = $users."Grade Level"
    $Attr_cn = $Attr_givenName.Substring(0,1) + $Attr_middleName.Substring(0,1) + $Attr_sn
    IF (Get-ADUser $Attr_cn)
    {Write-Host $Attr_cn already exists in Active Directory

    Thank you for helping me with that before it became an issue later on, however, even when modified to be $Attr_sAMAaccountName i still get errors.
    #Connect to Active Directory
    Import-Module ActiveDirectory
    # Import iNOW user information
    $Users = import-csv D:\ADUpdate\Data\INOW_export.csv
    #Check to see if the account already exists in AD
    ForEach ( $user in $users )
    #Assign the content to variables
    $Attr_employeeID = $users."Student Number"
    $Attr_givenName = $users."First Name"
    $Attr_middleName = $users."Middle Name"
    $Attr_sn = $users."Last Name"
    $Attr_postaldeliveryOfficeName = $users.School
    $Attr_company = $users."School Number"
    $Attr_department = $users."Grade Level"
    $Attr_sAMAccountName = $Attr_givenName.Substring(0,1) + $Attr_middleName.Substring(0,1) + $Attr_sn
    IF (Get-ADUser $Attr_sAMAccountName)
    {Write-Host $Attr_sAMAccountName already exists in Active Directory
    PS C:\Windows\system32> D:\ADUpdate\Scripts\INOW-AD.ps1
    Get-ADUser : Cannot convert 'System.Object[]' to the type 'Microsoft.ActiveDirectory.Management.ADUser'
    required by parameter 'Identity'. Specified method is not supported.
    At D:\ADUpdate\Scripts\INOW-AD.ps1:28 char:28
    + IF (Get-ADUser $Attr_sAMAccountName)
    + ~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidArgument: (:) [Get-ADUser], ParameterBindingException
    + FullyQualifiedErrorId : CannotConvertArgument,Microsoft.ActiveDirectory.Management.Commands.GetAD
    User

  • Import CSV data to an existing populated XLS file

    Hi Guys,
    Looking for some assistance, with some powershell I have never done before..  and don't know where to start..
    I have a CSV file that contains some data - Several items exported from a standard powershell command.
    I also have an XLS file that has been manually populated for the last 6 months with data.
    What I want to get to (and have started with exporting the data to csv) is for the script to run, and add the data from the CSV to the XLS in the already exisiting colum heading.
    SO in the first row - we need to add the date, followed by adding data under each subsequent column that already has a heading..
    Anyone got any suggestions on where to start with this? Any advice greatly appreciated.
    Thanks
    Allan Harding

    I think your question is a bit too broad and vague.
    If you are not a scripter then I recommend asking in the Exxcel forum.  THey will help you learn how to use Excel tomerge data tables.
    If you want to scrpt this then youneed to start by learning how to write a script.  YOu will want to use Import-Csv and the Excel object model.
    If this is critical then I recommend contacting a consultant.
    http://technet.microsoft.com/en-us/scriptcenter/dd742419.aspx
    ¯\_(ツ)_/¯

  • Help with Importing Excel Data into Formatted Tables

    This is my first post, here, so please be gentle!
    I am a relatively new user of InDesign CS4, and I am creating a 70-pg manufacturer's price book.  A very large portion of each page is going to be size and price information imported from a large Excel spreadsheet.
    I have created the table format that I'd like to use for each page, but the trouble comes when I import the Excel data into that table.  For some reason, when I import, it all dumps into one cell.  Would it be best to import as an unformatted table, and then format the table each time, or is there a way to simply import the data into my pre-formatted table?  I've seen how the former is done, but the latter seems much easier (...although that could be my inexperience talking).
    Any advice would be greatly appreciated!
    Thanks so much,
    Laura (V1500)

    Thank you both so much for your time!  This is exactly what I needed.
    Cheers
    Laura

  • Importing ansi date time format into diadem

    Hi,
    I have an ANSI data log txt file with the time channel format  30/03/2006 15:51:08.846 and i am trying import into DIADEM  using the date/time format dd/mm/yyyy hh:nn:ss.fff , but when the file is imported, the time channel cells appear as NO VALUE. Can anybody help please ?
    AdeK

    Hi Adek,
    Here is a DataPlugin which loads your data file into DIAdem 10.2.  There are a number of advantages to using a DataPlugin over the ASCII Import Wizard, so long as your data files follow a similar data file structure.  To register the DataPlugin, just detach the ZIP file, unzip the URI file within, then double-click on the URI file in Windows Explorer.  After that, you will be able to load your TXT data files by dragging&dropping them directly from the NAVIGATOR tree view or Search Results into the Data Portal on the right hand side of DIAdem.  DataPlugins also support selective loading, register loading, reduced loading, and DataFinder indexing, so all of these immediately become available to you with your TXT files.  Note also that header lines 2-4 are now declared as both File and Group properties.
    Ask if you have questions using the DataPlugin,
    Brad Turpin
    DIAdem Product Support Engineer
    National Instruments
    Attachments:
    Adek_TXT.zip ‏2 KB

  • Creating stored procedure to import csv data into tables

    I am trying to create a stored procedure to import data from a csv file into a table. I tried using sqlldr but could not run it at school due to access permissions that they will not change. So I looked at the utl_package and from what I can see it requires a "virtual" directory, which conflicts with the access permissions again at school. So can I create a stored procedure that will just read data from a csv file into a table without the need to write or create a file/directory.
    Thankyou

    Satyaki_De wrote:
    Since, you are talking about so much restriction - i don't think you will get that.
    One alternative solution can be -> write a java based pl/sql procedure and try load that by parsing it. But, for that you need execute privileges on dbms_java package.
    Do you have that? I doubt.
    Regards.
    Satyaki De.Nope not allowed to use java, and im new to oracle. Why can't a stored procedure in pl/sql just read a file and put the data into a table I create prior?

  • Importing CSV data

    Hi!
    I'm trying to import data into a table in my APEX application.
    Everything was fine until I try to import decimal numbers.
    This is a problematic row:
    770;6;1;20130930;AA;20130930;NP;MB;03234495;;R;00;HR;0000;B;P1111;;00000000000000000000;HRK;;;;;XXX;;;;;;-157.492.093,16;-157.492.093,16;01;92963223473;;
    Number is -157.492.093,16.
    Table column is defined as number (17,2)
    My import parameters are:
    Separator ;
    Charset UTF-8
    Group separator .
    Decimal separator ,
    I've used the default Data Loading page...
    Does anyone know how to fix this?
    Regards,
    Ivan

    Ivan,
    put up a sample app on apex.oracle.com
    what you have SEEMS correct.
    You have properly identified the 'group separator' as a period and a 'decimal separator' as a comma.
    The only idea I have is:
    You may need to 'declare' the number format on the next page after the initial "load data" page.
    I have had to add a format for my DATE columns.
    eg FORMAT for the number:
    S999G999G999G999G990D00
    MK

  • Import CSV Data from a server into SQLite  - Help me do this or talk me out of it?

    There is a portion of a MySQL database (no more than 500 records, 6 fields) that I would like to download into a local SQLite db every time a user signs in.
    The reason I want to do this is because there are a lot of weird  group by statements I want to chart, and I don't want to have to go back to the server over and over again.
    So what I've done is created an empty local SQLite db that matches my MySQL table. 
    Is there a way to import all these records without having to do a "foreach" statement? 

    You will need to read the data into an ArrayCollection likely with HTTPService and then use a for loop to build and run the Insert ... Values() sql statements.
    Preferably wrapped in transaction.

  • Problem importing specific data from .csv file

    Hello!
    I'm using JDev 11.1.3.0 with JSF ui and I've been following the excellent example about importing comma separated values from a .csv file, of Mr. Bors, but I have a specific problem...
    1) When i try to import a date value (format dd/mm/yyyy) to a date field of my adf table, the date won't be accepted and it's not shown (I'm using Row.setAttribute("Field", textValue) for that purpose). I guess I have to enter it in a specific format but don't know how!
    2) Same thing with double values. Tried 1.50 and it's showing on the adf table 0.02. Tried 1,50 (with a different text separator) and it's throwing me an error and doesn't show anything on the table (again i'm using Row.setAttribute("Field", textValue) for that one too)...
    Any help would be appreciated!
    Thank you
    Edited by: Nikolas Saridakis on 6 Νοε 2010 11:10 πμ

    I tried your suggestion before I set my question but didn't know if I was looking on the right direction (I tried some things but they didn't work out). After a bit poking everything was fine...
    So I used
    Row.setAttribute("Field_name", oracle.jbo.domain.Date.fromText(value, "dd/mm/yyyy", null));
    and it did the trick...
    Weird thing about numbers was that after that, they were entered properly on the table with a normal Row.setAttribute("Field_name", value); with value being a String... anyway!
    Thank you Timo!

  • Excel issues with importing CSV or HTML table data from URL - Sharepoint? Office365?

    Greetings,
    We have a client who is having issues importing CSV or HTML table data as one would do using Excel's Web Query import from a reporting application.  As the error message provided by Excel is unhelpful I'm reaching out to anyone who can help us begin to
    troubleshoot problems affecting what is normal standard Excel functionality.  I'd attach the error screenshot, but I can't because my account is not verified....needless to say it says "Microsoft Excel cannot access  the file https://www.avantalytics.com/reporting_handler?func=wquery&format=csv&logid=XXXX&key=MD5
    Where XXXX is a number and MD5 is an md5 code.  The symptoms stated in the error message are:
    - the file name or path does not exist
    -The file is being used by another program
    -The workbook you are trying to save has the same name as a currently open workbook.
    None of these symptoms are the case, naturally. The user encountered this with Excel2010, she was then upgraded to Excel2013 and is still experiencing the same issue. The output of this URL in a browser (IE, Chrome, Firefox) is CSV data for the affected
    user, so it is not a network connectivity issue.  In our testing environment using both Excel2010 or 2013 this file is imported successfully, so we cannot replicate.  The main difference I can determine between our test environment and the end-user
    is they have a Sharepoint installation and appear to have Office365 as well.
    So,  my question might more appropriately be for Sharepoint or Office365 folks, but I can't be sure they're  a culprit.  Given this - does anyone have any knowledge of issues which might cause this with Sharepoint or Office365 integrated with
    Excel and/or have suggestions for getting more information from Excel or Windows other than this error message?  I've added the domain name as a trusted publisher in IE as I thought that might be the issue, but that hasn't solved anything.  As you
    can see its already https and there is no authentication or login - the md5 key is the authentication.  The certificate for the application endpoint is valid and registered via GoDaddy CA.
    I'm at a loss and would love some suggestions on things to check/try.
    Thanks  -Ross

    Hi Ross,
    >> In our testing environment using both Excel 2010 and 2013 this file is imported successfully, so we cannot replicate.
    I suspect it is caused by the difference of web server security settings.
    KB: Error message when you use Web query to a secure Web page (HTTPS://) in Excel: "Unable to open"
    Hope it will help.
    By the way, this forum is mainly for discussing questions about Office Development (VSTO, VBA and Apps for Office .etc.). For Office products feature specific questions, you could consider posting them on
    Office IT Pro forum or Microsoft Office Community.
    Regards,
    Jeffrey
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Importing csv with dates

    Hi all,
    I am importing csv files that contain (among others) two columns with dates into Numbers. The dates are all uniformly formatted by dd.mm.yyyy, which is in Germany the usual date format. However, the import changes them into numbers, e.g.
    26.10.2009 becomes 38650
    13.11.2009 becomes 38668
    So, there seams to be some logic behind, but how can I conveniently get the correct date back?
    Thanks for any help.
    Best regards.

    Unfortunately, I wasn't able to change the cell format type to 'date' *after* importing the data, it just remained 'number'.
    However, I found a way to work around the problem: Introducing a new column and using there the function EDATE gave me the original date back! The function takes two arguments, the first I let point to the cell with the strange number, the second I let point to an empty cell.
    EDATE might not be meant to work that way and this feature is not documented, anyway, it worked for me.

  • IPhoto imports photos with wrong dates even if the dates are fine on the camera

    Hi!
    When I import photos with iPhoto, sometimes it imports them with wrong dates, even the dates are fine on the camera. It puts dates such as 2032. Does anyone know how can I fix that. As far as I know there is no way to change dates of the photos.
    Thanks!

    well that is very confusing since if the date is correct on the camera it will be correct in iPhoto
    and as to
    As far as I know there is no way to change dates of the photos.
    Try looking through your iPhto menus - two commands - adjust time and date and batch change time and date - asjust is used to correct incorrect dates like a comera setting -   Batch change for missing dates like with scans
    LN

  • Adding data file to the existing tablespace given wrong format file name

    Hi
    while adding the data file existing tablespace. i given wrong format '/dev/oracle/data/user_data_02' i miss the .dbf extension for this data file.
    Alter tablespace ts2 add datafile '/dev/oracle/data/user_data_02' size 200m
    the data file was created.
    So i want to know. What will happen if the data file is wrong format?
    Thanks in advance

    it doesnt matter what you name the file. oracle will be able to use it. if you need to change the name to meet your naming conventions, then you will need to take the db down and rename the file at the OS level and then startup mount to rename the file at the DB level.
    why are you puting files in /dev? Thats for devices.

Maybe you are looking for