Import CSV Data from a server into SQLite  - Help me do this or talk me out of it?

There is a portion of a MySQL database (no more than 500 records, 6 fields) that I would like to download into a local SQLite db every time a user signs in.
The reason I want to do this is because there are a lot of weird  group by statements I want to chart, and I don't want to have to go back to the server over and over again.
So what I've done is created an empty local SQLite db that matches my MySQL table. 
Is there a way to import all these records without having to do a "foreach" statement? 

You will need to read the data into an ArrayCollection likely with HTTPService and then use a for loop to build and run the Insert ... Values() sql statements.
Preferably wrapped in transaction.

Similar Messages

  • Periodic Importing of data from SQL Server into Oracle Database

    I would like to know how I can use the SQL Developer tool to import data from a SQL Server database across the internet into an Oracle Database on a periodic basis? I understand there is a Migration Workbench, but it seems to be only a one off activity.
    What I would really like is to be able to import data for example, very 30 minutes from multiple tables/views in a SQL Server Database into an Oracle Database (overwriting or modifying existing same data) without the need for Heterogeneous Connectivity, as my Oracle Database is not my LAN.
    Thank you for your kind ideas and suggestions.

    To be honest the best way to do it is DG4ODBC or DG4MSQL. All other solutions need manual interaction.
    The gateways DG4ODBC and DG4MSQL are independant from the Oracle database and also from the SQL Server location. In a worst case scenario you have an Oracle database on machine A, the gateway on machine B and the SQL Server on machine C.
    Gateways were built to exactly do what you want. As you have access to the Oracle database the only part you need to define within the Oracle database is a db link. If you specify the full tns connect identifier there is even no need to edit the tnsnames.ora file.
    Even if you excluded the gateway in your last sentence the best solution would be the gateway.
    Another option you can use is the SQL Server linked server mechanism. It allows you to link another server using OLEDB driver. In your case you can then link the Oracle db into your SQL Server and then insert/update/delete and select Oracle tables from the SQL Server. Using now a trigger will replicate the data immediately to the Oracle db.

  • How can i import the data from multiple sources into single rpd in obiee11g

    how can i import the data from multiple sources into single rpd in obiee11g

    Hi,
    to import from multiple data sources, first configure ODBC connections for respective data sources. then you can import data from multiple data sources. When you import the data, a connection pool will create automatically.
    tnx

  • Using Oracle Forms Importing Data From SQL Server into Oracle Tables.

    Dear All,
    We are using Oracle Forms 10g in windows XP and having OAS 10g and Oracle database 9i.
    How can we import data from SQL Server 2005 into Oracle tables using Oracle Forms?
    Thanks & Regards
    Eidy

    I have no idea what "Oracle Hetrogenius Services" is, so I can't help you with that, sorry.
    SQL Developer might also assist you. SQL Developer can connect to SQL Server as well as Oracle and has some tools for migration. See the documentation for details:
    http://download.oracle.com/docs/cd/E12151_01/doc.150/e12156/toc.htm
    For additional help on using SQL Developer for this task, please consult Support or the SQL Developer forum: SQL Developer
    Hope this helps,
    Jacob

  • Import data from SQL Server into MS Word document for Mail Merge purpose ?

    Hi,
    Is it possible to import contacts from SQL Server into MS Word for mail merge purpose or if retrieving data from MS Excel can we update the data in MS Excel sheet without opening it ?
    Note: Remember when you open a word document already set up for mail merge, asks you to run the query to return all records from the excel sheet it is connected to.
    Khurram

    Word and the current data source dialog do not really give you any help with that.
    You either have to be able to create a View in SQL Server that performs the query you need, then connect to that, or you have to be able to create the correct query manually (or perhaps using some other query tool that can help you), then use VBA to connect
    using that query. 
    For example, if you have been through the connection process once (connecting to a single table) then you will have a .odc (Office Data Connection file) which has the info. needed to connect to the correct server and database. It's a text file with some
    HTML and XML inside. You can copy/rename it. Let's say it is called "c:\a\myodc.odc" Then in VBA you can use something like
    ActiveDocument.OpenDataSource Name:="c:\a\myodc.odc, _
    SQLStatement:="put your SQL statement in here, and if it is long,...", _
    SQLStatement1:="put the second part in here"
    You get a maximum of either 255 or around 511 characters in the SQL statement, and Word tends to impose some syntax requirements that Transact-SQL does not, so e.g. you may need to quote all your table names.
    You can also se an empty .odc file and provide connection info. in the COnnection:= parameter in OpenDataSource.
    As background, until Word 2000, by default you would use MS Query to create your SQL query, and MS Query does have facilities that can help you build your query (a bit like the ones in MS Access). That may still be possible (it is a bit harder to find the MS
    Query option now, and I am not sure it works with the latest versions of Word). MS Query only works for ODBC queries, and they do not always work correctly when you actually issue the query using ODBC from Word, because of a Word problem to do with Unicode
    fields in SQL Server. But you could probably still use MS Query to help you construct your SQL. (It's probably easier to do that in Excel, though).
    Peter Jamieson

  • Using PowerShell to import CSV data from Vendor database to manipulate Active Directory Users

    Hello,
    I have a big project I am trying to automate.  I am working in a K-12 public education IT Dept. and have been tasked with importing data that has been exported from a vendor database via .csv file into Active Directory to manage student accounts. 
    My client wants to use this data to make bulk changes  to student user accounts in AD such as moving accounts from one OU to another, modifying account attributes based on State ID, lunchroom ID, School, Grade, etc. and adding new accounts / disabling
    accounts for students no longer enrolled.
    The .csv that is exported doesn't have headers that match up with what is needed for importing in AD, so those have to be modified in this process, or set as variables to get the correct info into the correct attributes in AD or else this whole project is
    a bust.  He is tired of manually manipulating the .csv data and trying to get it onto AD with few or no errors, hence the reason it has been passed off to me.
    Since this information changes practically daily, I need a way to automate user management by accomplishing the following on a scheduled basis.
    Process must:
    Check to see if Student Number already exists
    If yes, then modify account
    Update {School Name}, {Site Code}, {School Number}, {Grade Level} (Variables)
    Add correct group memberships (School / Grade Specific)
    Move account to correct OU (OU={Grade},OU=Students,OU=Users,OU={SiteCode},DC=Domain,DC=net)
    Remove incorrect group memberships (School / Grade Specific)
    Set account status (enabled / disabled)
    If no, create account
    Import Student #
    Import CNP #
    Import Student name
    Extract First and Middle initial
    If duplicate name exists
    Create log entry for review
    Import School, School Number, Grade Level
    Add to correct Group memberships (School / Grade Specific)
    Set correct OU (OU={Grade},OU=Students,OU=Users,OU={SiteCode},DC=Domain,DC=net)
    Set account Status
    I am not familiar with Powershell, but have researched enough to know that it will be the best option for this project.  I have seen some partial solutions in VB, but I am more of an infrastructure person instead of scripting / software development. 
    I have just started creating a script and already have hit a snag.  Maybe one of you could help.
    #Connect to Active Directory
    Import-Module ActiveDirectory
    # Import iNOW user information
    $Users = import-csv C:\ADUpdate\INOW_export.csv
    #Check to see if the account already exists in AD
    ForEach ( $user in $users )
    #Assign the content to variables
    $Attr_employeeID = $users."Student Number"
    $Attr_givenName = $users."First Name"
    $Attr_middleName = $users."Middle Name"
    $Attr_sn = $users."Last Name"
    $Attr_postaldeliveryOfficeName = $users.School
    $Attr_company = $users."School Number"
    $Attr_department = $users."Grade Level"
    $Attr_cn = $Attr_givenName.Substring(0,1) + $Attr_middleName.Substring(0,1) + $Attr_sn
    IF (Get-ADUser $Attr_cn)
    {Write-Host $Attr_cn already exists in Active Directory

    Thank you for helping me with that before it became an issue later on, however, even when modified to be $Attr_sAMAaccountName i still get errors.
    #Connect to Active Directory
    Import-Module ActiveDirectory
    # Import iNOW user information
    $Users = import-csv D:\ADUpdate\Data\INOW_export.csv
    #Check to see if the account already exists in AD
    ForEach ( $user in $users )
    #Assign the content to variables
    $Attr_employeeID = $users."Student Number"
    $Attr_givenName = $users."First Name"
    $Attr_middleName = $users."Middle Name"
    $Attr_sn = $users."Last Name"
    $Attr_postaldeliveryOfficeName = $users.School
    $Attr_company = $users."School Number"
    $Attr_department = $users."Grade Level"
    $Attr_sAMAccountName = $Attr_givenName.Substring(0,1) + $Attr_middleName.Substring(0,1) + $Attr_sn
    IF (Get-ADUser $Attr_sAMAccountName)
    {Write-Host $Attr_sAMAccountName already exists in Active Directory
    PS C:\Windows\system32> D:\ADUpdate\Scripts\INOW-AD.ps1
    Get-ADUser : Cannot convert 'System.Object[]' to the type 'Microsoft.ActiveDirectory.Management.ADUser'
    required by parameter 'Identity'. Specified method is not supported.
    At D:\ADUpdate\Scripts\INOW-AD.ps1:28 char:28
    + IF (Get-ADUser $Attr_sAMAccountName)
    + ~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidArgument: (:) [Get-ADUser], ParameterBindingException
    + FullyQualifiedErrorId : CannotConvertArgument,Microsoft.ActiveDirectory.Management.Commands.GetAD
    User

  • Load data from FTP server into Oracle

    I have a zipped file in a FTP server. I need to unzip the file and sql load it into an Oracle table.
    Does Oracle has any utilities with which I can direct do "get" from the FTP server?
    Does Oracle has any zip/unzip features?
    I am using 10g.

    user6794035 wrote:
    I have a zipped file in a FTP server. I need to unzip the file and sql load it into an Oracle table.
    Does Oracle has any utilities with which I can direct do "get" from the FTP server?
    Does Oracle has any zip/unzip features?
    I am using 10g.Oracle does not support thes features directly. You'll have to write an OS script to performthe tasks and insert the data.
    Dependending on your output format you can use UTL_FILE, SQL*LOADER, or external tables to load the data.

  • Need a script to import the data from flat file

    Hi Friends,
    Any one have any scripts to import the data from flat files into oracle database(Linux OS). I have to automate the script for every 30min to check any flat files in Incoming directory process them with out user interaction.
    Thanks.
    Srini

    Here is my init.ora file
    # $Header: init.ora 06-aug-98.10:24:40 atsukerm Exp $
    # Copyright (c) 1991, 1997, 1998 by Oracle Corporation
    # NAME
    # init.ora
    # FUNCTION
    # NOTES
    # MODIFIED
    # atsukerm 08/06/98 - fix for 8.1.
    # hpiao 06/05/97 - fix for 803
    # glavash 05/12/97 - add oracle_trace_enable comment
    # hpiao 04/22/97 - remove ifile=, events=, etc.
    # alingelb 09/19/94 - remove vms-specific stuff
    # dpawson 07/07/93 - add more comments regarded archive start
    # maporter 10/29/92 - Add vms_sga_use_gblpagfile=TRUE
    # jloaiza 03/07/92 - change ALPHA to BETA
    # danderso 02/26/92 - change db_block_cache_protect to dbblock_cache_p
    # ghallmar 02/03/92 - db_directory -> db_domain
    # maporter 01/12/92 - merge changes from branch 1.8.308.1
    # maporter 12/21/91 - bug 76493: Add control_files parameter
    # wbridge 12/03/91 - use of %c in archive format is discouraged
    # ghallmar 12/02/91 - add global_names=true, db_directory=us.acme.com
    # thayes 11/27/91 - Change default for cache_clone
    # jloaiza 08/13/91 - merge changes from branch 1.7.100.1
    # jloaiza 07/31/91 - add debug stuff
    # rlim 04/29/91 - removal of char_is_varchar2
    # Bridge 03/12/91 - log_allocation no longer exists
    # Wijaya 02/05/91 - remove obsolete parameters
    # Example INIT.ORA file
    # This file is provided by Oracle Corporation to help you customize
    # your RDBMS installation for your site. Important system parameters
    # are discussed, and example settings given.
    # Some parameter settings are generic to any size installation.
    # For parameters that require different values in different size
    # installations, three scenarios have been provided: SMALL, MEDIUM
    # and LARGE. Any parameter that needs to be tuned according to
    # installation size will have three settings, each one commented
    # according to installation size.
    # Use the following table to approximate the SGA size needed for the
    # three scenarious provided in this file:
    # -------Installation/Database Size------
    # SMALL MEDIUM LARGE
    # Block 2K 4500K 6800K 17000K
    # Size 4K 5500K 8800K 21000K
    # To set up a database that multiple instances will be using, place
    # all instance-specific parameters in one file, and then have all
    # of these files point to a master file using the IFILE command.
    # This way, when you change a public
    # parameter, it will automatically change on all instances. This is
    # necessary, since all instances must run with the same value for many
    # parameters. For example, if you choose to use private rollback segments,
    # these must be specified in different files, but since all gc_*
    # parameters must be the same on all instances, they should be in one file.
    # INSTRUCTIONS: Edit this file and the other INIT files it calls for
    # your site, either by using the values provided here or by providing
    # your own. Then place an IFILE= line into each instance-specific
    # INIT file that points at this file.
    # NOTE: Parameter values suggested in this file are based on conservative
    # estimates for computer memory availability. You should adjust values upward
    # for modern machines.
    # You may also consider using Database Configuration Assistant tool (DBCA)
    # to create INIT file and to size your initial set of tablespaces based
    # on the user input.
    # replace DEFAULT with your database name
    db_name=DEFAULT
    db_files = 80 # SMALL
    # db_files = 400 # MEDIUM
    # db_files = 1500 # LARGE
    db_file_multiblock_read_count = 8 # SMALL
    # db_file_multiblock_read_count = 16 # MEDIUM
    # db_file_multiblock_read_count = 32 # LARGE
    db_block_buffers = 100 # SMALL
    # db_block_buffers = 550 # MEDIUM
    # db_block_buffers = 3200 # LARGE
    shared_pool_size = 3500000 # SMALL
    # shared_pool_size = 5000000 # MEDIUM
    # shared_pool_size = 9000000 # LARGE
    log_checkpoint_interval = 10000
    processes = 50 # SMALL
    # processes = 100 # MEDIUM
    # processes = 200 # LARGE
    parallel_max_servers = 5 # SMALL
    # parallel_max_servers = 4 x (number of CPUs) # MEDIUM
    # parallel_max_servers = 4 x (number of CPUs) # LARGE
    log_buffer = 32768 # SMALL
    # log_buffer = 32768 # MEDIUM
    # log_buffer = 163840 # LARGE
    # audit_trail = true # if you want auditing
    # timed_statistics = true # if you want timed statistics
    max_dump_file_size = 10240 # limit trace file size to 5 Meg each
    # Uncommenting the line below will cause automatic archiving if archiving has
    # been enabled using ALTER DATABASE ARCHIVELOG.
    # log_archive_start = true
    # log_archive_dest = disk$rdbms:[oracle.archive]
    # log_archive_format = "T%TS%S.ARC"
    # If using private rollback segments, place lines of the following
    # form in each of your instance-specific init.ora files:
    # rollback_segments = (name1, name2)
    # If using public rollback segments, define how many
    # rollback segments each instance will pick up, using the formula
    # # of rollback segments = transactions / transactions_per_rollback_segment
    # In this example each instance will grab 40/5 = 8:
    # transactions = 40
    # transactions_per_rollback_segment = 5
    # Global Naming -- enforce that a dblink has same name as the db it connects to
    global_names = TRUE
    # Edit and uncomment the following line to provide the suffix that will be
    # appended to the db_name parameter (separated with a dot) and stored as the
    # global database name when a database is created. If your site uses
    # Internet Domain names for e-mail, then the part of your e-mail address after
    # the '@' is a good candidate for this parameter value.
    # db_domain = us.acme.com      # global database name is db_name.db_domain
    # FOR DEVELOPMENT ONLY, ALWAYS TRY TO USE SYSTEM BACKING STORE
    # vms_sga_use_gblpagfil = TRUE
    # FOR BETA RELEASE ONLY. Enable debugging modes. Note that these can
    # adversely affect performance. On some non-VMS ports the db_block_cache_*
    # debugging modes have a severe effect on performance.
    #_db_block_cache_protect = true # memory protect buffers
    #event = "10210 trace name context forever, level 2" # data block checking
    #event = "10211 trace name context forever, level 2" # index block checking
    #event = "10235 trace name context forever, level 1" # memory heap checking
    #event = "10049 trace name context forever, level 2" # memory protect cursors
    # define parallel server (multi-instance) parameters
    #ifile = ora_system:initps.ora
    # define two control files by default
    control_files = (ora_control1, ora_control2)
    # Uncomment the following line if you wish to enable the Oracle Trace product
    # to trace server activity. This enables scheduling of server collections
    # from the Oracle Enterprise Manager Console.
    # Also, if the oracle_trace_collection_name parameter is non-null,
    # every session will write to the named collection, as well as enabling you
    # to schedule future collections from the console.
    # oracle_trace_enable = TRUE
    # Uncomment the following line, if you want to use some of the new 8.1
    # features. Please remember that using them may require some downgrade
    # actions if you later decide to move back to 8.0.
    #compatible = 8.1.0
    Thanks.
    Srini

  • Standard report which importe personnel number from  one server to other

    Dear Hcm Gurus.
    Is any standard report exist in SAP Which imports Personnel number from one server to other server which are maintained in Source server ,imported all contents maintained in source server.
    e.g  An employee is hired in  u201C Server Au201D and personnel number  is generated internally while we want to imports same employee data with same personnel number in Server u201CBu201D
    IS it possible ? if is kindly do help and clarify any standard report which import complete data from one server   to other server with all its infotypes.
    barket

    Hi,
    a standard way should be setting up an ALE scenario between the two systems to transfer employee data and keep them updated. This needs a bit more set up than a single report to be run but it is a common way to create HR landscapes where different local HR systems consolidate data to a centralized system.
    The other options are valid, too. But I know them more for transfering data for test or validation reasons.
    If you could add some more information to the business background of your requirement we could add give a better suggestion which solution to head for.
    Kind Regards
    Roman

  • How do I import email messages from another server?

    Heya,
    So I can't figure out how to import mail messages from another server into this one? I had [email protected] pointing to the old one, when I repointed it obviously all my email "disappeared" because there is none on the new server. So how do I move it from the old server to the new one? Usually I'd just drag and drop the email message files over FTP but SLS doesn't seem to give me permission to do that on this one.
    Thanks!

    Search for discussions of the imapsync tool.

  • Attempt to read data from the server failed...

    I'm getting the following error on one of my email accounts:
    +There may be a problem with the mail server or network. Verify the settings for account “[my account]” or try again.+
    +The server returned the error: The attempt to read data from the server “[my account]” failed.+
    I'd like to find out more about the error, so that I can start to track this down and see if there is a problem with the mail server settings, or if the problem is on my end.
    I don't see any reference to this error in the logs – would it be somewhere else?

    I've just solved a similar issue.
    I have a dedicated server running Plesk 9.5 and when I upgraded to iLife 11 and Snow Leopard this error appeared. I could quickly click "get mail" and I'd get all my mail, but only 3-4 of my 9 mail accounts would connect. Theo others would have the error:
    The server error encountered was: The attempt to read data from the server...
    I found solutions for those using IMAP mail:
    modify the /etc/courier-imap/imapd configuration file and change MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40. This allows all the machines behind my home firewall to connect to multiple accounts on the e-mail server with mailbox caching enabled.
    I'd made this change on my server but it didn't seem to have any effect. It dawned on me that I'm using POP, no IMAP. So I found in /etc/courier-imap/pop3d the same settings. I changed the MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40 and voila, all my connections concurrently worked.
    This has taken me more than two days to fix and I hope posting this helps someone else with the same issue.

  • Extract the data from SQL Server and Import into Oracle

    Hi,
    I would like to run a daily job that will export the table data from SQL server table (it will be only one or two table) and Import back into Oracle table (it might one or two table tables).
    Could you please guide me that how can i do this using either sql server or oracle?
    We have oracle 9.2 and sql server 2005.
    Normally i do from flat file which is generated by source destination nand i dump into oracle using sql*loader but this time I have to directly extract/export the data from MS Sql server and load into Oracle table, mostly it will reload so i might doing any massaging data during the load.
    If you show me the detail approach, it will be really appreciated.
    I have access to Sql server but i don't how to use sql server to do this or using oracle as a daily job even becuase have to schedule the job for this as it will be a daily job.
    Thanks,
    poratips

    Unless you can find an open source ODBC driver for SQL Server that runs on Solaris (and I wouldn't be overly hopeful there) Heterogeneous Services would require that you license something-- a third party ODBC driver, a new Oracle instance, or an Oracle Transparent Gateway.
    As I stated below, you could certainly use SQL Server's ETL tool, DTS. Oracle's ETL tools would require additional licensing since you're just on 9.2. You could also write a small application (Java or otherwise) that connected to both databases and transferred the data. If you're particularly enterprising, you could load the SQL Server Type 4 JDBC driver into Oracle's JVM and write a Java stored procedure that connected to the SQL Server database via JDBC, but that's a pretty convoluted approach.
    Justin

  • Import data from a file into oracle db

    Hello!
    The problem is the following: data within a file must be placed
    into several tables in a oracle server. For that a program has
    to be written. It4s not done with executing a program cause the
    data in the file can vary and depending on different flags the
    import into oracle must be done. Now we tried the classes of
    oo4o to access the db, but this only works for integer values,
    the samples from oracle also only work for integer. We don4t
    need to use oo40, is there a better / other way to write the
    programm ? Has anybody a free sample ?
    Thanx

    ur problem is to import the data from file with different conditions which needs to be taken care programitically
    Pls mail me your file and ur logic
    my id is [email protected]

  • How to import csv data into Oracle using c#

    Hello,
    How to import csv data into Oracle using c #. Where data to be imported 3GB in size and number of rows 7512263. I've managed to import csv data into Oracle, but the time it takes about 1 hour. How to speed up the time it takes to import csv data into oracle. Thank you.
    This is my code:
    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Text;
    using System.Diagnostics;
    using System.Threading;
    using System.Text.RegularExpressions;
    using System.IO;
    using FileHelpers;
    using System.Data.OracleClient;
    namespace sqlloader
    class Program
    static void Main(string[] args)
    int jum;
    int i;
    bool isFirstLine = false;
    FileHelperEngine engine = new FileHelperEngine(typeof(XL_XDR));
    //Connect To Database
    string constr = "Data Source=(DESCRIPTION=(ADDRESS_LIST="
    + "(ADDRESS=(PROTOCOL=TCP)(HOST= pt-9a84825594af )(PORT=1521 )))"
    + "(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=o11g)));"
    + "User Id=xl;Password=rahasia;";
    OracleConnection con = new OracleConnection(constr);
    con.Open();
    // To Read Use:
    XL_XDR[] res = engine.ReadFile("DataOut.csv") as XL_XDR[];
    jum = CountLinesInFile("DataOut.csv");
    FileInfo f2 = new FileInfo("DataOut.csv");
    long s2 = f2.Length;
    int jmlRecord = jum - 1;
    for (i = 0; i < jum; i++)
    ShowPercentProgress("Processing...", i, jum);
    Thread.Sleep(100);
    if (isFirstLine == false)
    isFirstLine = true;
    else
    string sql = "INSERT INTO XL_XDR (XDR_ID, XDR_TYPE, SESSION_START_TIME, SESSION_END_TIME, SESSION_LAST_UPDATE_TIME, " +
    "SESSION_FLAG, VERSION, CONNECTION_ROW_COUNT, ERROR_CODE, METHOD, HOST_LEN, HOST, URL_LEN, URL, CONNECTION_START_TIME, " +
    "CONNECTION_LAST_UPDATE_TIME, CONNECTION_FLAG, CONNECTION_ID, TOTAL_EVENT_COUNT, TUNNEL_PAIR_ID, RESPONSIVENESS_TYPE, " +
    "CLIENT_PORT, PAYLOAD_TYPE, VIRTUAL_TYPE, VID_CLIENT, VID_SERVER, CLIENT_ADDR, SERVER_ADDR, CLIENT_TUNNEL_ADDR, " +
    "SERVER_TUNNEL_ADDR, ERROR_CODE_2, IPID, C2S_PKTS, C2S_OCTETS, S2C_PKTS, S2C_OCTETS, NUM_SUCC_TRANS, CONNECT_TIME, " +
    "TOTAL_RESP, TIMEOUTS, RETRIES, RAI, TCP_SYNS, TCP_SYN_ACKS, TCP_SYN_RESETS, TCP_SYN_FINS, EVENT_TYPE, FLAGS, TIME_STAMP, " +
    "EVENT_ID, EVENT_CODE) VALUES (" +
    "'" + res.XDR_ID + "', '" + res[i].XDR_TYPE + "', '" + res[i].SESSION_START_TIME + "', '" + res[i].SESSION_END_TIME + "', " +
    "'" + res[i].SESSION_LAST_UPDATE_TIME + "', '" + res[i].SESSION_FLAG + "', '" + res[i].VERSION + "', '" + res[i].CONNECTION_ROW_COUNT + "', " +
    "'" + res[i].ERROR_CODE + "', '" + res[i].METHOD + "', '" + res[i].HOST_LEN + "', '" + res[i].HOST + "', " +
    "'" + res[i].URL_LEN + "', '" + res[i].URL + "', '" + res[i].CONNECTION_START_TIME + "', '" + res[i].CONNECTION_LAST_UPDATE_TIME + "', " +
    "'" + res[i].CONNECTION_FLAG + "', '" + res[i].CONNECTION_ID + "', '" + res[i].TOTAL_EVENT_COUNT + "', '" + res[i].TUNNEL_PAIR_ID + "', " +
    "'" + res[i].RESPONSIVENESS_TYPE + "', '" + res[i].CLIENT_PORT + "', '" + res[i].PAYLOAD_TYPE + "', '" + res[i].VIRTUAL_TYPE + "', " +
    "'" + res[i].VID_CLIENT + "', '" + res[i].VID_SERVER + "', '" + res[i].CLIENT_ADDR + "', '" + res[i].SERVER_ADDR + "', " +
    "'" + res[i].CLIENT_TUNNEL_ADDR + "', '" + res[i].SERVER_TUNNEL_ADDR + "', '" + res[i].ERROR_CODE_2 + "', '" + res[i].IPID + "', " +
    "'" + res[i].C2S_PKTS + "', '" + res[i].C2S_OCTETS + "', '" + res[i].S2C_PKTS + "', '" + res[i].S2C_OCTETS + "', " +
    "'" + res[i].NUM_SUCC_TRANS + "', '" + res[i].CONNECT_TIME + "', '" + res[i].TOTAL_RESP + "', '" + res[i].TIMEOUTS + "', " +
    "'" + res[i].RETRIES + "', '" + res[i].RAI + "', '" + res[i].TCP_SYNS + "', '" + res[i].TCP_SYN_ACKS + "', " +
    "'" + res[i].TCP_SYN_RESETS + "', '" + res[i].TCP_SYN_FINS + "', '" + res[i].EVENT_TYPE + "', '" + res[i].FLAGS + "', " +
    "'" + res[i].TIME_STAMP + "', '" + res[i].EVENT_ID + "', '" + res[i].EVENT_CODE + "')";
    OracleCommand command = new OracleCommand(sql, con);
    command.ExecuteNonQuery();
    Console.WriteLine("Successfully Inserted");
    Console.WriteLine();
    Console.WriteLine("Number of Row Data: " + jmlRecord.ToString());
    Console.WriteLine();
    Console.WriteLine("The size of {0} is {1} bytes.", f2.Name, f2.Length);
    con.Close();
    static void ShowPercentProgress(string message, int currElementIndex, int totalElementCount)
    if (currElementIndex < 0 || currElementIndex >= totalElementCount)
    throw new InvalidOperationException("currElement out of range");
    int percent = (100 * (currElementIndex + 1)) / totalElementCount;
    Console.Write("\r{0}{1}% complete", message, percent);
    if (currElementIndex == totalElementCount - 1)
    Console.WriteLine(Environment.NewLine);
    static int CountLinesInFile(string f)
    int count = 0;
    using (StreamReader r = new StreamReader(f))
    string line;
    while ((line = r.ReadLine()) != null)
    count++;
    return count;
    [DelimitedRecord(",")]
    public class XL_XDR
    public string XDR_ID;
    public string XDR_TYPE;
    public string SESSION_START_TIME;
    public string SESSION_END_TIME;
    public string SESSION_LAST_UPDATE_TIME;
    public string SESSION_FLAG;
    public string VERSION;
    public string CONNECTION_ROW_COUNT;
    public string ERROR_CODE;
    public string METHOD;
    public string HOST_LEN;
    public string HOST;
    public string URL_LEN;
    public string URL;
    public string CONNECTION_START_TIME;
    public string CONNECTION_LAST_UPDATE_TIME;
    public string CONNECTION_FLAG;
    public string CONNECTION_ID;
    public string TOTAL_EVENT_COUNT;
    public string TUNNEL_PAIR_ID;
    public string RESPONSIVENESS_TYPE;
    public string CLIENT_PORT;
    public string PAYLOAD_TYPE;
    public string VIRTUAL_TYPE;
    public string VID_CLIENT;
    public string VID_SERVER;
    public string CLIENT_ADDR;
    public string SERVER_ADDR;
    public string CLIENT_TUNNEL_ADDR;
    public string SERVER_TUNNEL_ADDR;
    public string ERROR_CODE_2;
    public string IPID;
    public string C2S_PKTS;
    public string C2S_OCTETS;
    public string S2C_PKTS;
    public string S2C_OCTETS;
    public string NUM_SUCC_TRANS;
    public string CONNECT_TIME;
    public string TOTAL_RESP;
    public string TIMEOUTS;
    public string RETRIES;
    public string RAI;
    public string TCP_SYNS;
    public string TCP_SYN_ACKS;
    public string TCP_SYN_RESETS;
    public string TCP_SYN_FINS;
    public string EVENT_TYPE;
    public string FLAGS;
    public string TIME_STAMP;
    public string EVENT_ID;
    public string EVENT_CODE;
    I hope someone can give me a solution. Thanks

    The fastest way is to use external tables or sql loader (sqlldr). (If you use external tables or sql loader, you don't use C# at all).

  • Importing data from SQL Server 2005 database

    Dear Friends
    Scenario:
    We have built an application for which the database is SQL Server 2005.
    Now, we have planned to switch over to Oracle 8i database.
    There is a field with data type as Varchar(Max) – which can hold data with a maximum length of 231 characters (SQL Server 2005)
    In Oracle 8i, the Varchar2 max size 4000 bytes or characters.
    We want to import the data from SQL Sever 2005 database into Oracle 8i database.In the process, the value for the particular field whose data type set as Varchar(Max) in SQL Server 2005 and contains more than 4000 characters gets automatically truncated while importing into oracle.
    We want the entire characters stored in the field to be imported into oracle database.
    Please suggest solutions.
    Thanks and regards
    Bharath Kumar V

    Use CLOB if the data is purely text data. BLOB otherwise.
    But the question is why you want to switch over to an Oracle version which is de-supported by Oracle?

Maybe you are looking for