Loading csv files from JDBC

What is the best way to load a csv file using JDBC link? Right now I basically read a line and insert it, batching it doesn't seem to speed it up either.
Thanks
Brian Timothy
null

Hai !
I think i can help u in this as i have done almost the same thing...
In Oracle for importing the data from other databases there is one bulk insert command ie SQLLDR...This commands takes parameters as a Control File(*.ctl) and data file ie(*.csv) ......You can get more help on it on the site if you search for SQL Loader....
But i think it cann't be executed from StoredProcedure's so what i am doing is running this command from my java application using Runtime.exec("SQLLDR...................") .
I think this will help you ...You are always welcome for any more clarifications....
Regards
Brijesh Kumar
SSI-Technologies

Similar Messages

  • Error loading csv file from application server

    Hi all,
    While uploading a csv file from the application server to psa we are getting the following error,
    Error 2 while splitting CSV data record
    Message no. RSDS_ACCESS011
    Diagnosis
    Error 2 occurred while splitting the CSV data record 1
    1 = Could not find a closing escape character
    2 = Invalid escape character
    3 = Conversion error
    4 = Other error
    System Response
    The function was terminated.
    Procedure
    Check the values of the data separator and escape sign, and try again.
    But i've checked the file and the escape sign, data seperator in it also. Everything is fine.  The same file we are able to load successfully in quality system.
    How to solve this error??
    Thanks in advance.

    Hi BI consultant:
       Could you please provide more details?
    For example:
    1.Is your P application server a UNIX flavor? (Solaris, AIX, UX, Linux)
       If yes..
             2. Are you able to see the contents of the file correctly with a "cat" or "vi" command? (at operating system level).
                   If no...
                         3. Did you upload the csv flat file to the server via FTP?
                                If yes...
                                     4. Did you use the "binary" or the "ascii" parameter on the FTP command used to upload the file?
    Probably you need to upload the CSV file again to your application server and make sure you can se the file contents ("cat" or "vi" command) before trying to execute the InfoPackage.
    Regards,
    Francisco Milán.
    Edited by: Francisco Milan on Jun 3, 2010 11:13 AM

  • Load Flat File from App Server

    HI, all.
    I want load flat file from appliation server. I created CSV file and loaded it on app. server with FM ARCHIVEFILE_CLIENT_TO_SERVER. After I created datasource and tried load data in the infocube. In this procedure I encountered with two problem
    1. When I look file in AL11, I doesn't see cyrillic symbols, instead this symbols I see #.
    2. When I try load data with data source, I get exception RS_EXCEPTION 000 "File don't open"
    Anybody can help me resolve this problem?
    wbr, Fanil.

    Hi, kodanda pani KV.
    2. File is was closed.
    1. Can me you clearly explain what you meen?
    wbr, Fanil

  • Error Loading the file from application server

    Hi Team,
    I'm trying to load the .CSV file from application server and I'm getting the error message.
    In the data package
      Update to PSA --Green
      Transfer Rule -- Green
      Update Rule -  error ABORT was set in the customer routine 9998
                            Error 1 in the update
    Please help me to provide the solution.
    But when I try to load the other files there are no issues. thanks.
    Regards,
    Senthil

    Raj
      check the below links, which can helps to solve your issues
    Flat file data load
    flat file Data loading issue
    Mahesh

  • Infopackage-Load Many Files from Application Server and later Archive/Move

    Hi All..
      I have a doubt,   I have a requirement of take many files to load into BI 7.0..  I used the infopackage before with option:
    Load Binary File From Application server
      I load information successfully... only with one file ...but If I can load many files (with different names) like the next list.. I think it's not a good idea modify the file name (path) on infopackage each time).. :
    *All of this files will be on one server that itu2019s map into AL11.. Like
    Infopfw
    BW_LOAD_20090120.txt
    BW_LOAD_20090125.txt
    BW_LOAD_OTHER_1.txt
    u2026.
    Etc..
    This directory it's not in BW server.. It's other server..but I can load form this location (one file by one)
    Could you help me with this questions:
    -     How can I Use an infopackage with routine that take all the files..one by oneu2026 in order of creation dateu2026and load into Target? Is it possible?.. I have some knowledge of ABAP.. but I don´t know exactly how I can say to system this logicu2026
    -     In addition is it possible move this files to other locationu2026 like into Infopfwarchive u2026 just to have an history of files loaded.
    I saw that in infopackage you have an option to create a routine.. in ABAP codeu2026 Iu2019m a little bit confused because I donu2019t  know how I can specify all the path..
    I try with:
    Infopfw
    InfopfwFile.csv
    Infopfw
    This is the abap code that automatically you see and you need to modifyu2026
    Create a routine for file name
    This routine will be called by the adapter,
    when the infopackage is executed.
              p_filename =
              p_subrc = 0.
    Thank you for your ideas or recommendations.
    Al

    Hi Reddy, thank you for your answer
    I have some doubuts.. when you explain me the option:
    All the above files are appending dates at the end of the file....
    You can load the files through infopackage by using Routines and pick the files based on date at the end of the file..***
    I need to ask you if you think that when you know the date of the file and the infopackage pick each file... thi can work for many files??... or how it's possible control this process?
    About this option, I want to ask you If when you menction Unix code... where it's programed this code?.. in the routine of BW Infopackage??
    ****Or
    Create two folders in your BW in Application server level, in AL11 (ask Basis team)
    I call it is F1 and F2 folders.
    First dump the files into F1 I assume that the file name in F1 is "BW_LOAD_20090120.txt", using Unix code you rename the file and then keep in the same foleder F1 or move to F2.
    Then create InfoPackage and fix the file name (i.e. you renamed), so you don't need to change everyday your file name at infopackage level.Because in AL11 everyday the file are overwrite.
    To I get BW_LOAD_20090120.txt file in F1, then I renamed to BW_LOAD.txt and loaded into BW, then tomorrow I get BW_LOAD_20090125.txt in F1, then I renamed to BW_LOAD.txt....
    so in this way it will work.You need to schedule the Ubix script in AL11.
    This is the way how to handle the application server...I'm using the same logic.
    Thank you soo much.
    Al

  • Reading a CSV file from server

    Hi All,
    I am reading a CSV file from server and my internal table has only one field with lenght 200. In the input CSV file there are more than one column and while splitting the file my internal table should have same number of rows as columns of the input record.
    But when i do that the last field in the internal table is appened with #.
    Can somebody tell me the solution for this.
    U can see the my code below.
    data: begin of itab_infile occurs 0,
             input(3000),
          end of itab_infile.
    data: begin of itab_rec occurs 0,
             record(200),
          end of itab_rec.
    data: c_comma(1) value ',',
            open dataset f_name1 for input in text mode encoding default.
            if sy-subrc <> 0.
              write: /, 'FILE NOT FOUND'.
              exit.
            endif.
    do
      read dataset p_ipath into waf_infile.
      split itab_infile-input at c_sep into table itab_rec.
    enddo.
    Thanks in advance.
    Sunil

    Sunil,
    You go not mention the platform on which the CSV file was created and the platform on which it is read.
    A common problem with CSV files created on MS/Windows platforms and read on unix is the end-of-record (EOR) characters.
    MS/Windows usings <CR><LF> as the EOR
    Unix using either <CR> or <LF>
    If on unix open the file using vi in a telnet session to confirm the EOR type.
    The fix options.
    1) Before opening the opening the file in your ABAP program run the unix command dos2unix.
    2) Transfer the file from the MS/Windows platform to unix using FTP using ascii not bin.  This does the dos2unix conversion on the fly.
    3) Install SAMBA and share the load directory to the windows platforms.  SAMBA also handles the dos2unix and unix2dos conversions on the fly.
    Hope this helps
    David Cooper

  • Copy CSV file from windows to linux

    Our oracle is installed on UNIX machine so we have to place CSV file there.
    We are on windows.We are using Putty client to access UNIX server.
    how to copy CSV file from windows to UNIX server?
    Thanks

    vis1985 wrote:
    Our oracle is installed on UNIX machine so we have to place CSV file there.
    We are on windows.We are using Putty client to access UNIX server.
    how to copy CSV file from windows to UNIX server?Using scp (secure copy).
    It is supported by Putty and ssh services are available by default on most (if not all) modern Unix servers.
    Don't have a Windows machine close by, but I recall the scp command is supported by the Putty executable called "+pscp.exe+".
    The syntax is very simple:
    scp <from_destination> <to_destination>
    example (pushing a file from Windows to Oracle Unix server)
      scp c:\datafiles\accounts.csv [email protected]:/u01/files/accounts.csv This will prompt for a password.
    You also can use ssh trusted certificates (RSA or DSA keys) to automate this. Using Putty you can generate a key pair that consists of a private key and public key.
    You copy and paste the public key into the destination server's +$HOME/.ssh/authorized_keys+ file. This results the server in trusting that client and allows the client to connect (using ssh, scp and sftp) without having to supply a password (as the private part of the key serves as authentication). (I would however not use the oracle account for this, but a special scp-copy usage Unix account that does not support shell usage)
    If you want to automate it the other way around - have the Unix server pull the file from the Windows client, then you need to install server software on that Windows machine to allow it to provide such a file copy service.
    Again I suggest using ssh - OpenSSH is available for a variety of platforms, and Windows should be included. If not, there should be ssh service for Windows from Microsoft or another vendor that can be used.
    Any other method than ssh will be either insecure or problematic. FTP is insecure as usernames and passwords are transmitted in clear text. Using Windows File Sharing requires Samba to be loaded and configured on the Unix server - and this is extra work installing and configuring and testing s/w that uses a low level security, that is also proprietary and could required setting on the Windows client also to make it work in a somewhat robust fashion.
    The standard today is ssh. It is robust and secure. And it makes a lot of sense to use it.. and pretty good motivation for not using it ifo something else.

  • Accessing CSV File from URL

    Hi Experts,
    I developing an interface where it need access an CSV file from an URL.
    The url is: http://200.218.208.119/download/fechamento/20100413.csv
    The file is modified every day, then yesterday this file will be: 20100414.csv
    My interface generate the following erros
      <Trace level="1" type="T">---- Plain HTTP Adapter Outbound----</Trace>
      <Trace level="1" type="T">---------------------------------------------</Trace>
    - <Trace level="1" type="B" name="CL_HTTP_PLAIN_OUTBOUND-ENTER_PLSRV">
      <Trace level="3" type="T">Quality of Service BE</Trace>
      <Trace level="1" type="T">Get XML-Dokument from the Message-Objekt</Trace>
      <Trace level="3" type="T">URL http://200.218.208.119:80/download/fechamento/20100413.csv</Trace>
      <Trace level="3" type="T">Proxy Host:</Trace>
      <Trace level="3" type="T">Proxy Service:</Trace>
      <Trace level="3" type="T">~request_method POST</Trace>
      <Trace level="3" type="T">~server_protocol HTTP/1.0</Trace>
      <Trace level="3" type="T">accept: */*</Trace>
      <Trace level="3" type="T">msgguid: A7031081480011DFA526001517D1434C</Trace>
      <Trace level="3" type="T">service: D0B_100</Trace>
      <Trace level="3" type="T">interface namespace: http://bcb.gov.br/xi/OB02</Trace>
      <Trace level="3" type="T">interface name: MI_RFC_OUT</Trace>
      <Trace level="3" type="T">Header-Fields</Trace>
      <Trace level="3" type="T">Prolog conversion Codepage: UTF-8</Trace>
      <Trace level="3" type="T">Epilog conversion Codepage: UTF-8</Trace>
      <Trace level="3" type="T">content-length 133</Trace>
      <Trace level="3" type="T">content-type: text/csv; charset=UTF-8</Trace>
      <Trace level="2" type="T">HTTP-Response :</Trace>
      <Trace level="1" type="T">Method Not Allowed</Trace>
      <Trace level="2" type="T">Code : 405</Trace>
      <Trace level="2" type="T">Reason: Method Not Allowed</Trace>
      <Trace level="2" type="T">HTTP-response content-length 6261</Trace>
    can I help?

    > I developing an interface where it need access an CSV file from an URL.
    This is not supported in PI standard.
    I think the option will be available in PI 7.3, but I cannot promise it.

  • Delete a .csv file from desktop system

    Hi All,
    My requirement is to read the .csv file from the desktop system having the shared folder and delete the file after read successfully.
    Here I can read the .csv file from the location using the function RFC_REMOTE_FILE and updated the content into internal table.
    But I cant delete the file from the presentation server ( Desktop system).
    Can anyone tell me how to delete the .csv file from the desktop system on different location.
    Note:
    I followed this link to read file:
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/9831750a-0801-0010-1d9e-f8c64efb2bd2&overridelayout=true

    Hi Rob,
    Thanks. I solved this problem myself.
    The solution to delete the file from remote system is
    concatenate 'DEL' i_filename i_dirname into v_bkfile separated by space .
    call function 'RFC_REMOTE_EXEC'
      destination  c_dest
      exporting
        command               = v_bkfile
      exceptions
        system_failure        = 1  MESSAGE v_ermsg
        communication_failure = 2  MESSAGE v_ermsg.

  • Download CSV-File from Interactive Report

    Dear all,
    I want to download a CSV-File from a Interactive report.
    The content of the Rows in Excel:
    Erste,"B","C","D","E","F","G","H","I","J","K","L","M","N","O","P"
    IBC ->,"-","1","2","3","4","5","6","7","8","9","10","11","12","13","14" [...]
    How can I solve this problem?
    Thanks for help.

    Your CSV-File use "comma" as separator. But Excel think, that CSV-File separator is "semicolon".
    Try to set "CSV Separator"=";" in section "Report Attributes" - "Download".

  • Load XML file from addon domain without cross-domain Policy file

    Hello.
    Assuming that there are two addon domains on the same server: /public_html/domain1.com       and      /public_html/domain2.com
    I try to load XML file from domain2.com into domain1.com without using cross-domain policy file (since it doesn’t work on xml files in my case).
    So the idea is to use php file in order to load XML and read it back to flash.
    I’ve found an interesting scripts that seems to do the job but unfortunately I can't get it to work. In my opinion there is somewhere problem with AS3 part. Please take a look.
    Here are the AS3/PHP scripts:
    AS3 (.swf in www.domain1.com):
    // location of the xml that you would like to load, full http address
    var xmlLoc:String = "http://www.domain2.com/MyFile.xml";
    // location of the php xml grabber file, in relation to the .swf
    var phpLoc:String = "loadXML.php";
    var xml:XML;
    var loader:URLLoader = new URLLoader();
    var request:URLRequest = new URLRequest(phpLoc+"?location="+escape(xmlLoc) );
    loader.addEventListener(Event.COMPLETE, onXMLLoaded);
    loader.addEventListener(IOErrorEvent.IO_ERROR, onIOErrorHandler);
    loader.load(request);
    function onIOErrorHandler(e:IOErrorEvent):void {
        trace("There was an error with the xml file "+e);
    function onXMLLoaded(e:Event):void {
        trace("the rss feed has been loaded");
        xml = new XML(loader.data);
        // set to string, since it is passed back from php as an object
        xml = XML(xml.toString());
        xml_txt.text = xml;
    PHP (loadXML.php in www.domain1.com):
    <?php
    header("Content-type: text/xml");
    $location = "";
    if(isset($_GET["location"])) {
        $location = $_GET["location"];
        $location = urldecode($location);
    $xml_string = getData($location);
    // pass the url encoded vars back to Flash
    echo $xml_string;
    //cURLs a URL and returns it
    function getData($query) {
        // create curl resource
        $ch = curl_init();
        // cURL url
        curl_setopt($ch, CURLOPT_URL, $query);
        //Set some necessary params for using CURL
        curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
       //Execute the curl function, and decode the returned JSON data
        $result = curl_exec($ch);
        return $result;
        // close curl resource to free up system resources
        curl_close($ch);
    ?>

    I think you might be right about permissions/settings on the server for php. Unfortunately I'm not allowed to adjust them.
    So I wrote my own script - this time I used file path instead of http address of the XML file.  It works fine in my case.
    Here it is:
    XML file on domain2.com:
    <?xml version="1.0" encoding="UTF-8"?>
    <gallery>
        <image imagePath="galleries/gallery_1/images/1.jpg" thumbPath="galleries/gallery_1/thumbs/1.jpg" file_name= "1"> </image>
        <image imagePath="galleries/gallery_1/images/2.jpg" thumbPath="galleries/gallery_1/thumbs/2.jpg" file_name= "2"> </image>
        <image imagePath="galleries/gallery_1/images/3.jpg" thumbPath="galleries/gallery_1/thumbs/3.jpg" file_name= "3"> </image>
    </gallery>
    swf  on domain1.com:
    var imagesXML:XML;
    var variables:URLVariables = new URLVariables();
    var varURL:URLRequest = new URLRequest("MyPHPfile.php");
    varURL.method = URLRequestMethod.POST;
    varURL.data = variables;
    var MyLoader:URLLoader = new URLLoader;
    MyLoader.dataFormat =URLLoaderDataFormat.VARIABLES;
    MyLoader.addEventListener(Event.COMPLETE, XMLDone);
    MyLoader.load(varURL);
    function XMLDone(event:Event):void {
        var imported_XML:Object = event.target.data.imported_XML;
        imagesXML = new XML(imported_XML);
       MyTextfield_1.text = imagesXML;
       MyTextfield_2.text = imagesXML.image[0].attribute("thumbPath");  // sample reference to attribute "thumbPath" of the first element
    php file on domain1.com:
    <?php
    $xml_file = simplexml_load_file('../../domain2.com/galleries/gallery_1/MyXMLfile.xml');  // directory to XML file on the same server
    $imported_XML = $xml_file->asXML();
    print "imported_XML=" . $imported_XML;
    ?>
    Regards
    PS: for those who read the above discussion: the first and the second script work but you must test which one is better in your situation. The first script will also work between two domains on different servers. No cross domain policy file needed.

  • Can I load xml file from SWC without using @embed

    I'm developing a mobile application in which I need to load xml files from File.applicationDirectory. This works great if the xml files are part of the main application swf. But I would like to move these xml files into a SWC so they could be shared across multiple applications.
    Using FlashBuilder 4.5, when a SWC is built, I can specify files to embed in the library that are not assets (Assets Tab of Flex Library Build Path). For various design reasons, I do NOT want to embed the xml files via @embed.
    When the swc is built and I open it up using a zip utility, I see the xml files in there just fine. So they are being bundled with the SWC. But how can I load these files in my main application that does not involve using @embed? When the main application is built, the swc setting for link type is "merged into code".
    I wouldn't expect the application to automatically pull out the xml files from the swc and place them in the File.applicationDirectory on the mobile device. I've tried loading from there just in case but file.exists is false (as expected).
    I've searched the web (and continue to do so) and all the answers seem to be to use @embed. Is there another way?
    Randy

    It's actually a lot easier than you think.
    Just reference the file like any'ol URL using a path relative to the SWC's src directory.
    So if you include the file "assets/xml/some.xml", just use that same string like you would any remote resource.
    For example:
    var loader:URLLoader = new URLLoader( new URLRequest("assets/xml/some.xml"));
    I believe it would also work like this "/assets/xml/some.xml", but I prefer relative paths so the link doesnt break if moved out of the SWC...

  • Stored Proc to create .csv file from table

    Hi all,
    I have a table with 4 columns and 10 rows. Now I need a stored proc to create a .csv file from the table data. Here the scripts to create a table and insert the row's.
    Create table emp(emp_id number(10), emp_name varchar2(20), department varchar2(20), salary number(10));
    Insert into emp values ('1','AAAAA','SALES','10000');
    Insert into emp values ('2','BBBBB','MARKETING','8000');
    Insert into emp values ('3','CCCCC','SALES','12000');
    Insert into emp values ('4','DDDDD','FINANCE','10000');
    Insert into emp values ('5','EEEEE','SALES','11000');
    Insert into emp values ('6','FFFFF','MANAGER','90000');
    Insert into emp values ('7','GGGGG','SALES','12000');
    Insert into emp values ('8','HHHHH','FINANCE','14000');
    Insert into emp values ('9','IIIII','SALES','20000');
    Insert into emp values ('10','JJJJJ','FINANCE','21000');
    commit;
    Now I need a stored proc to create a .csv file in my local location. Please let me know If you need any other details....

    Some pointers:
    http://www.oracle-base.com/articles/9i/GeneratingCSVFiles.php
    http://tkyte.blogspot.com/2009/10/httpasktomoraclecomtkyteflat.html
    also, doing a search on this forum or http://asktom.oracle.com will give you many clues.
    .csv file in my local location.What is your 'local location'?
    A client machine? The database server machine?
    What database version are you using?
    (the result of: select * from v$version; )

  • How to load jar files from remote location

    Hi all,
    I am trying to load jar files from remote server, from a servlet which is running on OC4j.
    For doing this,First I am getting ClassLoader by using ClassLoader.getSystemClassLoader() and then type casting it to URLClassLoader.
    But by doing it I am getting ClassCastException,because oc4j returns oracle.classloader.PolicyClassLoader instead of java.net.ClassLoader.
    Appreciate if anyone can tell me how to load remote jar files using oracle.classloader.PolicyClassLoader .
    Thanks
    Harish

    Hi,
    I suppose you know about this, but just in case.
    I have used jnpl to load jar files from remote location.
    Rowan

  • SSIS 2008 – Read roughly 50 CSV files from a folder, create SQL table from them dynamically, and dump data.

    Hello everyone,
    I’ve been assigned one requirement wherein I would like to read around 50 CSV files from a specified folder.
    In step 1 I would like to create schema for this files, meaning take the CSV file one by one and create SQL table for it, if it does not exist at destination.
    In step 2 I would like to append the data of these 50 CSV files into respective table.
    In step 3 I would like to purge data older than a given date.
    Please note, the data in these CSV files would be very bulky, I would like to know the best way to insert bulky data into SQL table.
    Also, in some of the CSV files, there will be 4 rows at the top of the file which have the header details/header rows.
    According to my knowledge I would be asked to implement this on SSIS 2008 but I’m not 100% sure for it.
    So, please feel free to provide multiple approaches if we can achieve these requirements elegantly in newer versions like SSIS 2012.
    Any help would be much appreciated.
    Thanks,
    Ankit
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com

    Hello Harry and Aamir,
    Thank you for the responses.
    @Aamir, thank you for sharing the link, yes I'm going to use Script task to read header columns of CSV files, preparing one SSIS variable which will be having SQL script to create the required table with if exists condition inside script task itself.
    I will be having "Execute SQL task" following the script task. And this will create the actual table for a CSV.
    Both these components will be inside a for each loop container and execute all 50 CSV files one by one.
    Some points to be clarified,
    1. In the bunch of these 50 CSV files there will be some exception for which we first need to purge the tables and then insert the data. Meaning for 2 files out of 50, we need to first clean the tables and then perform data insert, while for the rest 48
    files, they should be appended on daily basis.
    Can you please advise what is the best way to achieve this requirement? Where should we configure such exceptional cases for the package?
    2. For some of the CSV files we would be having more than one file with the same name. Like out of 50 the 2nd file is divided into 10 different CSV files. so in total we're having 60 files wherein the 10 out of 60 have repeated file names. How can we manage
    this criteria within the same loop, do we need to do one more for each looping inside the parent one, what is the best way to achieve this requirement?
    3. There will be another package, which will be used to purge data for the SQL tables. Meaning unlike the above package, this package will not run on daily basis. At some point we would like these 50 tables to be purged with older than criteria, say remove
    data older than 1st Jan 2015. what is the best way to achieve this requirement?
    Please know, I'm very new in SSIS world and would like to develop these packages for client using best package development practices.
    Any help would be greatly appreciated.
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com
    1. In the bunch of these 50 CSV files there will be some exception for which we first need to purge the tables and then insert the data. Meaning for 2 files out of 50, we need to first clean the tables and then perform
    data insert, while for the rest 48 files, they should be appended on daily basis.
    Can you please advise what is the best way to achieve this requirement? Where should we configure such exceptional cases for the package?
    How can you identify these files? Is it based on file name or are there some info in the file which indicates
    that it required a purge? If yes you can pick this information during file name or file data parsing step and set a boolean variable. Then in control flow have a conditional precedence constraint which will check the boolean variable and if set it will execute
    a execte sql task to do the purge (you can use TRUNCATE TABLE or DELETE FROM TableName statements)
    2. For some of the CSV files we would be having more than one file with the same name. Like out of 50 the 2nd file is divided into 10 different CSV files. so in total we're having 60 files wherein the 10 out of 60 have
    repeated file names. How can we manage this criteria within the same loop, do we need to do one more for each looping inside the parent one, what is the best way to achieve this requirement?
    The best way to achieve this is to append a sequential value to filename (may be timestamp) and then process
    them in sequence. This can be done prior to main loop so that you can use same loop to process these duplicate filenames also. The best thing would be to use file creation date attribute value so that it gets processed in the right sequence. You can use a
    script task to get this for each file as below
    http://microsoft-ssis.blogspot.com/2011/03/get-file-properties-with-ssis.html
    3. There will be another package, which will be used to purge data for the SQL tables. Meaning unlike the above package, this package will not run on daily basis. At some point we would like these 50 tables to be purged
    with older than criteria, say remove data older than 1st Jan 2015. what is the best way to achieve this requirement?
    You can use a SQL script for this. Just call a sql procedure
    with a single parameter called @Date and then write logic like below
    CREATE PROC PurgeTableData
    @CutOffDate datetime
    AS
    DELETE FROM Table1 WHERE DateField < @CutOffDate;
    DELETE FROM Table2 WHERE DateField < @CutOffDate;
    DELETE FROM Table3 WHERE DateField < @CutOffDate;
    GO
    @CutOffDate which denote date from which older data have to be purged
    You can then schedule this SP in a sql agent job to get executed based on your required frequency
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

Maybe you are looking for

  • Double or more same condition type on the sales order item view

    Dear expert, We have two condition type for the material price. PR00 : For automatically pricing from material price list. (For example : 50 EUR) ZAD6 : For manually price enter by user (For example : 100 EUR) When I enter to ZAD6, last price is avai

  • Find any materials about badi: hrhap00_admin

    hi,experts. i am suprised that there is no demo about baid hrhap00_admin  in standard system. so i am refused how to use this badi. can anyone show some information about this badi ? thanks a lot. yours ping.

  • Importing Romanian Subtitles into DVD SP

    I'm not able to import Romanian subtitles into DVD SP correctly.  I can see them correctly in a program on my computer called Jubler, but when I import them into DVD SP, non-English characters don't import correctly.  Any thoughts? Thanks, Gary

  • ML Up to date erase iLife?

    I recently ordered a refurb iMac and it comes with Lion and iLife and the other built in programs.  If I upgraded to ML would it erase the iLife programs and force me to buy them again?

  • Error Message cropping up...

    I'm running into this error after upgrading to the ODBC driver version 8.1.7.7.0; Microsoft OLE DB Provider for ODBC Drivers error '80040e0e' [Oracle][ODBC]Invalid bookmark value. I am thinking this could be an error in programmers code not an ODBC c