2008R2 FOREACH csv file - move data to db - move file to folder weirdness

Greetings everyone,
I'm facing a weird problem. So I'm retrieving files from an ftp server and saving them in a local 'to process' folder. Then I use a foreach file iterator to process those files and save the data in the database through a data flow task. When this operation
finishes, I move the file in 2 possible folders: Processed or Error. This depend on the result of the dataflow task:
To ensure my loop continues if an error should occur i've placed and event handler on the OnError action of the DFT. There in the system variables I set the value for PROPAGATE to false. This way there isn't and error escalation that would fire the package
onerror event handler. Now all works well when all my files are correct regarding the content.
For testing purposes I''ve created a dummy data file that would fire a truncation error. When I run the package and the irerator arrives at the BAD dummy data file, the file gets placed in the error folder as expected and I get the correct error message
(truncation error). However when the next file, which is correct, processes in the DFT, it also produces an error:
Error: SSIS Error Code
DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "DS_STG" failed with error code 0xC0202009. There may be error messages posted before this with more information on why
the AcquireConnection method call failed.
I don't understand this. It seems like the previous error has a corrupting impact on the DFT. Is there a property that I have to set here? I've tried playing with the MaximumErrorCount and tried to run it with a configuration file to persist the connection
manager information, but the behaviour still persists. 
So in summary: the iteration continues until the end, but after one bad file is being processed in the DFT, all the next good files get the connection manager error...
I can think of ways to workaround this issue, but I would think that this should work as it is, no? :) 
Thanks for the answers in advance.

Hi Visakh,
I specify the folder which holds the csv files. Then I assign each file path to a variable
Then I use this variable (path) as connection string in the DFT flat file source:
I created a workaround which I don't like but it seems to do the job. I kept the original foreach file to determine which file is valid BUT I don't do the data insert in this DFT no more. I deleted the ole db destination so it generates an error only if
the data doesn't come through at the DFT. 
Then I remove the bad files from the to process folder. I copy the foreach iterator with all the components, but now I add the ole db destination in the data flow task for data insert. But at this time there are only correct files in the 'to process' folder.
This works, but it isn't 'pretty' :) 
Do you have an idea what could be wrong? It seems one bad file corrupts the destination connection. 
When you say bad file is it metadata which is corrupted?
Please Mark This As Answer if it solved your issue
Please Mark This As Helpful if it helps to solve your issue
Visakh
My MSDN Page
My Personal Blog
My Facebook Page

Similar Messages

  • SSIS 2008 - Import all CSV files in folder - each target different table

    I'm new to SSIS 2008, and I'm attempting to import a series of CSV files into a database. I want to loop over every CSV file in a specified folder and import each file into a different table that must be created on the fly (name of table should be base name
    of file, i.e. with path and extension stripped off). The file structures are similar, but not identical, so the single target table solution won't work.
    Using an example I found here:
    http://microsoft-ssis.blogspot.com/2011/02/how-to-configure-foreach-loop-file.html
    I'm able to successfully load all files into ONE table, but I need to load each file into a separate table. Can anyone provide some assistance on exactly how to modify the project to allow for a table to be created, on the fly, for each source file?
    thanks,
    Mark

    Obviously you need the name of the table to become equal to the name of the file you are processing and this you can do using an expression, so you capture your file name and assign it to the package variable. From there you need an Execute SQL Task to
    create the table which input parameter becomes that variable for the table. The tricky part is to insert the data each time to a new table, the standard DFT would not work as you cannot manipulate on the OpenRowset object, so you can either use a Script Task
    to code that in or a bulk insert procedure, depends what you are more comfortable with.Do you need help with any of the items?Arthur My Blog

  • Can somebody help in getting a sample code to upload a csv file

    Can somebody help in getting a sample code to upload csv file from folder every 30 minutes that will update a particular record in crm ondemand

    Hi,
    I'm sorry but I do not have a code sample which performs the scenario you have described below. The samples I did provide are meant to illustrate the use of CRM On Demand Web Services and can be used as a guide to those developing integrations.
    If you have specific questions or issues that arise during the development of your integration please don't hesitate to post them, however, I would recommend that you contact Oracle Consulting Services or one of the many system integrators who actively develop applications which integrate with CRMOD if you require more in-depth assistance.
    Thanks,
    Sean

  • Read data from a CSV file

    hi,
    i've managed to make a CSV file which includes logged data of how much my NXT car is turning left(L),right(R) or going straight(S). format is:direction/timer value (L/1922 means it went to left for 1922 ms).
    where i got stuck is how can i read those time and direction values from csv file in such way that those values could be used for the motor move input?
    Actually what i'm trying to do is severeal VIs which function like record/play action in NXT Toolkit.
    (i've attached the vi and the csv file, the file extension is CSV but it is not written in csv format it's a basic txt file actually )
    Message Edited by szemawy on 05-23-2008 10:34 AM
    Attachments:
    read_deneme.vi ‏10 KB
    testus.csv ‏1 KB

    after i get the data in 1-D array format I cannot convert it to string so that i can send it by write BT message. i tried flatten to string but didnt work out
    [nevermind the mess, just see the part where the 1-D string is being converted to string]
    Attachments:
    read_test2.JPG ‏184 KB
    read_test2_fp.JPG ‏132 KB

  • How to import data from CSV file with columns separated by semicolon?

    I migrate database from MS SQL 2008 to ORACLE 11g
    I export data to CSV file from MS SQL
    then I try to import it to Oracle
    several tables goes fine using Import data option in the SQL Developer
    Standard CSV files with data separated by comma were imported.
    chars, date (with format string), and integer data are imported via import wizard without problems
    the problems were when I try to import table with noninteger numbers with modal part separated by comma not by dot
    comma is the standard separator for columns in CSV file
    so I must change the standard separator to semicolon
    then the import wizard have problem to correct recognize the columns data because it use only standard CSV comma separator :-/
    In SQL Developer 1.5.3 Tools -> Preferences -> Migration -> Data Move Options
    I change "End of Column Delimiter" to ; but it doens't work
    Is this possible to change the standard column separator for import data wizzard in SQL Developer 1.5.3?
    Or maybe someone know how to import data in SQL Developer 1.5.3 from CSV when CSV colunn separator is set to semicolon?

    A new preference has been added to customize the import delimiter in main code line. This should be available as part of future release.

  • How to avoid the split problem when uploading the data from csv file

    Dear Friends,
                  I have to upload data from the .csv file to my custom table , i have found a problem when uploading the data .
    i am using the code as below , please suggest me what i have to do in this regard
          SPLIT wa_raw_csv  AT ',' INTO
                    wa_empdata_csv-status
                     wa_empdata_csv-userid
                     wa_empdata_csv-username
                     wa_empdata_csv-Title
                     wa_empdata_csv-department.
    APPEND wa_empdata_csv TO  itab.
    in the flat file i can see for one of the record for the field Title  as
    Director, Finance - NAR............there by through my code the  wa_empdata_csv-Title is getting splited data as "Director, and  Department field is getting Finance - NAR" , i can see that even though  " Director, Finance - NAR"  is one word it is getting split .
    .......which is the problem iam facing.Please could any body let me know how in this case i should handle in my code that this word
    "Director,Finance - NAR"   wil not be split into two words.
    Thanks & Records
    Madhuri

    Hi Madhuri,
    Best way to avoid such problem is to use TAB delimited file instead of comma separated data. Generally TAB does not appear in data.
    If you are generating the file, use tab instead of comma.
    If you cannot modify the format of file and data length in file is fixed character, you will need to define the structure and then move data in fixed length structure.
    Regards,
    Mohaiyuddin

  • SQL server 2014 and VS 2013 - BuilInsert task - truncate a field data of a CSV file and insert it to SQL table.

    Hello everyone,
    To move data from roughly 50 CSV files in a loop to SQL tables (table per CSV file), I've used BulkInsert task in For each loop container.
    Please know, for all different columns of all CSV files, the filed length was specified as varchar(255).
    It worked well for the first 6 files but on 7th file, it has found the data in one of the columns of CSV file, which exceeds the limit of 255 characters. I would like to truncate the data and insert the remaining data for this column. In other words, I would
    like to insert first 255 characters into table for this field and let the package execute further.
    Also, if I would use SQLBulkCopy in Script task, will it be able to resolve this problem? I believe, I would face the similar problem over there too. Although, would like to get confirmation for the same from experts.
    Can you please advise how to get rid of this truncation error?
    Any help would be greatly appreciated.
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com

    Hello! I suggest you add a derived column transformation between Source and Destination, use string functions to extract first 255 characters from the incoming field and send this output to destination field, that ways you can get rid of these sort of issues.
    Useful Links:
    http://sqlage.blogspot.in/2013/07/ssis-how-to-use-derived-column.html
    Good Luck!
    Please Mark This As Answer if it solved your issue.
    Please Vote This As Helpful if it helps to solve your issue

  • SQL server 2014 and VS 2013 - Dataflow task, read CSV file and insert data to SQL table

    Hello everyone,
    I was assigned a work item wherein, I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current
    file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.
    On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file. Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?
    I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns
    in it. These files needs to be migrated to SQL tables using the optimum way.
    Does anybody know which is the best way to setup the Dataflow task for this requirement?
    Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.
    Any help would be much appreciated. It's very urgent.
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com

    The standard Data Flow Task supports only static metadata defined during design time. I would recommend you check the commercial COZYROC
    Data Flow Task Plus. It is an extension of the standard Data Flow Task and it supports dynamic metadata at runtime. You can process all your input CSV files using a single Data Flow Task
    Plus. No programming skills are required.
    SSIS Tasks Components Scripts Services | http://www.cozyroc.com/

  • SSIS package to order all the CSV files based on their date of creation/modification mentioned in a filename and load the least recent file in oracle Destination

    HI,
    I need a help for creating SSIS package which has flat file source (specifically , delimited csv files) which are placed in a shared folder
    having file names as  File_1_2015-04-30 08:54:13.900.csv, File_2_2015-04-30 07:54:13.900.csv.
    So, I wanted to use foreach loop to find out the oldest file amongst that(through script task) and need to transform that file in Oracle Destination. After that, I want to archive that file and need to continue this action uptil the most recent file gets
    loaded in destination.
    So, Could you guys please guide me?
    Thanks in advance

    I'd say you need two loops, one to cycle through all the files and collect their names. Then sort them based on your criterion
    Perhaps you can start by seeing http://www.sqlis.com/sqlis/post/Looping-over-files-with-the-Foreach-Loop.aspx
    Collect the file names into a Recodset Destination, then shred it http://www.sqlis.com/sqlis/post/Shredding-a-Recordset.aspx
    perhaps you need an outer loop to iterate over each file to compare the files for dates.
    Do each piece and post here where you needed to stop.
    Arthur
    MyBlog
    Twitter

  • Uploading data to oracle database when a new CSV file arrives to a folder

    Hi,
    I am a student trying to learn oracle and right not working on an academic project. My problem scenario is as follows.
    I have a folder in my system.Whenever a new CSV file is pasted to that folder.The contents should be loaded to database using SQLLoader.The problem I am facing is.
    1) How to schedule this as to what should I write in batch file(for arrival of new file).
    2) How to tell SQL Loader to take contents from the new SQL file everytime an upload is scheduled by the batch file.
    I will be glad for any guidance on this.Thanks :)

    you have to do below steps:
    1. create a sqlloader control file according to your input data file
    2. write a shell script which will check is any new file came in the fiolder. If it came then load the file using above controlfile. once load is complete move file in some other folder.
    3. schedule the shell script using crontab to run every 10 second or so.
    Note: This is just one of the way. there can be many methods to solve above problem including using oracle schduler.

  • How to export out the date into the csv file?

    Hi, I had been trying to export out the value of the date to csv file.
    This is the  script:
    $strADPath = 'LDAP://dc=test,dc=com'
    function ConvertLargeIntegerToDate([object]$LargeInteger){
    try
    $int64 = ConvertLargeIntegerToInt64 ($LargeInteger)
    if ($int64 -gt 0)
    $retDate = [datetime]::fromfiletime($int64)
    else
    $retDate = $null
    return $retDate
    catch
    return $null
    $objSearch = New-Object DirectoryServices.DirectorySearcher
    $objSearch.Filter = '(&(objectClass=user)(samaccountname=user1))'
    $objSearch.SearchRoot = $strADPath
    $objSearch.PageSize = 1000
    $objSearch.SearchScope = "subtree"
    $objResults = $objSearch.Findall()
    $dateAccountExpires = ConvertLargeIntegerToDate $objUser.accountexpires[0]
    Write-Host "date Account expires: " $dateAccountexpires
    $objResults| ForEach-Object{
    $_.GetDirectoryEntry()
    } |
    Select-Object -Property @{Name="sAMaccountName";Expression={$_.sAMAccountName}},
    @{Name="cn";Expression={$_.cn}},
    @{Name="name";Expression={$_.name}},
    @{Name="manager";Expression={$_.manager}},
    @{Name="givenName";Expression={$_.givenName}},
    @{Name="accountExpires";Expression={$_.dateAccountExpires}},
    @{Name="department";Expression={$_.department}} |
    Export-Csv -Path 'D:\test44.csv'
    This is what I get in PowerShell ISE:
    This is what I had get for the csv file for the expire date:

    hi FWN,
    the code had giving me error saying that it could not call method on a null-value expression.
    $temp = $_.Properties
    the code had gave error saying that it could not call method on a null-value expression.
    $properties | %{ $res.$_ = $temp.Item($_) }
    the code had gave error saying that it could not call method on a null-value expression.
    with lot of thanks
    noobcy

  • Best option to transmit CSV file as POST data to remote site

    I'm quite new to the SAP scene and am tasked with getting some data out of our database and up to a third party web application.
    Their API requests the data be formatted as a CSV file and uploaded as an HTTP POST attachment (file upload) to their site.
    What's my best approach to this?
    We have PI, but I just learned about CL_HTTP_CLIENT and am hoping I can do this move directly from the ABAP environment but am unsure of the sorts of technicalities involved with either of these options.
    Can I setup a "service" in PI that simply posts data to a URL (as opposed to sending a SOAP request)?
    What sort of setup do I need to do to get CL_HTTP_CLIENT to talk to the remote site? I've tested with HTTP_POST and get an SSL error even when posting to a non-ssl url (http).

    public
    void Save(IPropertyBag
    propertyBag, bool clearDirty,
    bool saveAllProperties)
    object val2 = (object)_event;
    propertyBag.Write("Event",
    ref val2);
    object val3 = (object)_fullload;
    propertyBag.Write("FullLoad",
    ref val3);
    object val4 = (object)_sharedsecret;
    propertyBag.Write("SharedSecret",
    ref val4);
    object val5 = (object)_content;
    propertyBag.Write("Content",
    ref val5);
    object val6 = (object)_clienttype;
    propertyBag.Write("ClientType",
    ref val6);
    object val7 = (object)_clientinfo;
    propertyBag.Write("ClientInfo",
    ref val7);
    object val8 = (object)_clientversion;
    propertyBag.Write("ClientVersion",
    ref val8);
    #endregion
    #region IComponent
    public
    IBaseMessage Execute(IPipelineContext
    pc, IBaseMessage inmsg)
    //Convert Stream to a string
    Stream s =
    null;
    IBaseMessagePart bodyPart = inmsg.BodyPart;
    string separator =
    new
    Guid().ToString();
    inmsg.BodyPart.ContentType =
    string.Format("multipart/form-data;
    boundary={0}", separator);
    //inmsg.BodyPart.Charset = string.Format("US-ASCII");
    // NOTE inmsg.BodyPart.Data is implemented only as a setter in the http adapter API and a
    //getter and setter for the file adapter. Use GetOriginalDataStream to get data instead.
    if (bodyPart !=
    null)
    s = bodyPart.GetOriginalDataStream();
    byte[] bytes =
    new
    byte[s.Length];
    int n = s.Read(bytes, 0, (int)s.Length);
    string msg =
    new
    ASCIIEncoding().GetString(bytes).TrimEnd(null);
    //Get boundry value from first line of code
    string boundry = msg.Substring(2, msg.IndexOf("\r\n")
    - 2);
    //Create new start to message with MIME requirements.
    msg =
    "MIME-Version: 1.0\r\nContent-Type: text/plain; boundary=\"" + boundry +
    "\"\r\n" + msg;
    //Convert back to stram and set to Data property
    inmsg.BodyPart.Data =
    new
    MemoryStream(Encoding.UTF8.GetBytes(msg));
    //reset the position of the stream to zero
    inmsg.BodyPart.Data.Position = 0;
    return inmsg;
    #endregion

  • Comparing SQL Data Results with CSV file contents

    I have the following scenario that I need to resolve and I'm unsure of how to approach it. Let me explain what I am needing and what I have currently done.
    I've created an application that automatically marks assessments that delegates complete by comparing SQL Data to CSV file data. I'm using C# to build the objects required that will load the data from SQL into a dataset which is then compared to the
    associated CSV file that contains the required results to mark against.
    Currently everything is working as expected but I've noticed that if there is a difference in the number of rows returned into the SQL-based dataset, then my application doesn't mark the items at all.
    Here is an example:
    ScenarioCSV contains 4 rows with 8 columns of information, however, let's say that the delegate was only able to insert 2 rows of data into the dataset. When this happens it marks everything wrong because row 1 in both CSV and dataset files were correct,
    however, row 2 in the dataset holds the results found in row 4 in the CSV file and because of this it is comparing it against row 2 in the CSV file.
    How can I check whether a row, regardless of its order, can be marked as it does exist but not in the same order, so I don't want the delegate to lose marks just because the row data in the dataset is not perfectly in the same order of the row data
    in the CSV file???
    I'm at a loss and any assistance will be of huge help to me. I have implemented a ORDER BY clause in the dataset and ensured that the same order is set in the CSV file. This has helped me for scenarios where there are the right number of rows in the dataset,
    but as soon as there is 1 row that is missing in the dataset, then the marking just doesn't allow for any marks for both rows even if the data is correct.
    I hope I've made sense!! If not, let me know and I will provide a better description and perhaps examples of the dataset data and the csv data that is being compared.
    Thanks in advance....

    I would read the CSV into a datatable using oledb. Below is code I wrote a few weeks ago to do this.
    Then you can compare two datatables by a common primary key (like ID number)
    Below is the webpage to compare two datatables
    http://stackoverflow.com/questions/10984453/compare-two-datatables-for-differences-in-c
    You can find lots of examples by perform following google search
    "c# linq compare two dattatable"
    //Creates a CSVReader Class
    public class CSVReader
    public DataSet ReadCSVFile(string fullPath, bool headerRow)
    string path = fullPath.Substring(0, fullPath.LastIndexOf("\\") + 1);
    string filename = fullPath.Substring(fullPath.LastIndexOf("\\") + 1);
    DataSet ds = new DataSet();
    try
    if (File.Exists(fullPath))
    string ConStr = string.Format("Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0}" + ";Extended Properties=\"Text;HDR={1};FMT=Delimited\\\"", path, headerRow ? "Yes" : "No");
    string SQL = string.Format("SELECT * FROM {0}", filename);
    OleDbDataAdapter adapter = new OleDbDataAdapter(SQL, ConStr);
    adapter.Fill(ds, "TextFile");
    ds.Tables[0].TableName = "Table1";
    foreach (DataColumn col in ds.Tables["Table1"].Columns)
    col.ColumnName = col.ColumnName.Replace(" ", "_");
    catch (Exception ex)
    MessageBox.Show(ex.Message);
    return ds;
    jdweng

  • How to import data from excel or csv files to Oracle table

    hello everybody,
    I am new here and new in Oracle. I would like to know the steps how to import data from excel or csv files to Oracle table.
    Let say I already have table inside the Oracle. Then my user give me the sets of data inside the Excel Worksheet.
    So, how can I import the excel data into Oracle table.
    Thank you in advance.
    cheers,
    shima

    Even easier. Download JDeveloper 11G from this site.
    Set up the database connection, right click on the table, select Import->Excel and specify your file to load it. On the import pop-up, you must view and update each tab indicating Columns, Data Types, and DML.
    Columns -- move the selected columns that you want to load to the box on the right
    Data Types -- select column name from second column to which the data for each column of the import file should load
    DML -- click this tab to generate the INSERT SQL
    Once done click 'Insert'

  • Date format in CSV file

    Hello,
    I am working on loading program which downloads the application server CSV file to internal table.
    In the CSV file there is a field date in the format (22/05/2008). When I move this to my internal table field date, it is truncating to '22/05/20'. How to solve this issue.
    Can I load into internal table with the same format?
    Please help me.
    Thanks,
    Shreekant

    Hi, Rangrej
    As all Suggested assign first this date to char type filed length 10 and using following way you can assign this to type date.
    Test Sample Code Bellow.
    TYPES: BEGIN OF ty_test,
      cdate(10),
      date LIKE sy-datum,
      END OF ty_test.
    DATA: it_test TYPE STANDARD TABLE OF ty_test WITH HEADER LINE.
    it_test-cdate = '22/05/2008'. APPEND it_test.
    it_test-cdate = '23/05/2008'. APPEND it_test.
    it_test-cdate = '24/05/2008'. APPEND it_test.
    it_test-cdate = '25/05/2008'. APPEND it_test.
    it_test-cdate = '26/05/2008'. APPEND it_test.
    LOOP AT it_test.
      it_test-date+0(4) = it_test-cdate+6(4).
      it_test-date+4(2) = it_test-cdate+3(2).
      it_test-date+6(2) = it_test-cdate+0(2).
      MODIFY it_test INDEX sy-tabix.
    ENDLOOP.
    Best Regards,
    Faisal

Maybe you are looking for

  • Communication problem between NetBeans and Tomcat

    hi! i got a quite mysterious problem. here is what happens: - i start NetBeans 5.5.1 (the first time) - i want to debug my JSF-Project, the Debugger starts - After a few seconds the debugger waits for tomcat (it sais: "Waiting for Tomcat...") and tom

  • Running Oracle on Windows 2008 R2 64-bits and WCF

    I'm trying to deploy to a Windows 2008 R2 64-Bits server running AppFabric a WCF 4.0 service that connects to an Oracle 10gR2 database. On my Windows 7 64-Bits, inside Visual Studio 2010, everything goes well. I first started using Oracle ODP.NET and

  • How do I create a pdf in numbers?

    I have spent a few hours trying to work out how to create a pdf from a numbers worksheet. I don't seem to have the option to do it on my ipad 2 (OS latest version). Purchased numbers a week ago. If I open numbers then: If I go to Tools, Share, Print

  • My  app store is stuck in the USA how do i get it back to Canada

    How do I get my app store back to Canada?

  • Why a different Plug In Icon?

    I've been getting different results running Applets on a web page depending what system I'm running them on. Have tried different versions of Internet Explorer, Java Plug-In etc. but can't understand why the different results. VISUALLY THERE IS ONE B