Cannot change SQL command text in Data Flow Task

I have an SSIS package that extracts data from Teradata into a SQL Server DB.
I'm using SQL Server 2008 R2 EE x64, and Visual Studio 2008 PE (BIDS) supplied with it, accessing Teradata v13.
In the Integration Services project, I have a Data Flow Task, it has an ADO .net source which has Data access mode set to “SQL Command”.
This worked for a while when I initially entered a SQL statement to select data.
But, when I change the existing SQL Command text and save the package, the changes are lost.
It keeps going back to the original SQL Statement.
It is currently “select col1, col2, … col10 from view1”.
When I change it to anything else, like “select col5 from view1”, and save, and then double click the ADO NET source again, I find that the SQL command text is still the old one. It goes back to the previous statement.
I’ve tried other statements like “exec macro” for Teradata, etc. but the same thing keeps happening - changes are not saved.
Does anyone have any ideas on this, or have you seen this before?

This is odd, but seems to be a component metadata corruption; so why don't you:
delete the teradata connector and source component, then re add them back, see if it now accepts the new SQL statement, if this does not work
abandon this package and create a new one replicating the functionality incorporating the new SQL.
In case #1 and 2 both fail post here any errors observed and find out whether you are missing any updates to either Teradata provider or SQL Server
Arthur My Blog

Similar Messages

  • Data Flow Task is not executing in the package

    I have a package in which I have a Data Flow Task which will load the data from Excel file to SQL Table.
    The Data Flow Task is executing and loading the data if I execute that particular task only.
    But when I run the whole package, the data is not loading into the table. (Its not even showing in the Data Viewers also)
    FYI: On every run, the Source Excel file will be deleted and created by the previous tasks (before the Data Flow Task)
    Please give me the cause and solution for this.
    Thanks in advance.. 

    Hello Sai -
    Can you please post the structure of your package (screenshot) , unless there is a change in excel connection prior to data flow , there should be no reason for no data.
    As you mentioned the source files are deleted prior to data load , it could mean :
    1 - New files have no data
    2 - The connection of the data flow is dynamically changed to the new files.
    Happy to help! Thanks. Regards and good Wishes, Deepak. http://deepaksqlmsbusinessintelligence.blogspot.com/

  • Error at Data Flow Task [OLE DB Source [1]]: No column information was returned by the SQL command." error

    I have an .XLSX File that I am trying to Upload into SQL Server.  I use the Source as OLEDB Connection manager and pick OLDEB PROVIDER as MS OFFICE 12.0 ACCESS DATABASE ENGINE. In that I put this query but it gives the foll
    error when I run the SSIS Package.
    [OLE DB Source [1]] Error: No column information was returned by the SQL command.
    Query I use in DATA FLOW TASK  , IS SQL COMMAND:
    Select top 1 [Investor #],[Investor Name], CONVERT (VARCHAR(1000),Delegation)AS DELEGATION ,[Date Added],[Date Revised] from [EXCELSHEET1$]
    Any suggestion to fix this error? i have to do a convert since the excel datayype is DT_TEXT AND SQL TABLE DATATYPE is VARCHAR(1000)  SO Have to do this Convert.

    If the target table's column is not a BLOB/CLOB (e.g. NVARCHAR(MAX)) and you have text longer than the max value it will not fit into the target, if you are allowed you need to trim it and convert, otherwise sure you get the error as the target datatype
    is smaller. 
    Arthur My Blog

  • Read from sql task and send to data flow task - [OLE DB Source [1]] Error: A rowset based on the SQL command was not returned by the OLE DB provider.

    I have created a execut sql task -
    In that, i have a created a 'empidvar' variable of string type and put sqlstatement = 'select distinct empid from emp'
    Resultset=resultname=0 and variablename=empidvar
    I have added data flow task of ole db type and I put this sql statement under sql command - exec emp_sp @empidvar=?
    I am getting an error.
    [OLE DB Source [1]] Error: A rowset based on the SQL command was not returned by the OLE DB provider.
    [SSIS.Pipeline] Error: component "OLE DB Source" (1) failed the pre-execute phase and returned error code 0xC02092B4.

    shouldnt setting be Result
    Set=Full Resultset as your query returns a resultset? also i think variable to be mapped should be of object type.
    Then for data flow task also you need to put it inside a ForEachLoop based on ADO.NET recordset and map your earlier variable inside it so as to iterate for every value the sql task returns.
    Also if using SP in oledb source make sure you read this
    http://consultingblogs.emc.com/jamiethomson/archive/2006/12/20/SSIS_3A00_-Using-stored-procedures-inside-an-OLE-DB-Source-component.aspx
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • SQL 2005 SP2 - Cannot open Data Flow Task in SSIS

    I have just installed Service Pack 2 on my SQL 2005 Standard Edition.
    However, now all my SSIS packages will not allow me to open my Data FLow Tasks.  I get the following error:
    TITLE: Microsoft Visual Studio
    Cannot show the editor for this task.
    ADDITIONAL INFORMATION:
    The task returned an unsupported control editor type. (Microsoft.DataTransformationServices.Design)
    If I try to create a new Data Flow task I get:
    TITLE: Microsoft Visual Studio
    Failed to create the task.
    ADDITIONAL INFORMATION:
    The designer could not be initialized. (Microsoft.DataTransformationServices.Design)
    I have tried to install the latest hotfixes after this but they had no effect.
    Can anybody help me???? Please?

    I have had this same issue, where tasks would open fine in an SSIS package until SP2 was installed and then I get the same issue as noted above, i.e. :
    TITLE: Microsoft Visual Studio
    Cannot show the editor for this task.
    ADDITIONAL INFORMATION:
    The designer could not be initialized. (Microsoft.DataTransformationServices.Design)
    If anyone has some ideas on this, it would be greatly appreciated.

  • SQL Query using a Variable in Data Flow Task

    I have a Data Flow task that I created. THe source query is in this file "LPSreason.sql" and is stored in a shared drive such as
    \\servername\scripts\LPSreason.sql
    How can I use this .sql file as a SOURCE in my Data Flow task? I guess I can use SQL Command as Access Mode. But not sure how to do that?

    Hi Desigal59,
    You can use a Flat File Source adapter to get the query statement from the .sql file. When creating the Flat File Connection Manager, set the Row delimiter to a character that won’t be in the SQL statement such as “Vertical Bar {|}”. In this way, the Flat
    File Source outputs only one row with one column. If necessary, you can set the data type of the column from DT_STR to DT_TEXT so that the Flat File Source can handle SQL statement which has more than 8000 characters.
    After that, connect the Flat File Source to a Recordset Destination, so that we store the column to a SSIS object variable (supposing the variable name is varQuery).
    In the Control Flow, we can use one of the following two methods to pass the value of the Object type variable varQuery to a String type variable QueryStr which can be used in an OLE DB Source directly.
    Method 1: via Script Task
    Add a Script Task under the Data Flow Task and connect them.
    Add User::varQuery as ReadOnlyVariables, User::QueryStr as ReadWriteVariables
    Edit the script as follows:
    public void Main()
    // TODO: Add your code here
    System.Data.OleDb.OleDbDataAdapter da = new System.Data.OleDb.OleDbDataAdapter();
    DataTable dt = new DataTable();
    da.Fill(dt, Dts.Variables["User::varQuery"].Value);
    Dts.Variables["QueryStr2"].Value = dt.Rows[0].ItemArray[0];
    Dts.TaskResult = (int)ScriptResults.Success;
    4. Add another Data Folw Task under the Script Task, and join them. In the Data Flow Task, add an OLE DB Source, set its Data access mode to “SQL command from variable”, and select the variable User::QueryStr.
    Method 2: via Foreach Loop Container
    Add a Foreach Loop Container under the Data Flow Task, and join them.
    Set the enumerator of the Foreach Loop Container to Foreach ADO Enumerator, and select the ADO object source variable as User::varQuery.
    In the Variable Mappings tab, map the collection value of the Script Task to User::QueryStr, and Index to 0.
    Inside the Foreach Loop Container, add a Data Flow Task like step 4 in method 1.
    Regards,
    Mike Yin
    TechNet Community Support

  • Data Flow Task problems after installing SQL Server 2014 SSIS

    I can't create any Data Flow Task. Once I create (tested and working) Connection (it doesn't
    matter if it is Excel, Flat File or other) and try to use it with a Source Assistant, I get the following messages:
    ===================================
    The component could not be added to the Data Flow task.
    Could not initialize the component. There is a potential problem in the ProvideComponentProperties method. (Microsoft Visual Studio)
    ===================================
    Error at Data Flow Task [SSIS.Pipeline]: Component ", clsid {C4D48377-EFD6-4C95-9A0B-049219453431}" could not be created and returned error code 0x80070005 "Access is denied.". Make sure that the component is registered correctly.
    Error at Data Flow Task [ [1]]: The component is missing, not registered, not upgradeable, or missing required interfaces. The contact information for this component is "".
    ===================================
    Exception from HRESULT: 0xC0048021 (Microsoft.SqlServer.DTSPipelineWrap)
    Program Location:
    at Microsoft.SqlServer.Dts.Pipeline.Wrapper.CManagedComponentWrapperClass.ProvideComponentProperties()
    at Microsoft.DataTransformationServices.Design.PipelineTaskDesigner.AddNewComponent(String clsid, Boolean throwOnError, Boolean select)
    ===================================
    The component could not be added to the Data Flow task.
    Could not initialize the component. There is a potential problem in the ProvideComponentProperties method. (Microsoft Visual Studio)
    The component could not be added to the Data Flow task.
    Could not initialize the component. There is a potential problem in the ProvideComponentProperties method. (Microsoft Visual Studio)
    ===================================
    Error at Data Flow Task [SSIS.Pipeline]: Component ", clsid {C4D48377-EFD6-4C95-9A0B-049219453431}" could not be created and returned error code 0x80070005 "Access is denied.". Make sure that the component is registered correctly.
    Error at Data Flow Task [ [1]]: The component is missing, not registered, not upgradeable, or missing required interfaces. The contact information for this component is "".
    Error at Data Flow Task [SSIS.Pipeline]: Component ", clsid {C4D48377-EFD6-4C95-9A0B-049219453431}" could not be created and returned error code 0x80070005 "Access is denied.". Make sure that the component is registered correctly.
    Error at Data Flow Task [ [1]]: The component is missing, not registered, not upgradeable, or missing required interfaces. The contact information for this component is "".
    ===================================
    Exception from HRESULT: 0xC0048021 (Microsoft.SqlServer.DTSPipelineWrap)
    Program Location:
    at Microsoft.SqlServer.Dts.Pipeline.Wrapper.CManagedComponentWrapperClass.ProvideComponentProperties()
    at Microsoft.DataTransformationServices.Design.PipelineTaskDesigner.AddNewComponent(String clsid, Boolean throwOnError, Boolean select)
    Exception from HRESULT: 0xC0048021 (Microsoft.SqlServer.DTSPipelineWrap)
    at Microsoft.SqlServer.Dts.Pipeline.Wrapper.CManagedComponentWrapperClass.ProvideComponentProperties()
    at Microsoft.DataTransformationServices.Design.PipelineTaskDesigner.AddNewComponent(String clsid, Boolean throwOnError, Boolean select)

    See examples here:
    http://www.sqlusa.com/bestpractices/ssis-wizard/
    What is your data source?
    What is your data destination?
    Kalman Toth Database & OLAP Architect
    SQL Server 2014 Database Design
    New Book / Kindle: Beginner Database Design & SQL Programming Using Microsoft SQL Server 2014

  • SQL Error Data Flow Task 1: The buffer manager detected that the system was low on virtual memory...

    I'm relatively new to SQL and this is the error that appeared when I tried importing my data. Not sure how to deal with this. Help please. Thanks a lot!
    Information 0x4004800c: Data Flow Task 1: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 14 buffers were considered and 14 were locked. Either not enough memory is available to the pipeline
    because not enough is installed, other processes are using it, or too many buffers are locked.

    Either reduce the amount of data e.g. by lowering the Data Flow Task records max per batch and buffer size
    or install more RAM, or process on a more capable computer.
    Arthur
    MyBlog
    Twitter

  • "Syntax error or access violation" on Data Flow Task OLE DB Data Source

    I am implementing expression parameter for a SQL Server connection string (like this: http://danajaatcse.wordpress.com/2010/05/20/using-an-xml-configuration-file-and-expressions-in-an-ssis-package/)  and it works fine except when it reaches data flow
    task - OLE DB Source task. In this task, I execute a stored procedure like this: 
    exec SelectFromTableA ?,?,?
    The error message is this:
    0xC0202009 at Data Flow Task, OLE DB Source [2]: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    An OLE DB record is available.  Source: "Microsoft OLE DB Provider for SQL Server"  Hresult: 0x80004005  Description: "Syntax error or access violation".
    Error: 0xC004706B at Data Flow Task, SSIS.Pipeline: "OLE DB Source" failed validation and returned validation status "VS_ISBROKEN"
    When I change the SQL command above with reading from table directly it works fine. I should also add that before changing connection string of the SQL data source to use expression, the SSIS package was working fine and I know that the connection string
    is fine because other tasks in the package works fine!
    Any idea why?

    Hi AL.M,
    As per my understanding, I think this problem is due to the mismatching between the source and the destination tables. We can reconfigured every of components of the package to check the table schemas and configuration settings, close the BIDS/SSDT and then
    open and try to see if there are errors.
    Besides, to trouble shoot this issue, we can use the variable window to see the variable's value. For more details, please refer to the following blog:
    http://consultingblogs.emc.com/jamiethomson/archive/2005/12/05/2462.aspx
    The following blog about “SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred: Reasons and troubleshooting” is for your reference:
    http://blogs.msdn.com/b/dataaccesstechnologies/archive/2009/11/10/ssis-error-code-dts-e-oledberror-an-ole-db-error-has-occurred-reasons-and-troubleshooting.aspx
    Hope this helps.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Building a data flow task, within a foreach loop for dynamic table name, but ole db source not allowing variable

    In my control flow, I set up a variable for the table name, enumerated by SMO, following the instructions from the link here:
    http://www.bidn.com/blogs/mikedavis/ssis/156/using-a-for-each-loop-on-tables-ssis
    Now, I put a data flow task inside the foreach. I selected the OLE DB connection manger for my database, set the Data access mode to "Table name or view name variable", and selected my variable name from the drop down. So far so good. When I click on OK,
    it gives me an error 0x80040E37, basically saying it can't open the rowset for "my variable", Check that the object exists in the database.
    So, I assume I won't be able to do this "that' easily, and I will need to build a "SQL command from variable" or some such thing. Any advice on how to build this Source editor to dynamically name my columns from the variable?
    Thanks in advance!
    mpleaf

    Hi mpleaf,
    Please try to set "ValidateExternalData" to False in your OLE DB Source Properties and "DelayValidation" property to TRUE, please refer to similar threads:
    http://social.msdn.microsoft.com/Forums/en-US/sqlintegrationservices/thread/620557d9-41bc-4a40-86d5-0a8d2f910d8c/
    http://social.msdn.microsoft.com/Forums/en-US/sqlintegrationservices/thread/456f2201-447c-41b3-bf27-b3ba47e6b737
    Thanks,
    Eileen
    Eileen Zhao
    TechNet Community Support

  • Need help ASAP with Data Flow Task Flat File Connection

    Hey there,
    I have a Data Flow Task within a ForEach loop container.  The source of the flow is ADO.NET connection and the destination is a Flat File Connection.  I loop through a collection of strings in the ForEach loop.  Based on the string content,
    I write some data to the same destination file in each iteration overwriting the previous version. 
    I am running into following Errors:
    [Flat File Destination [38]] Warning: The process cannot access the file because it is being used by another process.
    [Flat File Destination [38]] Error: Cannot open the datafile "Example.csv".
    [SSIS.Pipeline] Error: Flat File Destination failed the pre-execute phase and returned error code 0xC020200E.
    I know what's happening but I don't know how to fix it.  The first time through the ForEach loop, the destination file is updated.  The second time is when this error pops up.  I think it's because the first iteration is not closing the destination
    file. How do I force a close of the file within Data Flow task or through a subsequent Script Task.
    This works within a SQL 2008 package on one server but not within SQL 2012 package on a different server.
    Any help is greatly appreciated.
    Thanks! 

    Thanks for the response Narsimha.  What do you mean by FELC? 
    First time poster - what is the best way to show the package here?

  • Data Flow task with error redirection hangs

    I am migrating an SSIS package from 2005 to 2012.  I have a package that, among other things, contains a data flow task with redirects.  The source is a flat file pipe delimited that we receive from an outside source.  The file contains a
    bunch of bad data including empty lines.  I redirect the bad rows so I can provide an audit back to the list provider.  The file has about 300 k rows.  Since I have completed the migration wizard, the data flow task stalls at 72,173 rows.  I
    can change the number of rows that get loaded by changing the DefaultBufferMaxRows and DefaultBufferSize values but I can't get it anywhere near 300K.  I decided to try rebuilding the data flow task from scratch and found that if I set it to ignore all
    errors, the entire file will load but when I add the redirect it hangs and does not give me any errors.
    I am currently running it in debug mode from Visual Studio.  I have not tried running it from the SS agent yet.
    Any help would be greatly appreciated.  I would like to keep the error redirects if at all possible for audit reasons.
    Thanks in Advance.
    Alan

    The error says it stopped on row 45200 and that the column AgentIdentifier returned status value 4 "Text was truncated..."  This is one of the errors that I have to trap for.  The field preceding AgentIdentifier is a remarks field that typically
    contains embedded pipe characters that throw off the rest of the row.  There are some other errors that I typically find in the data file but that one is the most frequent and is why I have to redirect so I can report back to the client what rows they
    need to fix.
    Thanks for the suggestion.

  • Data flow task error failed validation and return validation status "VS_NEEDSNEWMETADATA"

    I have ETL with ~800 tables that I moving from Oracle to SQL Server (Prod Oracle -> Prod SQL)
    Now the Oracle/SQL new version was came from vendor that I need to test, and for that I created new DEV environments for Oracle and SQL , the update includes updated new columns in exists tables and new tables . (DEV Oracle -> DEV SQL)
    So what I tried to do is to take the old ETL(PROD) to change the connection to DEV servers.
    Then I executing the packages from local laptop it's working, and if I trying to execute the packages from job schedule it's giving me errors : "Data flow task error failed validation and return validation status "VS_NEEDSNEWMETADATA"
    I went to each table to check the columns if something different, and I was dropping some of the tables and recreated them in the destination but the error still shows. I also tried to change the package to "DelayValidation" to True but without
    success.

    I do not understand the difference between "... if I going to change the Connection Manager to new connection" and "didn't change the Connection Manager, only changed inside the Server name / user/ pass" 800 tables.
    What I see is some tables your packages sees in Dev (laptop) is not of the same schema once the package is deployed hence the metadata error.
    Arthur
    MyBlog
    Twitter

  • Dynamically pick the table names in data flow task SSIS

    Hi All,
    I want to create a SSIS package which loads the data to a table on the other server every day. I have around 250 tables to load everyday and source and destination table names are available in a metadata table, table names have to read from the metadata
    table and data should be loaded. Is there a way that we can configure the source and destination table names dynamically in Data flow task?
    I am newbie to SSIS can any help with the solution for this problem.

    You can do that, not a big deal. The underlying problem is say suppose you constructed a ETL based on some x source and y destination and have put x(3 columns) as source and y destination(3 columns).. As you said if we have choice of dynamically pick the
    table names.. ETL might fail when you face below situation
    source x(3 columns)   destination y (4 columns) and there will be no mapping as it is dynamic. Sometimes mapping also fails even if the source and destination have same number of columns. If you still want to do... follow below steps:
    Create two variables:
    1.variable1 , datatype string
    2.variable datatype string  
    take one execute sql task, pick your source table names dynamically as you desired from metadata table \
    "SELECT sourcetblname as Res FROM @metadata WHERE ID=1" in sql statement and then go to name the result name as Res (I meant same as table alias) and map it to variablename1
    And in variable2 go to expression and write "Select * from "+@[User::variable].. and this will be your constructed dynamic command for oledb destination.
    And connect that execute sql task to (Data flow task)oledb source and choose data access mode as sql command with variable,and choose variable2. below is the diagram.
    - please mark correct answers

  • Row-by-Row processing in data flow task

    Hi to all
    I want ask to you, how to process one row at time in transformation components in data flow task?
    For Example, we have the following components in data flow task :
    Derived Column ----> Ole DB Command_1 ----> Ole DB Command_2
    I want that the Ole DB Command_1 receive the first row and execute sql command (INSERT command) then the Ole DB command_2 receive the same row and execute the sql command (INSERT command).
    After the Ole DB Command_1 receive the second row and execute sql command (INSERT command) then the Ole DB command_2 receive the same row and execute the sql command (INSERT command).
    After the Ole DB Command_1 receive the third row and execute sql command (INSERT command) then the Ole DB command_2 receive the same row and execute the sql command (INSERT command).
    .... And so on... until last row.
    Instead, now the Ole DB Command_1 receive n rows and execute n INSERT ... then the Ole DB Command_2 receive the same n rows and execute n INSERT.
    How to realize row-by-row processing in Ole DB Command_1 and Ole DB Command_2?
    thanks in advance.

    Why cant the INSERTS be wrapped inside a procedure and then procedure  be called within OLEDB Command so that inserts gets executed in required sequence for each records in the pipeline.
    Please Mark This As Answer if it solved your issue
    Please Mark This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

Maybe you are looking for