Data Flow Question

I have a Air app where an employee creates an order. This stores the order information in an order arraycollection, they also add items in the order and those go in the orderitems arraycollection.
Now because the employee may not always have a network connection I have made it so it caches the data in a stored object and an upload button to upload the orders.
Currently I have it upload the order for each item in the orders arraycollection and return an allert if the add status returns true or false.
Does anyone know of a way or seen a tutorial on how to grab the associated items based on the UID in the items array collection and upload those as well in flex?
Also after the item has been uploaded I would like to remove thme from the array collection.
Thanks for any help!

Your post is very vague and I don't think you give enough information to really help.
I am pretty familiar with LabVIEW, but terms like "spit out", "get serviced", and "continuously" have very little meaning without much more details.
If you only plot the data if they are nonzero, you must already have a case structure. To build an array, all you need is a shift register initialized with zero and a build array node in one of the cases (the array is wired through unchanged in the other case). Make sure to select "concatenate array" mode for "build array". Of course if you continously append data, you will very quickly run out of memory, so you probably need some mechanism to deal wit that scenario. Do you want to only plot the current data or the entire accumulated array?
I have no idea how the consumer loop can hold up the dataflow in the producer loop. Where is the dependency coming from?
You don't need to upload a huge project. I am sure you can reduce it to a simple VI with two small loops and some random number generator. This would be enough to show the basic architecture you are using. Thanks!
LabVIEW Champion . Do more with less code and in less time .

Similar Messages

  • SSIS Data Flow Question

    While working on my current project, a problem came to mind.
    On past projects in SSIS, I was taking data from an Excel or csv source and converting for use in SSMS. Straightforward and fairly simple, as I became more and more acquainted with SSIS.
    But, the stored procedure I'm working on now raised a question on how data flows through a project.
    The procedure updates a couple of tables using SQL's UPDATE command. So, it does a direct modification of a table.
    However, in duplicating that in SSIS - using Derived Column tools - I started to wonder what the data was doing as the flow progressed through the various steps in the Control and Data flows.
    Here's my Control Flow so far;
    In Step 1, the SQL table, sh1, has all records that meet the WHERE criteria of the embedded SQL code deleted, but leaving the rest of the table intact.
    In Step 2, I define a variable for use in the project using an embedded SQL query.
    In Step 3, I encounter the first update task;
    This flow terminates in the lower right corner with Update sh1, Pass 4. Here is where I have my question.
    When this flow terminates, does it return to the Control Flow with the table sh1 updated as required by the
    Update sh1 Data Flow? In other words, does it enter Control Flow Step 4 with the updates, or does the next update task (one in which the stored procedure uses the modified table, sh1) simply reload sh1 from the server, without any updates?
    If the former, then no worries, as that's how I was proceeding until a little bug in my ear said, "Whoa, buckwheat! Maybe the changes aren't stored."
    If the latter, then I'm guessing that I need to have a Data Flow Destination tool dragged to the workspace and connected up to the output of the
    Update sh1, Pass 4 tool.
    Fine, I thought, I'll direct the output to the sh1 table.
    But, that will put all of sh1 into the table, not just the updated rows. I'd have to TRUNCATE sh1 first. But, sh1 is active in the Data Flow. If I truncate it, there'll be nothing to update. Or, if I truncate it after the
    Update sh1, Pass 4 tool, it'll lose all the updates - and everything else.
    I figured that, in the latter case, I'd have to have some sort of interim table to hold the results of the
    Update sh1 Data Flow, truncate sh1 and then put the contents of the interim table into the, now empty, sh1 before returning control back to the Control Flow.
    I'd like to avoid an interim table, if at all possible, but if I have to have one, I have to.
    Or, is there another way?
    Thanx in advance for any help!

    You can always use a Record-Set Destination to hold the results from your first Data Flow Transformation. The record set variable is retained in memory and available within the context of your package
    Object Variables Recordsets
    But then you need to modify  your package to read from the record set. you can either use a script task , or loop through each record in your record set and process the updates for each row
    Shredding Record Set
    While these are possible options, i do not think they are the best way to go about things.
    I would like to know what you are doing in the Update Sh2 Step. You can keep adding data flows that follows Update sh1 Pass 4 and then finally update the table through an OLE DB Command transformation. then you do not have to store anything in memory. but
    you might probably end up with a complex data flow transformation.
    The design will depend on the volume of records, the number of transformations you want the data to go through.
    You can avoid transformations by writing queries that retrieve results in the way you want. Example Sorting the results by adding an Order by in your Source query, as opposed to adding a sort transformation
    Regards, Dinesh

  • LO cockpit extraction - data flow question

    This question is related with the LO cockpit extraction.
    Are update tables are meant to be the same  as set tables or
    are they different?
    can anybody provide a highlevel data flow for the logistics data
    using LO cockpit from R3 to BW
    please correct / enhance the below data flow for different methods of LO delta updates as i assume
    and inculude the set up table where it fits.
    (for direct update)
    R3 db talbes -> delta queue -> extract structure -> transfer structure -> infosource -> data target
    (for queued delta)
    R3 db talbes -> extraction queue -> delta queue -> extract structure -> transfer structure -> infosource -> data target
    (for unserialized V3 update)
    R3 db talbes -> postings to update tables -> delta queue -> extract structure -> transfer structure -> infosource -> data target
    I also want to know where does the delta queue reside - in R3 or in BW? Can the content of delta queue be viewed?
    Thanks for ur help - i will give full points for your input.
    ak

    Check the diagram in links below. It clearly identifies the data flow and whether the component is in R3 or BW.
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur

  • Data Flow Question in newly Migrated Business Content Install

    Hi, I just converted my Purchasing Business Content from 3.5 to 7.x.  This Content Loads the InfoCube directly from the Data Source.  After my DataFlow Migration I am left with still having an InfoSource. 
    Here is a picture of my Data Flow
    http://i55.tinypic.com/258b7zs.png
    I thought I would not have an InfoSource after all of this.  Is this correct?
    Thanks for any help with this...

    Hi, Kenneth,
    I beleive it's absolutley correct result after migration.
    I had the same thing with infosources when migrating 0sd_c03 and never had issues with that.
    Infosurces can be used in 7 data flow as sometimes it's a good thing to have additional transformation between source and target (allow for more flexibility in data transformation)
    By the way your infosurce also migrated as its a 7.x infosource now. I beleive it has a different object type compared to old 3.x infosurce. You can actually check the object type in se11, Tadir, select by your infosurce name.

  • Display Data Flow - Short Dump

    Hi all,
    When i select display data flow of any cube...it is going for a short dump.
    I have searched for the answer in previous Forum questions. I could find only for previous BW versions but not for for BI7.
    Could you please let me know the solution for this issue.
    Thanks & Regards,
    Eswari

    Hi All,
    Thank you very much for all of your responces.....
    I am working on Support Package 10.
    Here is the detailed description of the short dump.
    Short text
        The current application triggered a termination with a short dump.
    What happened?
        The current application program detected a situation which really
        should not occur. Therefore, a termination with a short dump was
        triggered on purpose by the key word MESSAGE (type X).
    What can you do?
        Note down which actions and inputs caused the error.
        To process the problem further, contact you SAP system
        administrator.
        Using Transaction ST22 for ABAP Dump Analysis, you can look
        at and manage termination messages, and you can also
        keep them for a long time.
    Error analysis
        Short text of error message:
        GP: Control Framework returned an error; contact system administrator
        Long text of error message:
         Diagnosis
             The Graphical Framework is based on the basis technology known as
             the Control Framework. A method in the Control Framework returned
             an error.
         Procedure
             It probably involves a programming error. You should contact your
             system administrator.
         Procedure for System Administration
             Check the programming of the graphics proxy especially for the
             parameters that were sent and, if necessary, correct your program.
        Technical information about the message:
        Message class....... "APPLG"
        Number.............. 229
        Variable 1.......... " "
        Variable 2.......... " "
        Variable 3.......... " "
        Variable 4.......... " "
    How to correct the error
        Probably the only way to eliminate the error is to correct the program.
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
        "MESSAGE_TYPE_X" " "
        "CL_AWB_OBJECT_NET_SAPGUI======CP" or "CL_AWB_OBJECT_NET_SAPGUI======CM005"
        "PBO"
        If you cannot solve the problem yourself and want to send an error
        notification to SAP, include the following information:
        1. The description of the current problem (short dump)
           To save the description, choose "System->List->Save->Local File
        (Unconverted)".
        2. Corresponding system log
           Display the system log by calling transaction SM21.
           Restrict the time interval to 10 minutes before and five minutes
        after the short dump. Then choose "System->List->Save->Local File
        (Unconverted)".
        3. If the problem occurs in a problem of your own or a modified SAP
        program: The source code of the program
           In the editor, choose "Utilities->More
        Utilities->Upload/Download->Download".
       4. Details about the conditions under which the error occurred or which
       actions and input led to the error.
    Thanks,
    Eswari.

  • R/3 data flow is timing out in Data Services

    I have created an R/3 data flow to pull some AP data in from SAP into Data Services.  This data flow outputs to a query object to select columns and then outputs to a table in the repository.  However the connection to SAP is not working correctly.  When I try to process the data flow it just idles for an hour until the SAP timeout throws an error.  Here is the error:
    R/3 CallReceive error <Function Z_AW_RFC_ABAP_INSTALL_AND_RUN: connection closed without message (CM_NO_DATA_RECEIVED)
    I have tested authorizations by adding SAP_ALL to the service account I'm using and the problem persists.
    Also, the transports have all been loaded correctly.
    My thought is that it is related to the setting that controls the method of generating and executing the ABAP code for the data flow, but I can't find any good documentation that describes this, and my trial and error method so far has not produced results.
    Any help is greatly appreciated.
    Thanks,
    Matt

    You can't find any good documentation??? I am working my butt off just.......just kiddin'
    I'd suggest we divide the question into two parts:
    My dataflow takes a very long time, how can I prevent the timeout after an hour? Answer:
    Edit the datastore, there is a flag called "execute in background" to be enabled. With that the abap is submitted as a background spool job, hence does not have the dialog-mode timeout. Another advantage is, you can watch it running by brwosing the spool jobs from the SAP GUI.
    The other question seems to be, why does it take that long even? Answer:
    Either the ABAP takes that long because of the data volume.
    Or the ABAP is not performing well, e.g. join via ABAP loops with the wrong table as inner.
    Another typical reason is to use direct_download as transfer method. This is fine for testing but it takes a very long time to download data via the GUI_DOWNLOAD ABAP function. And the download time would be part of the ABAP execution.
    So my first set of questions would be
    a) How complex is the dataflow, is it just source - query - data_transfer or are there joins, lookups etc?
    b) What is the volume of the table(s)?
    c) What is your transfer method?
    d) Have you had a look at the generated abap? (in the R/3 dataflow open the menu Validation -> Generate ABAP)
    btw, some docs: https://wiki.sdn.sap.com:443/wiki/display/BOBJ/ConnectingtoSAP

  • Error on Data Flow Task MSSQL 2012 Clustered "Description: The version of Lookup is not compatible with this version of the DataFlow. "

    We have an SSIS package that runs on clustered MSSQL 2012 Enterprise Nodes that is failing.  We use a job to executer the package.
    Environmental information:
    Product - Microsoft SQL Server Enterprise: Core-based Licensing (64-bit)
    Operating System - Microsoft Windows NT 6.1 (7601)
    Patform - NT x64
    Version - MSSQL Version 11.0.3349.0
    Package is set to 32 -bit.  All permissions verified.  Runs in lower environments, same MSSQL version.  All environments are clustered.  In the failing environment, all nodes are at the same service pack.  I have not verified if all
    nodes in the failing environment have SSIS installed.  Data access is installed.  We have other simpler packages that run in this environment, just not this one.  Time to ask the community for help!
    Error:
    Source: Data Flow Task - Data Flow Task (SSIS.Pipeline)     Description: The version of Lookup is not compatible with this version of the DataFlow.  End Error  Error:  Code: 0xC0048020    
    Description: Component "Conditional Split, clsid {7F88F654-4E20-4D14-84F4-AF9C925D3087}" could not be created and returned error code 0x80070005 "Access is denied.". Make sure that the component is registered correctly.  End Error 
    Description: The component is missing, not registered, not upgradeable, or missing required interfaces. The contact information for this component is "Conditional Split;Microsoft Corporation; Microsoft SQL Server; (C) Microsoft Corporation; All Rights
    Reserved; http://www.microsoft.com/sql/support;0".  End Error 
    (Left out shop specific information.  This is the first error in the errors returns by the job history for this package. )
    Thanks in advance.

    Hi DeveloperMax,
    According to your description, the error occurs when you execute the package with Agent job on clustered MSSQL 2012 Enterprise Nodes.
    As per my understanding, I think this issue can be caused by you use SQL Server Agent to schedule a SQL Server Integration Services package in a 64-bit environment. And the SSIS package is referencing some 32-Bit DLL or 32-Bit drivers which are available
    only in 32-bit versions, so the job failed.
    To fix this issue, we should use the 32-bit version of the DTExec.exe utility to schedule the 64-bit SQL Server Agent to run a package. To run a package in 32-bit mode from a 64-bit version of SQL Server Agent, we can go to the Job Step dialog box, then
    select “32 bit runtime” in the Advanced tab.
    Besides, we should make sure that SQL Server Integration Services is installed on the failing environment.
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Why does DB2 Data Flow Task query does not accept a date coming from a string or datetime variable in SSIS?

    I am trying to compare a DB2 date format to a date variable from SSIS. I have tried to set the variable as datetime and string. I have also casted the SQL date as date in the data flow task. I have tried a number of combinations of date formats, but no luck
    yet. Does anyone have any insights on how to set a date (without the time) variable and be able to use it in the data flow task SQL? It has to be an easy way to accomplish that. I get the following error below:
    An invalid datetime format was detected; that is, an invalid string representation or value was specified. SQLSTATE=22007".
    Thanks!

    Hi Marcel,
    Based on my research, in DB2, we use the following function to convert a string value to a date value:
    Date(To_Date(‘String’, ‘DD/MM/YYYY’))
    So, you can set the variable type to String in the package, and try the following query:
    ACCOUNT_DATE  BETWEEN  '11/30/2013' AND  Date(To_Date(?, ‘DD/MM/YYYY’))
    References:
    http://stackoverflow.com/questions/4852139/converting-a-string-to-a-date-in-db2
    http://www.dbforums.com/db2/1678158-how-convert-string-time.html
    Regards,
    Mike Yin
    TechNet Community Support

  • How can I fix consistent TCP timeout and make data flow simpler?

    Hi!
    I'm acquiring data from a Scanivalve Digital Scanning Array through a TCP/IP connection. I'm having problems with the connection timing out regularly. It will run fine if you take data several times in a row, but if the VI sits for several minutes (while looking at previous data runs or adjusting test setup) the connection will timeout the next time you try to take data. The subVIs (reading the tcp connection, processing the data pakcet, etc) were provided by the manufacturer several years ago. The midlevel VI were written by another engineer, and I adapted them to run with the top level VI that I needed for this test. The data flow in the VIs are convoluted and while I can follow the ones t
    hat I wrote, I'm not sure how to troubleshoot the others.
    I posted a question on an unrelated problem and the responses mentioned that race conditions were going to be a problem b/c of all of the global variables.
    Please be patient with my lack of knowledge in some areas. Any help would be appreciated.
    Thanks so much!
    -Sarah
    Engineering intern
    Techsburg, Inc.
    Attachments:
    DSA_Acquistion_VIs.zip ‏253 KB

    I'm not positive about the best solution in this situation, but there is lots of information available regarding error 56 when using TCP/IP communication.
    You might find some of these useful:
    Error 56 Occurred at TCP Open: Windows XP Fails as TCP/IP Server with LabVIEW 6.1.
    Error 56 Occurs When Using TCP Listen.vi
    TCP/IP Error Codes and Related Time-out Issues in LabVIEW

  • SSIS Data Flow task using SharePoint List Adapter Setting SiteUrl won't work with an expression

    Hi,
    I'm trying to populate the SiteUrl from a variable that has been set based on a query to a SQL table that has a URL field.  Here are the steps I've taken and the result.
    Created a table with a url in it to reference a SharePoint Task List.
    Created a Execute SQL Task to grab the url putting the result set in a variable called SharePointUrl
    Created a For Each container and within the collection I use the SharePointUrl as the ADO object source variable and select rows in the first table.
    Still in the For Each container within the Variable mappings I have another Package Variable called PassSiteUrl2 and I set that to Index 0 or the value of the result set.
    I created a script task to then display the PassSiteUrl2 variable and it works great I see my url
    This is where it starts to suck eggs!!!!
    I insert a Data Flow Task into my foreach loop.
    I Insert a SharePoint List Adapter into my Data Flow
    Within my SharePoint List Adapter I set my list to be "Tasks", My list view to be "All Tasks" and then I set the url to be another SharePoint site that has a task list just so there is some default value to start with.
    Now within my Data Flow I create an expression and set the [SharePoint List Source].[SiteUrl] equal to my variable @[User::PassSiteUrl2].
    I save everything and run my SSIS package and it overlays the default [SharePoint List Source].[SiteUrl] with blanks in the SharePoint List Adapter then throws the error that its missing a url
    So here is my question.  Why if my package variable displays fine in my Control Flow is it now not seen or seen as blanks in the Data Flow Expression.  Anyone have any ideas???
    Thanks
    Donald R. Landry

    Thanks Arthur,
    The scope of the variable is at a package level and when I check to see if it can be moved Package level is the highest level.  The evaluateasexpression property is set to True.  Any other ideas?
    I also tried to do the following.  Take the variable that has the URL in it and just assign it to the description of the data flow task to see if it would show up there (the idea being the value of my @[User::PassSiteUrl] should just show in the
    description field when the package is run. That also shows up blank. 
    So i'm thinking its my expression.  All I do in the expression is set [SharePoint List Source].[SiteUrl] equal to @[User::PassSiteUrl] by dragging and dropping the variable into the expression box.  Maybe the expression should be something
    else or is their a way to say  @[User::PassSiteUrl] = Dts.Variables("User::PassSiteUrl2").Value.ToString() 
    In my script task I use Dts.Variables("User::PassSiteUrl2").Value.ToString() to display
    the value in the message box and that works fine.
    Donald R. Landry

  • Owb3i data flow connectors problem in mapping editor

    Hi,
    To define mappings between source and target, the manual says:
    steps 1-3 and then,
    "Repeat steps one through three until you have created all the data flow
    connection appropriate for your situation."
    The manual also says:
    "To connect Mapping Operators, you draw lines from output attributes or output
    attribute groups to input attributes or groups between the operators."
    The question:
    When I draw a lines for individual connections, it is fine. But I have a source/target with 201 columns. So I dragged my mouse between the groups. This created 201 additional attributes in the target. Am I doing it wrong or is it a 'feature'? How else I can make all the 201 connections less painfully?
    TIA.
    --Rao.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    sunil kumar wrote:
    Hi,
    >
    > As said above convert transfer rules to transformations (right click on transfer rules -> additional prop -> create transformation)
    >
    > Then right click on datasource -> Migrate
    >
    > After doing this, your datasource will be linked to Infocube directly with one transformations automatically.
    >
    > Regards,
    > Sunil
    Hi Sunil & Sravan,
    THanks for ur quick reply.As you said click on transfer rules and create transformation.As i can see in my system i have a transformation from data source to infosource .i.e. starting with RSDS_  and then another transformation from info source to info cube.i.e. starting with TRCS_  .  what i did was created a transformation by right click on info cube and by giving data source name as 2LIS_03_BF .this is a manual transformation i am mapping manually.Is this a correct way? Please tell me how to do it.
    as i have to do this immediately.
    Thanks in advance

  • How can I create a template of a Data Flow Task in SSDT?

    The majority of the Data Flow Tasks that I build have the same five elements in them (OLE DB Source, Conditional Split, Derived Column, Data Conversion, and OLE DB Destination).  I'd love to have a template of a Data Flow Task which already contains
    these elements.  Is that possible for me to create, and how do I do it?
    I'm currently on SSDT 11.1, with Visual Studio Pro 2012.

    Hi Tim_Dorenkamp,
    If I understand correctly, you want to create a custom task, then add it as template of a Data Flow Task in SSIS Toolbox. So that you can use it in every packages.
    Just as Samir suggested, we can developing a Custom Task that create a class that inherits from the Task base class, apply the DtsTaskAttribute attribute to your new class, and override the important methods and properties of the base class, including the
    Execute method. The sample of create a custom task is for your reference:
    http://microsoft-ssis.blogspot.com/2013/06/create-your-own-custom-task.html
    Besides, as to your requirement, we can simply create a template package that including this Data Flow Task, then use this template to create new packages. If we want to use the Data Flow Task in an existing package, we can add a new package with the template,
    then copy the Data Flow Task from the new package and paste it to the existing package. For more details about create a template package in SSIS, please refer to the following thread:
    https://social.technet.microsoft.com/Forums/en-US/b0747467-daff-4fc8-9f81-20d68b38e5ee/change-default-for-package-protection-level?forum=sqlintegratio
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Data Flow from TXT to a table error

    Hello,
    I am trying to fill in the data from a .txt file I have into a table in a DB. Previously this worked fine in DTS and I can still do it when I import the DTS command but I want to update this to a data flow because the DTS commands needs to be run on 32 bit
    and I'm using 64 bit. 
    I'm getting 3 errors:
    [OLE DB Destination [322]] Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E21.
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E21  Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
    [OLE DB Destination [322]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "input "OLE DB Destination Input" (335)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE
    DB Destination Input" (335)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
    [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "OLE DB Destination" (322) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (335). The identified
    component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the
    failure.
    Before I changed the Flat File Source input advanced editor input and output properties to text stream [DT_TEXT] because the table has VarChar I also had an other error but this seems to be resolved. The only problem is if I look at the mappings the
    input is text stream [DT_TEXT] but the output is a string and I am unable to change this in the advanced editor of the OLE DB destination. I can change it but it changes back on it's own.
    Could I please get some help on these errors?
    Thanks

    Hi SQLNewbie101,
    According to your description, when you change column data type in the advanced editor of OLE DB Destination, it always changes back.
    Based on my research, the column data type is already confirmed by the destination table, it depends on the columns in the table, so we cannot change it.
    To fix this issue, one way as you said, we can use Data Conversion Transformation to convert the [DT_TEXT] data type to [DT_STR] after Flat File Source. Another way is directly change the column data type in the Advanced tab of Flat File Connection Manager
    Editor as below. Then double click the Flat File Source to update the columns.
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Are unique GUIDs required in the data flow task in SSIS?

    Hi,
    I have previously used SQL server 2008R2 to develop packages.  I was told not to copy and paste packages, or create templates with tasks on the data flow because this leads to multiple packages with the same GUID, the consequence of this might be calamitous
    problems at runtime when the engine can't distinguish between tasks in different packages.
    My process was to create templates with Control flow tasks only, manually create the dataflow tasks and then use bids helper from codeplex to regenerate GUIDs (this regenerated the GUIDs at the control flow level only).
    I am now using SQL 2012. I have looked and seen that if I copy and paste a package the GUIDs at the data flow level are also copied.  So my questions are:
    1) Does having the same ID in different dataflows cause problems as I was led to believe?
    2) If it does has this been addressed in some way in SSIS 2012? and what is the best way of working around this given I have many similar packages to develop?
    Thanks,
    Dan

    SSIS doesn't really lend itself well to creating templates.
    SQL 2012 is much improved when it comes to copy and pasting tasks, dataflows and transformations between package designs, other than the usual issues with missing connection managers.

  • Tracing down data flow within an DB

    Hello,
    pls, many times I was wandering if there is an practical way of performing subject's issue. Namely,
    1) if there is an App without source code, on production server and
    2) corresponding DB, that is open i.e., it is possible to go through all the objects but, plethora of them
    Question: upon CLICK event on the control located on the App interface, is there an practical way of how to follow data flow, entered on the interface controls, how to trace down which SP(s) have been fired out etc..etc...
    DB itself is very, very complex.....
    how to cope with such tasks?
    thanks
    bye

    On production server don't use SQL Profiler due to performance impact.
    Use server-side tracing with caution as well:
    http://www.sqlusa.com/bestpractices/createtrace/
    BOL: http://msdn.microsoft.com/en-us/library/ms175047.aspx
    Kalman Toth Database & OLAP Architect
    SQL Server 2014 Design & Programming
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

Maybe you are looking for

  • Thunderbird no longer gets my email

    It was working and it did some upgrades and it allows me to send emails, but after a few days I noticed when someone complained I hadn't responded to my email that I wasn't getting any. I checked my isp with webmail and sure enough there were 13 emai

  • I keep getting emails telling me that I have used 14.5GB of my 15GB but all of my devices say that I have only used 7.2GB.  Any ideas how to fix this?

    For the past week I have been getting emails as per below: You are currently using 14.5 GB of your total 15 GB of iCloud storage, which means that your iCloud storage is almost full. When your storage is full, device backups to iCloud will stop and a

  • Is there a way to put borders in Iphotos?

    I have an older mac with Photoshop that won't transfer to my Macbook with Snow Leopard. Can trundle over pix or email them to edit but trying simpler moves with ImageTricksLite, Quartz and Iphotos. Can do some edits I want but in none can I find a wa

  • 7.0NW: script logic to post data to another cube

    Hi, On the 7.0NW we have a FINANCE cube (8 dimensions) and a FINANCE_DETAIL (12 dimensions) cube. I was hoping that someone could advise on how to transport data using script logic from FINANCE_DETAIL to FINANCE, knwoing that the SQL logic for the MS

  • Dynamic "search" query

    Hi there, I'm tring to build a dynamic query to search not only the exatly phrase typed by the user but even the single words. So far, I've got how to make the preparedStatement but I'm still stuck in how to make the exactly number of psmt.setString