OLEDB Source/Destination Metadata refresh

Hi Everyone,
Is there any way we can turn off the case sensitivity of OLEDB Source/Destination metadata. The package fails whenever there is a change in the case of columns. For e.g. if RecordID becomes Recordid, the metadata needs to be refreshed.
I am aware that metadata needs to be set before we can execute the DFT but curious if there's any way we can turn off the case sensitivity.
Thanks and regards,
Jatin

Sadly, none that I am aware of.
hing is the package captures a snapshot of the metadata as a hash (I assume) a change in case would invalidate this hash.
Arthur My Blog

Similar Messages

  • Need to load the data from oledb source(sql server table to ) ODBC destination table. )

    I have around 700,000 records that is needed to be moved from a table that exists in my sql server 2012 database table to the DB3 database table. I am using currently oledb source and ado.net as destination to do this
    transfer. But its taking more than 2 hrs, I need to get this done fast, there should be a way. but I am not able to get it done faster can any one help please....
    Thank you 

    I suspect you are talking about DB2 database. In that case I would recommend you check the commercial COZYROC
    DB2 Destination component. It is 20x faster compared to the standard RBAR insertion.
    SSIS Tasks Components Scripts Services | http://www.cozyroc.com/

  • How to pass a variable for a SQL query in OLEDB source?

    Hi All,
    I am new to SSIS and working on it past few days. Can anyone please help me getting through a scenario where I need to pass a variable in the SQL statement in OLEDB source connection. Please find below for the details.
    eg:
    1) I have a SQL table with the columns SerialNumber, Name, IsValid, FileName with multiple rows.
    2) I have the file Name in a variable called Variable1.
    3) I want to read the data from my SQL table filtering based on the FileName (Variable1) within a data flow task and pull that data to the destination table.
    Question: In the data flow task, added source and destination DB connection with a script component in between to perform my validations. When trying to retrieve the data from source using the variable (i.e. SQL Query with variable), I am not able to add
    the query as the SQL statement box is disabled. How to filter the data based on the variable in the source DB ?
    Any help/suggestions would be of great help.
    Thanks,
    Sri

    Just to add with Vaibhav comment .
    SQL Command  : SQL query either with SQL variable or any condition  or simple Sql statement
    Like ;
    Select * from dimcustomer
    SQL Command using Varible :
    Sometimes we design our dynamic query in variable and directly use that variable name in oledb source.
    If you Sql query needs a condition based on SSIS variable .
    you can find a Example here :
    http://www.toadworld.com/platforms/sql-server/b/weblog/archive/2013/01/17/ssis-replace-dynamic-sql-with-variables.aspx
    http://www.select-sql.com/mssql/how-to-use-a-variable-inside-sql-in-ssis-data-flow-tasks.html
    Thanks
    Please Mark This As Answer or vote for Helpful Post if this helps you to solve your question/problem. http://techequation.com

  • Please help - Can not use stored procedure with CTE and temp table in OLEDB source

    Hi,
       I am going to create a simple package. It has OLEDB source , a Derived transformation and a OLEDB Target database.
    Now, for the OLEDB Source, I have a stored procedure with CTE and there are many temp tables inside it. When I give like EXEC <Procedure name> then I am getting the error like ''The metadata  could not be determined because statement with CTE.......uses
    temp table. 
    Please help me how to resolve this ?

    you write to the temp tables that get created at the time the procedure runs I guess
    Instead do it a staged approach, run Execute SQL to populate them, then pull the data using the source.
    You must set retainsameconnection to TRUE to be able to use the temp tables
    Arthur My Blog

  • OLEDB Source sql query with parameter

    I have a dataflow  in a for each loop container.The OLEDB Source is pointing to oracle database.The sql query to fetch the data should take a variable value.
    Example : select * from tablename where colname = @variablevalue
    the variable value changes each time since the dataflow is in a for each loop container.
    can anyone please help me configuring the source??
    when i try to use ? i get the following error message
    Thanks,
    Rahul

    I can't troubleshoot that error as i don't have a Oracle source to test. See this - 
    http://microsoftdw.blogspot.com/2005/11/parameterized-queries-against-oracle.html
    Try -
    1. Create a int variable vBatchIndex (this would be your parameter which you are trying to pass) and provide the value
    2. Create another string variable vSQLQuery > Go to Variable properties > Set the "EvaluateAsExpression" property to "True"
    3. Under the variable properties > Go to Expression > and provide this expression as 
    "SELECT * FROM fc_batch_report WHERE batch_index= " + (DT_WSTR, 10) @[User::vBatchIndex]
    4.Next ,go to OLE DB Source > Select the Data access mode - SQL command from variable
    5.Select the variable name - vSQLQuery 
    Narsimha

  • Pass a variable value to SQL Command in OLEDB Source

    Hi,
      I have the OLEdb Source where it has SQL Command as Data Access Mode. Below is the sample query that i have in that.
    DECLARE @MonthOffSet int = 24
    DECLARE @PaidDate_SK_Low datetime = dateadd(mm,MONTH(getdate())-@MonthOffSet-1,dateadd(year,datediff(year,0,dateadd(YY,0,getdate())),0))
    DECLARE @PaidDate_SK_High datetime = dateadd(dd,-1,dateadd(MM,@MonthOffSet,@PaidDate_SK_Low))
    followed by select statement which has where clause.
     Instead of hard code the value 24, i am trying to get the value from variable. I know there is a limitation to add the parameters only in where clause. Is there any work around or solution.

    But i need to use the first four lines of code in lot other packages which has different sql statement. But if i make whole query variable then i have a limitation on length 4000 char.
    DECLARE @MonthOffSet int = 24
    DECLARE @PaidDate_SK_Low
    datetime = dateadd(mm,MONTH(getdate())-@MonthOffSet-1,dateadd(year,datediff(year,0,dateadd(YY,0,getdate())),0))
    DECLARE @PaidDate_SK_High
    datetime = dateadd(dd,-1,dateadd(MM,@MonthOffSet,@PaidDate_SK_Low))
    in my select statement in where clause i am using date range between @PaidDate_SK_Low
    datetime  and @PaidDate_SK_High
    datetime 
    Any suggestions

  • SSIS 2012 ETL is failing only at one server (No BIDS) but running successfully from BIDS on different sever . In this ETL, I have used Stored Procedure in OLEDB Source.

    Hi Guys,
    SSIS 2012 ETL is failing only at one server (No BIDS) but running successfully from BIDS on different sever . In this ETL, I have used Stored Procedure in OLEDB Source.
    Note: I have couple of ETLs developed in 2005 using same logic and upgraded to 2012, working perfectly.
    I am getting Error Message:
    SSIS
    Error Code
    DTS_E_OLEDBERROR. 
    An OLE DB
    error has occurred.
    Error code: 0x80004005.
    An
    OLE DB
    record is available. 
    Source: "Microsoft OLE DB Provider for SQL Server" 
    Hresult: 0x80004005 
    Description: "Error converting data type varchar to datetime.".
    Unable
    to retrieve
    column information
    from the data
    source. Make
    sure your target
    table in
    the database is
    available.
    "OLE DB Source"
    failed validation
    and returned
    validation status
    "VS_ISBROKEN".
    I tried below word around and found It is working perfectly.
    I loaded data into a table (dbo.TEMP) using Stored procedure and then I used this dbo.TEMP table in OLEDB source and then found no issue.
    MY SP Details: (This SP I am calling in OLEDB source of ETL) and when I run it from one server IT is working fine and when I run from ETL dedicated Server getting error:   Guys Help me out.
    USE
    [TEST_DB]
    GO
    SET
    ANSI_NULLS ON
    GO
    SET
    QUOTED_IDENTIFIER ON
    GO
    ALTER
    PROCEDURE  [DBO].[SP_TEST]
    --EXEC [DBO].[SP_TEST] '2014-09-30','2014-10-01'
    @FROMDATETIME
    DATETIME,
    @TODATETIME
    DATETIME
    AS
    SET
    NOCOUNT ON
    BEGIN
    DECLARE
    @FROMDATEKEY INT,
    @TODATEKEY INT,
    SET
    @FROMDATEKEY=
    CONVERT(VARCHAR(10),@FROMDATETIME,112)
    SET
    @TODATEKEY=
    CONVERT(VARCHAR(10),@TODATETIME,112)
    IF 1 = 1
    BEGIN
    SELECT
    CAST(NULL
    AS DATETIME) 
    AS TXN_DATE
    , CAST(NULL
    AS DATETIME
    ) AS PROCESS_DATE     
    , CAST(NULL
    AS money)
    AS  S1_AMT
    , CAST(NULL
    AS money)
    AS  S2_AMOUNT
    , CAST(NULL
    AS money)
    AS  S2_INVALID_AMOUNT
    , CAST(NULL
    AS money)
    AS  INVALID_MOVED_IN_VALID_S2_AMOUNT
    , CAST(NULL
    AS VARCHAR(20))
    AS SYSTEM_ID
    , CAST(NULL
    AS money)
    AS  S3_AMT
    END
    SELECT
    TXN_DATE
    ,PROCESS_DATE
    ,S1_AMT
    ,S2_AMOUNT
    ,S2_INVALID_AMOUNT
    ,INVALID_MOVED_IN_VALID_S2_AMOUNT
    ,SYSTEM_ID
    S3_AMT
    FROM
    DBO.TABLE_1
    WHERE TNX_DATE_KEY
    BETWEEN @FROMDATEKEY
    and @TODATEKEY
    UNION
    ALL
    SELECT
    TXN_DATE
    ,PROCESS_DATE
    ,S1_AMT
    ,S2_AMOUNT
    ,S2_INVALID_AMOUNT
    ,INVALID_MOVED_IN_VALID_S2_AMOUNT
    ,SYSTEM_ID
    S3_AMT
    FROM
    DBO.TABLE_2
    WHERE TNX_DATE_KEY
    BETWEEN @FROMDATEKEY
    and @TODATEKEY
    UNION
    ALL
    SELECT
    TXN_DATE
    ,PROCESS_DATE
    ,S1_AMT
    ,S2_AMOUNT
    ,S2_INVALID_AMOUNT
    ,INVALID_MOVED_IN_VALID_S2_AMOUNT
    ,SYSTEM_ID
    S3_AMT
    FROM
    DBO.TABLE_3
    WHERE TNX_DATE_KEY
    BETWEEN @FROMDATEKEY
    and @TODATEKEY
    END
    Data Source Mode: SQL Command for Variable
    "EXEC [DBO].[SP_TEST]  '"+ (DT_WSTR, 24) @[User::V_EXTRACT_FROM_DT]  +"','"+ (DT_WSTR, 24) @[User::V_EXTRACT_TO_DT]  +"'"
    Where variable @[User::V_EXTRACT_FROM_DT] and @[User::V_EXTRACT_TO_DT] is defined as DATETIME 
    Thanks Shiven:) If Answer is Helpful, Please Vote

    Hi,
    Yes you are right. At one sever where I was getting error, DateTime was in USA format and Where It was running successfully was in AUS format.
    I changed from USA to AUS and I did another changes:
    Data Source Mode: SQL
    Command
    EXEC  [DBO].[SP_TEST] 
    @FROMDATETIME = ?,
    @TODATETIME = ?
    and It is working fine.
    Thanks Shiven:) If Answer is Helpful, Please Vote

  • Issue with OLEDB Source in BIML

    Hi all,
    I am trying to automate a data flow with BIML. I am using an expression to build my SQL dynamically based on input parameter. Sparing
    details of the use case, I need this flexibility in my project. I am having 1 master table which consists of file name and source query.
    I want to use SQL Command from Variable Data access mode in OLEDB Source.
    On the basis of input file source query need to change automatically .However; my query is not being evaluated when the package
    is generated. The query will populate after package generation, when I open the source and set access mode SQL Command, but I cannot seem to get this configured automatically as desired. This is preventing me from doing transformations in the script.
    Please help me to achieve the above scenario
    Below is my BIML Script(C#).
    <Biml
    xmlns="http://schemas.varigence.com/biml.xsd">
    <!-- Database Connection manager-->
    <Connections>
    <Connection
    Name="Archive"
    ConnectionString="Data Source=RLDEVOLP03.DEVELOPMENT.LOCAL;Initial Catalog=Archive;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;"
    />
    <Connection
    Name="DataStaging"
    ConnectionString="Data Source=RLDEVOLP03.DEVELOPMENT.LOCAL;Initial Catalog=DataStaging;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;"
    />
    </Connections>
    <!-- Name Of the the Package-->
    <Packages>
    <Package
    Name="LoadArchive Using BIML"
    ConstraintMode="Linear"
    ProtectionLevel="EncryptSensitiveWithUserKey" >
    <Variables>
    <Variable
    Name="V_Archive_tablename"
    DataType="String" ></Variable>
    <Variable
    Name="V_Archivequery"
    DataType="String"
    EvaluateAsExpression="true">SELECT a.*, b.BBxKey as Archive_BBxKey, b.RowChecksum as Archive_RowChecksum
    FROM dbo.ImportBBxFbapp a LEFT OUTER JOIN Archive.dbo.ArchiveBBxFbapp b ON a.Col001 = b.BBxKey Where (b.LatestVersion = 1 OR b.LatestVersion IS NULL)
    </Variable>
    <Variable
    Name="v_Src_FileName"
    DataType="String" >FBAPP</Variable>
    <!-- Load Data Truncate Staging Sequence Container--> 
    <Container
    Name="Load Data Truncate Staging"
    ConstraintMode="Parallel">
    <Tasks>
    <Dataflow
    Name="Archive Data"
    DelayValidation="true" >
    <Transformations>
    <OleDbSource
    Name="Source"
    ConnectionName="DataStaging"
    ValidateExternalMetadata="false">
    <TableFromVariableInput
    VariableName="User.V_Archivequery"/>
    </OleDbSource>            
    </Transformations>
    </Dataflow>
    </Tasks>
    </Container>
    </Tasks>
    </Package>
    </Packages>
    </Biml>
    Regards,
    Vipin jha
    Thankx & regards, Vipin jha MCP

    Hi Vipin,
    very few people would respond here, why don't you post to the BIML dedicated forum: http://varigence.com/Forums?forumName=Biml&threadID=0
    Arthur
    MyBlog
    Twitter

  • Pass Parameter to IN Operator in SQL Command in OLEDB source

    Hi,
    I am trying to pass multiple values as one parameter. Eg: I need to get the employess with gender in ('M','F')
    query: select eno, ename, sal, gender from emp where gender in (?)
    Created a variable 'varGender' with the value - 'M','F'
    And passed the parameter mapping in OLEDB Source. But this is not returning any records.
    Please advise.
    Thanks.

    Hi Vaibhav,
    Thanks for your kind reply.
    I need to pass the same where condition to multiple source queries and we don't want to repeat the same condition in all queries and replace with a variable. Also the query is too big to keep it in the Variable. Please help me.
    Thanks
    Put the required filter values in a staging table and then use it in a join condition with all the other tables in all the other tasks.
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Clean solution to use Database Links in OLEDB source

    Hi,
    I have a connection to server A and from there, I have access to a set of views that are behind a database link.
    All the queries I can do follow have the following semantic:
    SELECT CODE, DESCRIPTRION
    FROM STUDENT@DBLINK_DEV
    but then in production it will be
    SELECT CODE, DESCRIPTRION
    FROM STUDENT@DBLINK_PRD
    Is there any clean solution to have the dblink in a parameter without having to use a SQL statement inside package variable?
    Thank you

     I dislike the fact that I cannot simply copy paste thecode into the sql query of the oledb source.
    Why do you have to even do that?
    The variable takes care of the query by itself by means of expression so all that you do is set expression
    and map the variable.
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Choose CAF source for metadata???

    Hi,
    I’m trying to create my second CAF Application on NWDS 7.0, after a successfully first application.
    Everything looks fine; I’m able to create Entity Services, Application Services, Generate All Project Code, Validate, Build and Deploy to J2EE Engine, and all, with success. The problem is when testing the developed Application Service.
    Clicking with right button of the mouse on the developed Application Service, and selecting Test, it opens an Internet Explorer window, pointing to the URL http://localhost:50100/webdynpro/dispatcher/sap.com/cafUIservicebrowser/ServiceBrowser?cafsource=true&service=sap.com/test/testeapp, after logon with administrator, it shows a screen displaying:
    “Choose CAF source for metadata or define ABAP parameters and choose ABAP source: …”
    I’m selecting CAF Source button, but an error message saying “Error when getting service façade: Service Manager Initialization failed illegal argument exception: Unable to create javax.ejb.EJBObject” are begin displayed.
    What’s wrong? After this, I can’t even test again my first CAF Application. It looks like metadata database of CAF, have been corrupted.
    How can I restore the system without having to install NW04s again?
    Thanks and Regards,
    Paul Croft

    Hi,
    Once again, thanks for the explanation, but maybe i didn't explain well.
    After installing NW04s Sneak Preview, i was able to create and Test a CAF Application (Betting Pool example from SDN), with success.
    Then tried to create a new one myself, similar to the previous one, and was able to compile, generate code and deploy with success, but when trying to test the entity service, a screen displaying “Choose CAF source for metadata or define ABAP parameters and choose ABAP source: …” appear.
    Since there, i’m not able to test any CAF application, even the first, which already worked.
    This means that after deploying the second application, something gets corrupted. What is what I’m trying to find out?
    I confirmed, CAFAdmin and CAFUIAdmin roles are configured has expected, even because the first CAF Application already worked.
    Thanks and Regards,
    Paul Croft

  • Cannot load source/destination schema - on a .BTM file

    I'm converting a project originally developed in VS2008/BTS 2009 to VS2013/BTS2013R2
    I get this error in my MAP file while building.
    Exception Caught: Cannot load source/destination schema: MyCompany.Schema.  Either the file/type does not exist, or if a project dependency exists, the dependent project is not built.
    Why is this? I've tried GACing, Re-GACing, reloading the source and target schemas etc etc. None of the commonly suggested solutions worked for this error. 
    We contacted MSFT support; the engineer after trying various approaches, finally recommended to change Build Action from BTSCompile to None for the .BTM file. Apparently, this fixed the build error. But I feel like this is just masking the real problem.
    What is the true remedy for the above error occuring in a .BTM file ? Thank you for any help. 
    -Perennial Newbie-

    Since you have a case opened with Microsoft, it would be better to get the root cause from them.
    We can speculate by any suggestion may not be the exact reason for your issue and the more appropriate/relevant solution
    would be from Microsoft support as they are already on top of your issue and would have taken some log/knew the complete background of your issue completely.
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • How to make the Source Destination work correctly from FCP

    When ever I use FCP6 to export to via Compressor 3 it says it will generate the compressed file to my "source" destination. Which if I use Compressor as a stand alone app and drag a video file into it, it does indeed place the compressed video in the same directory as the source video. But from FCP it always just puts it on the route of my main system drive!
    I know I can alter the destination by draging options from the Destination TAB but is there anyway to tell Compressor to use the "actual" source of the video file when coming via FCP that can be permantly set?
    Kevin

    is there anyway to tell Compressor to use the "actual" source of the video file when coming via FCP that can be permantly set?
    Not really, because of the many different places that "source" video can reside for FCP sequences. Many projects I work on have many different source locations. So, how does Compressor know exactly what source location to save to? That's why it defaults to the root directory.
    The story is different for an already exported file, Compressor knows where that file is located.
    If you want Compressor to save in a specific location, you've already figured out how to do that, either create a destination and drag it to your job, or set it manually.

  • SDK-UI - Control of the Metadata Refresh Option

    When creating metadata the SBO Client need to do a meta-data refresh in order to see new data. In a developers world that is no big problem, but to the customer it means that every time something must be changed, they need to be aware of the Refresh-window (since I closes all windows). Also we canu2019t trigger the metadata refresh form (we only have the option to suppress it while creating data. We should have at least have the option to trigger it, but best solution would of cause to not need it at all).

    When creating metadata the SBO Client need to do a meta-data refresh in order to see new data. In a developers world that is no big problem, but to the customer it means that every time something must be changed, they need to be aware of the Refresh-window (since I closes all windows). Also we canu2019t trigger the metadata refresh form (we only have the option to suppress it while creating data. We should have at least have the option to trigger it, but best solution would of cause to not need it at all).

  • Weblogic 9.2 distributed source destination problem.

    (Tom, posting this here as you monitor this one). Answering your query as well.
    We are using Weblogic 9.2 and have configured a Messaging Bridge (to connect to Weblogic 8.1). We have a distribted queue defined and 2 managed servers. jms-xa-rar deployed correctly and passes the test.
    In the Source Bridge Desitination, the ConnectionURL is given as "t3://ms1:54032,ms2:54034"
    Initially bridge as not started and both managed servers were shutdown. Then we brought up the 2 managed servers one by one, found that one member of source destination on ms2 had a consumer while destination member on ms1 had no consumers. Instead when turning on debug for MessagingBridge, found these exceptions.
    <29-Mar-2010 17:07:23 o'clock BST> <Warning> <Connector> <BEA-190032> << eis/jms/WLSConnectionFactoryJNDIXA > ResourceAllocationException thrown by resource adapter on call to ManagedConnectionFactory.createManagedConnection(): "javax.resource.ResourceException: ConnectionFactory: failed to get initial context (InitialContextFactory =weblogic.jndi.WLInitialContextFactory, url = t3://ms1:54032,ms2:54034, user name = null) ">
    <29-Mar-2010 17:07:23 o'clock BST> <Info> <MessagingBridge> <BEA-200042> <Bridge com.bridge.somename failed to connect to the source destination and will try again in 60 seconds. This could be a temporary condition unless the messages repeat several times. (javax.resource.ResourceException: ConnectionFactory: failed to get initial context (InitialContextFactory =weblogic.jndi.WLInitialContextFactory, url = t3://ms1:54032,ms2:54034, user name = null))>
    The whole setup works with just one managed server given in ConnectionURL i.e. "t3://ms1:54032"
    Why is it unable to find initial context when we give a cluster JNDI for ConnectionURL in source bridge destination?
    1) With ONLY ms1 as the connection url for source bridge destination, it works fine and I can see 2 consumers on the member ms1 destination (no consumers on ms2 as expected)
    2) With ONLY ms2 as the connection url for source bridge destination, it works only with MS2. I can see 1 consumer on member destination ms2. Nothing shows up on ms1. when starting ms1, it did show some messages given below. I could not see JMS Server recognizing ms1 as Active Destination either even though server started up.
    3) With ms1 and ms2 in connection url, one seems to connect correctly at times but never both (never gotten consumers listed on both managed servers).
    <31-Mar-2010 07:05:34 o'clock BST> <Info> <MessagingBridge> <BEA-200021> <Bridge "com.somebridge"
    failed to get one of the adapters from JNDI (javax.naming.NameNotFoundException: Unable to resolve
    'eis.jms.WLSConnectionFactoryJNDIXA'. Resolved 'eis.jms'; remaining name 'WLSConnectionFactoryJNDIXA').
    javax.naming.NameNotFoundException: Unable to resolve 'eis.jms.WLSConnectionFactoryJNDIXA'. Resolved 'eis.jms'; remaining
    name 'WLSConnectionFactoryJNDIXA'
    at weblogic.jndi.internal.BasicNamingNode.newNameNotFoundException(BasicNamingNode.java:1139)
    at weblogic.jndi.internal.BasicNamingNode.lookupHere(BasicNamingNode.java:252)
    at weblogic.jndi.internal.ServerNamingNode.lookupHere(ServerNamingNode.java:171)
    at weblogic.jndi.internal.BasicNamingNode.lookup(BasicNamingNode.java:206)
    at weblogic.jndi.internal.BasicNamingNode.lookup(BasicNamingNode.java:214)
    at weblogic.jndi.internal.BasicNamingNode.lookup(BasicNamingNode.java:214)
    at weblogic.jndi.internal.WLEventContextImpl.lookup(WLEventContextImpl.java:269)
    at weblogic.jndi.internal.WLContextImpl.lookup(WLContextImpl.java:362)
    at weblogic.jms.bridge.internal.MessagingBridge.startInternal(MessagingBridge.java:562)
    at weblogic.jms.bridge.internal.MessagingBridge.run(MessagingBridge.java:1030)
    at weblogic.work.ExecuteRequestAdapter.execute(ExecuteRequestAdapter.java:21)
    at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:145)
    at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:117)
    >
    <31-Mar-2010 07:16:28 o'clock BST> <Info> <MessagingBridge> <BEA-200033> <Bridge "com.somebridge"
    is obtaining connections to the two adapters.>
    <31-Mar-2010 07:16:28 o'clock BST> <Info> <Common> <BEA-000628> <Created "1" resources for pool
    "eis/jms/WLSConnectionFactoryJNDIXA", out of which "1" are available and "0" are unavailable.>
    <31-Mar-2010 07:16:28 o'clock BST> <Info> <Common> <BEA-000628> <Created "1" resources for pool
    "eis/jms/WLSConnectionFactoryJNDIXA", out of which "1" are available and "0" are unavailable.>
    <31-Mar-2010 07:12:13 o'clock BST> <Debug> <JMSBackEnd> <000000> <Starting block
    ing receive for consumer>
    <31-Mar-2010 07:12:13 o'clock BST> <Debug> <JMSBackEnd> <000000> <Blocking recei
    ve request: state = 101>
    <31-Mar-2010 07:12:13 o'clock BST> <Debug> <JMSBackEnd> <000000> <Associating me
    ssage with transaction>

    Tom,
    Please find responses to your queries below
    Can you help me understand your topology?
    -- Is the bridge running on the same server/cluster as its source destination? Its target destination? Neither?
    Bridge is running on same cluster as the source destination, there are 2 managed servers in cluster.
    -- If neither, is it running on 8.1 or 9.1?
    Bridge running on Weblogic 9.2.2.
    -- Is the source destination distributed? Is it on 8.1 or 9.1?
    source = Weblogic 9.2.2, yes source destination is a distributed queue (uniform distributed with member allocated evenly)
    -- Is the target destination distributed? Is it on 8.1 or 9.1?
    target destination is 8.1 and is distributed target as well. This is on a remote machine.
    -- What is the bridge's URL for the source destination?
    http://ms1:50128,ms2:50130 (I may not be able to provide details on actual IP or DNS, apologies)
    -- What is the bridge's URL for the target destination?
    http://r1:40888,r2:40890
    More inputs
    The connection factory on the message producer side i.e my end is Server Affinity while the connection factory on target side is Load Balanced
    JMSBackendDebug on, I can actually see that 2 consumers are being created on each ms
    Managed Server 1
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <New consumer for jms_module!jms.server.ms1@queue_name>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Calling out to DDMember: jms_module!jms.server.ms1@queue_name, hash: 335836273, dispId: managed1_server, backEndId: <6772957712843200991.10>:managed1_server, destinationId: <6772957712843200991.12>, remoteSecurityMode: 11>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Created a new consumer with ID <6772957712843200991.16> on queue jms_module!jms.server.ms1@queue_name>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Back from DDMember: jms_module!jms.server.ms1@queue_name, hash: 335836273, dispId: managed1_server, backEndId: <6772957712843200991.10>:managed1_server, destinationId: <6772957712843200991.12>, remoteSecurityMode: 11>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Calling out to DDHandler: queue_name, hash: 29969233, dd: jms_module!queue_name>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Calling out to FEDDHandler: queue_name, hash: 23986467>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Back from FEDDHandler: queue_name, hash: 23986467>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Calling out to QueueForwardingManager: jms_module!jms.server.ms1@queue_name within jms_module!queue_name, hash: 25321487>
    Managed Server 2
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <New consumer for jms_module!jms.server.ms2@queue_name>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Calling out to DDMember: jms_module!jms.server.ms2@queue_name, hash: -112860366, dispId: managed2_server, backEndId: <4215229783845568768.10>:managed2_server, destinationId: <4215229783845568768.12>, remoteSecurityMode: 11>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Created a new consumer with ID <4215229783845568768.16> on queue jms_module!jms.server.ms2@queue_name>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Back from DDMember: jms_module!jms.server.ms2@queue_name, hash: -112860366, dispId: managed2_server, backEndId: <4215229783845568768.10>:managed2_server, destinationId: <4215229783845568768.12>, remoteSecurityMode: 11>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Calling out to DDMember: jms_module!jms.server.ms1@queue_name, hash: 335836273, dispId: managed1_server, backEndId: <6772957712843200991.10>:managed1_server, destinationId: <6772957712843200991.12>, remoteSecurityMode: 11>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Back from DDMember: jms_module!jms.server.ms1@queue_name, hash: 335836273, dispId: managed1_server, backEndId: <6772957712843200991.10>:managed1_server, destinationId: <6772957712843200991.12>, remoteSecurityMode: 11>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Calling out to DDHandler: jms_module!queue_name, hash: 24464810, dd: jms_module!queue_name>
    <01-Apr-2010 11:32:06 o'clock BST> <Debug> <JMSBackEnd> <000000> <Calling out to FEDDHandler: jms_module!queue_name, hash: 10346403>
    Edited by: user9521457 on Apr 1, 2010 3:49 AM

Maybe you are looking for