Integration Services - Dimension Table

I?m trying to build a dimension table for a star schema to be used with Essbase Integration Services. I need to know how to structure the table when the data in the fact table is at a more granular level that that of the Essbase leaf member. The typical structure of my dimension tables is: [Leaf Node Name], [Leaf Node Alias], [Gen01], [Gen02]?, where the leaf node name is also the level of detail in my fact table.

Well do remember bitmap indexes are best on columns where there not so many unique values. Everyone has differing opinions on what that # ought to be. In other words, the cardinality, or # of unique elements is not extremely high.
IN general, bitmap indexes will improve performance given that the CBO is using them.
Look at this post, it is a pretty good discovery on this sort of thing. it tends to be a design process, where you test with your given dataset and optimize it.
http://www.rittmanmead.com/2007/07/27/playing-around-with-star-transformations-and-bitmap-indexes/
-Greg

Similar Messages

  • Load hierarchical attribute dimensions with Integration Services?

    Hi everybody,
    I need to load product dimension which is organized in a relational table like this:
    Product (parent_code, member_code, member_alias, brand, consolidation, formula)
    Every product has a brand and I need to load brand as attribute dimension. The thing is Brand is not flat but it has it's own hierarchy. For Brand I have another relational table where data is organized parent-child like this:
    Brand (brand_parent_code, brand_child_code, brand_child_alias).
    I have used in the past Integration Services but with flat attribute dimensions.
    Can I load hierarchical attribute dimensions with Integration Services? If yes, how do I do it, how do I specify the hierarchy?
    Thank you,
    Daniela

    Graham,
    This is definitely a supported feature in EIS/9.3.1/ASO. I have many models with this type of structure. How you set it up can vary. Usually my Attribute Hierarchies are not that deep, only two to three levels, maybe four in a rare case, so I don't usually use a parent child table to set up the hierarchy (I'm not saying that it won't work, it might, I haven't tried, but same steps should apply). In a typical model I will have my stock table which has a buyer field. Then in another table I will have my attribute structure which will have columns for buyer, teams, and categories.
    In EIS OLAP model, you add your attribute hierarchy table and use a join to link it to the main stock table, joining on the buyer field (you are now going from a "star" schema to a "snowflake". Go into the properties and make sure you define all the columns as "Attributes".
    Then in Metadata model, drag your categories attribute onto the outline, then drag the teams and set it as a child of categories and finally drag buyer and set as child of teams. You only set the attribute association for the buyer back to the base dimension.
    When you run your dim build it will set up your attribute dimension correctly.
    Some things to keep in mind, make sure you have a process that ensures for every stock code you have in the main table, you have a matching one in your attribute dim table.
    Sometimes, depending on how much manipulation I need to do, instead of joining the tables in EIS I will go back to relational source and create a view that joins the two tables together, then in my OLAP model, I have one table that has three attribute columns, one column for the buyer and then the other two for team and category, from that point setting up in metadata model is the same.
    Good luck, let me know if you run into trouble.

  • Attribute Dimensions with Integration Services

    Good evening, I'm creating an ASO Essbase model through Integration Services v9.3.1 and I am having difficulties with creating an Attribute dimension that contains a hierarchy.
    We have a list of stock codes which are grouped by vendor in a dimension, each stock code has an attribute assigned to it for the buyer responsible.
    The business requirement is to start from the top level of the attribute dimension, drill in to categories, then to drill in to teams, and finally to the buyer... so we effectively require 3 levels within the attribute dimension.
    I have successfully created both the vendor (physical) and buyer (attribute) dimensions through EIS without problems, but when I load to Essbase all of the buyers are added at the top level of the dimension and I can't seem to specify that there should be a parent/child hierarchy to build the attribute dimension from.
    Is this possible, and if so any advice you can give as to how I achieve this would be greatly appreciated.
    Thanks in advance,
    Graham

    Graham,
    This is definitely a supported feature in EIS/9.3.1/ASO. I have many models with this type of structure. How you set it up can vary. Usually my Attribute Hierarchies are not that deep, only two to three levels, maybe four in a rare case, so I don't usually use a parent child table to set up the hierarchy (I'm not saying that it won't work, it might, I haven't tried, but same steps should apply). In a typical model I will have my stock table which has a buyer field. Then in another table I will have my attribute structure which will have columns for buyer, teams, and categories.
    In EIS OLAP model, you add your attribute hierarchy table and use a join to link it to the main stock table, joining on the buyer field (you are now going from a "star" schema to a "snowflake". Go into the properties and make sure you define all the columns as "Attributes".
    Then in Metadata model, drag your categories attribute onto the outline, then drag the teams and set it as a child of categories and finally drag buyer and set as child of teams. You only set the attribute association for the buyer back to the base dimension.
    When you run your dim build it will set up your attribute dimension correctly.
    Some things to keep in mind, make sure you have a process that ensures for every stock code you have in the main table, you have a matching one in your attribute dim table.
    Sometimes, depending on how much manipulation I need to do, instead of joining the tables in EIS I will go back to relational source and create a view that joins the two tables together, then in my OLAP model, I have one table that has three attribute columns, one column for the buyer and then the other two for team and category, from that point setting up in metadata model is the same.
    Good luck, let me know if you run into trouble.

  • Help On Integration Services

    Hi all.
    I'm trying to load members in CostCenter dimension using integration services [Release: 11.1.1.3.0 (Build EIS111130B021)].
    This dimension has two Hierarchies:
    * Hier 1 [stored]
    * Hier 2 [with shared members]
    At first loading, from relational lookup table to hyperion CostCenter dimension, everything goes just fine including shared member creation.
    This is the result
    hier 1
    ====
    Branch 01
    |-costcenter 20
    |-costcenter 30
    |-costcenter 40
    |-costcenter 50
    Branch 02
    |-costcenter 60
    |-costcenter 70
    |-costcenter 80
    hier 2
    ====
    Chief010 (Mr. Smith)
    |-costcenter 20 [shared]
    |-costcenter 30 [shared]
    |-costcenter 40 [shared]
    Chief030 (Mr. Green)
    |-costcenter 50 [shared]
    |-costcenter 60 [shared]
    |-costcenter 70 [shared]
    |-costcenter 80 [shared]
    After a company cost center reorganization I need to reload updated Hier into essbase via integration services.
    The new hier 1 and hier 2 structure should be like this:
    hier 1
    ====
    Branch 01 (empty)
    Branch 01.01
    |-costcenter 10 [new]
    |-costcenter 20
    |-costcenter 30
    |-costcenter 40
    Branch 02
    |-costcenter 50
    |-costcenter 60
    |-costcenter 70
    |-costcenter 80
    instead is:
    hier 1
    ====
    Branch 01
    |-costcenter 20
    |-costcenter 30
    |-costcenter 40
    |-costcenter 50
    Branch 01.01
    |-costcenter 10 [new]
    |-costcenter 20 [shared]
    |-costcenter 30 [shared]
    |-costcenter 40 [shared]
    |-costcenter 50 [shared]
    Branch 02
    |-costcenter 50 [shared]
    |-costcenter 60
    |-costcenter 70
    |-costcenter 80
    hier 2
    ====
    Chief010 (Mr. Smith)
    |-costcenter 10 [shared]
    |-costcenter 20 [shared]
    |-costcenter 30 [shared]
    |-costcenter 40 [shared]
    |-costcenter 50 [shared]
    Chief030 (Mr. Green)
    |-costcenter 50 [shared]
    |-costcenter 60 [shared]
    |-costcenter 70 [shared]
    |-costcenter 80 [shared]
    I suppose it depends on the fact that:
    *in metaoutline/ properties/ allows duplicate shared member is flagged (otherwise hier 2 should not be created);
    *in dimension/ properties/ add as shared member is flagged (otherwise hier 2 should not be created);
    Unflagging these options shared member are not created.
    Morover:
    I cannot clear and reload member or delete an recreate database in order to prevent data loss.
    I'm forced to use integration services in order to create drill-trough reports.
    Any help would be appreciated, thanks in advance.
    Agathe

    Check this thread where people have already given their suggestion on learning SSIS
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/f2cc1cf3-204d-454a-a189-47df87a3aa23/i-want-to-learn-ssis?forum=sqlintegrationservices
    I would suggest to go for You tube videos (type learn SSIS or begin SSIS step by step) you will get lot of good tutorials to start with.
    Happy Learning!!
    If this post answers your query, please click "Mark As Answer" or "Vote as Helpful".

  • Chinese Character  Problem in Essbase Integration Service & Oracle9i

    I installed oracle9i on windows2000,which character set is SIMPLIFIED CHINESE_CHINA.zhs16cgb231280. And I installed Essbase Integration Service 6.5.4 on HP-UNIX server. I designed an OLAP Model in Integration Services Console and I defined some dimensions name using chinese characters. When I saved the model, the chinese characters I inputed is saved as some question marks(?) in the tables in oracle 9i. After I loaded the dimension data which include the chinese charaters, the dimension data which is chinese characters also displayed question marks in the EIS console. I have added the variable NLS_LANG="SIMPLIFIED CHINESE_CHINA.zhs16cgb231280" in the .profile, but it did not work either. How can I solve this problem? Can I do any configuration on EIS files to solve this problem? Would you please help me? Thank you very much!

    Check for SETLOCALE in EIS<BR>SETLOCALE <LANGUAGE_TERRITORY.CODEPAGE@SORT><BR>SETLOCALE .UTF8@default<BR>this might help you.

  • Need a document about how to move the fact and dimension table's to different server's

    Hello Experts,
    I need a detailed doc on how to move the fact and dimension tables to different server's.Please help me out from this
           Thanks in advance....

    You still haven't told anyone what products besides Essbase you are using, without which this is an impossible question to answer.
    https://forums.oracle.com/thread/2585515
    https://forums.oracle.com/thread/2585171
    Are you connecting to these tables from Essbase with a load rule / ODBC?  Using Studio?  Using Integration Services?  Any Drill-Through reporting set up?
    This may sound harsh, but if you truly don't know how to answer any of these questions you should probably not be anywhere near this task...

  • In Answers am seeing "Folder is Empty" for Logical Fact and Dimension Table

    Hi All,
    Am working on OBIEE Answers, on of sudden when i clicked on Logical Fact table it showed me as "folder is empty". I restarted all the services and then tried still showing same for Logical Fact and Dimension tables but am able to see all my reports in Shared Folders. I restarted the machine too but no change. Please help me out to resolve this issue.
    Thanks in Advance.
    Regards,
    Rajkumar.

    First of all, follow the forum etiquette :
    http://forums.oracle.com/forums/ann.jspa?annID=939
    React or mark as anwser the post that the user gave.
    And for your question, you must check the log for a possible corrupt catalog :
    OracleBIData_Home\web\log\sawlog0.log

  • [Forum FAQ] How do I send multiple rows returned by Execute SQL Task as Email content in SQL Server Integration Services?

    Question:
    There is a scenario that users want to send multiple rows returned by Execute SQL Task as Email content to send to someone. With Execute SQL Task, the Full result set is used when the query returns multiple rows, it must map to a variable of the Object data
    type, then the return result is a rowset object, so we cannot directly send the result variable as Email content. Is there a way that we can extract the table row values that are stored in the Object variable as Email content to send to someone?
    Answer:
    To achieve this requirement, we can use a Foreach Loop container to extract the table row values that are stored in the Object variable into package variables, then use a Script Task to write the data stored in packages variables to a variable, and then set
    the variable as MessageSource in the Send Mail Task. 
    Add four variables in the package as below:
    Double-click the Execute SQL Task to open the Execute SQL Task Editor, then change the ResultSet property to “Full result set”. Assuming that the SQL Statement like below:
    SELECT   Category, CntRecords
    FROM         [table_name]
    In the Result Set pane, add a result like below (please note that we must use 0 as the result set name when the result set type is Full result set):
    Drag a Foreach Loop Container connects to the Execute SQL Task. 
    Double-click the Foreach Loop Container to open the Foreach Loop Editor, in the Collection tab, change the Enumerator to Foreach ADO Enumerator, then select User:result as ADO object source variable.
    Click the Variable Mappings pane, add two Variables as below:
    Drag a Script Task within the Foreach Loop Container.
    The C# code that can be used only in SSIS 2008 and above in Script Task as below:
    public void Main()
       // TODO: Add your code here
                Variables varCollection = null;
                string message = string.Empty;
                Dts.VariableDispenser.LockForWrite("User::Message");
                Dts.VariableDispenser.LockForWrite("User::Category");
                Dts.VariableDispenser.LockForWrite("User::CntRecords");     
                Dts.VariableDispenser.GetVariables(ref varCollection);
                //Format the query result with tab delimiters
                message = string.Format("{0}\t{1}\n",
                                            varCollection["User::Category"].Value,
                                            varCollection["User::CntRecords"].Value
               varCollection["User::Message"].Value = varCollection["User::Message"].Value + message;   
               Dts.TaskResult = (int)ScriptResults.Success;
    The VB code that can be used only in SSIS 2005 and above in Script Task as below, please note that in SSIS 2005, we should
    change PrecompileScriptIntoBinaryCode property to False and Run64BitRuntime property to False
    Public Sub Main()
            ' Add your code here
            Dim varCollection As Variables = Nothing
            Dim message As String = String.Empty
            Dts.VariableDispenser.LockForWrite("User::Message")
            Dts.VariableDispenser.LockForWrite("User::Category")
            Dts.VariableDispenser.LockForWrite("User::CntRecords")
            Dts.VariableDispenser.GetVariables(varCollection)
            'Format the query result with tab delimiters
            message = String.Format("{0}" & vbTab & "{1}" & vbLf, varCollection("User::Category").Value, varCollection("User::CntRecords").Value)
            varCollection("User::Message").Value = DirectCast(varCollection("User::Message").Value,String) + message
            Dts.TaskResult = ScriptResults.Success
    End Sub
    Drag Send Mail Task to Control Flow pane and connect it to Foreach Loop Container.
    Double-click the Send Mail Task to specify the appropriate settings, then in the Expressions tab, use the Message variable as the MessageSource Property as below:
    The final design surface like below:
    References:
    Result Sets in the Execute SQL Task
    Applies to:
    Integration Services 2005
    Integration Services 2008
    Integration Services 2008 R2
    Integration Services 2012
    Integration Services 2014
    Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

    Thanks,
    Is this a supported scenario, or does it use unsupported features?
    For example, can we call exec [ReportServer].dbo.AddEvent @EventType='TimedSubscription', @EventData='b64ce7ec-d598-45cd-bbc2-ea202e0c129d'
    in a supported way?
    Thanks! Josh

  • Multiple columns from the same dimension table as row labels performing slowly

    (Working with SSAS tabular)
    I'm trying to figure out what the approach should be for the following scenario:
    Lets say we have a Customer table. The table has columns such as account number, department number, name, salesperson, account manager, number of customers, delivery route, etc
    A user of the model could want to see any permutation of that information as the row labels. How should that be handled?
    What we've been doing so far is that the user adds each column they want into the "ROWS" section in Excel. This works fine with smaller tables (for example, "Department" table with a "Department Code" and "Department Name",
    but on large tables this quickly chokes. I understand why this is happening, I just haven't found a better way to accomplish the same thing.
    I can add a calculated column to the model through VS, but obviously this is unsupportable and unscalable when each person needs their own permutations of the data. Can something similar be done in Excel? 
    This question seems to be what I need:
    http://social.msdn.microsoft.com/Forums/en-US/97d1157a-1402-4227-b96a-79524401ddcd/mdx-query-performance-when-selecting-multiple-attributes-from-same-dimension?forum=sqlanalysisservices
    However I can't find any information on how to add those properties (is it a multidimensional-only thing?)

    Thanks for the help. Sorry but i'm a self-taught developer, and i may be missing some basics :)
    Anyway i've done what you suggested but i get this error:
    [nQSError: 15011]The dimension table source Dimension Services.DM_D_SERVIZI_SRV has an aggregate content specification that specifies the level Product. But the source mapping contains column COD_PRODUCT with a functional dependency association on a more detailed level .
    where:
    - DM_D_SERVIZI_SRV is the physical alias for the Service Dimension (and the name of the LTS too)
    - COD_PRODUCT is the leaf of the hierarchy, the physical primary key, but it hasnt to be included in the hierarchy
    Do i have to add another level with the primary key and hide it to the users?
    I tried to solve this going to the logical tables source properties, on the tab contents, setting "logical level" to null for the hierarchy, but i don't know if this is correct.
    Thanks

  • Best Practice loading Dimension Table with Surrogate Keys for Levels

    Hi Experts,
    how would you load an Oracle dimension table with a hierarchy of at least 5 levels with surrogate keys in each level and a unique dimension key for the dimension table.
    With OWB it is an integrated feature to use surrogate keys in every level of a hierarchy. You don't have to care about
    the parent child relation. The load process of the mapping generates the right keys and cares about the relation between the parent and child inside the dimension key.
    I tried to use one interface per Level and created a surrogate key with a native Oracle sequence.
    After that I put all the interfaces in to one big Interface with a union data set per level and added look ups for the right parent child relation.
    I think it is a bit too complicated making the interface like that.
    I will be more than happy for any suggestions? Thank you in advance!
    negib
    Edited by: nmarhoul on Jun 14, 2012 2:26 AM

    Hi,
    I do like the level keys feature of OWB - It makes aggregate tables very easy to implement if your sticking with a star schema.
    Sadly there is nothing off the shelf with the built in knowledge modules with ODI , It doesnt support creating dimension objects in the database by default but there is nothing stopping you coding up your own knowledge module (use flex fields maybe on the datastore to tag column attributes as needed)
    Your approach is what I would have done, possibly use a view (if you dont mind having it external to ODI) to make the interface simpler.

  • [Forum FAQ] How to get SSIS packages XML definition which are stored in SQL Server Integration Services instance

    Introduction
    Integration Services gives you the ability to import and export packages, and by doing this change the storage format and location of packages. But after import packages into package store, how can we get the package XML definition?
    Solution
    As we know, SSIS packages are stored in msdb using existing SSIS storage table([msdb].[dbo].[sysssispackages]). The “packagedata” column store the actual SSIS package with Image data type. In order to get the package XML definition, we need to convert “packagedata”
    column through Varbinary to XML. You can refer to the following steps:
    Using the following query to get package GUID:
    SELECT [name],
                [id]
      FROM [msdb].[dbo].[sysssispackages]
    Using the following query to convert packagedata column to XML: SELECT id, CAST(CAST(packagedata AS VARBINARY(MAX)) AS XML) PackageDataXML
    FROM      [msdb].[dbo].[sysssispackages]
    WHERE id= 'ABB264CC-A082-40D6-AEC4-DBF17FA057B2'
    More Information
    sysssispackages (Transact-SQL):
    http://msdn.microsoft.com/en-us/library/ms181582.aspx
    Applies to
    SQL Server 2005
    SQL Server 2008
    SQL Server 2008R2
    SQL Server 2012
    SQL Server 2014
    Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

    Hi Ketak. Thank you for replying. I already followed your instructions - specifically -
    You do not see the SQL Server Reporting Services  service in SharePoint Central Administration after installing SQL Server 2012 SSRS in SharePoint mode
    I get the following error when I run rssharepoint.msi on the APP sever (where Central Admin is installed). I have to run this other wise
    Install-SPRSService and Install-SPRSServiceProxy 
    are not recognized as commands on that server.
    Failed to call GetTypes on assembly Microsoft.AnalysisServices.SPAddin, Version=11.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91. Could not load file or assembly Microsoft.AnalysisServices.SPClient, Version=11.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'
    or one of its dependencies. The system cannot find the file specified.
    macrel

  • SSIS Package compiled successful, in SQL Server Integration Service package executed sucessful, But fail to run in MS SQL Job Scheduler

    Hi Everyone,
    I having a problem to transfer data from MS SQL 2005 to IBMAS400. Previously my SSIS was running perfectly but there is some changes I need to be done in order for the system to work well. Considers my changes are minimal & just for upgrades (but I did
    include DELETE statements to truncate AS400 table before I insert fresh data from MS SQL table to the same AS400 table), so I compile my SSIS package & it run successfully & I passed it into MS SQL Integrated Service as 1 of the packages & manually
    executed the package & the result is the same, that mean it was successful again but when I try to run it in a MS SQL Job Scheduler, the job failed with these message shown below as extracted from the job View history. 
    Date today
    Log Job History (MSSQLToAS400)
    Step ID 1
    Server MSSQLServer
    Job Name MSSQLToAS400
    Step Name pumptoAS400
    Duration 00:00:36
    Sql Severity 0
    Sql Message ID 0
    Operator Emailed
    Operator Net sent
    Operator Paged
    Retries Attempted 0
    Message
    Executed as user: MSSQLServer\SYSTEM. ... 9.00.4035.00 for 32-bit  Copyright (C) Microsoft Corp 1984-2005. All rights reserved.    
    Started:  today time  
    Error: on today time     
    Code: 0xC0202009     Source: SSISMSSQLToAS400 Connection manager "SourceToDestinationOLEDB"     
    Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. 
    Error code: 0x80004005.  An OLE DB record is available.  
    Source: "IBMDA400 Session"  
    Hresult: 0x80004005  
    Description: "CWBSY0002 - Password for user AS400ADMIN on system AS400SYSTEM is not correct ".  End Error  
    Error: today     
    Code: 0xC020801C     
    Source: Data Flow Task OLE DB Destination [5160]     
    Description: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "DestinationClearData" failed with error code 0xC0202009.  There may be error messages posted before
    this with more information on why the AcquireConnection method ca...  The package execution fa...  The step failed.
    So I hope somebody can shed some hints or tips for me to overcome time problem of mine. Thanks for your help in advance. As I had scoured thoroughout MSDN forums & found none solution for my problem yet. 
    PS: In the SQL Integrated Services when I deployed the package I set the security of the packages to Rely on server... 
    Hope this will help.

    Hi Ironmaidenroxz,
    From the message “Executed as user: MSSQLServer\SYSTEM”, we can see that the SQL Server Agent job ran under the Local System account. However, a Local System account doesn’t have the network rights natively, therefore, the job failed to communicate with
    the remote IBMAS400 server.
    To address this issue, you need to create a proxy account for SQL Server Agent to run the job. When creating the credentials for the proxy account, you can use the Windows domain account under which you executed the package manually.
    References:
    How to: Create a Credential
    How to: Create a Proxy
    Regards,
    Mike Yin
    TechNet Community Support

  • Data service for table in Oracle 8.0.6

    Hi,
    Using WebLogic 8.1.4 and LiquidData 8.5 I am trying to create physical data services for tables in a DB in Oracle 8.0.6. I am aware that that Oracle version is not supported by Oracle anymore, but I need to work with that version anyway (you know how it is sometimes).
    I managed to create a connection pool for this through the WebLogic Server Console by providing the JDBC driver for 8.0.6., but when I want to create a data source using the new connection pool and WebLogic tries to get the metadata, I get pop up windows with messages like:
    "Bigger type length than maximum"
    and
    "OALL8 in an inconsistent state"
    and
    "Protocol violation"
    One more thing to mention: I also added the Oracle 8.0.6 JDBC driver to the WebLogic Server classpath (Tools -> WebLogic Server -> Server Properties ... -> WebLogic Server: added classes12.zip to Server classpath additions) and restarted WebLogic Workshop and Server. Still I get those error messages.
    Is there a special procedure how to provide/configure a specific driver for a DBMS that is not natively supported by WebLogic?
    Any help is appreciated.
    Thanks,
    Wilko

    Hi Mike,
    Thanks for the quick reply. Below the contents of the console window from starting the Workshop and Server. I'll try your next hint next and let you know about the outcome. As far as I see there were no errors issued by the Server while I tried to connect to Oracle 8.0.6 to upload metadata. (I am not sure whether anything was printed out while I started the server). My address is w.eschebach at vsnlinternational dot com.
    Thanks,
    Wilko
    This is how my workshop.cfg looks like:
    C:\bea\weblogic81\workshop
    C:\bea\jdk142_05\jre\bin\java.exe
    -XX:-UseThreadPriorities -Xmx256m -Xms64m -Xss256k -client -Dsun.io.useCanonCaches=false -Dsun.java2d.noddraw=true -Dsun.java2d.d3d=false -Djava.system.class.loader="workshop.core.AppClassLoader" -cp "C:\bea\weblogic81\workshop\wlw-ide.jar" workshop.core.Workshop
    Console output:
    DEBUG: extensions=C:\bea\weblogic81\workshop\\extensions
    INFO: Registering extension com.bea.portal.ide.CommonServices
    INFO: Service com.bea.portal.ide.findrefs.FindRefsSvc registered
    INFO: Handler for urn:com-bea-portal-ide:ref-finders registered
    INFO: Registering extension workshop.control.ControlServices
    INFO: Service com.bea.ide.control.ControlSvc registered
    INFO: Registering extension com.crystaldecisions.integration.weblogic.workshop.r
    eport.Bootstrap
    INFO: Registering extension workshop.debugger.DebuggerServices
    INFO: Exit Handler found
    INFO: Service com.bea.ide.debug.DebugSvc registered
    INFO: Handler for urn:com-bea-ide:debugExpressionViews registered
    INFO: Registering extension workshop.jspdesigner.JspDesignerServices
    INFO: Service com.bea.ide.ui.browser.BrowserSvc registered
    INFO: Service com.bea.ide.jspdesigner.PaletteActionSvc registered
    INFO: Handler for urn:com-bea-ide-jspdesigner:tags registered
    INFO: Registering extension workshop.liquiddata.LiquidDataExtension
    INFO: Registering extension workshop.pageflow.services.PageFlowServices
    INFO: Exit Handler found
    INFO: Service workshop.pageflow.services.PageFlowSvc registered
    INFO: Service com.bea.ide.ui.palette.DataPaletteSvc registered
    INFO: Handler for urn:workshop-pageflow-wizard:extension registered
    INFO: Registering extension com.bea.portal.ide.portalbuilder.PortalBuilderServic
    es
    INFO: Service com.bea.portal.ide.portalbuilder.laf.LookAndFeelSvc registere
    d
    INFO: Service com.bea.portal.ide.portalbuilder.laf.css.CssSvc registered
    INFO: Service com.bea.portal.codegen.CodeGenSvc registered
    INFO: Registering extension com.bea.portal.ide.PortalServices
    INFO: Service com.bea.portal.ide.cache.CacheInfoSvc registered
    INFO: Registering extension workshop.process.ProcessExtension
    INFO: Service workshop.process.ProcessSvc registered
    INFO: Service workshop.process.broker.channel.ChannelManagerSvc registered
    INFO: Handler for urn:com-bea-ide-process:process registered
    INFO: Registering extension workshop.shell.ShellServices
    INFO: Exit Handler found
    INFO: Service com.bea.ide.ui.frame.FrameSvc registered
    INFO: Service com.bea.ide.core.datatransfer.DataTransferSvc registered
    INFO: Service com.bea.ide.actions.ActionSvc registered
    INFO: Service com.bea.ide.document.DocumentSvc registered
    INFO: Service com.bea.ide.core.HttpSvc registered
    INFO: Service com.bea.ide.ui.help.HelpSvc registered
    INFO: Service com.bea.ide.ui.output.OutputSvc registered
    INFO: Service com.bea.ide.core.navigation.NavigationSvc registered
    INFO: Service com.bea.ide.filesystem.FileSvc registered
    INFO: Service com.bea.ide.filesystem.FileSystemSvc registered
    INFO: Service com.bea.ide.refactor.RefactorSvc registered
    INFO: Service com.bea.ide.security.SecuritySvc registered
    INFO: Handler for urn:com-bea-ide:actions registered
    INFO: Handler for urn:com-bea-ide:document registered
    INFO: Handler for urn:com-bea-ide:frame registered
    INFO: Handler for urn:com-bea-ide:encoding registered
    INFO: Handler for urn:com-bea-ide:help registered
    INFO: Registering extension workshop.sourcecontrol.SCMServices
    INFO: Service com.bea.ide.sourcecontrol.SourceControlSvc registered
    INFO: Handler for urn:com-bea-ide:sourcecontrol registered
    INFO: Registering extension workshop.sourceeditor.EditorServices
    INFO: Service com.bea.ide.sourceeditor.EditorSvc registered
    INFO: Service com.bea.ide.sourceeditor.compiler.CompilerSvc registered
    INFO: Handler for urn:com-bea-ide:sourceeditor:sourceinfo registered
    INFO: Registering extension com.bea.wls.J2EEServices
    INFO: Service com.bea.wls.ejb.EJBSvc registered
    INFO: Service com.bea.wls.DBSvc registered
    INFO: Registering extension workshop.workspace.WorkspaceServices
    INFO: Exit Handler found
    INFO: Service com.bea.ide.workspace.WorkspaceSvc registered
    INFO: Service com.bea.ide.workspace.ServerSvc registered
    INFO: Service com.bea.ide.workspace.SettingsSvc registered
    INFO: Service com.bea.ide.build.AntSvc registered
    INFO: Service com.bea.ide.workspace.RunSvc registered
    INFO: Handler for urn:com-bea-ide:settings registered
    INFO: Handler for urn:com-bea-ide:project registered
    INFO: Registering extension workshop.xml.XMLServices
    INFO: Service com.bea.ide.xml.types.TypeManagerSvc registered
    INFO: Service com.bea.ide.xml.types.TypeResolverSvc registered
    INFO: Service com.bea.ide.xmlmap.XMLMapSvc registered
    DEBUG: Workshop temp dir: C:\DOCUME~1\TR003137\LOCALS~1\Temp\wlw-temp-18920
    DEBUG: ExtensionsLoaded: 8329ms
    DEBUG: UI Displayed: 11563ms
    DEBUG: Time to load XQuery Functions (in seconds) - 0
    DEBUG: Time to load repository (in seconds) - 0
    DEBUG: LdBuildDriver loaded
    DEBUG: project ProvisioningDataServices activated
    DEBUG: Setting active project to: ProvisioningDataServices
    DEBUG: Workspace Activated: 17126ms
    DEBUG: Document Panel initialized: 17501ms
    DEBUG: *** CompilerProject constructor 1
    DEBUG: WorkspaceLoaded: 17594ms
    DEBUG: getClasspathMapping initiated with 29 item list.
    DEBUG: getClasspathMapping returning 29 item map.
    INFO: Startup Complete
    DEBUG: Time to load repository (in seconds) - 1
    DEBUG: Loading template file wsrp-producer-project.zip
    DEBUG: Loading template file wli-tutorial.zip
    DEBUG: Loading template file wli-schemas.zip
    DEBUG: Loading template file wli-newprocess.zip
    DEBUG: Loading template file wli-helloworld.zip
    DEBUG: Loading template file webflow-project.zip
    DEBUG: Loading template file tutorial-webservice.zip
    DEBUG: Loading template file tutorial-pageflow.zip
    DEBUG: Loading template file tutorial-jbc.zip
    DEBUG: Loading template file tutorial-ejb.zip
    DEBUG: Loading template file portal-project.zip
    DEBUG: Loading template file portal-application.zip
    DEBUG: Loading template file pipeline-application.zip
    DEBUG: Loading template file oag-schemas.zip
    DEBUG: Loading template file netui-webapp.zip
    DEBUG: Loading template file liquiddata-project.zip
    DEBUG: Loading template file liquiddata-application.zip
    DEBUG: Loading template file ejb-template.zip
    DEBUG: Loading template file default-workshop.zip
    DEBUG: Loading template file datasync-template.zip
    DEBUG: Loading template file crystalreports.zip
    DEBUG: Loading template file commerce-project.zip
    DEBUG: Loading template file commerce-application.zip
    DEBUG: URI is null. Delete Version will not show up in the menu
    DEBUG: URI is null. Delete Version will not show up in the menu
    DEBUG: GCThread: performing gc while idle

  • SSIS 2012: Integration Services Catalog not showing data for most recent executions

    Techies--
    Under a previous deployment of an SSIS package, I was able to go to the Integration Services Catalog, look under the  folder-->project--> package  then right click and request a standard report for all executions. The report would display
    the execution details.
    Today a colleague recently deployed a newer version of the same package to a test server... which is failing from a job set up via sql agent.  I went to look at the root cause by going to the Integration Services Catalog to review the report. No execution
    data appears.  I looked at the filter--compared it to the filter I have set on the production copy of the stable version. The folder name, package name and project name match. Both have a status of 'All'. The date range is the same (2/19 - 2/25). 
    In fact, I went back to the test deployment and changed the date range to the date of the last failure (1/30).  Now I see data only from 1/30 but none for today's execution.
    Something must have happened in the deployment process--but what? How do I debug this issue with the reports--but even more importantly,  how do I get to the raw execution data to see what the story is on the failed execution itself?

    @Arthur, thanks for responding. In a nutshell, I searched through the ssisdb.catalog.event_messages table to determine roughly the same thing you determined--an execution for that package (or for that matter, any misnamed package) with today's timestamp
    simply never occurred--even though the job agent log appeared to indicate it had.
    I re-deployed the package ... forced an execution with an interactive run from the catalog... and viola! The execution info appeared through the standard reports as expected.
    Any idea where I could/should look for a wayward deployment? (BTW, the report for validations showed no data either on that first deployment).

  • SQL Server Integration Services 11.0 not showing

    To whom it may concern,
    I have been using SSIS 2008 R2 without a problem.  I then installed VS 2012 and migrated my SSIS packages from VS 2010 to VS 2012.  I also installed the BI Components (SSDTBI_VS2012_x86_ENU) in VS 2012,
    All the SSIS packages runs fine within VS 2012.  As soon as I run the packages from the command line (i.e. "C:\Program Files\Microsoft SQL Server\110\DTS\Binn\dtexec.exe") the packages referencing any lookup objects provides the following
    error: 
    Error: 2014-04-21 13:20:15.85
    Code: 0xC000F427
    Source: Update Category SSIS.Pipeline
    Description: To run a SSIS package outside of SQL Server Data Tools you must install Derived Column of Integration Services or higher.
    End Error
    Error: 2014-04-21 13:20:15.85
    Code: 0xC000F427
    Source: Update Category SSIS.Pipeline
    Description: To run a SSIS package outside of SQL Server Data Tools you must install Lookup [Table Name] of Integration Services or higher.
    End Error
    I checked my services on my Local machine and the following integration service is the only one available:
    SQL Server Integration Service 10.0 (i.e. SQL Server Integration Service 11.0 is missing)
    I presume this is the problem.  
    I will appreciate it if someone can assist with this problem.  I have also tried to repair MS-SQL 2012 (Express Version) to no avail (Shared Features).
    Kind Regards,

    Hi, I am running SSIS on a Local Dev Machine and I am trying to execute the packages via the "C:\Program Files\Microsoft SQL Server\110\DTS\Binn\dtexec.exe" Command Line It look as though I need to remove my previous version of SSIS and then install the
    2012 SSIS.
    I have also downloaded the extend MS SQL 2012 Express Version (1.9 GB), but still no no avail.  It therefore seems to be related to the previous version of SSIS that I possibly need to remove and then try to reinstall SSIS for 2012.
    I will revert back as soon as I have done these tests.
    As I said before, the Express version doesn't have SSIS.
    MCSE SQL Server 2012 - Please mark posts as answered where appropriate.

Maybe you are looking for

  • IPhone is not recognized by iTunes

    iPhone 5, had nothing but problems with this Phone, something i never had when i had a wackberry. Firstly the lock button broke, now the phone wont sync and crashes every couple hours unexplainably which is excellent when im typing an email. The 3rd

  • Problem in Cluster Table Maintenance in sm34

    Dear ABAP Experts, I am new to abap. I am struck in a problem. I have created a View Clusters to maintain database tables. (in se54) But when I go and try to maintain the fields of database tables: In "new entries" I am able to add only one fild and

  • Rename a column

    Dear Sir, Does the latest version of oracle database support renaming of a column name? Please advise. Thanks. null

  • Deletion of  particular roles from SU01 for specific users

    Hi, Actually I have to certain(only some)  roles for  users in SU01. This is bulk data so doing it manually through SU01 is not possible. Is there any function module/BAPI which will delete particular roles for users?? Disha.

  • Shuffle 2.0 + windows XP sp2 + itunes 7.0.2 + usb 2.0 = why not??

    Hello! 1st time plugged in, it was blinking orange. And still this is the only one it can do. I read that charging is working this time too, so i left it this way for 12 hours. Battery seems fine now, after unmounting and turning on. The problem is,