Oracle BI (Siebel Analytics) / Data Warehouse Data Security

Hello,
We are implementing Oracle Siebel Analytics and Custom datawarehousing using Oracle 9i DB. I am interested to know if we can implement consistent data security model for different data marts. Means with one single data warehouse, I want to implement data security model or policies in one place and they are applied consistently across all of the datamarts.
1. Can we or oracle support this kind of secinerio.
2. If yes, what would be perfect model an example will be helpful, and if not what best approach is available to resolve this security model.
Any suggestion or help would be great.
Regards
Girish

Hi, Do you also get some other packages names that the error msg reports like ...
RPE-01012 Cannot deploy PL/SQL maps to the target schema because it is not owned by the Control Center and "SOME WRT.... package names" I think once did I face this and made sure that the repository user has some packages granted through OWBREPOS_OWNER/OWB_REPOS_OWNER to the user you are using to run the mapping.
Regards,
Mohammad Farhan Alam

Similar Messages

  • What are the Disadvantages of Management Data Warehouse (data collection) ?

    Hi All,
    We are plan to implement Management Data Warehouse in production servers .
    could you please explain the Disadvantages of Management Data Warehouse (data collection) .
    Thanks in advance,
    Tirumala 
     

    >We are plan to implement Management Data Warehouse in production servers
    It appears you are referring to production server performance.
    BOL: "You can install the management data warehouse on the same instance of SQL Server that runs the data collector. However, if server resources or performance is an issue on the server being monitored, you can install the management data warehouse
    on a different computer."
    Management Data Warehouse
    Kalman Toth Database & OLAP Architect
    SQL Server 2014 Database Design
    New Book / Kindle: Beginner Database Design & SQL Programming Using Microsoft SQL Server 2014

  • Oracle for a Data Warehouse & Data Mining Project

    Hello,
    I am working for a marketing company and we have a pretty big OLTP database and my supervisor wants to make use of these data for decision making. The plan is to create a
    data mart(if not a warehouse) and use data mining tools on top of it.
    He does not want to buy one of those expensive tools so I was wondering if we could handle such a project just by downloading OWB and Darwin from Oracle site? None of us are data warehouse specialists so it will be very though for us. But I would like to learn. Actually, I was looking for some example warehouse + mining environment implementations to get the main ideas. I will appreciate any suggestions and comments.
    Thank you

    Go to
    http://www.oracle.com/ip/analyze/warehouse/datamining/
    for white papers, demos, etc. as a beginning.
    Also, Oracle University offers a course on Oracle data Mining.

  • Management Data Warehouse Data Collection fails due to login failure

    Hello,
    I am trying to set up a Management Data Warehouse on a server other than the one I want statistics of.  Unfortunately, each upload fails because the data collector upload job cannot log onto the warehouse server.  For some inexplicable reason the process is trying to log on using domain\serverName.  Obviously no such user exists, and the process fails.  Below is the error message I see in the logs:
    Description: An error occurred with the following error message: "An error occurred while verifying the result set schema against the output table schema. The data collector cannot connect to the management data warehouse. : Login failed for user 'domain\server name$'.".
    Any help would be greatly appreciated.
    Thanks,
    Zachary

    http://technet.microsoft.com/en-us/library/bb677211.aspx says
    The data warehouse is installed on a different computer from the data collector. Probable causes are network connectivity problems or an unavailable host server. This error only affects upload packages.
    Handling: Because there is no advance notification about a server shutdown, this error cannot be anticipated and handled automatically. The error is logged and after a brief interval, the upload is restarted. After four unsuccessful upload attempts, the collection set is disabled and its state is written to the execution log.
    Note:
    Any data that is collected while the collection set is running is kept and accumulated. If the upload package can connect to the data warehouse, the accumulated data is uploaded.
    Blog: http://dineshasanka.spaces.live.com

  • Data Warehouse Date format

    Hello all
    I am trying to create a dashboard to be used in a weekly meeting for the services team.  One of the requirements is to show the number of incidents by category for the last 7,14,30 days.  SO I am trying to create a Date filter for my pivot table
    but dates are not coming in as a date format.  I read that this is a known "feature" in the data warehouse and heard that there is a fix/update that works around this situation. Has any found a fix for this?

    I found this
    http://blogs.technet.com/b/servicemanager/archive/2012/12/07/incidents-or-service-requests-sliced-by-months-quarters.aspx
    Seems like it what I was looking for.
    This is cool and works well.  but not quite what I am trying to do.  I will keep working with it though.
    but still looking on how to filter 7,14,30 day...

  • Permanent Job Opportunity - Oracle BI Data Warehouse Developer Chicago, IL

    Submit Resumes to [email protected]
    The Business Intelligence Specialist will play a critical role in designing, developing, deploying, and supporting data warehouse/data mart applications. In this role, the person will be responsible for all BI aspects of a data warehouse/data mart application. Primary duties will be to create reporting standards, as well as coach and support power users with selected Oracle tool. The ideal candidate will have 3+ years demonstrated experience in data warehousing and Business Intelligence tools. Must also possess excellent communication skills and an outstanding track record with the user.
    Principal Duties:
    Participates with internal clients to define software requirements for development, maintenance and/or improvements
    Maintains accuracy, integrity, and availability of the data warehouse
    Tests, monitors, manages, and validates data warehouse activity, including data extraction, transformation, movement, loading, cleansing, and updating processes
    Designs and optimizes data mart models for Oracle Business Intelligence Suite.
    Translates the reporting requirements into data analysis and reporting solutions.
    Reviews and sign off on project plan(s).
    Reviews and sign off on technical design(s).
    Defines and develops BI reports for accessing/analyzing data in warehouse.
    Customizes BI tools and data sets for different types of users.
    Designs and develop UAT (User Acceptance Testing).
    Drives improvement of BI system architecture and development process.
    Develops and maintains internal relationships. Actively champions teamwork. Uses internal resources to enhance knowledge and expertise of industry, research, products and services. Provides information and support to others in the company.
    Required Skills:
    Education and Experience:
    BS/MS in Computer Science or equivalent.
    3+ years of experience with Oracle, PL/SQL Development and Data Warehousing.
    Experience Oracle Business Intelligence Suite and Crystal Reports is a plus.
    2-3 years dimensional modeling experience.
    Demonstrated hands on experience with Unix/Linux, SQL required.
    Demonstrated hands on experience with Oracle reporting tools.
    Demonstrated experience with translating business requirements into data analysis and reporting solutions.
    Experience in training programs/teach users to use tools.
    Expertise with software development process.
    Effective mediator - able to facilitate constructive and productive discussions with internal customers, external clients, and development personnel pertaining to feature definition, project scope, and status
    Problem solving*identifies and resolves problems in a timely manner, gathers and analyzes information skillfully and maintains confidentiality.
    Planning/organizing*prioritizes and plans work activities and uses time efficiently. Work requires continual attention to detail in composing and proofing materials, establishing priorities and meeting deadlines. Must be able to work in a fast-paced environment with demonstrated ability to juggle multiple competing tasks and demands.
    Quality control*demonstrates accuracy and thoroughness and monitors own work to ensure quality.
    Adaptability*adapts to changes in the work environment, manages competing demands and is able to deal with frequent change, delays or unexpected events.
    Benefits/Compensation:
    Employees enjoy competitive compensation. We have a full benefits package including medical and dental insurance, long-term disability and life insurance and a 401(k) plan.
    The client operates within the healthcare industry.
    This is a permanent full-time position. After ensuring your availability and qualifications we will put you in direct contact with the client to move forward in the process.

    FORWARD THE UPDATED RESUME AS SOON AS POSSIBLE.

  • Data warehouse database

    <p>
    Today I came across one very interesting question
    </p>
    <p>
    "Data Warehouse can be only deployed in relation database."
    </p>
    <p>
    Above statement is true or false.
    </p>
    <p>
    If we see it simply without any complication or may be go back 7-8 years it's answer may be false as I find out after doing research on it
    </p>
    <p>
    "A data warehouse can be normalized or denormalized. It can be a relational database, multidimensional database, flat file, hierarchical database, object database, etc. Data warehouse data often gets changed. And data warehouses often focus on a specific activity or entity." Larry Greenfield
    </p>
    <p>
    The data warehouse is normally (but does not have to be) a relational database. It must be organized to hold information in a structure that best supports not only query and reporting, but also advanced analysis techniques, like data mining. Most data warehouses hold information for at least 1 year and sometimes can reach half century, depending on the business/operations data retention requirement. As a result these databases can become very large.en.wikipedia.org
    </p>
    <p>
    But I think when we look into the complication of designing and functionality which we are expecting from a data warehouse today and plus the concept which use when designing the data warehouse structure like star scheme, snow flake etc., I think it cannot be done in any other type of database which don't follow the relation database concept. We may call it multidimensional database or ORDBMS and we may talk about cubes, measures, dimension and so on but from the base it has to follows the relation database.
    </p>
    <p>
    Let me know if anybody has anything to say about it.
    </p>
    <br>
    Regards,
    <br><br>
    Raj<br>
    <b>www.oraclebrains.com<a>
    <br><font color="#FF0000">POWERED by the people, to the people and for the people WHERE ORACLE IS PASSION.</font></b>

    Thanks Justin!
    <br><br>
    I agree with you the concept exist before the relational database was invented. Last time I know they call it EIS then they name it MIS and now data warehouse. But what I am taking about is modern data warehouse technique, If you really think that any other type of database can support it without following relational database concept. Let me know which one?
    <br><br>
    I am still searching for my answers and have discuss with lot of people, but when I asked them have they seen any implementation of data warehouse without using relational database, the answer that I get is always negative.<br><br>
    Raj<br>
    <b>www.oraclebrains.com<a>
    <br><font color="#FF0000">POWERED by the people, to the people and for the people WHERE ORACLE IS PASSION.</font></b>

  • ERP data warehouse

    Hi all,
    What is the requirement of data warehouse if ERP exists? As ERP captures major transactional data then what are the difficulties to analyze data directly from ERP database? What bebefits can be found if data is transformed into data warehouse data model? What are the difficulties to transforme data from ERP to data warehouse?
    Please help me. These are the research questions of my M.Sc thesis.
    Thanks.
    Swapan

    I'm receiving a similar error when attempting to create data warehouse tables with BIAPPS 7.9.6.
    The "Installing the DAC Platform" documentation states:
    "4.9.4.2 How to Create ODBC Connections for Oracle Databases
    Follow these instructions for creating ODBC connections for Oracle databases on
    Windows. For instructions on creating ODBC connections for Oracle databases on
    UNIX or Linux, see the documentation provided with your database.
    Note: You must use the Oracle Merant ODBC driver to create the ODBC connections.
    The Oracle Merant ODBC driver is installed by the Oracle BI Applications installer.
    Therefore, you will need to create the ODBC connections after you have run the Oracle
    BI Applications installer and have installed the DAC Client."

  • Configuration Dataset = 90% of Data Warehouse - Event Errors 31552

    Hi All,
    I'm currently running SCOM 2012 R2 and have recently had some problems with the Data Warehouse Data Sync. We currently have around 800 servers in our production environment, no Network devices, we use Orchestrator for integration with our call logging system
    and I believe this is where our problems started. We had a runbook which got itself into a loop and was constantly updating alerts, it also contributed to a large number of state changes. We have resolved that problem now, but I started to receive alerts
    saying SCOM couldn't sync Alert data under event 31552.
    Failed to store data in the Data Warehouse.
    Exception 'SqlException': Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding. 
    One or more workflows were affected by this.  
    Workflow name: Microsoft.SystemCenter.DataWarehouse.StandardDataSetMaintenance 
    Instance name: Alert data set 
    Instance ID: XX
    Management group: XX
    I have been researching problems with syncing alert data, and came across the queries to manually do the database maintenance, I ran that on the alert instance and it took around 16.5 hours to run on the first night, then it ran fast (2 seconds) most the
    day, when it got to about the same time the next day it took another 9.5 hours to run so I'm not sure why that's giving different results.
    Initially it appeared all of our datasets were out of sync, after the first night all appear to be in sync bar the Hourly Performance Dataset. Which still has around 161 OutstandingAggregations. When I run the Maintenance on Performance it doesn't appear
    to be fixing it. (It runs in about 2 seconds, successfully)
    I recently ran DWDatarp on the database to see how the Alert Dataset was looking and to my surprise I found that the Configuration Dataset has blown out to take up 90% of the DataWarehouse, table below. Does anyone have any ideas on what might cause this
    or how I can fix it?
    Dataset name                   Aggregation name     Max Age     Current Size, Kb
    Alert data set                 Raw data                 400       132,224 (  0%)
    Client Monitoring data set     Raw data                  30             0 (  0%)
    Client Monitoring data set     Daily aggregations       400            16 (  0%)
    Configuration dataset          Raw data                 400   683,981,456 ( 90%)
    Event data set                 Raw data                 100    17,971,872 (  2%)
    Performance data set           Raw data                  10     4,937,536 (  1%)
    Performance data set           Hourly aggregations      400    28,487,376 (  4%)
    Performance data set           Daily aggregations       400     1,302,368 (  0%)
    State data set                 Raw data                 180       296,392 (  0%)
    State data set                 Hourly aggregations      400    17,752,280 (  2%)
    State data set                 Daily aggregations       400     1,094,240 (  0%)
    Microsoft.Exchange.2010.Dataset.AlertImpact Raw data      
    7     0 (  0%)
    Microsoft.Exchange.2010.Dataset.AlertImpact Hourly aggregations        
    3     0 (  0%)
    Microsoft.Exchange.2010.Dataset.AlertImpact Daily aggregations      
    182     0 (  0%)
    Microsoft.Exchange.2010.Reports.Dataset.Availability Raw data                 400           176 (  0%)
    Microsoft.Exchange.2010.Reports.Dataset.Availability Daily aggregations       400             0 (  0%)
    Microsoft.Exchange.2010.Reports.Dataset.TenantMapping Raw data 7             0 (  0%)
    Microsoft.Exchange.2010.Reports.Dataset.TenantMapping Daily aggregations       400             0 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ActiveUserMailflowStatistics.Data Rawdata                   3        84,864 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ActiveUserMailflowStatistics.Data Hourly aggregations        7       407,416 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ActiveUserMailflowStatistics.Data Daily aggregations       182       143,128 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ServerMailflowStatistics.Data Raw data                   7         6,088 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ServerMailflowStatistics.Data Hourly aggregations       31        20,056 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ServerMailflowStatistics.Data Daily aggregations       182         3,720 (  0%)
    I have one other 31553 event showing up on one of the Management servers as follows,
    Data was written to the Data Warehouse staging area but processing failed on one of the subsequent operations.
    Exception 'SqlException': Sql execution failed. Error 2627, Level 14, State 1, Procedure ManagedEntityChange, Line 368, Message: Violation of UNIQUE KEY constraint 'UN_ManagedEntityProperty_ManagedEntityRowIdFromDAteTime'. Cannot insert duplicate key in
    object 'dbo.ManagedEntityProperty'. The duplicate key value is (263, Aug 26 2013  6:02AM). 
    One or more workflows were affected by this.  
    Workflow name: Microsoft.SystemCenter.DataWarehouse.Synchronization.ManagedEntity 
    Instance name: XX 
    Instance ID: XX
    Management group: XX
    which from my readings means I'm likely in for an MS support call.. :( But I just wanted to see if anyone has any information about the Configuration Dataset as I couldn't find much in my searching.

    Hi All,
    The results of the MS Support call were as follows, I don't recommend doing these steps without an MS Support case, any damage you do is your own fault these particular actions resolved our problems:
    1. Regarding the Configuration Dataset being so large. 
    This was caused by our AlertStage table which was also very large, we truncated the alert stage table and ran the maintenance tasks manually to clear this up. As I didn't require any of the alerts sitting in the AlertStage table we simply did a straight truncation
    of the the table. The document linked by MHG above shows the process of doing a backup & restore on the AlertStage Table if you need to. It took a few days of running maintenance tasks to resolve this problem properly. As soon as the truncation had taken
    place the Confirguration Dataset dropped in size to less than a gig.
    2. Error 31553 Duplicate Key Error
    This was a problem with duplicate keys in the ManagedEntityProperty table. We identified rows which had duplicate information, which could be gathered from the Events being logged on the Management Server.
    We then updated a few of these rows to have a slightly different time to what was already in the Database. We noticed that the event kept logging with a different row each time we updated the previous row. We ran the following query to find out how many rows
    actually had duplicates:
    select * from ManagedEntityProperty mep
    inner join ManagedEntity me on mep.ManagedEntityRowId = me.ManagedEntityRowId
    inner join ManagedEntityStage mes on mes.ManagedEntityGuid = me.ManagedEntityGuid
    where mes.ChangeDateTime = mep.FromDateTime
    order by mep.ManagedEntityRowId
    This returned over 25,000 duplicate rows. Rather than replace the times for all the rows, we removed all duplicates from the database. (Best to have MS Check this one out for you if you have a lot of data)
    After doing this there was a lot of data moving around the Staging tables (I assume from the management server that couldn't communicate properly), so once again we truncated the AlertStage table as it wasn't keeping up. Once this was done everything worked
    properly and all the queues stayed under control.
    To confirm things had been cleared up we checked the AlertStage table had no entries and the ManagedEntityStage table had no entries. We also confirmed that the 31553 events stopped on the Management server.
    Hopefully this can help someone, or provide a bit more information on these problems.

  • Why do we need SSIS and star schema of Data Warehouse?

    If SSAS in MOLAP mode stores data, what is the application of SSIS and why do we need a Data Warehouse and the ETL process of SSIS?
    I have a SQL Server OLTP database. I am using SSIS to transfer my SQL Server data from OLTP database to a Data Warehouse database that contains fact and dimension tables.
    After that I want to create cubes using SSAS form Data Warehouse data.
    I know that MOLAP stores data. Do I need any Data warehouse with Fact and Dimension tables?
    Is not it better to avoid creating Data warehouse and create cubes directly from OLTP database?

    Another thing to note is data stored in transactional system may not always be in end user consumable format for ex. we may use bit fields/flags to represent some details in OLTP as storage required ius minimum but presenting them as is would not make any
    sense to user as they would not know what each bit value represents. In such cases we apply some transformations and convert data into useful information for users to understand. This is also in the warehouse so that information in warehouse can directly be
    used for reporting. Also in many cases the report will merge data from multiple source systems so merging it on the fly in report would be tedious and would have hit on report server. In comparison bringing them onto common layer (warehouse) and prebuilding
    aggregates would be benefitial for the report performance.
    I think (not sure) we join tables in SSAS queries and calculate aggregations in it.
    I think SSAS stores these values and joined tables and we do not need to evaluates those values again and this behavior is like a Data Warehouse.
    Is not it?
    So if I do not need historical data, Can I avoid creating Data Warehouse?
    On the backend SSAS uses queries only to extract the data
    B/w I was not explaining on SSAS. I was explaining on what happens inside datawarehouse  which is a relational database by itself. SSAS is used to built cube (OLAP structures) on top of datawarehouse. star schema is easier for defining relationships
    and buidling aggregations inside SSAS as its simple and requires minimal lookups to be performed. Also data would be held at lowest granularity level which can easily be aggregated to required levels inside OLAP cubes. Cube processing is very resource
    intensive and using OLTP system would really have a huge impact on processing performance as its nnot denormalized and also doing tranformation etc on the fly adds up to complexity. Precreating a layer (data warehouse) having data in required format would
    make cube processing easier and simpler as it has to just cross join tables and aggregate data based on relationships defined and level needed inside the cube.
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Accessing Data Warehouse with HTML DB

    I have a test data warehouse database 10g comprising of seven dimension tables and one fact table. When I access one table at a time, the query runs fine, but when I join two dimension tables or more to the fact table, the result set comes out wrong. The performance is also very poor. Is HTML DB not capable of properly accessing a data warehouse data?
    Here is the query I'm having problem with:
    SELECT p.prod_name, s.store_name, pr.week, sl.dollars
    FROM sales sl, product p, period pr, store s
    WHERE p.prodkey = sl.prodkey
    AND pr.perkey = sl.perkey
    AND p.prod_name LIKE 'Assam Gold%'
    OR p.prod_name LIKE 'Earl%'
    AND s.store_name LIKE 'Instant%'
    AND pr.month = 'NOV'
    AND pr.year = 2003
    ORDER BY p.prod_name, sl.dollars DESC
    Your input would be appreciated.

    I doubt this was intentional, but you are not joining the store table to anything. You do filter the rows from that table with the AND s.store_name LIKE 'Instant%' predicate, but it is not joined to any of the other 3 tables. Your query will essentially return the number of rows from the other 3 tables multiplied by the number of rows returned from store. SYou might think about grouping some of your predicates for readability and possibly for correct logic.SELECT p.prod_name, s.store_name, pr.week, sl.dollars
      FROM sales sl, product p, period pr, store s
    WHERE p.prodkey = sl.prodkey
       AND pr.perkey = sl.perkey
       -- Add missing predicate here
       -- AND s.something = sl,p, or pr .something
       -- end missing predicate
       AND (p.prod_name LIKE 'Assam Gold%'
            OR
            p.prod_name LIKE 'Earl%')
       AND s.store_name LIKE 'Instant%'
       AND pr.month = 'NOV'
       AND pr.year = 2003
    ORDER BY p.prod_name, sl.dollars DESCHope this helps,
    Tyler

  • Data Warehouse Jobs stuck at running - Since February!

    Folks,
    My incidents have not been groomed out of the console since February. I ran the Get-SCDWJob and found most of the jobs are disabled. See below. I've tried to enable all of them using PowerShell and they never are set back to Enabled.
    No errors are present in the Event log. In fact, the Event log shows successfully starting the jobs.  
    I've restarted the three services. Rebooted the server.
    I've been using this blog post as a guide.
    http://blogs.msdn.com/b/scplat/archive/2010/06/07/troubleshooting-the-data-warehouse-data-warehouse-isn-t-getting-new-data-or-jobs-seem-to-run-forever.aspx
    Anyone have any ideas?
    Win 08 R2 and SQL 2008 R2 SP 1.
    BatchId Name                 Status       CategoryName     StartTime               
    EndTime                  IsEnabled
    13810   DWMaintenance        Running      Maintenance      3/22/2013 4:26:00 PM                             
    True    
    13807   Extract_DW_            Running      Extract          2/28/2013 7:08:00 PM                             
    False   
            ServMgr_MG                                                                                                   
    13808   Extract_Ser Running      Extract          2/28/2013 7:08:00 PM                             
    False   
            vMgr_MG                                                                                                      
    13780   Load.CMDWDataMart    Running      Load             2/28/2013 7:08:00 PM                             
    False   
    13784   Load.Common          Running      Load             2/28/2013 7:08:00 PM                             
    False   
    13781   Load.OMDWDataMart    Running      Load             2/28/2013 7:08:00 PM                             
    False   
    13809   MPSyncJob            Running      Synchronization  2/28/2013 8:08:00 PM                             
    True    
    3405    Process.SystemCenter Running      CubeProcessing   1/31/2013 3:00:00 AM     2/10/2013 2:59:00 PM     True    
            ChangeAndActivityMan                                                                                         
            agementCube                                                                                                  
    3411    Process.SystemCenter Running      CubeProcessing   1/31/2013 3:00:00 AM     2/10/2013 2:59:00 PM     True    
            ConfigItemCube                                                                                               
    3407    Process.SystemCenter Running      CubeProcessing   1/31/2013 3:00:00 AM     2/10/2013 2:59:00 PM     True    
            PowerManagementCube                                                                                          
    3404    Process.SystemCenter Running      CubeProcessing   1/31/2013 3:00:00 AM     2/10/2013 2:59:00 PM     True    
            ServiceCatalogCube                                                                                           
    3406    Process.SystemCenter Running      CubeProcessing   1/31/2013 3:00:00 AM     2/10/2013 2:59:00 PM     True    
            SoftwareUpdateCube                                                                                           
    3410    Process.SystemCenter Running      CubeProcessing   1/31/2013 3:00:00 AM     2/10/2013 2:59:00 PM     True    
            WorkItemsCube                                                                                                
    13796   Transform.Common     Running      Transform        2/28/2013 7:08:00 PM                             
    False 

    Okay, I've done to much work without writing it down. I've gotten it to show me a new error using Marcel's script. The error is below.  
    It looks like a Cube issue. Not sure how to fix it.
    There is no need to wait anymore for Job DWMaintenance because there is an error in module ManageCubeTranslations an
    e error is: <Errors><Error EventTime="2013-07-29T19:03:30.1401986Z">The workitem to add cube translations was aborte
    cause a lock was unavailable for a cube.</Error></Errors>
    Also running the command  Get-SCDWJobModule | fl >> c:\temp\jobs290.txt shows the following errors.
    JobId               : 302
    CategoryId          : 1
    JobModuleId         : 6350
    BatchId             : 3404
    ModuleId            : 5869
    ModuleTypeId        : 1
    ModuleErrorCount    : 0
    ModuleRetryCount    : 0
    Status              : Not Started
    ModuleErrorSummary  : <Errors><Error EventTime="2013-02-10T19:58:30.6412697Z">The connection either timed out or was lo
                          st.</Error></Errors>
    ModuleTypeName      : Health Service Module
    ModuleName          : Process_SystemCenterServiceCatalogCube
    ModuleDescription   : Process_SystemCenterServiceCatalogCube
    JobName             : Process.SystemCenterServiceCatalogCube
    CategoryName        : CubeProcessing
    Description         : Process.SystemCenterServiceCatalogCube
    CreationTime        : 7/29/2013 12:57:39 PM
    NotToBePickedBefore :
    ModuleCreationTime  : 7/29/2013 12:57:39 PM
    ModuleModifiedTime  :
    ModuleStartTime     :
    ManagementGroup     : DW_Freeport_ServMgr_MG
    ManagementGroupId   : f61a61f2-e0fe-eb37-4888-7e0be9c08593
    JobId               : 312
    CategoryId          : 1
    JobModuleId         : 6436
    BatchId             : 3405
    ModuleId            : 5938
    ModuleTypeId        : 1
    ModuleErrorCount    : 0
    ModuleRetryCount    : 0
    Status              : Not Started
    ModuleErrorSummary  : <Errors><Error EventTime="2013-02-10T19:58:35.1028411Z">Object reference not set to an instance o
                          f an object.</Error></Errors>
    ModuleTypeName      : Health Service Module
    ModuleName          : Process_SystemCenterChangeAndActivityManagementCube
    ModuleDescription   : Process_SystemCenterChangeAndActivityManagementCube
    JobName             : Process.SystemCenterChangeAndActivityManagementCube
    CategoryName        : CubeProcessing
    Description         : Process.SystemCenterChangeAndActivityManagementCube
    CreationTime        : 2/10/2013 7:58:31 PM
    NotToBePickedBefore : 2/10/2013 7:58:35 PM
    ModuleCreationTime  : 2/10/2013 7:58:31 PM
    ModuleModifiedTime  : 2/10/2013 7:58:35 PM
    ModuleStartTime     : 2/10/2013 7:58:31 PM
    ManagementGroup     : DW_Freeport_ServMgr_MG
    ManagementGroupId   : f61a61f2-e0fe-eb37-4888-7e0be9c08593
    JobId               : 331
    CategoryId          : 1
    JobModuleId         : 6816
    BatchId             : 3406
    ModuleId            : 6242
    ModuleTypeId        : 1
    ModuleErrorCount    : 0
    ModuleRetryCount    : 0
    Status              : Not Started
    ModuleErrorSummary  : <Errors><Error EventTime="2013-02-10T19:58:38.7064180Z">Object reference not set to an instance o
                          f an object.</Error></Errors>
    ModuleTypeName      : Health Service Module
    ModuleName          : Process_SystemCenterSoftwareUpdateCube
    ModuleDescription   : Process_SystemCenterSoftwareUpdateCube
    JobName             : Process.SystemCenterSoftwareUpdateCube
    CategoryName        : CubeProcessing
    Description         : Process.SystemCenterSoftwareUpdateCube
    CreationTime        : 2/10/2013 7:58:35 PM
    NotToBePickedBefore : 2/10/2013 7:58:39 PM
    ModuleCreationTime  : 2/10/2013 7:58:35 PM
    ModuleModifiedTime  : 2/10/2013 7:58:39 PM
    ModuleStartTime     : 2/10/2013 7:58:35 PM
    ManagementGroup     : DW_Freeport_ServMgr_MG
    ManagementGroupId   : f61a61f2-e0fe-eb37-4888-7e0be9c08593
    JobId               : 334
    CategoryId          : 1
    JobModuleId         : 6822
    BatchId             : 3407
    ModuleId            : 6246
    ModuleTypeId        : 1
    ModuleErrorCount    : 0
    ModuleRetryCount    : 0
    Status              : Not Started
    ModuleErrorSummary  : <Errors><Error EventTime="2013-02-10T19:58:42.2943950Z">Object reference not set to an instance o
                          f an object.</Error></Errors>
    ModuleTypeName      : Health Service Module
    ModuleName          : Process_SystemCenterPowerManagementCube
    ModuleDescription   : Process_SystemCenterPowerManagementCube
    JobName             : Process.SystemCenterPowerManagementCube
    CategoryName        : CubeProcessing
    Description         : Process.SystemCenterPowerManagementCube
    CreationTime        : 2/10/2013 7:58:39 PM
    NotToBePickedBefore : 2/10/2013 7:58:42 PM
    ModuleCreationTime  : 2/10/2013 7:58:39 PM
    ModuleModifiedTime  : 2/10/2013 7:58:42 PM
    ModuleStartTime     : 2/10/2013 7:58:39 PM
    ManagementGroup     : DW_Freeport_ServMgr_MG
    ManagementGroupId   : f61a61f2-e0fe-eb37-4888-7e0be9c08593
    JobId               : 350
    CategoryId          : 1
    JobModuleId         : 6890
    BatchId             : 3410
    ModuleId            : 6299
    ModuleTypeId        : 1
    ModuleErrorCount    : 0
    ModuleRetryCount    : 0
    Status              : Not Started
    ModuleErrorSummary  : <Errors><Error EventTime="2013-02-10T19:58:45.8355723Z">Object reference not set to an instance o
                          f an object.</Error></Errors>
    ModuleTypeName      : Health Service Module
    ModuleName          : Process_SystemCenterWorkItemsCube
    ModuleDescription   : Process_SystemCenterWorkItemsCube
    JobName             : Process.SystemCenterWorkItemsCube
    CategoryName        : CubeProcessing
    Description         : Process.SystemCenterWorkItemsCube
    CreationTime        : 2/10/2013 7:58:42 PM
    NotToBePickedBefore : 2/10/2013 7:58:46 PM
    ModuleCreationTime  : 2/10/2013 7:58:42 PM
    ModuleModifiedTime  : 2/10/2013 7:58:46 PM
    ModuleStartTime     : 2/10/2013 7:58:42 PM
    ManagementGroup     : DW_Freeport_ServMgr_MG
    ManagementGroupId   : f61a61f2-e0fe-eb37-4888-7e0be9c08593
    JobId               : 352
    CategoryId          : 1
    JobModuleId         : 6892
    BatchId             : 3411
    ModuleId            : 6300
    ModuleTypeId        : 1
    ModuleErrorCount    : 0
    ModuleRetryCount    : 0
    Status              : Not Started
    ModuleErrorSummary  : <Errors><Error EventTime="2013-02-10T19:58:49.6887476Z">Object reference not set to an instance o
                          f an object.</Error></Errors>
    ModuleTypeName      : Health Service Module
    ModuleName          : Process_SystemCenterConfigItemCube
    ModuleDescription   : Process_SystemCenterConfigItemCube
    JobName             : Process.SystemCenterConfigItemCube
    CategoryName        : CubeProcessing
    Description         : Process.SystemCenterConfigItemCube
    CreationTime        : 2/10/2013 7:58:46 PM
    NotToBePickedBefore : 2/10/2013 7:58:50 PM
    ModuleCreationTime  : 2/10/2013 7:58:46 PM
    ModuleModifiedTime  : 2/10/2013 7:58:50 PM
    ModuleStartTime     : 2/10/2013 7:58:46 PM
    ManagementGroup     : DW_Freeport_ServMgr_MG
    ManagementGroupId   : f61a61f2-e0fe-eb37-4888-7e0be9c08593

  • Data Warehouse performance since changing retention settings?

    Hi,
    I dont know if its a coincidence but I have managed to get into a bit of a state with regard to our data warehouse.
    Firstly when the server was specced I dont think anybody actually worked out what size the databases would work out to be. I have started troubleshooting initially because of a lack of disk space. The DW was set to the defaults of 400 days etc and had grown
    to around 700GB. Our operations DB is around 50GB in size. The disk at this point had around 50GB of space left.
    Anyway I did some work on the retention in the DW to knock a lot of stuff down to say 9 months as needed. A week later and the data is now 500GB although the physical size is still 700GB.
    Now I dont know if its coincidence but in the last couple of days I am getting performance alerts such as not being able to  store data in the DW in a timely manner, failing to perform maintenace and visual indications that things have slowed down on
    the DW. For example an SLA report for the month for all servers now times out when before it ran in a few minutes.
    So I am wondering if the "blank" space in the DW is now causing issues as there is data at both ends of the database perhaps. I would like to get this blank space back but no expert on SQL and wondering if any other considerations needs to be taken
    for SCOM "to get this back".
    I also understand that perhaps more disk space is required for the actual grooming so maybe I need to get down to 6 montghs of data before this can happen.
    The performance part may not be tied to the issue but I guess either way I would like to get the space back if possible
    thanks

    There are sereval causes
    1) check any events or performance collection which generate a hugh DB and DW size by using the following SQL query in
    http://blogs.technet.com/b/kevinholman/archive/2007/10/18/useful-operations-manager-2007-sql-queries.aspx
    2) You can also refer to the following post
    http://deploymentportal.blogspot.ru/2012/08/operations-manager-data-warehouse.html
    3) check SQL logs on the datawarehouse, especially for blocking problems
     Also, check Disk IO on the data warehouse (the windows mp collects these metrics). If it affects all Management Servers and the message does say "timeout" then the problem is likely to be at the SQL end. It may not be maxed out by CPU or Memory but
    there are likely to be other bottlenecks on the SQL box. What is the Disk Queue for disks that the operations manager data warehouse data and log files reside? These are on seperate physical disks aren't they??
     My guess is that it is a SQL issue .... temporary timeouts suggest that SQL is busy  doing something else. And I'd tend to concentrate my thoughts on Disk IO rather than memory or cpu.
    Roger

  • Normalized (3NF) VS Denormalized(Star Schema) Data warehouse :

    what are the benefits of normalized data warehouse (3NF) over the denormalized (Star schema)?
    if DW is in the 3NF then is need to create the seprate physical database which contains several data marts( star schema)with physical tables, which feeds to cube or create the views(SSAS data source view) on top of 3NF warehouse of star schema which feeds to
    cube?
    please explin the pros and cons of 3NF and denormalized DW.
    thanks in advance.
    Zaim Raza.

    Hi Zaim,
    Take a look to this diagram:
    1) Normally, 3NF schema is typical for ODS layer, which is simply used to fetch data from sources, generalize, prepare, cleanse data for upcoming load to data warehouse.
    2) When it comes to DW layer (Data Warehouse), data modelers general challenge is to build historical data silo.
    Star schema with slow changing facts and  slow changing dimensions are partially suitable.
    The DataVault and other similar specialized methods provides, in my opinion, wider possibility and flexibility.
    3) Star schema is perfectly suitable for datamarts. SQL Server 2008 and higher contains numerous query analyzer improvements to handle such workload efficiently. SQL Server 2012 introduced column stored indexes, that makes possibility to
    create robust star model datamarts with SQL Query performance comparable to MS OLAP. 
    So, your choice is:
    1) Create solid, consistent DW solution
    2) Create separate datamarts on top of DW for specific business needs. 
    3) Create necessary indexes, PK, FK key and statistics (of FK in fact tables) to help sql optimizer as much as possible.
    4) Forget about approach of defining SSAS datasource view on top of 3NF (or any other DWH modeling method), since this is the way to performance and maintenance issues in the future.

  • Differences between operational systems data modeling and data warehouse da

    Hello Everyone,
    Can anybody help me understand the differences between operational systems data modeling and data warehouse data modeling>
    Thanks

    Hello A S!
    What you mean is the difference between modelling after normal form like in operational systems (OLTP) e. g. 3NF and modelling a InfoCube in a data warehouse (OLAP)?
    While in a OLTP you want to have data tables free of redundance and ready for transactions meaning writing and reading few records often, in an OLAP-system you need to read a lot of data for every query you do on a database. Often in an OLAP-system you aggregate these amounts of data.
    Therefore you use another principle for these database scheme. This is called star schema. This means that you have one central table (called fact table) which holds the key figures and have keys to another tables with characteristics. These other tables are called dimension tables. They hold combinations of the characteristics. Normally you design it that your dimensions are small, so the access  on the data is more efficent.
    the star scheme in SAP BI is a little more complex than explained here but it follows the same concept.
    Best regards,
    Peter

Maybe you are looking for

  • Simple button re-ordering problem

    I understand that changing the button order on a menu is simple - either use the shortcuts on the canvas, right click the button, etc. But for the life of me, no matter which way I try (including even trying to re-order through the drop-down menu), t

  • Virtual switch internet connection sharing problem

    Greetings, I am having a problem with sharing internet connection to virtual machine. I tried to create external virtual switch but without success. From time to time I got an error: Error applying Virtual Switch Properties changes An internal error

  • Kernel Driver IST in User Mode?

    Hi, We are discussing about the diagram shown in the following URL:https://msdn.microsoft.com/en-us/library/jj659820.aspx In the past I posted similar question for Windows CE 6.0 (http://www.tech-archive.net/Archive/WindowsCE/microsoft.public.windows

  • Metadata in xi

    In file to idoc scenario we define metadata in IDX2,but in idoc to file scenario is it require to import metadata into xi, if yes where in xi we import it?

  • Data load in the Production takes long time

    Hi All, We have load which is extracting data from 0HR_PA_0 Data Source. It is loading daily some 2 lacs 30thousand records.But it was taking lots of time to load around 3 hrs.So if we check in the joblog of the source system the job is active for a