Data Warehouse update

We are currently looking into reporting and have noticed that the Data Warehouse takes 4 hours to update. We do not have any of our DW jobs set at 4 hours, but have consistently seen that it takes 4 hours for the IRs/SRs to show in the DW. Is there any reason
for why it takes so long? Is it possible that we have something set incorrectly?

There's a couple of jobs involved in the process of moving data to the Data Warehouse. These jobs are called ETL and are short for
Extract, Transform and Load. When all these three jobs have finished, the data should be available in your reports in SSRS.
The Extract job executes every 5 minute
The Transform job executes every 30 minute
The Load job executes every 60 minute
So based upon this, it shouldn't take more that a maximum of 95 minutes until the data is available in your reports in SSRS.
If we are talking about Cube data, there is a set of additional jobs involved, these are called Process jobs. These jobs run at night, so the data available in your OLAP reports are from yesterday.
Regards
//Anders
Anders Asp | Lumagate | www.lumagate.com | Sweden | My blog: www.scsm.se

Similar Messages

  • Service Manager Data Warehouse - Updates

    Hi,
    I have recently build an Analysis services cube using the DWDataMart database, and we also have out of the box cubes set up. This all works correctly, no issues. I have had a look in the database today and noticed that the table ResolvedByUserFactView (along
    with many others) is simply a view of monthly tables combined eg. ResolvedByUserFactJan14, Feb14 etc. However the monthly tables only exist up to October 2014.
    So my question is, are new tables automatically created as part of the daily ETL process for new months if they don't already exist?
    If not can I create them myself and will they be populated automatically come November or will I need to look into the jobs also to populate the Data Warehouse? (Which I cannot currently access, and don't know if there is a way to amend them or if I have
    to start from scratch? Any help would be appreciated.

    Hi,
    These tables are automatically created for the next new month, please don't worry.
    Cheers,
    Marat
    Site: www.scutils.com  Twitter:
    LinkedIn:
    Graveyard:

  • Update data automatically in fact table in Data Warehouse

    Hi,
    I'm working on the creation of a data warehouse that include different data source like SQL Server performance (more than one), Active Directory users, Server performance (more than one), Exchange server mailboxes. The problem is that performance data change
    frequently (like CPU and Memory), so my question is how to update data in fact table every 5 seconds automatically with SSIS.
    Thank you for any advice  

    I'm assuming you have already figured out how to capture the data e.g. Powershell, extended events, MDW etc. and just need to know what dimensions or fact tables do you need.
    You need to decide how often you are going to capture this data and based on that you will have dimensions with appropriate grain. Don't try to cram everything in the same fact table if it not of the same granularity. Also, separate process usually
    have separate fact tables.
    In addition to the Date dimension, you will need a Time dimension with a grain of 1 second (or maybe 5 seconds if that is when you get your data) then run the SSIS every 5 seconds to capture and append that data in the fact table.
    - Aalamjeet Rangi | (Blog)

  • Data not updated in business entity after change in architectural object

    Hi,
    A business entity was created from Architectural Object.
    When we modify the address in the architectural object, the data is not updated in the business entity.
    Is there any solution that make an automatic update for the address in business entity after a change in architectural object ?
    Thx for your help.
    Regards
    Saad

    Hi,
    I have created new infopackage and ran. Now the following message I have got on the monitor.
    "Data not received in PSA Table
    Diagnosis
    Data has not been updated in PSA Table . The request is probably still running or there was a short dump.
    Procedure
    In the short dump overview in BW, look for the short dump that belongs to your data request. Make sure the correct date and time are specified in the selection screen.
    You can use the wizard to get to the short dump list, or follow the menu path "Environment -> Short dump -> In Data Warehouse".
    Removing errors
    Follow the instructions in the short dump."
    Any more thoughts?
    Thanks,
    Rao.

  • Configuration Dataset = 90% of Data Warehouse - Event Errors 31552

    Hi All,
    I'm currently running SCOM 2012 R2 and have recently had some problems with the Data Warehouse Data Sync. We currently have around 800 servers in our production environment, no Network devices, we use Orchestrator for integration with our call logging system
    and I believe this is where our problems started. We had a runbook which got itself into a loop and was constantly updating alerts, it also contributed to a large number of state changes. We have resolved that problem now, but I started to receive alerts
    saying SCOM couldn't sync Alert data under event 31552.
    Failed to store data in the Data Warehouse.
    Exception 'SqlException': Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding. 
    One or more workflows were affected by this.  
    Workflow name: Microsoft.SystemCenter.DataWarehouse.StandardDataSetMaintenance 
    Instance name: Alert data set 
    Instance ID: XX
    Management group: XX
    I have been researching problems with syncing alert data, and came across the queries to manually do the database maintenance, I ran that on the alert instance and it took around 16.5 hours to run on the first night, then it ran fast (2 seconds) most the
    day, when it got to about the same time the next day it took another 9.5 hours to run so I'm not sure why that's giving different results.
    Initially it appeared all of our datasets were out of sync, after the first night all appear to be in sync bar the Hourly Performance Dataset. Which still has around 161 OutstandingAggregations. When I run the Maintenance on Performance it doesn't appear
    to be fixing it. (It runs in about 2 seconds, successfully)
    I recently ran DWDatarp on the database to see how the Alert Dataset was looking and to my surprise I found that the Configuration Dataset has blown out to take up 90% of the DataWarehouse, table below. Does anyone have any ideas on what might cause this
    or how I can fix it?
    Dataset name                   Aggregation name     Max Age     Current Size, Kb
    Alert data set                 Raw data                 400       132,224 (  0%)
    Client Monitoring data set     Raw data                  30             0 (  0%)
    Client Monitoring data set     Daily aggregations       400            16 (  0%)
    Configuration dataset          Raw data                 400   683,981,456 ( 90%)
    Event data set                 Raw data                 100    17,971,872 (  2%)
    Performance data set           Raw data                  10     4,937,536 (  1%)
    Performance data set           Hourly aggregations      400    28,487,376 (  4%)
    Performance data set           Daily aggregations       400     1,302,368 (  0%)
    State data set                 Raw data                 180       296,392 (  0%)
    State data set                 Hourly aggregations      400    17,752,280 (  2%)
    State data set                 Daily aggregations       400     1,094,240 (  0%)
    Microsoft.Exchange.2010.Dataset.AlertImpact Raw data      
    7     0 (  0%)
    Microsoft.Exchange.2010.Dataset.AlertImpact Hourly aggregations        
    3     0 (  0%)
    Microsoft.Exchange.2010.Dataset.AlertImpact Daily aggregations      
    182     0 (  0%)
    Microsoft.Exchange.2010.Reports.Dataset.Availability Raw data                 400           176 (  0%)
    Microsoft.Exchange.2010.Reports.Dataset.Availability Daily aggregations       400             0 (  0%)
    Microsoft.Exchange.2010.Reports.Dataset.TenantMapping Raw data 7             0 (  0%)
    Microsoft.Exchange.2010.Reports.Dataset.TenantMapping Daily aggregations       400             0 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ActiveUserMailflowStatistics.Data Rawdata                   3        84,864 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ActiveUserMailflowStatistics.Data Hourly aggregations        7       407,416 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ActiveUserMailflowStatistics.Data Daily aggregations       182       143,128 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ServerMailflowStatistics.Data Raw data                   7         6,088 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ServerMailflowStatistics.Data Hourly aggregations       31        20,056 (  0%)
    Microsoft.Exchange.2010.Reports.Transport.ServerMailflowStatistics.Data Daily aggregations       182         3,720 (  0%)
    I have one other 31553 event showing up on one of the Management servers as follows,
    Data was written to the Data Warehouse staging area but processing failed on one of the subsequent operations.
    Exception 'SqlException': Sql execution failed. Error 2627, Level 14, State 1, Procedure ManagedEntityChange, Line 368, Message: Violation of UNIQUE KEY constraint 'UN_ManagedEntityProperty_ManagedEntityRowIdFromDAteTime'. Cannot insert duplicate key in
    object 'dbo.ManagedEntityProperty'. The duplicate key value is (263, Aug 26 2013  6:02AM). 
    One or more workflows were affected by this.  
    Workflow name: Microsoft.SystemCenter.DataWarehouse.Synchronization.ManagedEntity 
    Instance name: XX 
    Instance ID: XX
    Management group: XX
    which from my readings means I'm likely in for an MS support call.. :( But I just wanted to see if anyone has any information about the Configuration Dataset as I couldn't find much in my searching.

    Hi All,
    The results of the MS Support call were as follows, I don't recommend doing these steps without an MS Support case, any damage you do is your own fault these particular actions resolved our problems:
    1. Regarding the Configuration Dataset being so large. 
    This was caused by our AlertStage table which was also very large, we truncated the alert stage table and ran the maintenance tasks manually to clear this up. As I didn't require any of the alerts sitting in the AlertStage table we simply did a straight truncation
    of the the table. The document linked by MHG above shows the process of doing a backup & restore on the AlertStage Table if you need to. It took a few days of running maintenance tasks to resolve this problem properly. As soon as the truncation had taken
    place the Confirguration Dataset dropped in size to less than a gig.
    2. Error 31553 Duplicate Key Error
    This was a problem with duplicate keys in the ManagedEntityProperty table. We identified rows which had duplicate information, which could be gathered from the Events being logged on the Management Server.
    We then updated a few of these rows to have a slightly different time to what was already in the Database. We noticed that the event kept logging with a different row each time we updated the previous row. We ran the following query to find out how many rows
    actually had duplicates:
    select * from ManagedEntityProperty mep
    inner join ManagedEntity me on mep.ManagedEntityRowId = me.ManagedEntityRowId
    inner join ManagedEntityStage mes on mes.ManagedEntityGuid = me.ManagedEntityGuid
    where mes.ChangeDateTime = mep.FromDateTime
    order by mep.ManagedEntityRowId
    This returned over 25,000 duplicate rows. Rather than replace the times for all the rows, we removed all duplicates from the database. (Best to have MS Check this one out for you if you have a lot of data)
    After doing this there was a lot of data moving around the Staging tables (I assume from the management server that couldn't communicate properly), so once again we truncated the AlertStage table as it wasn't keeping up. Once this was done everything worked
    properly and all the queues stayed under control.
    To confirm things had been cleared up we checked the AlertStage table had no entries and the ManagedEntityStage table had no entries. We also confirmed that the 31553 events stopped on the Management server.
    Hopefully this can help someone, or provide a bit more information on these problems.

  • Syntax for WriterLoginName in Data Warehouse DB

    Hello
    I'm having a few issues with our management servers writing to the Data Warehouse DB. I've checked the 'Management Group' table and can see the WriterLoginName is set to
    DOMAIN\sv-scom-dw - however, i'm just querying whether that field should read
    sv-scom-dw
    The account is in fact a domain account. It's listed as the 'Data Warehouse SQL Account' & 'Data Warehouse Action Account' (under Administration > Run As configuration > Accounts). 
    We have two entries in the database security (rights over OperationsMangerDW), one as DOMAIN\sv-scom-dw & a local SQL login called sv-scom-dw. Both accounts have the following permissions: apm_datareader, apm_datawriter, db_datareader, db_owner, OpsMgrReader,
    OpsMgrWriter, public.
    We're a SCOM 2012 R2 environment. All servers are 2012 R2, SQL is also 2012 standard. 
    Anyone faced a similar issue before? I'm seeing a lot of alerts in the Monitoring section for the Data Warehouse. One in particular:
    Data Warehouse failed to discover performance standard data set. Failed to enumerate (discover) Data Warehouse objects and relationships among them. The operation will be retried.
    Exception 'SqlException': Management Group with id ''5F201AB2-4B10-7FCC-C716-B2361102248D'' is not allowed to access Data Warehouse under login ''sv-scom-dw''
    One or more workflows were affected by this.
    Workflow name: Microsoft.SystemCenter.DataWarehouse.Discovery.StandardDataSet
    Instance name: Performance data set
    Instance ID: {B81C47FB-A80D-0FE5-A8DB-DC4544FC8DA6}
    Management group: ******
    As you can see from the alert the account referenced is 'sv-scom-dw' and not 'DOMAIN\sv-scom-dw'. Which is why I originally asked, should the field in the management table be updated?
    Thanks, David.

    Hi guys.
    Thanks for the responses, I shall provide an event  ID shortly. In response to Mai, I've followed the link you've posted and I'm now checking the 'data source and related settings', so i've gone to http://localhost/reports on the Warehouse server (which
    also hosts the reporting), and i've got the following error:
    The report server cannot decrypt the symmetric key that is used to access sensitive or encrypted data in a report server database. You must either restore a backup key or delete all encrypted content. (rsReportServerDisabled)
    Get
    Online Help
    Keyset does not exist (Exception from HRESULT: 0x80090016)
    Have you come across this before?

  • Table and Index compression in data warehouse - thoughts?

    Hi,
    We have a data warehouse with large fact tables and materialized views of this data.
    Approx 3 million inserts per day week-ends about 12 million.
    The fact tables we have expected to have 200 million, and couple with 1-3 billion.
    Tables partitioned and have bitmap indexes.
    Just wondered what thoughts were about compressing large fact tables and mviews both from point of view of ETL into them and reporting from them afterwards.
    I take it, can compress/uncompress accordingly without any problem?
    Many Thanks

    After compression, most SELECT statements would not get slower. Actually, many can get faster due to reduced IO and buffer needs.
    The situation with DMLs is more complex. It depends on the exact compression options (basic or advanced) and the DML (INSERT,UPDATE, direct load,..),but generally DML are negatively affected by compression.
    In a Data Warehouses (DWs), it is usually quite beneficial to compress partitions or tables that contain data that is not supposed to be modified (read only or read mostly). Please note that in many cases you do not have to compress while you are loading the data – you can do that later.
    You can also consider compressing some of your B-tree indexes (if you use them in your DW system).
    Iordan Iotzov
    http://iiotzov.wordpress.com/

  • Difference between general DB and Data Warehouse DB

    Hi,
    We have a server on which Oracle Database was already installed. We want to use it as a data wareshouse. I had a question that if this database would be sufficient for a data warehouse or i would have to create a new database for data warehouse. Is it possible to find out if the installation was general purpose or Data Warehouse?
    Also if i go ahead then would it impact if i directly install the new database without uninstalling previous oracle database.
    Appreciate your help
    regards,
    Edited by: user10243788 on Mar 23, 2010 2:09 AM

    While installing you can select any one 'General Purpose' or 'Dataware house', the only difference in those two while installation is that the parameters for init.ora will be having high values for dataware house database, which can also be updated later manually. So you can go ahead and install general purpose database aswell but later you need to modify the init.ora parameters for specifying higher memory values for parameters like shared_pool_size, java_pool_size, db_buffer_cache etc.

  • Permanent Job Opportunity - Oracle BI Data Warehouse Developer Chicago, IL

    Submit Resumes to [email protected]
    The Business Intelligence Specialist will play a critical role in designing, developing, deploying, and supporting data warehouse/data mart applications. In this role, the person will be responsible for all BI aspects of a data warehouse/data mart application. Primary duties will be to create reporting standards, as well as coach and support power users with selected Oracle tool. The ideal candidate will have 3+ years demonstrated experience in data warehousing and Business Intelligence tools. Must also possess excellent communication skills and an outstanding track record with the user.
    Principal Duties:
    Participates with internal clients to define software requirements for development, maintenance and/or improvements
    Maintains accuracy, integrity, and availability of the data warehouse
    Tests, monitors, manages, and validates data warehouse activity, including data extraction, transformation, movement, loading, cleansing, and updating processes
    Designs and optimizes data mart models for Oracle Business Intelligence Suite.
    Translates the reporting requirements into data analysis and reporting solutions.
    Reviews and sign off on project plan(s).
    Reviews and sign off on technical design(s).
    Defines and develops BI reports for accessing/analyzing data in warehouse.
    Customizes BI tools and data sets for different types of users.
    Designs and develop UAT (User Acceptance Testing).
    Drives improvement of BI system architecture and development process.
    Develops and maintains internal relationships. Actively champions teamwork. Uses internal resources to enhance knowledge and expertise of industry, research, products and services. Provides information and support to others in the company.
    Required Skills:
    Education and Experience:
    BS/MS in Computer Science or equivalent.
    3+ years of experience with Oracle, PL/SQL Development and Data Warehousing.
    Experience Oracle Business Intelligence Suite and Crystal Reports is a plus.
    2-3 years dimensional modeling experience.
    Demonstrated hands on experience with Unix/Linux, SQL required.
    Demonstrated hands on experience with Oracle reporting tools.
    Demonstrated experience with translating business requirements into data analysis and reporting solutions.
    Experience in training programs/teach users to use tools.
    Expertise with software development process.
    Effective mediator - able to facilitate constructive and productive discussions with internal customers, external clients, and development personnel pertaining to feature definition, project scope, and status
    Problem solving*identifies and resolves problems in a timely manner, gathers and analyzes information skillfully and maintains confidentiality.
    Planning/organizing*prioritizes and plans work activities and uses time efficiently. Work requires continual attention to detail in composing and proofing materials, establishing priorities and meeting deadlines. Must be able to work in a fast-paced environment with demonstrated ability to juggle multiple competing tasks and demands.
    Quality control*demonstrates accuracy and thoroughness and monitors own work to ensure quality.
    Adaptability*adapts to changes in the work environment, manages competing demands and is able to deal with frequent change, delays or unexpected events.
    Benefits/Compensation:
    Employees enjoy competitive compensation. We have a full benefits package including medical and dental insurance, long-term disability and life insurance and a 401(k) plan.
    The client operates within the healthcare industry.
    This is a permanent full-time position. After ensuring your availability and qualifications we will put you in direct contact with the client to move forward in the process.

    FORWARD THE UPDATED RESUME AS SOON AS POSSIBLE.

  • R/3 extraction - Missing Messages:No data from Update rules

    Hi SAP Gurus,
                  I'm into the process of R/3 extraction into BW for 0PUR_O01 . The Datasource we r using is 2LIS_02_ITM. The Sceduled infopackage still runs for over 12 hrs. We are getting the typical Missing Messages error. Its in the yellow status with one out of 3 data packets showing Everythin OK.While the other two are "Warning Received".
    PSA and Transfer rules have all the records . But Update rules show 11339 --> 0 records. Please suggest what has to be done to move forward. We have checked and there is no dump in ST22. Also its shows 11339 to 0 in start routine of Update rules. I m giving out all possible hints so that it might help giving in a wide variety of suggestions . Please do suggest a way forward.

    Hi,
    Try if this works for you.
    In this case you can execute the LUWs to get the missing data into BW. Here is the procedure.
    1. Go to manage data target  Monitor OR directly go to Monitor for request that failed
    2. Select: Environment  Transact RFC  In the Data Warehouse
    3. Enter the selection criteria Date and User TCODE and execute
    4. Select EDIT  Execute LUW
    This will restart the loading process only for the data package that failed because of space issues or some server problems thereby avoiding the necessity of reloading the entire request.
    Hope this info helps you.
    Regards,
    Yogesh

  • Upgrade OM 2012 to SP1 Beta - Version of SQL Server for the Operational Database and the Data Warehouse

    Hello,
    When I try to verify the prerequisites to upgrade my SCOM 2012 UR2 Platform to SP1 Beta, I have these errors :
    The installed version of SQL Server is not supported for the operational database.
    The installed version of SQL Server is not supported for the data warehouse.
    But when I execute this query Select @@version on my MSSQL Instance, the result is :
    Microsoft SQL Server 2008 R2 (SP1) - 10.50.2500.0 (X64)   Jun 17 2011 00:54:03   Copyright (c) Microsoft Corporation  Standard Edition (64-bit) on Windows NT 6.1 <X64> (Build 7600: ) (Hypervisor) 
    But
    here, we can see that :
    Microsoft SQL Server: SQL Server SQL 2008 R2 SP1, SQL Server 2008 R2 SP2, SQL Server 2012, SQL Server 2012 SP1, are supported.
    Do I need to pach my MSSQL Server with a specific cumulative update package ?
    Thanks.

    These are the requirements for your SQL:
    SQL Server 2008 and SQL Server 2012 are available in both Standard and Enterprise editions. Operations Manager will function with both editions.
    Operations Manager does not support hosting its databases or SQL Server Reporting Services on a 32-bit edition of SQL Server.
    Using a different version of SQL Server for different Operations Manager features is not supported. The same version should be used for all features.
    SQL Server collation settings for all databases must be one of the following: SQL_Latin1_General_CP1_CI_AS, French_CI_AS, Cyrillic_General_CI_AS, Chinese_PRC_CI_AS, Japanese_CI_AS, Traditional_Spanish_CI_AS, or Latin1_General_CI_AS.  No other collation
    settings are supported.
    The SQL Server Agent service must be started, and the startup type must be set to automatic.
    Side-by-side installation of System Center Operations Manager 2007 R2 reporting and System Center 2012 Service Pack 1 (SP1), Operations Manager reporting on the same server is not supported.
    The db_owner role for the operational database must be a domain account. If you set the SQL Server Authentication to Mixed mode, and then try to add a local SQL Server login on the operational database, the Data Access service will not be able to start.
    For information about how to resolve the issue, see
    System Center Data Access Service Start Up Failure Due to SQL Configuration Change
    If you plan to use the Network Monitoring features of System Center 2012 – Operations Manager, you should move the tempdb database to a separate disk that has multiple spindles. For more information, see
    tempdb Database.
    http://technet.microsoft.com/en-us/library/jj656654.aspx#BKMK_RBF_OperationsDatabase
    Check the SQL server agent service and see whether it is set to automatic AND started. This got me confused at my first SP1 install as well. This is not done by default...
    It's doing common things uncommonly well that brings succes.

  • Data Warehouse and ETL tools for data verification ?

    Data Warehouse and ETL tools for data verification ?
    How need to to data verification using ETL tool ? Also how to relate this thing to datawaehouse ?
    Thanks in Advance

    Hi  Shyamal Kumar,
    1)  BW it self  facilitates to do the ETL (Extraction Transformation Loading)  steps:
         example:
                     Extraction  - from SAP or other data bases
                     Transformation - using transfer rules, Updates rules
                     Loading  -  Loading into ODS, Cube, master data
    2) Typically used ETL tools in the industry are:
         a)   datastage from Ascential (owned by IBM)
         b)   Informatica
         c)   Mercator
    Regards, BB

  • Unable to register my data warehouse in Service Manager

    I have been trying to register my data warehouse but keep getting the same error message each time - "Invalid URI:  the hostname could
    not be parsed."  I know the issue is on the ServiceManager database side of things, but there is not a lot of information related to this error message and the info I do find is unrelated to ServiceManager.  I have gone as far as building a
    brand new data warehouse server and reinstalling all of the data warehouse databases from scratch.  It doesn't matter if I tried to register with the original data warehouse environment or the new environment, I get the same error message.  I also
    received a powershell command from Microsoft Support to help clean up any residual entries in the ServiceManager database that might be left over.  At this point I'm at a loss on how to proceed.  Anyone ever run into this issue?
    Here's the event log message:
    Unable to register Service Manager installation with Data Warehouse installation.
     Data Warehouse Server: DW_SCSM
     Service Manager Management Server: FlowSMRC
     Exception: Microsoft.EnterpriseManagement.UI.Core.Shared.PowerShell.PSServiceException: Invalid URI: The hostname could not be parsed.
       at Microsoft.EnterpriseManagement.UI.Core.Shared.PowerShell.PSHostService.executeHelper(String cmd, Object[] input)
       at Microsoft.EnterpriseManagement.UI.Core.Shared.PowerShell.PSHostService.Invoke(String cmd, Object[] input)
       at Microsoft.EnterpriseManagement.ServiceManager.UI.Administration.DWRegistration.Registration.DWRegistrationHelper.AcceptChanges(WizardMode wizardMode)
    Also, I tried registering the data warehouse using powershell Add-SCDWMgmtGroup and get an error saying that Cannot associate Service Manager installation on scsm_server with Service Manager Data Warehouse installation on dw_server.

    Just got additional information about this error. If you run the following query in SQL pointing to the ServiceManager database you should see the name of the Management Server.  The query is:
    select * from MT_Microsoft$SystemCenter$ResourceAccessLayer$SdkResourceStore
    There is a column named Server_<some GUID> that should contain the Management Sever NetBIOS name.  If it has some other name, run an update query to ensure the column has the NetBIOS name of the Management Server.  Restart the SCSM Console
    and retry the Database Registration again and it should succeed.
    Microsoft has said that this is normally caused by either moving the database or database server to another server and the value is never updated.
    Try registering the DW Server again and see if you have better luck.

  • Tables between OBAW and Oracle Data Warehouse in OBIA rpd are different

    Hi,
    The tables in the Data Warehouse DB is different as compared to the tables in the physical layer of the Oracle data Warehouse in OracleBIAnalytiscApps.rpd.
    When I click update Row Count, it gives me below error message:
    There was an error while updating row count for "Oracle Data Warehouse"."Catalog"."DBO"."W_CTRY_REGN_D":
    [nQSError :17001] Oracle Error Code: 942, message: ORA-00942:Table or view does not exist at OCICallSTmTEXECUTE
    Whereas when I click on some other tables for Row count there is not error.
    Also, When I am trying to look for some specific table which exist in Data Warehouse (that has the data), it does not exist in the rpd.
    Kindly assit me.

    hi,
    Have a look
    Not able toView Data in Answers
    Let me know
    thanks,
    saichand.v

  • Sort_area_size in data warehouse applications

    Hi everyone,
    I am working on optmizing the data warehouse application that involves with inserts and updates on bitmap indexes. How to decide proper value for sort_area_size?. I think my application is doing lot of I/Os when inserts/updates are performed on the column that has bitmap index. I tried with various values for sort_area_size with no improvement in the performance. The default sort_area_size is 5M and I tried by incrementing the value until it reached 100 MB. I could not determine the proper value for sort_area_size to have good performance. Reference to any documentation is also appreciated.
    Your help is very much appreciated.
    Thanks
    Suresh.

    Suresh,
    Please refer to the following documentation:
    http://otn.oracle.com/docs/products/oracle9i/doc_library/release2/server.920/a96533/memory.htm#39086
    Also, this seems to be a server technology question, please post in the server technology forum for more details.
    Regards:
    Igor
    Hi everyone,
    I am working on optmizing the data warehouse application that involves with inserts and updates on bitmap indexes. How to decide proper value for sort_area_size?. I think my application is doing lot of I/Os when inserts/updates are performed on the column that has bitmap index. I tried with various values for sort_area_size with no improvement in the performance. The default sort_area_size is 5M and I tried by incrementing the value until it reached 100 MB. I could not determine the proper value for sort_area_size to have good performance. Reference to any documentation is also appreciated.
    Your help is very much appreciated.
    Thanks
    Suresh.

Maybe you are looking for