Partition on InfoCube F table for loading & query performance

I am thinking of creating partition on our SalesStatististics (Utility Ind) cube. The reasone why we are not dividing the cube into yearly/monthly cubes is because the posting trans can be back dated and maintaining all these InfoCubes would be hard.
In the partitioning of InfoCube by month, <b>I am thinking of running ABAP code in the process chains to drop the index on only the current partition</b>. I know that there could be a few transaction that might get directed to other partitions but not whole lot. I am trying to avoid deleting the index on the entire cube and then rebuilding them as it takes 5-6 hours.
Has anyone writen/used such code to drop index on a single partition. Or does anyone has a clue if this is an okay thing to do? Any ideas?
SYSTEM: BW3.5, Custom Utility SalesCube Statistics, Total: 1000 million recs, Each Delta about 700K records
The cube is extremly slow in loading as well as queries. I am also dividing the cube into yearly cubes but the current year would have monthly partitions. A multi-cube would be on top and an ODS would provide item level data before the cube.
thanksin advance...
Bilal

I'll post some of the specific questions later. It seems we do not get much response when overall design is the concern.
Thanks anyway.

Similar Messages

  • How to create DB partitioning in active data tables for ods?

    hi all,
    Can anyone let me know how to create DB partitioning in active data tables for ods. if any docs pls share with me at my email id : [email protected]
    regds
    haritha

    Haritha,
    The following steps will briefly explain you to improve the performance in terms of DB partitioning as well as loading. Please find the same,
    transaction RSCUSTA2,
    oss note 120253 565725 670208
    and remove 'bex reporting' setting in ods if that ods not used for reporting.
    hope this helps.
    565725
    Symptom
    This note contains recommendations for improving the load performance of ODS objects in Business Information Warehouse Release 3.0B and 3.1 Content.
    Other terms
    Business Information Warehouse, ODS object, BW, RSCUSTA2, RSADMINA
    Solution
    To obtain a good load performance for ODS objects, we recommend that you note the following:
    1. Activating data in the ODS object
    In the Implementation Guide in the BW Customizing, you can implement different settings under Business Information Warehouse -> General BW settings -> Settings for the ODS object that will improve performance when you activate data in the ODS object.
    1. Creating SIDs
    The creation of SIDs is time-consuming and may be avoided in the following cases:
    a) You should not set the indicator for BEx Reporting if you are only using the ODS object as a data store.Otherwise, SIDs are created for all new characteristic values by setting this indicator.
    b) If you are using line items (for example, document number, time stamp and so on) as characteristics in the ODS object, you should mark these as 'Attribute only' in the characteristics maintenance.
    SIDs are created at the same time if parallel activation is activated (see above).They are then created using the same number of parallel processes as those set for the activation. However:if you specify a server group or a special server in the Customizing, these specifications only apply to activation and not the creation of SIDs.The creation of SIDs runs on the application server on which the batch job is also running.
    1. DB partitioning on the table for active data (technical name:
    The process of deleting data from the ODS object may be accelerated by partitioning on the database level.Select the characteristic after which you want deletion to occur as a partitioning criterion.For more details on partitioning database tables, see the database documentation (DBMS CD).Partitioning is supported with the following databases:Oracle, DB2/390, Informix.
    1. Indexing
    Selection criteria should be used for queries on ODS objects.The existing primary index is used if the key fields are specified.As a result, the characteristic that is accessed more frequently should be left justified.If the key fields are only partially specified in the selection criteria (recognizable in the SQL trace), the query runtime may be optimized by creating additional indexes.You can create these secondary indexes in the ODS object maintenance.
    1. Loading unique data records
    If you only load unique data records (that is, data records with a one-time key combination) into the ODS object, the load performance will improve if you set the 'Unique data record' indicator in the ODS object maintenance.
    Hope this helps..
    ****Assign Points****
    Thanks,
    Gattu

  • Having more LTSs in logical dimension table hit the query performance?

    Hi,
    We have a logical table having around 19 LTSs. Having more LTSs in logical dimension table hit the query performance?
    Thanks,
    Anilesh

    Hi Anilesh,
    Its kind of both YES and NO. Here is why...
    NO:
    LTS are supposed to give BI Server an optimistic and logical way to retrieve the data. So, having more Optimistic LTS might help the BI Server with some good options tailored to a variety of analysis requests.
    YES:
    Many times, we have to bring in multiple physical tables as a part of single LTS (Mostly when the physical model is a snowflake) which might cause performance issues. Say there is a LTS with two tables "A" and "B", but for a ad-hoc analysis just on columns in "A", the query would still include the join with table "B" if this LTS is being used. We might want to avoid this kind of situations with multiple LTS each for a table and one for both of them.
    Hope this helps.
    Thank you,
    Dhar

  • Standard table for finding query where used list

    Are there any standard tables available to find the list of web templates, bex reports and views using a particular query?
    i.e if i have the query name i should be able to find out where all it is being used.
    Thanks,
    Archna

    Hello,
    You have to join multiple tables to achieve this, here we go
    FOR QUERY Related Information
    RSRREPDIR - Directory of all reports (Query GENUNIID)
    Choose Type of a reporting component -> REP
    Tips :
    Choose Type of a reporting component -> QVW for query view
    SE11 -> RSZ* -> F4 gives you all tables related to queries
    For WORKBOOK Related Information
    Use the FM RRMX_WORKBOOK_QUERIES_GET to get the queries in a workbook by selecting the workbook ID from table RSRWORKBOOK
    Tips :
    SE11 -> RSRWB* -> F4 gives you all tables related to workbooks
    SE37 ->RRMX_WORKBOOKS* -> F4 gives you all the FM related to workbooks
    For Webtemplate Related Information
    Choose the dataprovider for query / view ID.
    SE11 -> RSZWOBJXREF - Structure of the BW Objects in a Template
    Also see,
    SE11 - > RSZWTEMPLATE - Header Table for BW HTML Templates
    Thanks
    Chandran

  • SSP5: Using multiple JVM for load balance performance?

    Sun 12 MAY 2002
    Apps 11.0.3
    SSP5 patchset I
    HP UX 11.0
    db 8.1.7.2 (64-bit)
    Load 60 concurrent sessions, each spawning 2.5 - 3 http connections.
    CPU 3:750Mhz
    RAM 8G
    Is anybody using multiple JVM for load balance?
    What is your ratio of JVM to concurrent iP sessions?
    Are you running Apache/Jserv on a server with any other applications, or is Apache by itself?
    If you are not using JVM, how many httpd processes do you get before Apache implodes? We stopped in the water at 90 httpd processes, but performance degraded starting at 70 sessions.
    Thx - Don

    Using Web Cache to load balance servlet-based Forms (6i and 9i) is unofficially supported. I say "unofficially" because we have actual customers doing it and getting support, but the 2 development teams (Forms and Web Cache) haven't actually done any integration testing of this sort of configuration yet. For your case, please contact your Support rep and ask what was done to use Web Cache as a load balancer for Forms6i at METRO in Germany. The Forms product managemment team is writing up a white paper to describe how to do it, but until then, you'll need to go through Support. Please contact me if you want more information.

  • SQL*Loader or external table for load a MSG (email) file

    Hi there!
    I'm looking for a way to load an email in a Oracle DB.
    I mean, not all the email's body in a column, but to "parse" it in a multi column/table fashion.
    Is it possible to do with a sql*loader script or an external table?
    I think it is not possible, and that I must switch to XML DB.
    Any idea?
    Thanks,
    Antonio

    Hello,
    Why don't you just load the entire MSG (email) as clob into one email_body column or whatever column name you want to use.
    To load data upto 32k, you can use varchar2(32656) but its not a good idea to load clob in that manner because it's very inconsistent as length can
    vary resulting in string literal too long. So you have 2 choices now, first you have to use either procedure or anonymous block to load clob data.
    First Method -- I loaded alert.log successfully and you can imagine how big this file can be (5MB in my test case)
    CREATE OR REPLACE DIRECTORY DIR AS '/mydirectory/logs';
    DECLARE
       clob_data   CLOB;
       clob_file   BFILE;
    BEGIN
       INSERT INTO t1clob
       VALUES (EMPTY_CLOB ())
       RETURNING clob_text INTO clob_data;
       clob_file   := BFILENAME ('DIR', 'wwalert_dss.log');
       DBMS_LOB.fileopen (clob_file);
       DBMS_LOB.loadfromfile (clob_data,
                              clob_file,
                              DBMS_LOB.getlength (clob_file)
       DBMS_LOB.fileclose (clob_file);
       COMMIT;
    END;Second Method: Use of Sqlldr
    Example of controlfile
    LOAD DATA
    INFILE alert.log "STR '|\n'"
    REPLACE INTO  table t1clob
       clob_text char(30000000)
    )Hope this helps

  • Use data set from a SqlDataSource as an input table for another query

    I have a ASP.net winforms app with C# code behind. I am trying to provide a forecasting function that would allow users to perform "what if" analysis on parts of the database without changing the underlying database. Currently this forecast works
    by taking the production schedule from a table and running it against the forecasting query. What I'm trying to accomplish is allowing the user to have a copy of the data in a gridview so they can edit it and see the differences between their schedule and
    the current.
    My first attempt involved having a SqlDataSource create a temporary table and populate it with data from the main database. This allows for the first part (giving the user a copy of the data set to manipulate without changing the original). I tried to create
    a second SQLDataSource and reference the temporary table but it is not valid and appears that the temp table only exists in the session called by the first SqlDataSource. 
    Currently the first data source looks like this
    CREATE TABLE #mastersched(
    product [char](16) NULL,
    ,date [datetime] NULL,
    INSERT INTO #mastersched
    product
    ,date
     select  
    product
    ,date
           from (Main database table);
    select  
    product
    ,date
    from #mastersched
    I need to use the table #mastersched in a join statement within the forecasting query to return another dataset to be displayed either in a gridview, chart, report, etc.

    Hello REIData,
    This issue is more regarding ASP.NET, I suggest you posting it to the suitable forum:
    http://forums.asp.net/
    There are ASP.NET experts who will help you better.
    Thanks for your understanding.                                       
    Regards.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • R12 Interface table for loading data from 3rd part payroll system

    Hi All,
    Can anyone help me to have a lists and detailed technical information of available interface table on Oracle R12 for importing/loading data from third party payroll system. And what should be the best way of importing the data? It should be load first to AP then to GL or load it directly to GL?
    Any help much appreciated. Thanks.
    Cyrus

    Hi Cyrus,
    Can you please let us know your business requirements of this integration, i.e what business want to acheive out of this integration.
    It depends on what your business requirements are wether to send only accounting information from payroll system to your Oracle GL ( then you can integrate Payroll system to Oracle GL directly, by sending the accounting information from your payroll) or if your requirement is to create payroll related invoices in AP and then do payments in oracle AP and then pass accounting information to GL ( then integrate your payroll to AP)
    Regards,
    Madhav

  • Table for the query role

    Hi All,
    I need the sap bw table name to findout the role for a particular query.
    Thanks
    Prasanna

    Hi Prasanna,
    The tables RSRREPDIR, AGR_HIER gives that relationship between the bex query and the role.
    COMPID in RSRREPDIR is query name.
    AGR_NAME in AGR_HIER is role name.
    Join criteria is:
    RSRREPDIR     GENUNIID     =     AGR_HIER     SAP_GUID
    Hope this helped.
    Thanks
    Vasu

  • Using VBA for loading Query data into Excel workbook

    Hi all.
    I want simply load data from BEx query into Excel Wortksheet using VBA because of report formats and footer section at the end of the results.
    Any code examples, tutorials, comments, suggestions will be regarded.
    thanx in advance,
    Gediminas

    The difficalty is that I don't know the number of rows report will return. And I need my footer only on LAST page of workbook.
    Another thing I can't imagine how to do by using standart BEx functionality is to design complex column header set (merged columns, sub-columns and etc.).

  • Table for SAP Query Authors

    Hi,
    Is there a SAP table to determine the author of the Query from the TCODE SQ01?

    Jim,
      Check the table "AQGDBBG" 
      Field name  :BGUNAM
    Don't forget ot reward if useful

  • TDE Table encryption SQL Query performance is very very slow

    Hi,
    We have done one column encryption for one table using TDE method with no salt option and it got impact the response time of sql query to 32 hours.
    Oracle database version is 10.2.0.5
    Example like
    alter table abc modify (numberx encrypt no salt);
    after encryption the SQL execution taking more time and below are the statement for the same.
    ================================
    declare fNumber cardx.numberx%TYPE;
    fCount integer :=0;
    fserno cardx.serno%TYPE;
    fcaccserno cardx.caccserno%TYPE;
    ftrxnfeeprofserno cardx.trxnfeeprofserno%TYPE;
    fstfinancial cardx.stfinancial%TYPE;
    fexpirydate cardx.expirydate%TYPE;
    fpreviousexpirydate cardx.previousexpirydate%TYPE;
    fexpirydatestatus cardx.expirydatestatus%TYPE;
    fblockeddate cardx.blockeddate%TYPE;
    fproduct cardx.product%TYPE;
    faccstmtsummaryind cardx.accstmtsummaryind%TYPE;
    finstitution_id cardx.institution_id%TYPE;
    fdefaultaccounttype cardx.defaultaccounttype%TYPE;
    flanguagecode cardx.languagecode%TYPE;
    froute integer;
    begin for i in (select c.numberx from cardx c where c.stgeneral='NORM')
    loop select c.serno, c.caccserno, c.trxnfeeprofserno, c.stfinancial, c.expirydate, c.previousexpirydate, c.expirydatestatus, c.blockeddate, c.product, c.accstmtsummaryind, c.institution_id, c.defaultaccounttype, c.languagecode, (select count(*) from caccountrouting ar where ar.cardxserno=c.serno and ar.rtrxntype=ISS_REWARDS.GetRewardTrxnTypeserno) into fserno, fcaccserno, ftrxnfeeprofserno, fstfinancial, fexpirydate, fpreviousexpirydate, fexpirydatestatus, fblockeddate, fproduct, faccstmtsummaryind, finstitution_id, fdefaultaccounttype, flanguagecode, froute from cardx c where c.numberx=i.numberx; fCount := fCount+1; end loop; dbms_output.put_line(fCount); end;
    ===============================
    Any help would be great appreciate
    Thanks,
    Mohammed.
    Edited by: Mohammed Yousuf on Oct 7, 2011 12:47 PM

    Still, that's not enough evidence to prove that TDE is indeed the culprit. Can you trace the query before and after enabling the TDE using 10046 and post it here.
    Aman....

  • Adding indexes to a table is slowing down query performance.

    I am running a query against a table which contains approx. 4 million records in it. So I created 4 indexes on the table and noticed that the performance on my query drastically decreased. I went back and began remove and creating the indexes in different combinations. It turns out that whenever two of four indexes are created the performance worsens. The strange thing about this problem is when I do an explain plan on the query the cost is greater when the performance is better and the cost is less when the performance is worse. Also Oracle only uses one out of the four indexes on the table for this query.
    I'd like to try to understand what is going on with the Oracle optimizer to try to fix this problem.

    Mark,
    Below is the information you requested.
    DATABASE: 10.2.0.3.0
    QUERY:
    select distinct object, object_access from betweb_objects
    where instr(object_access,'RES\') = 0
    and object_access_type = 'ADM'
    and object in (select distinct object
    from betweb_objects
    where instr(object_access,'RES\') = 0
    and object_access_type = 'NTK'
    and object not like '%.%'
    and substr(object_access,instr(object_access,'\')+1) in (select distinct substr(object_access,instr(object_access,'\')+1)
    from betweb_objects
    where object_access_type = 'NTK'
    and instr(object_access,'RES\') = 0
    minus
    select distinct upper(id)
    from uamp.ad_users
    where status = 'A'))
    TABLE:
    BETWEB_OBJECTS
    OBJECT                VARCHAR2
    OBJECT_ACCESS           VARCHAR2
    OBJECT_ACCESS_TYPE      VARCHAR2
    INDEXES ON BETWEB_OBJECTS:
    BETWEB_OBJECTS_IDX1
    OBJECT
    BETWEB_OBJECTS_IDX2
    OBJECT_ACCESS
    BETWEB_OBJECTS_IDX3
    OBJECT_ACCESS_TYPE
    BETWEB_OBJECTS_IDX4
    OBJECT_ACCESS
    OBJECT_ACCESS_TYPE
    TABLE:
    AD_USERS
    ID                VARCHAR2
    DOMAIN           VARCHAR2
    FNAME           VARCHAR2
    LNAME           VARCHAR2
    INITIALS           VARCHAR2
    TITLE           VARCHAR2
    DN                VARCHAR2
    COMPANY           VARCHAR2
    DEPARTMENT      VARCHAR2
    PHONE           VARCHAR2
    MANAGER           VARCHAR2
    STATUS           VARCHAR2
    DISPLAY_NAME      VARCHAR2
    EXPLAIN PLAN when performance is better:
    SELECT STATEMENT      Rows=13,414      Time=643,641      Cost=53,636,676      Bytes=6,948,452
    HASH UNIQUE           Rows=13,414      Time=643,641      Cost=53,636,676      Bytes=6,948,452
    HASH JOIN           Rows=694,646,835      Time=428           Cost=35,620      Bytes=359,827,060,530
    VIEW VW_NSO_1           Rows=542           Time=42           Cost=3,491           Bytes=163,684
    MINUS
    SORT UNIQUE           Rows=542                               Bytes=9,756
    INDEX FAST FULL SCAN BETWEB_OBJECTS_IDX4 Rows=26,427      Time=40           Cost=3,302           Bytes=475,686
    SORT UNIQUE           Rows=16,228                          Bytes=178,508
    TABLE ACCESS FULL AD_USERS Rows=16,360      Time=2           Cost=113           Bytes=179,960
    HASH JOIN                Rows=128,163,623      Time=322           Cost=26,805      Bytes=27,683,342,568
    TABLE ACCESS FULL BETWEB_OBJECTS Rows=9,161           Time=154           Cost=12,805      Bytes=989,388
    TABLE ACCESS FULL BETWEB_OBJECTS Rows=25,106      Time=154           Cost=12,822          Bytes=2,711,448
    EXPLAIN PLAN when performance is worse:
    SELECT STATEMENT      Rows=13,414      Time=22,614      Cost=1,884,484      Bytes=2,897,424
    HASH UNIQUE           Rows=13,414      Time=22,614      Cost=1,884,484      Bytes=2,897,424
    HASH JOIN                Rows=128,163,623      Time=322           Cost=26,805      Bytes=27,683,342,568
    TABLE ACCESS FULL BETWEB_OBJECTS Rows=9,161           Time=154           Cost=12,805      Bytes=989,388
    TABLE ACCESS FULL BETWEB_OBJECTS Rows=25,106      Time=154           Cost=12,822      Bytes=2,711,448
    MINUS
    SORT UNIQUE NOSORT      Rows=209           Time=40           Cost=3,305           Bytes=3,762
    INDEX FAST FULL SCAN BETWEB_OBJECTS_IDX4 Rows=264           Time=40           Cost=3,304           Bytes=4,752
    SORT UNIQUE NOSORT      Rows=164           Time=2           Cost=115           Bytes=1,804
    TABLE ACCESS FULL AD_USERS Rows=164           Time=2           Cost=114           Bytes=1,804

  • Tune Oracle tables for performance

    Hi,
    I am using an application that will dynamically insert network
    traffic data into about a dozen tables on a 9i database. Since I
    am fairly new to Oracle, I would like to know the best method for
    tuning/configuring these tables for best possible performance.
    What I'm looking for is how I can configure Space, Extents, Free
    Lists, etc. for these tables so I can realize the best possible
    performance.
    Thank you,

    I know some terms
    1) Setting the block size in inti.ora
    2) Putting Rollback segement files into different database and datafiles into different disk storage.
    3) Creating temp table space with free option
    4) Clustering your tables and indexing them
    5) Quering your data with query Hints
    But for exact statitstics you should got asktom.oracle.com. This person Tom, is might have best solution for you.
    If you get the reply plz mail me at [email protected]

  • Help for the query

    hello,
    I have a question again. The tables for the query are "Patient" and "Station".
    Station-Table:
    s_id, station
    Patient-Table:
    p_id, name, s_id, gender
    I want to know how many Patient are Male and Female for each Station. That means that the output should be:
    Station Male Female
    S1 12 10
    S2 6 4

    I supposed the values in gender are 'M' for Male and 'F' for Female
    select s.station, sum(decode(p.gender, 'M', 1, 0)) Male , sum(decode(p.gender, 'F', 1, 0)) Female
    from station s, patient p
    where s.s_id=p.s_id
    group by s.station;

Maybe you are looking for

  • Problem with mic in on audigy 2 platinum

    Hello, When I try to record signal from my stereo microphone (for example sony ecm 959a) and I'm switchng to "mic in" only one channel is recording. When I'm switching to "line in " everything is ok (but then I can't use gain). "Mic in" in Audigy 2 p

  • Error when getting port from service

    I've got a problem....I've succesfully created a ws client ..... here is the wsdl: <?xml version="1.0" encoding="UTF-8"?> <!-- Published by JAX-WS RI at http://jax-ws.dev.java.net. RI's version is Oracle JAX-WS 2.1.4. --> <!-- Generated by JAX-WS RI

  • Illustrator CS6 - hidden tools do not appear

    Normally a double-click on certain tools in the tool palette will reveal other tools hidden within it. For example, the direct selection tool (white arrow), when double-clicked will reveal the group selection tool. Double clicking has suddenly failed

  • Problem with PPPoE & ADSL conection

    Hi People, I installed Archlinux a few times, and 0 errors. Recently, (Yesterday) i was installing Arch, and i can`t do the conection. I have, Ethernet Card ADSL Modem Router   (No usb)    This modem not have the user name, password etc... Steps> 1 B

  • How can I transfer some or all Disc Menu templates from PrE7 to PrE9

    I have seen various versions of peopIe swapping templates from one program to another, but didn't see my scenario covered.  I have both Premiere Elements 7 and Premiere Elements 9 and I like the Disc Menu templates better from PrE7 better than PrE9.