Dimension size more than 50%

Hi Gurus,
May I know what are the issues, we will face, if the size of the Dimension table is more than 25%?
In what way it will affect the performance?
Thanks in advance..
Thansk
ananya

Hi,
if you focus on query performance (which is at the end what the users will appreciate the most) then be very cautious when flagging a dimension as line item...
As per my experience the 20% as threshold to flag a dim as line item is too low; I flag it when its cardinality is more than 70 even 80% and if I can afford it!
Indeed when having many characteristics it is sometimes very expensive to use only one dim for one charateristic!
It all depends on your data distribution and how your organize the characteristics in your dimensions. You cannot take for granted that if you are loading billing documents and post the billing document number in your cube "you should put the doc num in a line item dimension"... Simply because in my case, billing documents have in average more 50 items...
You should first try to analyze where is you system spending the most of your 10 minutes...
- could an additional index on one of the dimension table help?
- is your report using navigational attributes? perhaps an aditional index on the X table and/or P table could help a lot...
- do you collapse your cube?
- do you use aggregates?
- are all the indexes created and regenerated from time to time (avoid degenerescence)?
- are your statistics updated?
- which RDBMS are you using? Oracle can really be fine tuned...
- do you use partitioning? Physical and/or logical?
They're so many spots to tune in order to have an effincient BI system.... but one thing is for sure: 10 minutes is definitively too loooong... As a matter of fact, I start to wonder when the average time for a navigational step is more than 7 seconds in my system....
hope this helps...
Olivier.

Similar Messages

  • How can we know that size of dimension is more than fact table?

    how can we know that size of dimension is more than fact table?
    this was the question asked for me in interview

    Hi Reddy,
      This is common way finding the size of cube or dimensions or KF.
    Each keyfiure occupies 10 Bytes of memory
    Each Char occupies 6 Bytes of memory
    So in an Infocube the maximum number of fields are 256 out of which 233 keyfigure, 16 Dimesions and 6 special char.
    So The maximum capacity of a cube
    = 233(Key figure)10 + 16(Characteristics)6 + 6(Sp.Char)*6
    In general InfoCube size should not exceed 100 GB of data
    Hope it answer your question.
    Regards,
    Varun

  • The "Measures" dimension contains more than one hierarchy... Collation issue

    It appears that an Excel query pased through to SSAS has a "measures" with lowercase "m" when analysis services expects an uppercase "M" so it should look like "Measures". Is there a fix in excel to allow
    the correct passing of "Measures" member name to the cube?
    BTW, I have NO Calculations in the cube.
    In excel 2013 when I pivot with a pivot table connected to a case sensitive collation (non default config)
    cube and perform a filter by "Keep only Selected Items" I get the error "The 'Measures' dimension contains more than one hierarchy, therefore the hierarchy must be explicity specified".
    When I revert back to server wide setting to case insensitive, and I preform the exact same pivoting function it works without error. The problem appears to be that excel does not understand the server collation setting.
    When I run SQL Server Profilier I narrowed down the MDX statement run in Excel that gives me an error to this:
    with
    member measures.__XlItemPath as
    Generate(
    Ascendants([Employee].[Location Code].currentmember),
    [Employee].[Location Code].currentmember.unique_name,
    "|__XLPATHSEP__|"
    member measures.__XlSiblingCount as
    Generate(
    Ascendants([Employee].[Location Code].currentmember),
    AddCalculatedMembers([Employee].[Location Code].currentmember.siblings).count,
    "|__XLPATHSEP__|"
    member measures.__XlChildCount as
    AddCalculatedMembers([Employee].[Location Code].currentmember.children).count
    select { measures.__XlItemPath, measures.__XlSiblingCount, measures.__XlChildCount } on columns,
    [Employee].[Location Code].&[01W]
    dimension properties MEMBER_TYPE
    on rows
    from [Metrics]
    cell properties value
    Playing around with the query I discovered that if I capitalize the first letter of the "with measures" member, the statement works.
    with
    member Measures.__XlItemPath as
    Generate(
    Ascendants([Employee].[Location Code].currentmember),
    [Employee].[Location Code].currentmember.unique_name,
    "|__XLPATHSEP__|"
    member Measures.__XlSiblingCount as
    Generate(
    Ascendants([Employee].[Location Code].currentmember),
    AddCalculatedMembers([Employee].[Location Code].currentmember.siblings).count,
    "|__XLPATHSEP__|"
    member Measures.__XlChildCount as
    AddCalculatedMembers([Employee].[Location Code].currentmember.children).count
    select { measures.__XlItemPath, measures.__XlSiblingCount, measures.__XlChildCount } on columns,
    [Employee].[Location Code].&[01W]
    dimension properties MEMBER_TYPE
    on rows
    from [Metrics]
    cell properties value
    Also, I realise that I could change the collation on just the cube itself to case insenstive to get this to work, but I really don't want to do an impact analysis of running a mixed collation environment.
    So, my question is: Is there an excel fix that will allow me to run a case sensitve cube and allow me to click on filter and filter by "keep only selected items" or "Hide selected Items"? All other filtering works, it's only those two
    filtering options error for me.
    Here are the versions I'm working with:
    Excel 2013 (15.0.4535.1507) MSO(15.0.4551.1007) 32-bit Part of Microsoft Office Professional Plus 2013
    Microsoft Analysis Server Enterprise 2012 11.0.3000.0
    Any help would be appreciated. Thank you in advance!

    Hi, i assume this logic is for Dimension formula?
    If you have multiple hierarchy like ParentH1 and ParentH2 you should use FormulaH1 and FormulaH2 and not FORMULA column.
    in FORMULAH1
    [Account.H1].[Account_A] / [Account.H1].[Account_B]

  • Handling xml message of size more than 100mb in SAP PI 7.1

    Dear Experts,
    Is it possible for PI to pick-up and process a XML message of size more than 100 MB in PI 7.1 EHP-1?
    If yes, can you please let me know how to handle it?
    Thank  you.

    Hi Saravana,
    it is not a best practice to more than 100mb..
    you can increase below parameters and so that you would be able to process for the best..
    u2022     UME Parameters :  May be we need to look into the pool size and poolmax wait parameters - UME recommended parameters (like: poolmaxsize=50, poolmaxwait=60000)
    u2022     Tuning Parameters:  May be we need to look/define the Message Size Limit u201Clike: EO_MSG_SIZE_LIMIT = 0000100u201D under tuning category
    u2022     ICM Parameters: May be we need to consider ICM parameters (ex: icm/conn_timeout = 900000. icm/HTTP/max_request_size_KB = 2097152)
    Thanks and Regards,
    Naveen

  • XML to Idoc- XML size more than 200 MB

    Hi Experts,
    I have an interface with source xml file size more than 200MB. I need to map it to Idoc and send it to ERP. I am planning to change wsdl of idoc(changing occurence 1.1 to 0.99999) to send multiple idocs in one message.
    I want to split XML file somehow and send it. what are the different options i have to achieve that ?  I am short of ideas.
    I know if we have flat file,  we can use recordset per message in sender file adapter.
    OR
    Cleint is willing to send it as a SOAP request, i do not want to use SOAP for such a big file.
    Regards
    Inder

    Hi Inder,
    As per sap recommendation we would be able to handle 100 MB, you need to tune your server by increasing the [arametersso that you would be able to handle the messages with big payload.
    By default the parameter  icm/HTTP/max_request_size_KB will be 10240 which can handle 100MB of file size.if you increase the parameter value by tuning ur system you can process a file bigger than that..
    Please refer to the below links for reference....
    [link1|http://help.sap.com/saphelp_nw04s/helpdata/en/58/108b02102344069e4a31758bc2c810/content.htm]
    [link2|http://help.sap.com/saphelp_nwpi71/helpdata/de/95/1528d8ca4648869ec3ceafc975101c/content.htm]
    as per the above suggestions the best practice is to sand as a multiple idocs splitting into chunks.
    Cheers!!!!
    Naveen.

  • Outofmemory while creating a new object of file with size more than 100 MB

    I have created an application which generates a report by getting the data from our archived files (.zip file).By the time, the application is reaching a file with size more than 100 mb, it is running out fo memory while creating the object of that particular file. Can some one help me by tellin if there id way to resolve this issue?
    Thanks in advance

    If you're getting OutOfMemoryError, the simplest thing to try is to give the VM more memory at startup. For Sun's VM, I believe the default is 64 MB. You can increase this by using the -X args that control the heap size. For example: java -Xms128m -Xmx256m ... etc. ... This says start with 128 MB of heap, and allow it to grow up to 256 MB.
    One thing to consider, though, is do you need that much stuff in memory at once? A more intelligent approach might be to unpack the archive to the file system, and then read a file at a time or a line at a time or a whatever at a time is appropriate for the processing you need to do.

  • Couldn't Upload file with size more than 2K

    Hi,
    i am developing an webapplication where in i have to upload a file.we are using hibernate for data storing and retrieval.problem comes when i am uploading a file with size more than 2k bytes.if the file size lesser than 2k it works fine.i found that its a bug with thin jdbc driver version 9.but i am using 10g jdbc driver.still problem exists. i am not suppose to use oci drivers.
    working on
    OS: windows Xp.
    Apps :weblogic8.1
    DB: oracle 9i
    if anyone has solution plz mailme at [email protected]

    I'm not sure where the issue would be. Are you saying that you are using a 9i driver to access a 10g database? If so, download the newer driver and add it to your WEB-INF/lib directory. Remove the existing driver.
    If, on the other hand, you are using a 10g driver for a 9i database, I would not expect problems. However, you could always download the older driver and try it out.
    Or am I missing something?
    - Saish

  • How to transfer my iphone video to computer,the video siza more than 1.5G

    how to transfer my iphone4 video to computer. the video siza more than 1.5G

    Follow the instructions here or here, and then sync the iPhone with the iTunes library containing the content.
    (112570)

  • Load and Read XML file size more than 4GB

    Hi All
    My environment is Oracle 10.2.0.4 on Solaris and I have processes to work with XML file as below detail by PL/SQL
    1. I read XML file over HTTP port into XMLTYPE column in table.
    2. I read value no.1 from table and extract to insert into another table
    On test db, everything is work but I got below error when I use production XML file
         ORA-31186: Document contains too many nodes
    Current XML size about 100MB but the procedure must support XML file size more than 4GB in the future.
    Belows are some part of my code for your info.
    1. Read XML by line into variable and insert into table
    LOOP
    UTL_HTTP.read_text(http_resp, v_resptext, 32767);
    DBMS_LOB.writeappend (v_clob, LENGTH(v_resptext), v_resptext);
        END LOOP;
        INSERT INTO XMLTAB VALUES (XMLTYPE(v_clob));
    2. Read cell value from XML column and extract to insert into another table
    DECLARE
    CURSOR c_xml IS
    (SELECT  trim(y.cvalue)
    FROM XMLTAB xt,
    XMLTable('/Table/Rows/Cells/Cell' PASSING xt.XMLDoc
    COLUMNS
    cvalue
    VARCHAR(50)
    PATH '/') y;
        BEGIN
    OPEN c_xml;
    FETCH c_xml INTO v_TempValue;
    <Generate insert statement into another table>
    EXIT WHEN c_xml%NOTFOUND;
    CLOSE c_xml;
        END
    And one more problem is performance issue when XML file is big, first step to load XML content to XMLTYPE column slowly.
    Could you please suggest any solution to read large XML file and improve performance?
    Thank you in advance.
    Hiko      

    See Mark Drake's (Product Manager Oracle XMLDB, Oracle US) response in this old post: ORA-31167: 64k size limit for XML node
    The "in a future release" reference, means that this boundary 64K / node issue, was lifted in 11g and onwards...
    So first of all, if not only due to performance improvements, I would strongly suggest to upgrade to a database version which is supported by Oracle, see My Oracle Support... In short Oracle 10.2.x was in extended support up to summer 2013, if I am not mistaken and is currently not supported anymore...
    If you are able to able to upgrade, please use the much, much more performing XMLType Securefile Binary XML storage option, instead of the XMLType (Basicfile) CLOB storage option.
    HTH

  • SAP ISR -XI - SAP POS. File size more than 11 KB failing in Inbound

    Hi All,
    We are implementing SAP ISR- XI - POS Retail implementation using
    Standard Content Store Connectivity 2.0, GM Store Connectivity 1.0, and
    other contents.
    In our Inbound Scenario File-RFC , we are picking files from FTP server
    for sales and totals data and if the size of this sales data file in
    format *_XI_INPUT.DAT is greater than 11 kb , it is failing at XI
    Mapping level, saying Exception occurred during XSLT
    mapping "GMTLog2IXRetailPOSLog" of the application. We have tried and tested at mapping level no error found as this is processing files below 11 Kb successfully with same mappings, Also this is standard Mapping by SAP in form of XI Content Store connectivity 2.0.
    At XI Side we have processed the file of eg: 40 KB  by splitting the record data and making
    file size less than 11KB and it is being processed successfully, but file of 40 kb fails.
    XI Server: AIX  Server.
    There may be some memory setting missing or some basis problem also. Kindly let me know how to proceed.
    Regards,
    Surbhi Bhagat

    hi,
    this is hard to believe that such small files cannot be processed
    do your XI mappings work for any other flows with something more then 11kb?
    let me know about that and then we will know some more
    as this is realy very small size
    maybe your XI was installed in on PocketPC
    Regards,
    Michal Krawczyk

  • Broadcasting results not transferring to AL11 if file size more than 3 MB

    Hi All,
    I am broadcasting Workbook results to AL11(application server) by using SAP standard program. If the result file is more than 3MB pre calculation working fine but file is not transferring to application server. Could please let me is there is setting to increase the transfer limit to AL11. Infact I am in touch with Ba
    Thanks in advance.
    Regards,
    J B

    Hi Inder,
    As per sap recommendation we would be able to handle 100 MB, you need to tune your server by increasing the [arametersso that you would be able to handle the messages with big payload.
    By default the parameter  icm/HTTP/max_request_size_KB will be 10240 which can handle 100MB of file size.if you increase the parameter value by tuning ur system you can process a file bigger than that..
    Please refer to the below links for reference....
    [link1|http://help.sap.com/saphelp_nw04s/helpdata/en/58/108b02102344069e4a31758bc2c810/content.htm]
    [link2|http://help.sap.com/saphelp_nwpi71/helpdata/de/95/1528d8ca4648869ec3ceafc975101c/content.htm]
    as per the above suggestions the best practice is to sand as a multiple idocs splitting into chunks.
    Cheers!!!!
    Naveen.

  • Not accepting max cell size more than 1000 in HFM form

    Hi
    in the task list in the user forms iam not able to insert more than 1000 characters in the dev server 11.1.2.1
    while i can do that in the production server 11.1.1.3
    please any one can solve this issue i think i need to increase the max cell size

    Hi
    in the task list in the user forms iam not able to insert more than 1000 characters in the dev server 11.1.2.1
    while i can do that in the production server 11.1.1.3
    please any one can solve this issue i think i need to increase the max cell size

  • OWB10gR2 - Cube with relations to a dimension with more than one hierarchy

    Hi,
    I have defined a time dimension with two hierarchies. The standard hierarchy with the levels year-> month-> day,
    and a week hierarchy with the levels year-> week-> day.
    When I define the cube i choose dim_time on the dimension tab of the cube and select the level day for that dimension since this is the lowest level of both hierarchies.
    When I validate the cube I get the following error Message: VLD-0398: In the Cube test_cube, the dimension DIM_TIME has 2 hierarchies but Cube does not have reference to all lowest levels of these hierarchies.
    Validation details: The Cube should refer to a leaf levels of all multiple hierarchies of the Dimension.
    Do I have to add the dimension to the cube several times? or is it maybe another way to fix this? To me it seems a bit messy to add a dimension for each hierarchy.
    And just as a comment, Yes i know OWB now has a time dimension wizard but I have chosen not to use this at the moment.. ;)
    Regards Ragnar

    Hi, i assume this logic is for Dimension formula?
    If you have multiple hierarchy like ParentH1 and ParentH2 you should use FormulaH1 and FormulaH2 and not FORMULA column.
    in FORMULAH1
    [Account.H1].[Account_A] / [Account.H1].[Account_B]

  • BLOB size more than doc size.

    Hi All,
    We are migrating pdf documents to oracle database.The table in which we are inserting the BLOB's is under the Table Space "USER01".
    When we compare the free bytes availaability between before and after migration, it shows that about 5.23 GB of tablespace bytes has been occupied for
    insering 1930 BLOB's of size 447 MB.
    Is the BLOB's datatypes occupy this much space? Am I doing something wrong in calculating the size occupied?
    Is there any efficient way we can do store the BLOB into the Oracle database?.
    Any help will be appreciated.
    Thanks,
    Rana

    Hi Daniel,
    Thanks for the response, here is table table structure.
    CREATE TABLE RCONTENT
    DOC_ID INTEGER NOT NULL,
    PROD_ID VARCHAR2(60 BYTE) NOT NULL,
    INFO VARCHAR2(20 BYTE),
    DOC_TYPE VARCHAR2(20 BYTE),
    DOC_CONTENT BLOB
    This is existing in 9204 and 10gR1 both.
    Thanks,
    Rana.

  • File size more than 50 MB

    Hi
    I have a scenario where my file size is 50 MB. I have to split the file and then do the processing . whats the best approach
    venkat

    Hi
    Splitting the file before it enters  XI would considerably reduce the load on XI.
    please refer the following links
    /people/alessandro.guarneri/blog/2006/03/05/managing-bulky-flat-messages-with-sap-xi-tunneling-once-again--updated
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    Also check this valuable thread where size  related issue is discussed Re: Reg :The Maximum size of a file that can be processed

Maybe you are looking for