Database best practice: max number of columns

I have two questions that I would appreciate comments on...
We have a table titled TRANSACTION with 160 columns and a view titled TRANSACTIONS_VIEW with 233 columns in it. This was designed by someone a while ago. I am wondering if it is against best practice to have this many columns in a table? I have never before seen a table with this many columns in it and feel that there must be a way to split the data into multiple tables to make it more manageable.
My second question is on partitions, the above table TRANSACTION is partitioned by manually specifying partitions with max values on the transaction date starting august 2008 through january 2010 at 1 month increments. Isn't it much better to specify automatic partitioning using the interval clause?

kev374 wrote:
thanks for the response, yes there are many columns that violate 3NF and that is why the column count is so high.
Regarding the partition question, by better I meant by using "interval" the partitions could be created automatically at the specified interval instead of having to create them manually.The key is to understand the logic behind these tables and columns. Why it was designed like this, if it's business requirement then 200 some columns are not bad, if it's design flaw 20 columns could be too much. It's not necessarily always good to have strict 3NF design, sometime for various reason you can denormalize the tables to get better performance.
As to partitioning Q, you have to do the rolling window (drop/add partition as time goes by) type of partitioning scheme manually so far.

Similar Messages

  • Max number of columns in a Webi report

    In a worst-case scenario test, I requested about 1700 columns in a Webi report, but only a little more than 300 appeared in the SQL; a TOAD request for the same 1700 columns was successful. Is there an upper limit for the number of objects that can appear in a Webi report? Is there a parameter that could increase the limit?

    My eyes read the first half and my brian assumed the second half.
    I dont think there is a limitaion on the number of objects that a WebI report can pull.
    The only limitation I know of is the maximum number of column that you can display in the report panel; the max number is 100,000.
    With the limit on the max number of column diplay in a WebI report can be translated into the max number of columns in the query panel.
    This setting is in CMC> Applications>Web Intelligence-->Quick Display mode

  • Planning Layout error K9162 - max number of columns in planning layout rest

    Hello collegues,
    our client would like to have 34 columns in planning layout costs activity inputs of TA KP67. But error message K9162 - max. number of columns is restricted to 30. So I tried to switch of the message with TA OBA5 application area K9 - msg number 162. But this does not work as the error comes up again even though the message is switched off.
    Do you have some ideas?
    Thanks and regards,
    Christian

    Hello,
    thanks for your response. I have maintaine in OBMSG:
    K9     162     WE     E    switch off selected
    then in OBA5 for dialog and batch I have maintained  -
    but the error is upcoming again. So I think that this sort of message cannot be switched off as the planning layout is restricted to 30 columns by design maybe?
    thanks and regards,
    Christian

  • What's the max number of columns a table can have on 32 bit OS & 64 bit OS

    What is the max number of columns a table can have on 32 bit OS & 64 bit OS ?
    Edited by: sameer naik on 02-Jul-2010 02:11

    For TimesTen 7.0 and 11.2.1 releases the limit on the number of columns per table is 1000 for both 32 and 64 bit TimesTen. This can be found in the documentation under System Limits.
    Regards,
    Chris

  • Best practices on number of pipelines in a single project/app to do forging

    Hi experts,
    I need couple of clarification from you regarding endeca guided search for enterprise application.
    1)Say for example,I have a web application iEndecaApp which is created by imitating the jsp reference application. All the necessary presentation api's are present in WEB-INF/lib folder.
    1.a) Do I need to configure anything else to run the application?
    1.b) I have created the web-app in eclipse . Will I be able to run it from the any thirdparty tomcat server ? If not then where I have to put the war file to successfully run the application?
    2)For the above web-app "iEndecaApp" I have created an application named as "MyEndecaApp" using deploymenttemplate. So one generic pipeline is created. I need to integrate 5 different source of data . To be precise
    i)CAS data
    ii)Database table data
    iii)Txt file data
    iv)Excel file data
    v)XML data.
    2.a)So what is the best practice to integrate all the data. Do I need to create 5 different pipeline (each for each data) or I have to integrate all the 5 data's in a single pipeline ?
    2.b)If I create 5 different pipeline then all should reside in a single application "MyEndecaApp" or I need to create 5 difference application using deployment template ?
    Hope you guys will reply it back soon..... Waiting for your valuable response ..
    Regards,
    Hoque

    Point number 1 is very much possible ie running the jsp ref application from a server of your choice.I havent tried that ever but will draw some light on it once I try.
    Point number 2 - You must create 5 record adapters in the same pipeline diagram and then join them with the help of joiner components. The resultant must be fed to the property mapper.
    So 1 application, 1 pipeline and all 5 data sources within one application is what should be the ideal case.
    And logically also since they all are related data, so must be having some joining conditions and you cant ask 5 different mdex engines to serve you a combined result.
    Hope this helps you.
    <PS: This is to the best of my knowledge>
    Thanks,
    Mohit Makhija

  • Max number of columns

    Hi Friends,
    How can I find the table with maximum number of columns in the database 10g? Please help...
    Thanks

    For a particule schema/user, table(owned by user) with max columns:
    SELECT TABLE_NAME FROM USER_TAB_COLS
    WHERE COLUMN_ID = (SELECT MAX(COLUMN_ID) FROM USER_TAB_COLS)
    --There can be multiple tables having same max columns
    For a particular user/Schema, table with max columns ( tables owned by other users but user is having acces/grants to the table)
    SELECT OWNER, TABLE_NAME FROM ALL_TAB_COLS
    WHERE COLUMN_ID = (SELECT MAX(COLUMN_ID) FROM ALL_TAB_COLS)
    --There can be multiple tables having same max columns
    table with max columns in DB -
    SELECT OWNER, TABLE_NAME FROM DBA_TAB_COLS
    WHERE COLUMN_ID = (SELECT MAX(COLUMN_ID) FROM DBA_TAB_COLS)
    --There can be multiple tables having same max columns
    --This may include table in sys or other schemas as well
    HTH,
    ~Yogesh

  • Best practice for number of result objects in webi

    Hello all,
    I am just wondering if SAP has any recommendation or best practice document regarding number of fields in Result Objects area for webi. We are currently running on XI 3.1 SP3...one of the end user is running a webi with close to 20 objects/dimensions and 2 measure in result objects. The report is running for 45-60 mins and sometimes timing out. The cube which stores data has around 250K records and the report would return pretty much all the records from the cube.
    Any recommendations/ best practices?
    On similar issue - our production system is around 250GB what would be the memory on your server typically...currently we have 8GB memory on the sap instance server.
    Thanks in advance.

    Hi,
    You mention Cubes so i suspect BW or MS AS .   Yes,  OLAP data access (ODA) to OLAP DataSets is a struggle for WebIntelligence which is best at consuming Relational RowSets.
    Inefficient MDX queries can easily be generated by the webi tool, primeraly due to substandard (or excessive) query and document design. Mandatory filters and focused navigation (i.e. targetted BI questions) are the best for success.
    Here's an intersting article about "when is a webi doc too big" https://weblogs.sdn.sap.com/pub/wlg/18706
    Here's a best practice doc about webi report design and tuning ontop of BW MDX : https://service.sap.com/~sapidb/011000358700000750762010E 
    Optimization of the cube itself, including aggregates and cache warming is important. But especially  use of Suppress Unassigned nodes in the BW hierarchy, and "query stripping" in the webi document.
    finally,  patch level of the BW (BW-BEX-OT-MDX) component is critical.  i.e. anything lower than 7.01 SP09 is trouble. (memory management, mdx optimization, functional correctness)
    Regards,
    H

  • Max number of columns in table for compression

    i was told that 256 columns is the max number in a single table for oracle to do compression on table. Can anyone confirm this ?

    I can't confirm it off the top of my head but if it is the case then it is documented at http://tahiti.oracle.com you can look it up yourself.
    What I do want to weigh in on is the insanity of a table with more than 256 columns. I consider any table with more than 50 columns suspect, more than 100 likely a strong indication that someone understands little about normalization. Anyone contemplating 255+ columns should have their fingers removed from their keyboard by force if necessary.

  • Database best practices (concurreny)?

    Hey all, i'm new to servlets and am worried about concurrency. As far as I can tell as long as variables are local to you doGet/doPost then there should be any issues.
    That being said, i've noticed alot of sample code creating a database connection in the init() method and destroying it in the destory() method.
    Are database connections thread safe? Are there any 'best' practices for database access?
    What is the best way to insure scalability accessing a database with multiple users hitting the same servlet?
    Thanks.
    Dave

    Use connection pool for one.

  • One-time import from external database - best practices/guidance

    Hi everyone,
    I was wondering if there was any sort of best practice or guideline on importing content into CQ5 from an external data source.  For example, I'm working on a site that will have a one-time import of existing content.  This content lives in an external database, in a custom schema from a home-grown CMS.  This importer will be run once - it'll connect to the external database, query for existing pages, and create new nodes in CQ5 - and it won't be needed again.
    I've been reading up a bit about connecting external databases to CQ (specifically this:http://dev.day.com/content/kb/home/cq5/Development/HowToConfigureSlingDatasource.html), as well as the Feed Importer and Site Importer tools in CQ, but none of it really seems to apply to what I'm doing.  I was wondering if there exists any sort of guidelines for this kind of process.  It seems like something like this would be fairly common, and a requirement in any basic site setup.  For example:
    Would I write this as a standalone application that gets executed from the command-line?  If so, how do I integrate that app with all of the OSGi services on the server?  Or,
    Do I write it as an OSGi module, or a servlet?  If so, how would you kick off the process? Do I create a jsp that posts to a servlet?
    Any docs or writeups that anyone has would be really helpful.
    Thanks,
    Matt

    Matt,
    the vault file format is just an xml representation of what's in the
    repository and the same as the package format. In fact, if you work on
    your projects with eclipse and maven instead of crxdelite to do your
    work, you will become quite used to that format throughout your project.
    Ruben

  • Data Load function - max number of columns?

    Hello,
    I was able to successfully create a page with Data Load Wizard. I've used this page wizard before with no issues, however in this particular case I want to load a spreadsheet with a lot of columns (99 to be precise). When I run the page, it uploads the spreadsheet into the proper table, but only the first 45 columns. The remaining 56 columns are Null for all rows. Also, there are 100 rows in the spreadsheet and it doesn't load them all (it loads 39).
    Is there a limit to the number of columns it can handle?
    Also, when I re-upload the same file, the data load results show that it Inserted 0 rows, Updated 100 rows, Failed 0 zeros. However there are still only a total of 39 rows in the table.
    Thoughts?
    Steve

    Steve wrote:
    FYI, I figured out why it wasn't loading all 100 rows. Basically I needed to set two dependent columns in the load definition instead of one; it was seeing multiple rows in the spreadsheet and assumed some were the same record based on one column. So that part is solved...
    I still would like feedback on the number of columns the Data Load Wizard can handle, and if there's way to handle more than 45.The Data Load Wizard can handle a maximum of 46 columns: +{message:id=10107069}+

  • Max number of columns in Alv grid display.

    Is there any limitation on number of fields that can be displayed using alv grid display.
    Please tell how i can display 199 fields using ALV.
    Thanks in advance.

    I am not sure of the maximum of columns possible.
    If you see the col_pos field in the field catalog table it can have only 2 digits. so i would assume it would be only 99 columns, but not sure.
    Would get you more information soon.
    Thanks,
    Balaji

  • Database best practice theoretical question

    Hi Forums
    I wasn't sure which forum to post this under- general non-specific Database question.
    I have a client server database app, which is working well with a single network client currently. I need to implement some kind of database record locking scheme to avoid concurrency problems. I've worked on these kind of apps. before, but never one this complicated. Obviously I've scourged the web looking into this, but found not too much good reference material.
    I was thinking of making some kind of class with several lists- one for each table that requires locking having record locks whilst transactions are occuring. Then having clients request locks on records, then release locks when they are finished.
    I'm wondering if client requests a lock..then crashes without releasing locks. Should there be a timeout set to enable server release record? Should the client be polling the server to show its still working on the record? This sounds a mess to me!
    If anyone can direct me to some good reference material on this I would appreciate it.
    br
    FM
    EDT* OK now I feel silly! Theres a great article on [Wiki |http://en.wikipedia.org/wiki/Concurrency_control] showing some often used methods.
    Edited by: fenderman on Mar 6, 2010 3:13 AM

    kev374 wrote:
    thanks for the response, yes there are many columns that violate 3NF and that is why the column count is so high.
    Regarding the partition question, by better I meant by using "interval" the partitions could be created automatically at the specified interval instead of having to create them manually.The key is to understand the logic behind these tables and columns. Why it was designed like this, if it's business requirement then 200 some columns are not bad, if it's design flaw 20 columns could be too much. It's not necessarily always good to have strict 3NF design, sometime for various reason you can denormalize the tables to get better performance.
    As to partitioning Q, you have to do the rolling window (drop/add partition as time goes by) type of partitioning scheme manually so far.

  • Best practice for Yes / No column

    Hello,
    When the column stores only Yes / No Values. What is the right definition for that column. Is the following right pertaining IS_PAID column :
    CREATE TABLE ABSENCE
        ID         NUMBER (3) NOT NULL ,
        USER_ID    NUMBER (3) NOT NULL ,
        IS_PAID    VARCHAR2 (1 BYTE) DEFAULT 'N' NOT NULL ,
    ALTER TABLE ABSENCE ADD CHECK
      IS_PAID IN ('N', 'Y')
    ;Regards,

    >
    Yes, but I wouldn't make too much fuss about the choice between function-based indexes and virtual columns - in either case they're generally seen as an add-on that requires a change to the code to work.
    >
    True - but the following is an example query that includes the 'fuss' that I like to avoid by using a VIRTUAL column.
    From your Oracle Scratchpad of December 16, 2010 'FBI oddities'
    http://jonathanlewis.wordpress.com/2010/12/16/fbi-oddities/#more-5249 (OP's issue is pretty much the one you discuss in that article.)
    select     /*+ index(t1) t1_f1 */     v1 from     t1 where     decode(state,'CLOSED', to_number(null), n1 ) = 100 ; Many developers, especially the less-experienced ones, can't write those 'decodes' properly. And they certainly aren't 'self-documening or as straightforward as a simple "WHERE column = 'value'".
    For me simpler is better.
    Edited by: rp0428 on Apr 23, 2013 12:34 PM to add code sample
    These are the two choices when using the sample DDL below (based on your sample from the link above):
    select * from t2 where decode(state,'OPEN', state, null ) = 'OPEN'       
    select * from t2 where virtual_state = 'OPEN'I prefer to use, and train developers to use the second choice. It is simpler and much more intuitive.
    create table t2 (
        state       varchar2(10),
        n1      number,
        v1      varchar2(10),
        padding varchar2(100),
        virtual_state varchar2(10) GENERATED ALWAYS AS (
        case when state = 'OPEN' then state else null end ) VIRTUAL
    create index t2_state on t2(decode(state,'OPEN', state, null ));
    create index t2_virtual_state on t2(virtual_state);
    insert into t2 (state, n1, v1, padding)
        select
            decode(mod(rownum,100),0,'OPEN','CLOSED'),
            rownum,
            lpad(rownum,10,0),
            rpad('x',100,'x')
        from
            all_objects
        where
            rownum <= 5000 ;
    select * from t2 where decode(state,'OPEN', state, null ) = 'OPEN';       
    select * from t2 where virtual_state = 'OPEN';

  • Best practice - comparing NUMBER(4, 0) and VARCHAR2(5)

    I have two fields in two different views that I am comparing in a complex query. This is just one small part of the query.
    I want to make sure I am doing this the best way. First, here are the two fields:
    ZAS.ORG_ID NUMBER(4,0)
    ORG.IORG_NBR VARCHAR2(5)
    And here is the part of the query that I select the first field.
    (CASE WHEN ZAS.ORG_ID IS NULL THEN NULL ELSE (TO_CHAR(ZAS.ORG_ID, '0000')) END)
    As you can see, I do the same thing in the second part of the query. To help you understand, note that I am basically pulling back a bunch of data from two sources and comparing them.
    (CASE WHEN org.iorg_nbr IS NULL THEN NULL ELSE (TO_CHAR(org.iorg_nbr, '0000')) END)
    Please let me know if I need to supply any additional information. Any help would be greatly appreciated. Thanks!

    Hello,
    This:
    (CASE WHEN ZAS.ORG_ID IS NULL THEN NULL ELSE (TO_CHAR(ZAS.ORG_ID, '0000')) END)Could simply be this:
    (TO_CHAR(ZAS.ORG_ID))And this:
    (CASE WHEN org.iorg_nbr IS NULL THEN NULL ELSE (TO_CHAR(org.iorg_nbr, '0000')) END)Can simply be this:
    (org.iorg_nbr)No need to do anything with that, since it's already TRIMmed (no spaces), and you're converting ZAS.ORG_ID to a CHAR string, so they both will match without the extra '0's. And no need to explicitly handle NULLs at all.

Maybe you are looking for

  • Why doesn't the new flash player I downloaded work? Tried it several times. Firefox 6.0.2.

    Every time I've tried to watch a youtube video in the past week it asks me to update my adobe flash player. I've tried at least 5 times over the course of several days and now I'm convinced that it isn't me. Apparently, you can't get tech support fro

  • Can we add a description in a Formula Editor in u201CCalculated Key Figureu201D

    Hi All, I am new to SAP - Infact beginner, I started doing Bex Reporting. However I got stuck in Calculated Key Figures... Here is my Scenario: If (0QUANTY_B) > Max[ZREQMTS, ZINVUSAGE] then u201CExcessu201D else u201CNo Stocku201D Note: OQuant_b, ZRE

  • How to test mail adapter ?

    Hi, Is there any guide thats explains the process of sending mails from XI to outlook. What is the Outbound Interface and Inbound interface. I know the configuration of Mail Adapter but little confused about Design. Any suggestions, or docs as how to

  • Atom binding not working

    Hi guys To make it simple, I've got a Flex application with Advanced Data Grids binded with XML and Atom feeds. With my XML file, the application works very well: jiraList = new XMLList(event.result.channel.item); However, when I try to access Atom f

  • How do I get past this message on my MacBook?

    Hello everyone, Today I wanted to install openSUSE 12.1 on my MacBook. I partitioned my hard drive first. I downloaded the openSUSE iso file from their site; I burned it to a DVD as usual, and I held option or alt on my computer on startup. I choose