Datatype size

Hi, i want a table which shows the size in bytes of each datatype for capacity planning

user637544 wrote:
Hi, i want a table which shows the size in bytes of each datatype for capacity planningIt is not as simple as that - a varchar2(2000) column could average 500 bytes in size as the full data type size is not used for smaller values. Then there are issues like pctused and pctfree, block sizes, overhead per block, chained rows, nested tables, columns stored out-of-line and so on. Even the position of null columns in relation to not-null columns can result in either storage needed or not for a null value. Then there's indexes - how do you accurately plan the size of a bitmap index for example without lots of details about the exact nature of the data being indexed? What about clusters, partitions, compression and a thousand and one other features that impact space?
Space capacity planning in the modern RDBMS? I'm not sure how much of a real business requirement that is anymore. We deal with terabyte databases - a few GB either way ito space planning is totally irrelevant.. never mind trying to get a somewhat accurate byte total. TB harddrives for PCs are pretty much common.. space is that cheap. You can get a full blown 20+TB storage array (suited for complex Oracle RAC usage too) pretty inexpensively - at a fraction of the cost of a 1TB network area storage system of less than 5 years ago.
There are no i's to dot and t's to cross when it comes to space planning. Byte counting for space planning is, IMO, irrelevant today.

Similar Messages

  • Table defination in datatype size can effect on query execution time.

    Hello Oracle Guru,
    I have one question , suppose I have create one table with more than 100 column
    and i tacke every column datatype varchar2(4000).
    Actual data are in every column not more than 300 character so in this case
    if i execute only select query
    so oracle cursor internaly read up to 4000 character one by one
    or it read character one by one and in last character ex. 300 it will stop there.
    If i reduce varchar2 size 300 instend of 4000 in table defination,
    so is it effect on select query execution time ?
    Thanks in advance.

    When you declare VARCHAR2 column you specify maximum size that can be stored in that column. Database stores actual number of bytes (plus 2 bytes for length). So if yiou insert 300 character string, only 302 bytes will be used (assuming database character set is single byte character set).
    SY.

  • Impact of specifiying max datatype size

    Our database stores multiple vendor versions of the same information. The same attribute could be different lengths for different vendors i.e NUMBER(10) for one vendor and NUMBER(13) for another. Same with VARCHAR2 fields. The vendors change the lengths often and we need to minimize the DDL changes. What is the negative impact of specifying say VARCHAR2(2000) and NUMBER as datatypes to insulate our database from length changes? There are a total of 1200 attributes.
    When executiung a query does oracle allocate the maximum length for each data type as per the table definition in memory cause it does not know how many characters were stored on disk? If so is this max allocation only for the DISK to intermediate buffer? AFter the immediate buffer it only loads the buffer cache and program work area with the actual length of data? As they say memory is cheap these days so is this a non issue?
    We need to know the impact before we make this design decision.
    Thanks in anticipation.

    Well, unless you take this to the extreme case of using CLOB's for all your fields (and you are not using CHAR or NCHAR) you should (technically) be fine to go with the higher values.
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1542606219593
    But that gets away from one purpose of being able to specify a size on data elements, that it enforces a business rule that "column X cannot exceed a value of Y", so i would think you'd have to weigh that into your design considerations somewhere.

  • Confusion about number datatype size

    I created following table, in which i determined the size of field 100.
    As I have study in a book it is given that range of ora number dt is 84 to 127, but here it doesn't work and it yield following error number ORA-012727._
    I am unable to understand why it does show that err no.
    Thanks in-advance.
    Niks

    user8863554 wrote:
    I created following table, in which i determined the size of field 100.
    As I have study in a book it is given that range of ora number dt is 84 to 127, Could you share the reference here?
    but here it doesn't work and it yield following error number ORA-012727._I can't find this error number in Oracle error list. If you meant ORA-12727, then
    ORA-12727: invalid back reference in regular expression
    Cause:      A back references was found before a sub-expression.
    Action:      Ensure a valid sub-expression is being referenced.
    You might want to post what you did exactly.

  • Total size of all columns

    Hi ,
    I created a table as below
    create table my_tab(no number(10), name varchar2(20));
    Now i need to get the total no of size of this table
    i.e no size is 10
    name size is 20
    total is 30
    pls give me a query to get the total size as 30.
    Thanks,
    Oracle

    It's not the way to find the size of table or any object in database, you can't get by assuming or by calculating datatype size.The below is the query to get object size in database.
    select segment_name
    ,      bytes "SIZE_BYTES"
    ,      ceil(bytes / 1024 / 1024) "SIZE_MB"
    from   dba_segments
    where  segment_name like '&obj_name'
    /Edited by: darkStargate on Jan 5, 2012 5:03 PM

  • Data type size in parameters?

    hi to all,
    why we can not give datatype size in procedure parameters any one help me urgent,

    Hi,
    Welcome to the forum!
    Chandra kumar wrote:
    hi to all,
    why we can not give datatype size in procedure parametersData types and sizes are given when a variable is declared. Arguments to procedures are not declared in the procedure, they are declared in a calling block, and then an existingvariable, with a data type and size, is passed to the procedure.
    Someone started another thread on the same question just a little earlier: {message:id=10989165}. See that thread for more.
    any one help me urgent,If you need help urgently, this is not the place to ask for it. People answer questions when they want to, not necesarily when you command them to. Using terms like "urgent" or "ASAP" is not only rude, it's self-defeating. That is, nobody will refuse to give you an answer because you did not say it was "urgent", but some people may not answer because you did say it.

  • Import limit  & char size

    Two quick questions:
    1) Is there any restriction on number of imports per java class?
    2) 'char' datatype size always fixed or platform dependent?
    Thanks in advance.

    Praveen_Forum wrote:
    I am really not sure about my answers but my thinking is
    1) there is no restriction on imports but as the number of imports increases it might have performance issues.Nope. Imports only exist at compile-time, at runtime everything is referred to by its fully-qualified name. A lot of imports might well point to a class that's doing too much, and needs decomposing to smaller classes, though
    2) guess char size is fixedYep

  • Need to create a block based on procedure

    We can create a block based on procedure...but whats the need to do so...please can anybody explain.

    There are several recent posts on the topic with a lot of good help & documentation.
    The Metalink note: 66887.1
    Re: Form on a procedure
    If you can't access Metalink, search here on that note. Someone posted the full text a couple days ago.
    This is also very good reference:
    Block based on procedure
    Essentially, you have a procedure, returning a table-of-records type variable, called by the block "Query Data Source..." properties.
    Name = procedure name
    Arguments = procedure parameters (IN parameter(s) Value is :block.item)
    Source Columns = the record type columns for your table-of-records
    Datatypes & sizes must match all around.
    Start very simple & get it to work first, then expand.
    You must follow the documentation very carefully. If it doesn't work at first, go through everything to verify. Also, read the Tip & Issues at the end of the Metalink note.
    Have fun. It works awsome once you get the hang of it.

  • Contoling structure of view.

    Hi,
    Can we define the column datatype sizes during creation of view.
    Let us consider the following view creation script.
    create or replace view cmonlftWCr
    select custcd,custnm,excldlryn,exdlrfrm,to_char(billdate,'YYYY-MM') Month,
    sum(billqty) MthQty,mqtycrt(sum(billqty)) MthQtyCrit
    from billdtl1
    group by custcd,custnm,excldlryn,exdlrfrm,to_char(billdate,'YYYY-MM')
    order by excldlryn,custcd,to_char(billdate,'YYYY-MM')
    This creates the required view. but with the following structure.
    SQL> desc cmonlftWCr;
    Name Null? Type
    CUSTCD NOT NULL NUMBER(7)
    CUSTNM NOT NULL VARCHAR2(50)
    EXCLDLRYN NOT NULL CHAR(1)
    EXDLRFRM DATE
    MONTH VARCHAR2(7)
    MTHQTY NUMBER
    MTHQTYCRIT VARCHAR2(4000)
    My problem is with column mthqty and mthqtycrit.
    I want mthqty should have the datatype number(9,3) and mthqtycrit should have the datatype char(3). The function mqtycrt returns only character string of 3 character long.
    Can anybody help me out in this regard...?
    Soumen.

    Hello
    You just need to use the CAST operator:
    SQL> create or replace view dt_test_view as
      2  select
      3     '1' as col1,
      4     CAST ('1' as  NUMBER(1)) as col2,
      5     CAST ('1' as  VARCHAR2(10)) as col3,
      6     '01-JAN-05' as col4,
      7     CAST ('01-JAN-05' AS DATE) as col5
      8  from
      9     dual;
    View created.
    SQL> select * from dt_test_view;
    C       COL2 COL3       COL4      COL5
    1          1 1          01-JAN-05 01-JAN-05
    SQL> desc dt_test_view;
    Name                                      Null?    Type
    COL1                                               CHAR(1 CHAR)
    COL2                                               NUMBER
    COL3                                               VARCHAR2(10)
    COL4                                               CHAR(9 CHAR)
    COL5                                               DATEHTH
    David

  • Problem with keeping attributes updated

    We have a problem with the server model in Designer.
    We have made a E/R diagram and genereted it to a server model diagram. Afterwards we want to delete some attributes which we delete back in the E/R program. Then we make some new attributes instead and generate a new server model diagram but the attributes which we have deleted in the E/R diagram is still present in the new server model diagram. The new attributes are there too. It is like the server model diagram only updates the new attributes, but doesn't delete the old, wrong, attributes.
    What can we do to solve the problem?

    Hi,
    DDT will modify the attributes values (datatype, size etc.) on the target tables.
    DDT doesn't drop the column if its corresponding attribute is deleted in ERD.
    However we can achieve this by deleting the table definition from DE and then executing the DDT on the new/modified Entity.
    Regards,
    Wilson.

  • Will a sequence return same value for two different sessions?

    Is there a possibility that a sequence will return same value to two different sessions when it is referred exactly at the same instance of time?

    @Justin... Thanks for your insight; indeed, we too feel this shouldn't ever happen and never heard of it either, but there it is. (No, we haven't logged a TAR yet -- whatever that is -- partly because it didn't occur to us and partly because we only recently came across the issue and sensibly want to do some testing before we cry foul.)
    However, the code is pretty straight-forward, much like this (inside a FOR EACH ROW trigger body):
    SELECT <seqname>.NEXTVAL INTO <keyvar> FROM DUAL;
    INSERT INTO <tblname> (<keyfield>, <... some other fields>)
    VALUES(<keyvar>, <... some other values> );
    (where <tblname> is NOT the table on which the trigger is fired). This is the only place where the sequence is ever accessed. The sequence state is way below its limits (either MAXVALUE or <keyfield>/<keyvar> datatype size).
    In this setup, end users sometimes got an out-of-the-blue SQL error to the effect that uniqueness constraint has been violated -- as I said, we used to have a unique index on <keyfield> -- which leads us to assume that the sequence generated a duplicate key (only way for the constraint to be violated, AFAIK). We released the constraint and indeed, using a simple SELECT <keyfield>, COUNT(*) FROM <tblname> GROUP BY <keyfield> HAVING COUNT(*)>1 got us some results.
    Unfortunately, the <tblname> table gets regularly purged by a consumer process so it's hard to trace; now we created a logger trigger, on <tblname> this time, which tracks cases of duplicate <keyfield> inserts... We'll see how it goes.
    @Laurent... winks at the CYCLE thing Our sequence is (needless to say) declared as NOCYCLE and the datatype is large enough to hold MAXVALUE.

  • API to validate a value against a value set

    I have a 'Table' Value Set based on the MTL_CATEGORIES_V view, and need to validate some loaded data against it to check the loaded values are are valid. Rather than use a cursor to perform a lookup of the value in MTL_CATEGORIES_V, is there an API that can be used instead?
    I thought I'd found something when I discovered "FND_FLEX_VAL_UTIL.is_value_valid", but this seems to just check that a value meets the criteria of the valueset based on the datatype. size, hi-lo range etc, it doesn't check if the value is actually permitted within the set.
    Andy

    Hi,
    you can use fm FM_DOMAINVALUE_CHECK. Pass parameter values for I_DOMNAME and I_DOMVALUE.
    If the value is invalid, exception VALUE_NOT_ALLOWED will occur.
    Regards,
    Klaus

  • Oracle distributed doc capture - database error

    hi all,
    i work at project ( capture , archive ) and i implement the ODDC ( instalation and integration )
    i do all configure include scanning profiles ( the last step )
    then when i do test the system i find that there is no data saved upon scanning and after commit else data in the commit table i build and mapping with the index fields and ECAudit table , but no data in tables like EcBatches and more,
    i see more than one column with type blob ( i mean column in capture system core tables like ECBatches)in many tables ( these tables not contain any data ! )
    i use the evaluation period license
    oracle 10g release 2
    win server 2003 stander edition R2 sp2
    any Idea please ??!!!!!

    its started !!!, but is there any blob field in the capture system usual database ??
    i mean the system database that the capture build it ??? is there any blob field ..... any idea please ??
    what about the check integrity utility is it stable ? what can do for me ?
    i tired with this problem ... any one can help me please ?
    is there any data must add to the database before the commit operation ??
    i dont have any data before and not all tables collect data just the ECAUDIT and my commit tables where the index fields saved i name it ( dbcommit ) with this tables
    database info
    user: capture
    password: capture
    database: capture
    commit table:
    DBCOMMIT
    column name: DataType size description
    PERMIT_ID integer
    DOC_TYPE varchar2 500 ( from 1 to 14 from pick list)
    AREA_ID integer default = 09
    SCAN_DATE date
    USER_ID integer
    DOC_PATH varchar2 500 document path on capture server
    INDEX_DATE Date
    when i do select statment like :
    select * from ECBATCHES;
    i get the error SP2-0678
    its mean :
    The "SP2" error messages are messages issued by SQL*Plus. The SP2-0678 error message text is "Column or attribute type can not be displayed by SQL*Plus." What you probably tried to do is to dump a binary data type, i.e. BLOB, to the screen in SQL*Plus. The problem is that SQL*Plus can only handle text data. It cannot handle binary data. Therefore, you are getting an error message telling you that this data type can not be displayed in your SQL*Plus session. Try your SELECT statement again but eliminate the BLOB column(s) from your query.
    the question is why there is some column type " BLOB " in the table like " ECBATCHES " and the other column is not blob its varchar2 and integer .... and these column empty !!! there is error somewhere but i tired re install it in new machine and database but nothing change !
    any IDEA please :(
    Edited by: yos on Nov 21, 2008 3:56 AM

  • How can we store an image in the file the image is int he form of bytes

    hi
    How can we write an image to a file and save it to hardisk the i mage is in the form of bytes please if any body knows please send the code for it
    Thanks for u reply

    can u send the code for me
    my image is in the form of bytes its size is=78K;
    now i want to store image in database using BLOB type i blob datatype size is25k;
    how i should store
    Help me if know anything about this
    Thanks In Advance

  • Regarding  techanical attributes in the data base tables

    hi gurus,
               cud u plz let me know  what will happen if i select buffering not allowed and in buffering type i selected single record buffering,,is it going to work...thnx in advance......pl;z reply in your words dont copy and paste it form sap library........

    These are the Technical Settings of database table and not Technical Attributes.
    <b>Technical Attributes implies, DATATYPE, SIZE etc....</b>
    <b>Buffering status</b>
    Definition
        The buffering status specifies whether or not a table may be buffered.
        This depends on how the table is used, for example on the expected
        volume of data in the table or on the type of access to a table. (mainly
        read or mainly write access to the table. In the latter case, for
        example, one would not select buffering).
        You should therefore select
        - Buffering not allowed if a table may not be buffered.
        - Buffering allowed but not activated if buffering is
          principally allowed for a table, but at the moment no buffering
          should be active. The
          buffering type specified in this case is only
          a suggestion.
        - Buffering allowed if the table should be buffered. In this
          case a buffering type
          must be specified.
    Regards,
    Pavan

Maybe you are looking for