Table size is too big Performance issue.

Hi,
Let us assume that we have table which has about 160 columns in it. About 120 of these columns are Varchar data type with about 100-3000 size each column.
This table also has about 2 Millions Rows in it. I am not sure if these are considered as big sized tables?
Does tables like these a good representation of data. I am in doubt as the size of the table is very big and might take long time for queries. We have about 10 indexes on this table.
What kind of precautions have to be taken when these kind of tables are involved in the database and they required for the application.
Database version is Oracle 10.2.0.4
i know the question is bit vague, but i am just wondering what needs to be done , and from where i have to start to dig into the issue just in case I get performance issues while trying to select the data or update the data.
i also want to know if there is any idle size for the tables and any thing that is more than that needs to be treated differently.
Thanking you
Rocky

Any table with more than about 50 columns should be viewed with suspicion. That doesn't mean that there aren't appropriate uses for tables with 120 or 220 columns but it does mean they are reasonably rare.
What does bother me about your first paragraph is the number of text columns with sizes up to 3K. This is highly indicative of a bad design. One thing is for sure ... no one is writing a report and printing it on anything smaller than a plotter.
2M rows is small by almost any definition so I wouldn't worry about it. Partitioning is an option but only if partition pruning will can be demonstrated to work with your queries and we haven't seen any of them nor would we have any idea what you might use as a partition key or the type of partitioning so any intelligent discussion of this option would require far more information from you.
There are no precautions that relate to anything you have written. You've told us nothing about security, usage, transaction volumes, or anything else important to such a consideration.
What needs to be done, going forward, is for someone that understands normalization to look at this table, examine the business rules, examine the purpose to which it will be put, and most importantly the reports and outputs that will be generated against it, and either justify or change the design. Then with an assessment of the table completed ... you need to run SQL and examine the plans generated using DBMS_XPLAN and timing as compared to your Service Level Agreement (SLA) with the system's customers.

Similar Messages

  • Scanning: The image size is too big. Please reduce the image height, width, resolution, scaling or output type.

    Good Day.
    I have recently discovered an issue with scanning.  I have tried scanning straight from the printer, through image capture, and through preview.  All yeilding the same results since they are essentailly kicking off the same scanning applet.  I can scan a jpeg formatted file as long as I applet is in the "Hide Details" mode.  If I click on the "Show Details" button, which I need to do to scan to PDF, I immediately receive an error stating: The image size is too big. Please reduce the image height, width, resolution, scaling or output type.
    I'm currently using an HP C5180 printer.   The most recently installed software was Adobe Digital Editions (I installed this to read an on-line e-book).  Not sure if there is any correlation.    Thanks for any help.
    Mac details are:
      Model Name:          MacBook Pro
      Model Identifier:          MacBookPro8,1
      Processor Name:          Intel Core i5
      Processor Speed:          2.3 GHz
      Number of Processors:          1
      Total Number of Cores:          2
      L2 Cache (per Core):          256 KB
      L3 Cache:          3 MB
      Memory:          8 GB
      Boot ROM Version:          MBP81.0047.B27
    Printer access log:
      Source:          /var/log/cups/access_log
      Size:          174 bytes
      Last Modified:          11/21/12 10:09 PM
      Recent Contents:          localhost - - [21/Nov/2012:22:09:16 -0500] "POST / HTTP/1.1" 200 61628 CUPS-Get-PPDs -
    localhost - - [21/Nov/2012:22:09:20 -0500] "POST / HTTP/1.1" 200 61628 CUPS-Get-PPDs -
    Image Capture Support:
      Image Capture Support:
      Path:          /Library/Image Capture/Support/Hewlett-Packard/Devices/HPAiOScan.bundle/Contents/Info.plist
      Version:          2.3.0
      Path:          /Library/Image Capture/Support/Hewlett-Packard/Devices/HPAiOScan.bundle/Contents/Resources/Dev iceInfo.plist
      Version:          2.3.0
    Photosmart C5100 series:
      Status:          Idle
      Print Server:          Local
      Driver Version:          4.0.0
      Default:          Yes
      Shared:          No
      URI:          dnssd://Photosmart%20C5100%20series%20%5B960B9E%5D._pdl-datastream._tcp.lo cal./?bidi
      PPD:          HP Photosmart C5100 series
      PPD File Version:          4.0.0
      PostScript Version:          (3011.104) 0
      CUPS Version:          1.6svn (cups-327)
      Scanning support:          Yes
      Scanning app (bundleID path):          -
      Scanning app version:          -
      Scanner UUID:          CC8DD435-CC8D-D435-CC8D-D435CC8DD435
      Printer Commands:          ReportLevels
      CUPS filters:
    Inkjet:
      Path:          /Library/Printers/hp/cups/Inkjet.driver/Contents/MacOS/Inkjet
      Permissions:          rwxr-xr-x
      Version:          4.0.0
    commandtohp:
      Path:          /Library/Printers/hp/cups/filters/commandtohp.filter/Contents/MacOS/comman dtohp
      Permissions:          rwxr-xr-x
      Version:          2.1.1
      Fax support:          No
      Printer utility:          /Library/Printers/hp/Utilities/HP Utility.app
      Printer utility version:          5.9.1
      PDEs:
    PDE.plugin:
      Sandbox compliant:          Yes

    Hello Sig
    The scanner works with a Windows computer, which proves the device is functional at a cursory level. The drivers are now distributed by Apple and this is an Apple computer. There is no scanning software provided by HP. HP's answer is that Mountain Lion takes care of all of this.  I anticipate that some setting or driver was somehow tweaked since the scanner had been working with Mountain Lion until a few days ago.  The device in question is an Apple product. So, I'm pretty confident that I'm in the right place.
    Regards

  • Why are some photos on camera roll but not in photo stream. Unable to backup as camera roll size is too big for free 5gb.

    Why are some photos on camera roll but not in photo stream. Unable to backup as camera roll size is too big for free 5gb.

    I assume you have Photostream on, correct?
    In order for the photos to be moved into Photostream, you must exit the camera app, and be on a wifi connection.  Both of these criteria must be met before the sync occurs.

  • HT204266 How to update the app through iTune if its size is too big?

    How to update the app through iTune if its size is too big?

    If your actually wanting to update your iPhone to IOS  6 and you don't have enough memory available, you still need to the action Chrisfromgastonia recommended.  You will find that Movies and TV Shows being moved to your itunes will free up the largest amount of space.  If IOS 6 is your objective, you may be able to re-download the Movies / Videos after the update.

  • SQL1139n The total size of the table space is too big.

    Hi
    Our R3 QA system runs on Solaris 10, using DB6 822.
    We have now run into a problem, where we cannot extend a table. It is 66GB in size. Because the page file size is 8 kb, the limit is apparently 64 GB (we got it to 66GB).
    It seems we will have to increase the page file size to get past the problem, say 16k or 32k. The question is how?
    So far we have a framework in mind:
    Create new table
    Copy old table into new table
    Drop old table
    Recreate old table with bigger page file size
    Copy new table into old table (now this new /old is getting confusing..)
    Bob's you aunty, or something of that effect...
    Is the thinking correct? If it is I will need much more detail, as I am not too familiar with DB2.
    Thanks in advance!

    hi derik,
    the db6-specific limits for max tablespace/table size are
    64/128/256/512GB for 4/8/16/32KB pagesize.
    for problems like "tablespace/table reaches its pagesize dependent max. size",
    we (DB6 porting team of SAP) have developed a special tool/ABAP report called 'DB6CONV'.
    DB6CONV takes care of everything concerning a table conversion.
    it is in depth described in OSS note 362325. the report itself it delivered by means of a transport, which is attached to note 362325 and/or can be downloaded via sapmats.
    in your case you have to
    a) get the latest DB6CONV transport and import it into your system
    b) create a new tablespace with a pagesize >8k
    c) assign a new data class for this tbsp in tables TADB6 (for data tbsp) and/or IADB6 (for index/tbsp)
    d) run DB6CONV from transaction SE38 as described in note 362325
    to convert(transfer) the table that is at/near the size limit
    by specifying either target tablespaces or target data class
    e) DB6CONV will duplicate the table first into the new tablespace, then copy the data into the newly created table. this can be done either 'offline' (fastes way, but table is not accessible during the conversion) or 'online' (slow, but table is accessible the whole time - despite a short time period when the switch from original to target table is performed)
    please make yourself familiar with the tool and the documentation.
    and feel free to ask if you need more information or have additional questions/remarks.
    regards, frank

  • Table size effect on query performance

    I know this sounds like a very generic question, but, how much does table size affect the performance of a query?
    This is a rather unusual case actually. I am running a query on two tables, say, Table1 and Table2. Table1 is roughly 1 million record. Whereas for Table2, I tried using different number of records.
    The resultant query returns 150,000 records. If I keep Table2 to 500 records, the query execution time takes 2 minutes. But, if I increase Table2 to 8,000 records, it would take close to 20 minutes!
    I have checked the "Explain plan" statement and note that the indexes for the columns used for joining the two tables are being used.
    Is it normal for table size to have such a big effect on performance time, even when number of records is under the 10,000 range?
    Really appreciate your inputs. Thanks in advance.

    Did you update your statistics when you changed the size of Table2? The CBO will probably choose different plans as the size of Table2 changes. If it thinks there are many more or fewer rows, you're likely to have performance issues.
    Justin

  • SQL server error log size is too big to handle

    I am working with a large database on windows sql server 2008 R2 such that it has to run continuously 24x7 because of that it is not possible
    to restart the server time to time. It is kind of monitoring system for big machines. Because of this SQL server error logs are growing too big even some times up to 60-70 GB at a limited sized hard drive. I can't delete them time to time manually. Can someone
    please suggest a way using which I can stop creation of such error logs or recycle them after sometime. Most of the errors are of this kind --
    Setting database option RECOVERY to simple for database db_name
    P.S.- I have read limiting error logs to 6 etc. But that didn't help. It will be best if you could suggest some method to disable these logs.

    Hi Mohit11,
    According to your description, your SQL Server error logs are growing too big to handle at a limited sized hard drive, and you want to know how to stop the generation of such error logs or recycle them after sometime automatically without restarting the
    SQL Server, right?
    As others mentioned above, we may not be able to disable SQL server error log generation. However we can recycle the error logs automatically by running the
    sp_cycle_errorlog on a fixed schedule (i.e. every two weeks) using SQL agent jobs so that the error logs will be recycled
    automatically without restarting SQL Server.
    And it is also very important for us to keep the error log files more readable. So we can increase the number of error logs a little more and run the sp_cycle_errorlog more frequently (i.e. daily), then each file will in a smaller size to be more readable
    and we can recycle the log files automatically.
    In addition, in order to avoid the size of all the log files growing into a too big size unexpected (sometime it may happen), we can run the following query in SQL Agent job to automatically delete all the old log files when the size of log files is larger
    than some value we want to keep (i.e. 30GB):
    --create a tample table to gather the information of error log files
    CREATE TABLE #ErrorLog
    Archieve INT,
    Dt DATETIME,
    FileSize INT
    GO
    INSERT INTO #ErrorLog
    EXEC xp_enumerrorlogs
    GO
    --delete all the old log files if the size of all the log files is larger than 30GB
    DECLARE @i int = 1;
    DECLARE @Log_number int;
    DECLARE @Log_Max_Size int = 30*1024; --here is the max size (M) of all the error log files we want to keep, change the value according to your requirement
    DECLARE @SQLSTR VARCHAR(1000);
    SET @Log_number = (SELECT COUNT(*) FROM #ErrorLog);
    IF (SELECT COUNT(FileSize/1024/1024) FROM #ErrorLog) >= @Log_Max_Size
    BEGIN
    WHILE @i <= @Log_number
    BEGIN
    SET @SQLSTR = 'DEL C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Log\ERRORLOG.' + CONVERT(VARCHAR,@i);
    EXEC xp_cmdshell @SQLSTR;
    SET @i =@i + 1;
    END
    END
    DROP TABLE #ErrorLog
    For more information about How to manage the SQL Server error log, please refer to the following article:
    http://support.microsoft.com/kb/2199578
    If you have any question, please feel free to let me know.
    Regards,
    Jerry Li

  • CFX_Image seems to resize ok but the file size is too big??

    my host uses CFX_IMAGE v1.6.6.11 (
    http://80.79.138.227/cfximage/index.cfm
    which does a perfectly good job of resizing images except for
    the fact that the resultant file size is too large. i found that
    compared to resizing images locally, resizing using this tag
    resulted in file sizes of double the size.
    i have tried tweaking the "quality" paramater but without
    success, i could get the file size down but then the quality was
    poor.
    so i am forced to resize images locally using macromedia
    fireworks or MS paint before uploading them.
    ideally i need to be able to resize images server side but
    without inordinate file sizes.
    does anyone else use a cfx_image tag that's more efficient?
    or does anyone else know how to get this tag to perform better?
    any help would be very gratefully received.
    thank you

    Hi,
    You may want to take a look a this script:
    Link
    image resize
    Don't forget to read the comments there they contain some
    useful information.
    Kind regards,
    Nebu

  • Database size is too big

    Hi, my client database size is 82GB after 1 year implementation. i have checked table size, the biggest table is AITW almost 50GB.
    i have changed the configuration of history/log to 5. the log will be removed to 5 when i update the item. so i have to update all item one by one. is there anyway to remove all item hisotry / log ?
    Thanks

    Hi,
    You may check this: maintenance of my database
    Thanks,
    Gordon

  • Index size is too big.

    Hi ,
    I am using a combined(3 fields) unique index (Normal) on a table which has around 17 M records.
    The size if index is 6074 MB. Due to which I/O and cluster wait for index is consuming 90% of time spent on many queries. So help me find a solution.
    Thanks..

    Besides the suggestions already given you may want to consider using index compression. If you want to give compression a try, you should first analyze the index to see if compression gives a desired result. This can be accomplished by issuing following the command:
    ANALYZE INDEX index_name VALIDATE STRUCTURE;Then you should run the following query:
    SELECT * FROM index_stats;and look at the columns OPT_CMPR_COUNT and OPT_CMPR_PCTSAVE. The OPT_CMPR_COUNT will give will the number of columns that Oracle recommends you to compress, which is from 0 to n-1 on unique indexes, begin 'n' the number of columns in the index; and OPT_CMPR_PCTSAVE will give you the expected percentage of saving from the compression.
    If this query return 0 for any of these columns you won't have any savings by doing compression so don't have the trouble. But if it says that you could achieve a good compression you can compress your index using the following statement:
    ALTER INDEX index_name REBUILD COMPRESS n;Where 'n' in the query above is the number of columns suggested by OPT_CMPR_COUNT.
    Edited by: Paulo Petruzalek on 23/07/2012 05:47
    Edited by: Paulo Petruzalek on 23/07/2012 05:47

  • Image size is too big after "Smart Build Grid"

    The Smart Build Grid feature is awesome. Unfortunately, I have one issue with it- it sometimes zooms the images too much when using the grid feature.
    Specifically, I have a 4 x 3 matrix of rectangular images. With each Zoom from the grid, Keynote zooms too much, so that the center of the image is nicely displayed, but cuz it zooms too much, important info on the sides of the image is cropped out in the magnified image.
    How can I set Keynote to zoom so that the zoomed image is only as big as the longest axis relative to the slide (thereby displaying all information)?
    Keynote 4.01, OS 10.4.11, PPC G4 867 MHz.
    Thank you very much!

    But the image is in
    <mx:DataGridColumn headerText="Input Status" ...
    How can it be fixed?  Please provide your advise.  Thanks.

  • Export photo but size is too big and magnified for uploading to web

    Hi
    I have tried to export a photo from IPhoto using Export function, selecting small then exporting to Desktop.  When I tried to upload to the web, the picture is very large and takes up the whole screen.  Even if i demagnify the picture before saving it, it is still very big.  I have tried to crop but this does not help.
    Please can you advise a quick and simple way to make pictures smaller visually.  (When I used Window, I used Paint which did the job).
    Thank you

    Thanks for the link.
    I'm not clear as far as size of file goes, and also to preview how it looks before exporting it.  I went to the File Export tab and should I wish to select "Size" then "Small", what size will that be on the screen? Or should I goto "Web Page", then if I goto "Image", I am sure what to reduce the 640 down to, but say if I half it, would that be a reasonable size on the screen/web page?
    I'm just trying to find a quick way to resize pictures for uploading to the web... but it's still a big vague to me.

  • Changelog.data size growing too big in embedded LDAP /weblogic

    Hi Team,
    We have embedded LDAP.
    We are having issues in setting the no of entries for changelog.data.
    Could anyone of you help as how could we set the treshold for changelog.data .
    we are using Linux server and weblogic as app server.
    Thanks In Advance..

    Thanks..
    But that doesnt seemed to be working.
    I have set following parameter in startup
    -javaagent:/app/platform/wily/current//Agent.jar -Dcom.wily.introscope.agentProfile=/app/platform/wily/current/IntroscopeAgent.profile - Dcom.wily.introscope.agent.agentName=Cramer-Dev0-RM-Admin -Dweblogic.security.ldap.changeLogThreshold=10 -Dweblogic.security.ldap.maxSize=1048576
    Now prior to setting 10 ,I had set it to 30 cleared changelog.data and restarted .It got generated with 28mb size.
    After setting 10 also its the same size of changelog that i could see.
    could you tell me what went wrong..
    Thanks

  • Reader OLE server does not draw PDF-document if paper size is too big

    If PDF document paper size is for example A0 (841 x 1189 mm) OLE object can be created and application shows PDF content correctly. But if application saves it's document and reopens it Reader does not draw content of embedded PDF document any more. Decreasing PDF document size to A1 solves the problem. Embedded object is not corrupted because it is possible to open it to Reader.
    I'm confused why Acrobat Reader OLE server is capable of drawing PDF file content in creation phase but not later on when application opens same file containing embedded PDF objects.
    "Application" means all possible application that has OLE support. I have tried this with three different applications and the behavior is same with all applications. Because of this I assume that application I'm developing does not have bug.

    Hello Sig
    The scanner works with a Windows computer, which proves the device is functional at a cursory level. The drivers are now distributed by Apple and this is an Apple computer. There is no scanning software provided by HP. HP's answer is that Mountain Lion takes care of all of this.  I anticipate that some setting or driver was somehow tweaked since the scanner had been working with Mountain Lion until a few days ago.  The device in question is an Apple product. So, I'm pretty confident that I'm in the right place.
    Regards

  • Dvd build size way too big...

    Hi,
    I have standard definition dvd project, in 16/9 aspect ratio. All of the mpeg2 and sd2 files come in at 3.9 GB. I am using layered menus, and I have 10 menus using 3 files. There are no animated menus at all. When I do a build using the preset encoding settings in dvd sp, the build file comes out at 6.2 GB. An additional 2 GB for a few scripts and the menus seem extreme.
    Any ideas?
    Thanks for any advice.

    so i think i found the problem, but things still don't add up...
    i have 10 tracks used by the main menu. with the dvd you can also access these same tracks from another menu, a biography menu. in order to have a track jump to the menu you accessed it from, i duplicated the tracks, assigning 1 set of tracks to the main menu, and the other to the biography menu.
    this is adding the size, but i'm not sure why, as the tracks are referencing the same video and audio content. right? if i had 10 tracks all referencing the same content, why would it increase the build 10 fold?

Maybe you are looking for