Big File vs Small file Tablespace

Hi All,
I have a doubt and just want to confirm that which is better if i am using Big file instead of many small datafile for a tablespace or big datafiles for a tablespace. I think better to use Bigfile tablespace.
Kindly help me out wheather i am right or wrong and why.

GirishSharma wrote:
Aman.... wrote:
Vikas Kohli wrote:
With respect to performance i guess Big file tablespace is a better option
Why ?
If you allow me to post, I would like to paste the below text from my first reply's doc link please :
"Performance of database opens, checkpoints, and DBWR processes should improve if data is stored in bigfile tablespaces instead of traditional tablespaces. However, increasing the datafile size might increase time to restore a corrupted file or create a new datafile."
Regards
Girish Sharma
Girish,
I find it interesting that I've never found any evidence to support the performance claims - although I can think of reasons why there might be some truth to them and could design a few tests to check. Even if there is some truth in the claims, how significant or relevant might they be in the context of a database that is so huge that it NEEDS bigfile tablespaces ?
Database opening:  how often do we do this - does it matter if it takes a little longer - will it actually take noticeably longer if the database isn't subject to crash recovery ?  We can imagine that a database with 10,000 files would take longer to open than a database with 500 files if Oracle had to read the header blocks of every file as part of the database open process - but there's been a "delayed open" feature around for years, so maybe that wouldn't apply in most cases where the database is very large.
Checkpoints: critical in the days that a full instance checkpoint took place on the log file switch - but (a) that hasn't been true for years, and (b) incremental checkpointing made a big difference the I/O peak when an instance checkpoint became necessary, and (c) we have had a checkpoint process for years (if not decades) which updates every file header when necessary rather than requiring DBWR to do it
DBWR processes: why would DBWn handle writes more quickly - the only idea I can come up with is that there could be some code path that has to associate a file id with an operating system file handle of some sort and that this code does more work if the list of files is very long: very disappointing if that's true.
On the other hand I recall many years ago (8i time) crashing a session when creating roughly 21,000 tablespaces for a database because some internal structure relating to file information reached the 64MB hard limit for a memory segment in the SGA. It would be interesting to hear if anyone has recently created a database with the 65K+ limit for files - and whether it makes any difference whether that's 66 tablespaces with about 1,000 files, or 1,000 tablespace with about 66 files.
Regards
Jonathan Lewis

Similar Messages

  • How to chunk the XML files into smaller files using peoplecode.

    We are trying to load the Invoices through our XMLDoc PeopleCode that we have written long back.
      Previously the file is not that big and we are able to load them successfully. But now each file is more than 200MB and we are getting the below error when initiated the process..
      Process Request shows status of 'INITIATED' or 'PROCESSING' but no longer running “.
      I tried to chunk the file manually to 50MB it is able to run it properly but chuncking manually is not possible for all the files as it is time taking process..
      Is there any way that we can check on Webserver/Appserver whether whats happening and how to handle this error..
    Or any sample code to divide the big file into smaller files..
      Please let me know your thoughts..

    Good morning Ashok,
    Your data is in a SQL*Loader supported datatype, so basically it should not be a problem.
    Have you checked the Database Utilities guide (for Oracle9i for instance: http://www.lc.leidenuniv.nl/awcourse/oracle/server.920/a96652/toc.htm) on how to handle this?
    I'll also repeat my previous question, have you ever been able to load data into the database using SQL*Loader (either using ASCII values or any other datatype)?
    For the OWB part, have you read "Importing Data Definitions" (typically chapter 4)? More specifically, "Specifying Field Properties" in "About Flat File Sources and Targets"?
    In "SQL*Loader Properties" it says the following:
    Type Describes the data type of the field for the SQL*Loader. You can use the Flat
    File Sample Wizard to import the following data types: CHAR, DATE, DECIMAL
    EXTERNAL, FLOAT EXTERNAL, INTEGER EXTERNAL, ZONED , AND ZONED
    EXTERNAL. For complete information on SQL*Loader field and data types, refer to
    Oracle9i Utilities. Currently, only portable datatypes are supported.
    If you check the database utitlities guide (chapter 6 Field List Reference) and look for "SQL*Loader Datatypes", you'll find (packed) decimal under the Nonportable Datatypes. This implies that OWB does not support it.
    This does not mean you can't use SQL*Loader, you'll only have to define everything separate from OWB and call it separately as well.
    Good luck, Patrick

  • How To Divide Source File Into Smaller Files Without Exporting?

    Hello Everyone,
    I'm a beginner with PPro CS4 and to video editing in general. I've done a lot of Googling and searching of these forums to get an answer to this question, but I haven't found a clear answer. Here's my problem:
    I have many hours of HDV footage shot over the past few years with a Canon HV30. This footage is on dozens of DV tape cassettes. I want to capture most of it to disk.
    When I capture a tape to disk, I get a single large mpeg4 file. (PPro captures all HDV footage to mpeg4.) "Scene select" doesn't work with the HV30, so my only other way to make smaller capture files would be to set In/Out points with batch capture. But this would mean hours of fast-forwarding and reversing with the VCR-like controls on the camera to review all the tapes and set the points. I thought it might be FASTER to simply capture an entire tape in one operation, THEN review the captured tape within PPro (it's much faster using the scrubber), and dividing that file into smaller files within PPro, thereby avoiding all the mechanical fast-forwarding and rewinding on the camera itself.
    But the only way I can find to do this seems to involve EXPORTING clips. Am I wrong to think that exporting a clip to a new file involves a degradation of the original mpeg4 source file? Or will there be no loss to the quality if I simply export to mpeg4 format?
    It seems there should be an easy way to simply divide the original capture into smaller source files for later editing, without putting the footage through another generation of processing (exporting), which might entail some loss of quality. Or am I wrong about this?
    (CS4 Master Suite, i7 3630k, 16GB RAM, C: 256GB SSD, D: 1T Setpoint F3, E: 1T Setpoint F3)

    HDV footage is mpeg2 (whats in a number ). CS4 does not do scene detect you need to use HDV split for that.
    http://strony.aster.pl/paviko/hdvsplit.htm.
    You can run your already captured files through hdv split and it will chop the file into clips and off you go.

  • How can I split a pdf file into smaller files using Acrobat XI

    How can I split a pdf file into smaller files using Acrobat XI?

    Hi laforcej
    Open the PDF In Acrobat ...
    Go to Tools -> Pages -> Extract
    Now Select the Page Number you Want to Extract and Save them

  • Big file or small file

    Hi,
    I'm in importing phase of a large database, 1.7 TB... distribuite in 14 fs (/oracle/<SID>/sapdata1-14) of 200 GB more or less, a suggest for size of each datafile? big? small? generaly we use brtools on disk as backup procedure.
    Regards.

    Hello Ganimede,
    from the view of oracle itself it doesn't really matter .. so you can use the max size of a data file (metalink note #271482.1 -> "4,194,303 multiplied by the value of the DB_BLOCK_SIZE parameter").
    But please think about your backup/restore procedures. If you are using RMAN with block recovery, the recovery would be much faster if your data files are small and each is located in its own backupset, etc...
    We use round about 10 GB files.
    Regards
    Stefan

  • How To Split Large Excel or CSV Files into Smaller Files

    Does anyone know how to split a large Excel or CSV file into multiple smaller files?  Or, is there an app that will work with Mac to do that?

    split [-a suffix_length] [-b byte_count[k|m]] [-l line_count] [-p pattern] [file [name]]
    is a native Terminal command. Read up more on https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/ man1/split.1.html
    I prefer to use gSplit which is gnu coreutils.
    You can install gnu coreutils using homebrew.

  • Break large TDMS file into small files

    hello all
    My TDMS file is around 3g, and needs changed into around 10M size files. 
    I ran McRorie's splitFiles.vi ‏15 KB in this page.  and set the number of samples per file as 5000000, however, I cannot get the results I need. every small file is only 1KB, and no data inside. What is the possible problem in this?
    Also, I tried to write a vi based on sample vi(Read TDMS File) by adding one "write to measurement file.vi". However, when I set the small file size as 10M Byte inside the"write to measurement file.vi", the first file could be around 20M, and the next few files may be correct as 10M, and then it just stop splitting, edding with a file even much larger than original file. I uploaded my vi here, maybe someone  can help to find some mistake for me. 
    Thanks very much!
    Wuwei Mao
    Solved!
    Go to Solution.
    Attachments:
    Read TDMS File.vi ‏54 KB

    Hi Wuwei,
    After giving the correct data type to TDMS Read node in splitFiles.vi, it works as expected. ( See the attached two VIs: createFile.vi and the modified splitFiles.vi)
    Because I don't know how you created your TDMS file, I write a new 3G bytes TDMS file, which has one group and one channel. The data type of samples is unsigned 16-bit Integer. The total number of samples is 1610612736. Then I set the number of samples per file to 5000000 as you did. So after splitting, each file size is 5000000*(16/8) bytes (around 10M bytes).
    Please make sure the followed steps have been done before you run the splitFiles.vi:
    1. The TDMS file will be split has been put on the proper path;
    2. The correct group and channel names have been given;
    3. The correct data type to TDMS Read node has been given.
    Because your second option using "write to measurement file.vi" to split TDMS file will lose some information, such as group and channel names. So I suggest using the method used by splitFiles.vi to accomplish your goal.
    Jie Zheng
    NI R&D

  • Large files or small files

    i work with still images and use imovie to make a slide show with music. it works rather well.
    im interested to know if low res jpg allow me to use twice as many images vs. max full size jpgs images, these images will be shown on a 50" flat screen?
    would also like to know is there a way of checking the imovie file size while it is being designed?
    also what type of dvd should i use to record this movie if its going to be played back on a dvd player or a computer?
    thank you

    Hi
    If You are using iDVD SlideShow function - then there is a LIMIT to know - of 99 photos per SlideShow
    You can have up to 99 SlidShows on a DVD
    I make my in any video editor except iMovie'08 to 11 as non of them can deliver interlaced SD-Video as iDVD and other DVD authoring programs demands - ELSE quality will suffer severely.
    I use
    • iMovie HD6
    • FinalCut Express or Pro
    • FotoMagico™ (incl Roxio Toast™ bundle) and burned as Blu-Ray WHEN QUALITY IS IMPORTANT
    Usin HD-Material will not help - but result will suffer here too due to BAD downscaling.
    Small resp Big photo files - YES there has been reported problems with large photos (whatever that is) - and it's not the size of the compressed .jpg files that matters BUT size of the Un-folded photo.
    My are from 2Mb to about 12Mb in .tiff (.tiff not used in projects - but always as .jpg - but to set size as uncompressed) and they work fine.
    But when picture are 50Mb to 500Mb (Yes I got one of those) - I guess that things might go wrong.
    Yours Bengt W

  • How to cut up a large video file into smaller files?

    I have a 30 minute video file I recorded on my camera. I need to make it into 6 smaller video so I can work on them separately instead of importing such a large file in final cut. Is there a way I can do this with Final Cut Pro?
    Thanks in advanced!

    No. FCP is a non-destructive editor. Once you ingest a clip it stays there on your drive unchanged until you delete it. FCP is really a dataaaaabase with instructions for playing back clips in order, applying filters, audio adjustments etc. If you want separate clips you have two options: either log and capture in chunks, or set markers on the clip and export smaller pieces, then re-import them.
    But I guess the question in my mind is, if it's already captured, why bother? you'll do your editing of the segments and export the finished pieces anyway...

  • How do I break up a pdf file into smaller files?

    I have a huge pdf file that I am trying to send as an attachment but it is too large to send.  I need to break it up into about 10 different files.  Any help would be appreciated.

    Hi sbills04,
    To split a PDF (or extract pages), you need to use Acrobat. Please see https://acrobatusers.com/tutorials/how-to-break-a-pdf-into-parts
    Please let us know if you have additional questions.
    Best,
    Sara

  • Is there a design pattern for splitting up files into smaller files?

    I am developing a project where I have to load very large files (upto 50 MB). Currently I am loading these files completely into (consecutive) memory. This has the advantage that I can very easily change bytes at certain locations, because I do not know the
    structure of all bytes.
    However, my intention is to also change the structure, e.g. removing/adding 'chunks'. Now I have the idea to remove the 'known' parts out of it, store them in classes with a data chunk only containing those parts and make a sort of reference list to those chunks.
    E.g.:
    Original file:
    Header
    ChunkA 1
    ChunkA 2
    Intermediate
    ChunkB 1
    Footer
    The result will be:
    ChunkA 1 and ChunkA 2 instance. ChunkB 1 instance
    'File' instance and a reference with base offsets + reference to all chunks.
    At the end I have to 'recreate' or write the original file (with changes) back.
    Is this in general a good idea or is there some design pattern helping me in this?

    50MB is not much in the modern era of 6GB+ machines. If you want to optimize memory then consider using a
    memory mapped file.
    But you mentioned making data structure changes. This is generally dangerous as you have to be concerned about things like disaster recovery. What happens if you are in the middle of saving the modified structure when the program dies? You just corrupted
    your file. A better solution is to stream the existing file using BinaryReader to read in the existing file in parts based upon the structure.  Write out the data to a new, temporary file using BinaryWriter. This may be the original data or your modifications
    depending upon need.  Once you've generated the new file replace the old file (with optional backup). So even if something catastrophic happens during saving you don't lose the original file.
    Michael Taylor
    http://blogs.msmvps.com/p3net

  • How to split a big file into several small files using XI

    Hi,
    Is there any way to split a huge file into small files using XI.
    Thanks
    Mukesh

    There is a how to guide for using file adapters, an "sdn search" will get you the document.
    Based on that , read a file into XI, Use strings to store the file content and do the split
    here is some code to get you started
    ===========
    Pseudocode:
    ===========
    // This can be passed in - however many output files to which the source is split
    numOutputFiles = 5;
    // Create the same number of filehandles as numOutputFiles specifies
    open file1, file2, file3, file4, file5;
    // Create an Array to hold the references to those filehandles
    Array[5] fileHandles = {file1, file2, file3, file5, file5};
    // Initialize a loop counter
    int loopCounter = 0;
    // a temporary holder to "point" the output stream to the correct output file
    File currentOutputFile = null;
    // loop through your input file
    while (sourceFile.nextLine != null)
    loopCounter++;
    if (loopCounter == (numOutputFiles +1) // you've reached 5, loop back
    loopCounter = 1;
    currentOutputFile = fileHandles[loopCounter]; // gets the output file at that index of the array
    currentOutputFile.write(sourceFile.nextLine);
    regards
    krishna

  • How to convert a large PDF into smaller files

    How does one convert a large PDF file into smaller files and not get odd page breaks.  Need to get the page breaks at the headers.  Just looking for instructions for an end user of Acrobat Pro who reports getting random page breaks and couldn't get help from the Adobe help desk.  I was directed here when I tried to contact support ... Figure this forum may have information. ... thanks.

    You can't. If you split the file into pages it will remain in the same layout the original has. PDF files were not meant to be edited in the way you're describing.

  • Small file to big file

    Hi All,
    can anyone tell me how to convert a small file tablespace to bigfile tablespace? below is my database version
    Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - 64bi
    PL/SQL Release 10.2.0.2.0 - Production
    CORE 10.2.0.2.0 Production
    TNS for Solaris: Version 10.2.0.2.0 - Production
    NLSRTL Version 10.2.0.2.0 - Production

    DUPLICATE*
    ===============================================
    small file to big file
    ===============================================

  • Compressing photo size in iphoto - want file size small but pic big? how?

    Hello
    I hope this is the right section for this post.
    I am building a website and I want photos on it, I am building it in iWeb.
    in iPhoto is there any way of making my photos a smaller file size?
    I have cropped them a bit, but some are still about 400kb.
    I want my site to load quickly.
    I want a gallery page with thumbnails to click on to enlarge them, but on my home page I want 1 big picture. What is compression and can i do it in iPhoto? Can I still have a picture the same size (to look at) but the file size is smaller. I have heard photos should be about 30-70kb and mine are 400kb jpegs!?
    Macbook   Mac OS X (10.4.8)   using iphoto and iweb, not hosting on .mac

    Betsy
    I believe that iWeb optimises the pics when you publish the site, but you could confirm that on the iWeb forum.
    If you want to reduce the size of pics using iPhoto, you do that on export: Select the pics you want and go File -> Export. Use the dialogue to specify the size of the pics you require and export them to a folder on the desktop.
    Regards
    TD

Maybe you are looking for

  • No increase in estimated speed

    I'm looking at getting infinity but was suprised to find there was no increase in estimated speeds between infinity and infinity 2 is this often the case? The estimated speed is 30mb/s for both.

  • User entry

    Hi, I have a requirement as following. I have one date 'planned completion date' and that this is getting populated from source sytem and is part of transaction data. I have one more date which 'report run date' with this i will do 'planned completio

  • Logic statement based on a dropdown field choice to go to PayPal or not.

    I have a drop down field called 'method of payment ' with 3 choices: PayPal, check, no pay. I split the page (to page 1 and page 2) directly below the drop down field and use this logic:  Go to (page 2) unless (method of payment) is (PayPal) (submit

  • After Effects rendering issues.

    I am currently trying to render a composition from After Effects, but every time I run into the same problem. Rendering starts then suddenly finishes, leaving me with a 68kb .mov file. The composition is rather large, about 5 secs. long and contains

  • Photosmart 8050 and 'Out of Paper' Error

    I own a Photosmart 8050 photo printer.  I have had it for a little while now and i only print on it every so often when I need to print pictures or other color documents.  I use my laser jet printer for printing most of my documents.  I've had no pro