Library too Large for Vault?

I just bought an iMac 27" with Mavericks.  I was on an iMac running Snow Leopard and Aperture 3.2.4.  I used Migration Assistant to move everything over and updated to Aperture 3.5.1.  My Aperture Library is 550 GB (40,000 images).  Now I can't use Vault!  Every time I try (to two different external drives), it stalls and when I look I see that Aperture is not responding.  I've let it go for hours but eventually have to Force Quit.  When I look at the Vault file, it has about 33 GB in it.
The Apple Pro support people suggested that the problem could have to do with the size of my library.  Indeed, we created a very small library and it saved to Vault without difficulty.   However, I'm not wild about dividing my library and it seems odd to me that I'm having this problem now when I didn't have it with Snow Leopard and Aperture 3.2.4.
Anyone else having this problem?  Thoughts?
Thanks!

I have found that Aperture Vaults (at least up to 100 GB) will transfer reasonably quickly to a direct connected drive.  The same is not true of transferring Vaults past about 20 GB to a network drive (to be clear - a drive that is not direct connected by wire and which is accessed by Wi-Fi).  Transfer to these network drives of vaults over 20 GB can take hours to days (if at all successful) - what is daunting is Aperture reports it is working (Status Bar and activity report in bottom left corner) but Aperture is locked up from any furthur activity and Force Quit Panel reports Aperture as not responding.  In this state, I have had files still transfer after lengthy intervals although after a couple of days I usually give up. Force Quit is the only way out.  Do this once or twice and you get the message that you need to rebuild your database.  I have found that rebuilding the database does help and speed up the transfer to a Network drive so long as you are in the under 50 GB size.  It does not seem to have any impact on larger vaults.
I have split up my big vault into smaller vaults by category and this has helped somewhat but you now have all these vaults which can be a pain.  I still have some larger vaults and these continue to be problematic for the Network drive.
I have a late model iMac and all my drive and Network Drives are Western Digital with all the latest software and firmware (as of April 2014).

Similar Messages

  • HT1229 My iPhoto Library (version 8.1.2) is 280GB (greater than 50% of my 500GB total storage memory on my iMac.  It was too large for me to drag it to a new hard drive so the Apple geniuses did it for me.  However they did not delete the Library from my

    My iPhoto Library (version 8.1.2) is 280GB (greater than 50% of my 500GB total storage memory on my iMac.  It was too large for me to drag it to a new hard drive so the Apple geniuses did it for me.  However they did not delete the Library from my iMac (that's my responsibility).  I dragged it to Trash and when it started to move I clicked over to the new hard drive to confirm it had indeed been copied.  I became nervous when I didn't see among the few files on this otherwise empty new hard drive anything that resembled a 280GB Library so I cancelled the migration to trash.
    How can I be sure that my iphoto has been copied and that all my "metadata" survived in tact?

    the new backup drive
    I thought the new drive would be your data drive to host the iPhoto library. Do you also use it for TimeMachine backups?
    I am unable to search either in email as well as Finder.  I AM able to search within iPhoto though, thankfully
    Spotlight may still be busy rebuildig its index.
    You could try to rebuild the Spotlight Index, if you do see no progress:
    Spotlight: How to re-index folders or volumes
    I hope, other frequent posters will drop in. I have not used iPhoto 8.x in long time.

  • My iTunes library is too large for my iPod classic--can I create a separate library so that I can split the current library and sync the distinct libraries onto separate iPods?

    My iTunes library is too large for my iPod classic--can I create a separate library so that I can split the current library and sync the distinct libraries onto separate iPods?

    - If the Gmail accounts are IMAP one then if the library is erased and then you use the account old email will be redownload
    - Easing the library on the MacBook does not effect what is on the iMac. On the other hand if you deleted the messages in the Mail app on the MacBook then they will deleted from all devices.
    -See the middle of the following. It would seem to help in your situation
    http://www.howtogeek.com/209517/how-to-stop-your-macs-mail-app-from-wasting-giga bytes-of-space/

  • Library too large

    I use photos professionally but my Aperture Library is now becoming too large for my Hard-drive on my Macbook Air.  I need to be able to take my Macbook with me at times to work on these photos.  What's the best way for me to do this?  I also have this library backed up to a vault on the same hard drive which seems silly.  What's the best way for me to manage things without losing the functionality to work on the photos within a single library?  I thought about splitting things into multiple libraries but then if I have a recent photo that belongs with an old photo and should reside within the same project.

    Get a large, fast external drive and put your library there, and bring it with you. Also have an up-to-date backup that you leave back at your home or office.

  • Library too large error.

    I was running tight on space on both my library and vault drives, so I referenced a bunch of masters to a third drive. The library drive now has plenty of space and the application has actually sped up. However, I'm not able to back up the vault. The drives are the same size and I'm getting a Library too large error message for the vault drive. Is there a trick I'm missing, or do I need to delete and recreate the vault?
    Thanks for the help!
    Steve

    OneDrive for Business/SkyDrive Pro has a 5000 item limit per Library. There is no workaround, other than reducing the number of items.
    Trevor Seward
    Follow or contact me at...
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • "This backup is too large for the backup volume" - Info

    Hi there. I had a problem with my time machine and got an error stating "This backup is too large for the backup volume". I did noticed after logging in that TM was indexing in the upper right corner [magnifier with a flashing dot(spotlight)] for a few seconds. So then I went on to "back up now" and it was preparing and then I got the Error message described above. So I uninstalled my anti-virus (you must disable auto protection/or exclude timemachine.app and its plist file (location below)from Anti-virus preferences in the case you have a anti-virus app, otherwise it will take forever to back up.) though that was not my issue. I then turn off time machine and deleted this .plist file in Macintosh HD > Library > Preferences > com.apple.TimeMachine.plist....STOP here if this fixed your problem after restarting. Time machine External Drive in Disk Utility **THESE STEPS WILL ERASE YOUR ENTIRE BACKUPS** ( "Erase" and rename or "partition" to make more that one partition on the External Drive if you wish, and Rename) (Disk utility> Partition tab> "option" you must - guid=intel / apple partition map=PowerPC)...sorry alot of newbie out there...by deleting the "com.apple.TimeMachine.plist" = when you plug in you TM it will ask you if you want to use the drive as a TM back up automatically. This did the trick. But to let you guys know I also used Cocktail (app) and used a feature it has to erase my computers spotlight index and rebuild it. Also in Cocktail, when you have your time machine plugged in you can erase its index and disable it all together. I recommend you first disable spotlight (before the first initial TM backup) in system preferences > spotlight> Privacy (tab) and plus to add time machine ...which has to be mounted (plugged in) to add from window under "Devices".

    http://www.macfixit.com/article.php?story=20090403093528353

  • This backup is too large for the backup volume - ridiculous backup size

    I have had big problems with Time Machine today. I have been successfully using it for 2 months now. I have about 30Gb spare on my iMac hard drive (out of 233Gb) and my Time Machine uses 195Gb out of 466Gb available.
    However, today I have lost all my Time Machine history except for one backup made this morning, and I am now getting an error message saying:
    "Time Machine Error
    This backup is too large for the backup volume. The backup requires 161061280.0 GB but only 270.4 GB are available.
    To select a larger volume, or make backup smaller by excluding files, open System Preferences and choose Time Machine"
    Clearly the required backup volume is wrong - that's 161,000.0TB!!!
    The same message has come up 3 times now.
    Can anyone help/advise on what to do next?

    Thanks Peggy,
    I re-indexed Spotlight and ran Time Machine again, but got exactly the same error message and the same massive storage required.
    I've opened Console as you suggested and these are a set of the messages from one back-up attempt yesterday (Console doesn' show anything earlier than 17.32 yesterday - but the problem started earlier than that I believe):
    "16/01/2008 17:32:52 /System/Library/CoreServices/backupd[326] Backup requested by automatic scheduler
    16/01/2008 17:32:52 /System/Library/CoreServices/backupd[326] Starting standard backup
    16/01/2008 17:32:52 /System/Library/CoreServices/backupd[326] Backing up to: /Volumes/Time Machine Backups/Backups.backupdb
    16/01/2008 17:32:53 /System/Library/CoreServices/backupd[326] Event store UUIDs don't match for volume iMac Hard Drive
    16/01/2008 17:32:53 /System/Library/CoreServices/backupd[326] Node requires deep traversal:/ reason:kFSEDBEventFlagMustScanSubDirs|kFSEDBEventFlagReasonEventDBUntrustable|
    16/01/2008 17:44:48 /System/Library/CoreServices/backupd[326] Starting pre-backup thinning: 157286.40 TB requested (including padding), 270.35 GB available
    16/01/2008 17:44:48 /System/Library/CoreServices/backupd[326] No expired backups exist - deleting oldest backups to make room
    16/01/2008 17:44:48 /System/Library/CoreServices/backupd[326] Error: backup disk is full - all 0 possible backups were removed, but space is still needed.
    16/01/2008 17:44:48 /System/Library/CoreServices/backupd[326] Backup Failed: unable to free 157286.40 TB needed space
    16/01/2008 17:44:49 /System/Library/CoreServices/backupd[326] Backup failed with error: Not enough available disk space on the target volume."

  • Backup too large for volume

    I have 2 macbook pro's (120GB & 160GB) backing up to a 500GB TM.
    both were backing up just fine, however in the past month the 160GB
    macbook pro keeps getting this message.....
    "backup too large for volume?"
    and subsequently the backup fails?
    the size of the backup is less than the free space on the TM drive...
    any help?

    dave,
    *_Incremental Backups Seem Too Large!_*
    Open the Time Machine Prefs on the Mac in question. How much space does it report you have "Available"? When a backup is initiated how much space does it report you need?
    Now, consider the following, it might give you some ideas:
    Time Machine performs backups at the file level. If a single bit in a large file is changed, the WHOLE file is backed up again. This is a problem for programs that save data to monolithic virtual disk files that are modified frequently. These include Parallels, VMware Fusion, Aperture vaults, or the databases that Entourage and Thunderbird create. These should be excluded from backup using the Time Machine Preference Exclusion list. You will, however, need to backup these files manually to another external disk.
    One poster observed regarding Photoshop: “If you find yourself working with large files, you may discover that TM is suddenly backing up your scratch disk's temp files. This is useless, find out how to exclude these (I'm not actually sure here). Alternatively, turn off TM whilst you work in Photoshop.” [http://discussions.apple.com/thread.jspa?threadID=1209412]
    If you do a lot of movie editing, unless these files are excluded, expect Time Machine to treat revised versions of a single movie as entirely new files.
    If you frequently download software or video files that you only expect to keep for a short time, consider excluding the folder these are stored in from Time Machine backups.
    If you have recently created a new disk image or burned a DVD, Time Machine will target these files for backup unless they are deleted or excluded from backup.
    *Events-Based Backups*
    Time Machine does not compare file for file to see if changes have been made. If it had to rescan every file on your drive before each backup, it would not be able to perform backups as often as it does. Rather, it looks for EVENTS (fseventsd) that take place involving your files and folders. Moving/copying/deleting/saving files and folders creates events that Time Machine looks for. [http://arstechnica.com/reviews/os/mac-os-x-10-5.ars/14]
    Installing new software, upgrading existing software, or updating Mac OS X system software can create major changes in the structure of your directories. Every one of these changes is recorded by the OS as an event. Time Machine will backup every file that has an event associated with it since the installation.
    Files or folders that are simply moved or renamed are counted as NEW files or folders. If you rename any file or folder, Time Machine will back up the ENTIRE file or folder again no matter how big or small it is.
    George Schreyer describes this behavior: “If you should want to do some massive rearrangement of your disk, Time Machine will interpret the rearranged files as new files and back them up again in their new locations. Just renaming a folder will cause this to happen. This is OK if you've got lots of room on your backup disk. Eventually, Time Machine will thin those backups and the space consumed will be recovered. However, if you really want recover the space in the backup volume immediately, you can. To do this, bring a Finder window to the front and then click the Time Machine icon on the dock. This will activate the Time Machine user interface. Navigate back in time to where the old stuff exists and select it. Then pull down the "action" menu (the gear thing) and select "delete all backups" and the older stuff vanishes.” (http://www.girr.org/mac_stuff/backups.html)
    *TechTool Pro Directory Protection*
    This disk utility feature creates backup copies of your system directories. Obviously these directories are changing all the time. So, depending on how it is configured, these backup files will be changing as well which is interpreted by Time Machine as new data to backup. Excluding the folder these backups are stored in will eliminate this effect.
    *Backups WAY Too Large*
    If an initial full backup or subsequent incremental backup is tens or hundreds of Gigs larger than expected, check to see that all unwanted external hard disks are still excluded from Time Machine backups.
    This includes the Time Machine backup drive ITSELF. Normally, Time Machine is set to exclude itself by default. But on rare occasions it can forget. When your backup begins, Time Machine mounts the backup on your desktop. (For Time Capsule users it appears as a white drive icon labeled something like “Backup of (your computer)”.) If, while it is mounted, it does not show up in the Time Machine Prefs “Do not back up” list, then Time Machine will attempt to back ITSELF up. If it is not listed while the drive is mounted, then you need to add it to the list.
    *FileVault / Boot Camp / iDisk Syncing*
    Note: Leopard has changed the way it deals with FileVault disk images, so it is not necessary to exclude your Home folder if you have FileVault activated. Additionally, Time Machine ignores Boot Camp partitions as the manner in which they are formatted is incompatible. Finally, if you have your iDisk Synced to your desktop, it is not necessary to exclude the disk image file it creates as that has been changed to a sparsebundle as well in Leopard.
    If none of the above seem to apply to your case, then you may need to attempt to compress the disk image in question. We'll consider that if the above fails to explain your circumstance.
    Cheers!

  • Time Machine iTunes Library too big for new Macbook Air

    My Macbook Pro went out of commission some time ago, I was fortunate enough to be able to get a Macbook Air to replace it. I had a Time Machine backup of all my old files on an external hard drive, however I had quite an extensive music library that is far too large for my current storage. I do have a backup computer with Windows 7. Is there some way I can access my full music library (over 200 gb) on a different computer through the Time Machine backup or in some fashion so I can put the music I want onto my new computer?

    NEVERMIND I FOUND A SOLUTION

  • I have a 500 GB hard drive and a 1TB Time Capsule running on a MacBook Pro.  It was all working well until the MacBook went in for a repair a week or so ago.  Since then, TC will not perform a backup;  instead, it says the backup is too large for the disk

    Since having my MacBook Pro repaired (for a video problem) Time Capsule returns the following message:  "This backup is too large for the backup disk. The backup requires 428.08 GB but only 192.14 GB are available."
    I notice that there is also a new sparse bundle.
    Since TC has my ONLY backup (going back about 4 years) I am reluctant to wipe it and start over fresh as I am afraid of losing files. 
    Is there a way of dealing with this?
    I am using Snow Leopard 10.6.8

    The repair shop likely replaced a major circuit board on your MacBook Pro, so Time Machine thinks that you have a "new" computer and it wants to make a new complete backup of your Mac.
    You are going to have to make a decision to either add another new Time Capsule....or USB drive to your existing Time Capsule....and in effect start over with a new backup of your Mac and then move forward again.
    For "most" users, I think this is probably the best plan because you preserve all your old backups in case you need them at some point, and you start over again with a new Time Capsule so you have plenty of room for years of new backups.
    Or, as you have mentioned, you have the option of erasing the Time Capsule drive and starting all over again. The upside is that you start over and have plenty of room for new backups. The downside is that you lose years of backups.
    Another option....trying to manually delete old backups individually....is tricky business....and very time consuming. To get an idea of what is involved here, study this FAQ by Pondini, our resident Time Capsule and Time Machine expert on the Community Support area. In particular, study the pink box.
    http://web.me.com/pondini/Time_Machine/12.html
    Once you look through this, I think you may agree that this type of surgery is not for the faint of heart.  I would suggest that you consider this only if one of the other options just cannot work for you.

  • Cannot decrypt RSA encrypted text : due to : input too large for RSA cipher

    Hi,
    I am in a fix trying to decrypt this RSA encrypted String ... plzz help
    I have the encrypted text as a String.
    This is what I do to decrypt it using the Private key
    - Determine the block size of the Cipher object
    - Get the array of bytes from the String
    - Find out how many block sized partitions I have in the array
    - Encrypt the exact block sized partitions using update() method
    - Ok, now its easy to find out how many bytes remain (using % operator)
    - If the remaining bytes is 0 then simply call the 'doFinal()'
    i.e. the one which returns an array of bytes and takes no args
    - If the remaining bytes is not zero then call the
    'doFinal(byte [] input, int offset, in inputLen)' method for the
    bytes which actually remained
    However, this doesnt work. This is making me go really crazy.
    Can anyone point out whats wrong ? Plzz
    Here is the (childish) code
    Cipher rsaDecipher = null;
    //The initialization stuff for rsaDecipher
    //The rsaDecipher Cipher is using 256 bit keys
    //I havent specified anything regarding padding
    //And, I am using BouncyCastle
    String encryptedString;
    // read in the string from the network
    // this string is encrypted using an RSA public key generated earlier
    // I have to decrypt this string using the corresponding Private key
    byte [] input = encryptedString.getBytes();
    int blockSize = rsaDecipher.getBlockSize();
    int outputSize = rsaDecipher.getOutputSize(blockSize);
    byte [] output = new byte[outputSize];
    int numBlockSizedPartitions = input.length / blockSize;
    int numRemainingBytes = input.length % blockSize;
    boolean hasRemainingBytes = false;
    if (numRemainingBytes > 0)
      hasRemainingBytes = true;
    int offset = 0;
    int inputLen = blockSize;
    StringBuffer buf = new StringBuffer();
    for (int i = 0; i < numBlockSizedPartitions; i++) {
      output = rsaDecipher.update(input, offset, blockSize);
      offset += blockSize;
      buf.append(new String(output));
    if (hasRemainingBytes) {
      //This is excatly where I get the "input too large for RSA cipher"
      //Which is suffixed with ArrayIndexOutofBounds
      output = rsaDecipher.doFinal(input,offset,numRemainingBytes);
    } else {
      output = rsaDecipher.doFinal();
    buf.append(new String(output));
    //After having reached till here, will it be wrong if I assumed that I
    //have the properly decrypted string ???

    Hi,
    I am in a fix trying to decrypt this RSA encrypted
    String ... plzz helpYou're already broken at this point.
    Repeat after me: ciphertext CANNOT be safely represented as a String. Strings have internal structure - if you hand ciphertext to the new String(byte[]) constructor, it will eat your ciphertext and leave you with garbage. Said garbage will fail to decrypt in a variety of puzzling fashions.
    If you want to transmit ciphertext as a String, you need to use something like Base64 to encode the raw bytes. Then, on the receiving side, you must Base64-DEcode back into bytes, and then decrypt the resulting byte[].
    Second - using RSA as a general-purpose cipher is a bad idea. Don't do that. It's slow (on the order of 100x slower than the slowest symmetric cipher). It has a HUGE block size (governed by the keysize). And it's subject to attack if used as a stream-cipher (IIRC - I can no longer find the reference for that, so take it with a grain of salt...) Standard practice is to use RSA only to encrypt a generated key for some symmetric algorithm (like, say, AES), and use that key as a session-key.
    At any rate - the code you posted is broken before you get to this line:byte [] input = encryptedString.getBytes();Go back to the encrypting and and make it stop treating your ciphertext as a String.
    Grant

  • Mac Mini display is too large for screen on only one user account

    Okay, so I left my two year olds alone for a minute playing the "alphabet game" on my Mac Mini. They only had the keyboard, no mouse but managed to muck up my display leaving me a bit frustrated.  The screen is now too large for my Samsung display. The only way to see everything (dock, top bar, etc) is to move my mouse arrow to the end of my display and see it roll back onto the page.  I've checked the settings there and they are fine. The MacBook Pro plugs right in and is proper resolution. So I then wondered if another account on the Mac Mini would do the same thing. I logged out of my Admin account and into another and everything looks just dandy. I log back into my Admin account and it's too large and blurry again. The resolution is set correct at 1920x1080 at 60 Hz.
    What button did they push on my keyboard that would do this and how do I get it back?? Aargh! Thanks all!

    Ha, figured it out myself from another discussion forum finally. Thoght I'd share in case anyone else runs into this. They must have hit "Zoom" by htting the "Control" and scroll buttons at the same time..
    Resolution:
    You can zoom out by holding down the Option and Command buttons on the keyboard and, while you hold them down, pressing the - key. 

  • Page header plus page footer too large for the page in crystal report 2008.

    Hi,
    when we selecting pieview and print after entering paramter it's showing error: page header plus page footer too large for the page. error in File.rpt page header or page footer loanger than page. and it's not showing print layout format if i connect another printer it's showing layout designe. and some times it's showing letter formate and if i give print it's taking default lamdscape but we setup defual setup for printer 10*12 inches in particular printer.please guide me how we can solve this issues.
    regds,
    samapth

    This is a really hard post to read. See if you can take a bit of time to reword it, but one thing I do understand is that you are getting this error:
    page header plus page footer too large for the page.
    Typically, you can trust that if the error is thrown, it is true. E.g.; this is not one of those errors that says one thing, and means another. I suspect that you have some field(s) in the header(s) that grow, depending on the data. If there is too much data or the data is too long ( a text for example), the error will be thrown. To resolve this, see if placing the field(s) into a group footer / header will help.
    Ludek

  • ERROR : OpenDoc CR to PDF - File is too large for attachment.

    We are getting the following error in 3.1 using an OpenDoc call when we call a large Crystal Report to PDF format...
    Error : 52cf6f8f4bbb6d3.pdf File is too large for attachment.
    It runs OK from BOE when given parameters that returned 44 pages. (PDF = 139 KB)
    We get the error on a parameter-set that returns 174 pages when run via CR Desktop or as a SCHEDULED Instance. (PDF = 446 KB).
    Client application can't use the SDKs to SCHEDULE Instances - only configured for OpenDoc calls.....
    The BOE server is running on SOLARIS - and it's is a 2 Server CMS-Cluster.
    The problem is SPORADIC, so I am thinking the issue is related to a specific setting on one of the servers.
    Any thoughts on where to start looking...?

    Problem is _not _with the number of Rows returned - it is an issue with the size of the PDF file that it is trying to move.
    Found a possible WINDOWS solution on BOB - need to find if there is an equivalent for SOLARIS...
    Check the dsws.properties on web server D:\Program Files\Business Objects\Tomcat55\webapps\dswsbobje\WEB-INF\classes
    See if you can change any parameter to remove size limitation.
    #Security measure to limit total upload file size
    maximumUploadFileSize = 10485760

  • SQL Error: ORA-12899: value too large for column

    Hi,
    I'm trying to understand the above error. It occurs when we are migrating data from one oracle database to another:
    Error report:
    SQL Error: ORA-12899: value too large for column "USER_XYZ"."TAB_XYZ"."COL_XYZ" (actual: 10, maximum: 8)
    12899. 00000 - "value too large for column %s (actual: %s, maximum: %s)"
    *Cause:    An attempt was made to insert or update a column with a value
    which is too wide for the width of the destination column.
    The name of the column is given, along with the actual width
    of the value, and the maximum allowed width of the column.
    Note that widths are reported in characters if character length
    semantics are in effect for the column, otherwise widths are
    reported in bytes.
    *Action:   Examine the SQL statement for correctness.  Check source
    and destination column data types.
    Either make the destination column wider, or use a subset
    of the source column (i.e. use substring).
    The source database runs - Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    The target database runs - Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    The source and target table are identical and the column definitions are exactly the same. The column we get the error on is of CHAR(8). To migrate the data we use either a dblink or oracle datapump, both result in the same error. The data in the column is a fixed length string of 8 characters.
    To resolve the error the column "COL_XYZ" gets widened by:
    alter table TAB_XYZ modify (COL_XYZ varchar2(10));
    -alter table TAB_XYZ succeeded.
    We now move the data from the source into the target table without problem and then run:
    select max(length(COL_XYZ)) from TAB_XYZ;
    -8
    So the maximal string length for this column is 8 characters. To reduce the column width back to its original 8, we then run:
    alter table TAB_XYZ modify (COL_XYZ varchar2(8));
    -Error report:
    SQL Error: ORA-01441: cannot decrease column length because some value is too big
    01441. 00000 - "cannot decrease column length because some value is too big"
    *Cause:   
    *Action:
    So we leave the column width at 10, but the curious thing is - once we have the data in the target table, we can then truncate the same table at source (ie. get rid of all the data) and move the data back in the original table (with COL_XYZ set at CHAR(8)) - without any issue.
    My guess the error has something to do with the storage on the target database, but I would like to understand why. If anybody has an idea or suggestion what to look for - much appreciated.
    Cheers.

    843217 wrote:
    Note that widths are reported in characters if character length
    semantics are in effect for the column, otherwise widths are
    reported in bytes.You are looking at character lengths vs byte lengths.
    The data in the column is a fixed length string of 8 characters.
    select max(length(COL_XYZ)) from TAB_XYZ;
    -8
    So the maximal string length for this column is 8 characters. To reduce the column width back to its original 8, we then run:
    alter table TAB_XYZ modify (COL_XYZ varchar2(8));varchar2(8 byte) or varchar2(8 char)?
    Use SQL Reference for datatype specification, length function, etc.
    For more info, reference {forum:id=50} forum on the topic. And of course, the Globalization support guide.

Maybe you are looking for

  • Read only source duplicate

    Hi 1-) If the source database is read only, do i still need recovery in the destination when i active duplicate ? 2-) If i issue, "alter database read only", does system,undo tablespace become read only as well ? As far as i know, system tablespace c

  • Anyone using a Samsung 245BW?

    I just got a new 24" Samsung 245BW to use with my PowerMac but it only seems to work at the 1920 x 1200 resolution with the RGB output. I was wondering why the Macs don't seem to be able to use the DVI input? If anyone has this monitor I'd like to kn

  • Changing axis in waveform chart

    How can I set the x-axis to be something other than time. I want to measure all of my inputs vs. a cycle count. Is there a way to change the axis? Attachments: boolean count.vi ‏12 KB

  • Shake Stabilizer Doesn't Appear In Adjustments

    Hi, I recently started using Premiere Elements 12. I've watched and followed tutorials on how to apply video stabilization and all say to simply click this button under adjustments. However, this doesn't appear in my adjustments menu. How can I add t

  • My iPad 2 won't update apps or down load any suggestions

    My iPad 2 won't update apps or down load any suggestions