Managing Large Aperture file backup and export

I have a 250GB Aperture library file with 30,000 photos on my Mac Book Pro with a 500GB Solid State driive. I bought a 1TB GoFlex external drive with FireWire 800. I copied the entire 250gb Aperture file to the drive. After 24 hours the file size changed to 180GB? I exported the Masters (full size) from Aperture to the external drive in a separate file that is 60GB as a second backup. I tried to "RELOCATE MASTERS" to the external drive, but it did not reduce the Aperture Libray size by anything. I also exported all of the Versions (full sze) from Aperture to the external drive as a third backup. I am terrified of losing my work which is now all backed up in three forms on a single hard drive. What do I do now?
I must decide whether to use the full Aperture.library on the external drive.. which is HUGE and will only get more bloated.
or import the versions with metadata into a new library but manage previews better.
or what?
Thanks

First, make and confirm two back-ups of your Library and of any Referenced Masters.  Your originals and each of your back-up sets should be on separate spindles.  Move one of the back-up sets off-site.  End of terror: it is now virtually impossible to lose any work you've done up to the point you made the back-ups.
A good set-up to end up with is this:
- Library on SSD
- Masters older than 60 days on external FW800 drive (these will be Referenced Masters; let's call this drive your Referenced Masters Drive).
- - - These constitute a complete Aperture "set".
- Back-up Library _and_ Referenced Masters to _another_ external drive (1st Back-up Set)
- Back-up Library _and_ Referenced Masters to a third external drive (2nd Back-up Set)
- Keep one Back-up Set off-site.  Never have all three Sets in the same location.
- Import new files into Library as Managed Masters.  Set Previews to be the size of your laptop screen.
- Every 60 days, using a Smart Folder to identify Images with Managed Masters older than 60 days, relocate those Managed Masters to your Referenced Masters external drive
- Back-up regularly
The above is just a template:  adjust to suit.  (Added:) How you get there will depend on where you are.  I didn't untangle your post.
When Images' Masters are off-line (you laptop is not connected to your Referenced Masters drive), you will be able to do all metadata administration, including moving Images, etc.  You will not be able to make Adjustments, Export, or Print.  (But you can drag the Preview out of Aperture and use it.)
Message was edited by: Kirby Krieger

Similar Messages

  • Large Aperture file backup strategy

    Hi All,
    I my Aperture library is just a little north of 800GB in size on a 1TB LaCie FW 800 HD (primary).  My previous backup strategy was to copy the library via a spare Mac Server  via 1 GB ethernet to a 36TB Isilon NAS and then copy this back to an identical 1 TB LaCie (backup).  This gave me three backups in three locations. 
    This worked fine and only took about 4 to 5 hours to backup the primary disk to the Isilon and I didn't care how long it took to make the LaCie backup.
    Once I hit 800 GB the time to backup the drive to the Isilon started to get in the range of 12 to 14 hours with more than a 1/2 million files being copied.
    Obviously this isn't working from an efficiency standpoint and I need to come up with a better backup plan one thats reasonably easy to use and faster to get done without sacrificing security of multiple backups in multiple locations.
    Thanks
    Mike

    First, make and confirm two back-ups of your Library and of any Referenced Masters.  Your originals and each of your back-up sets should be on separate spindles.  Move one of the back-up sets off-site.  End of terror: it is now virtually impossible to lose any work you've done up to the point you made the back-ups.
    A good set-up to end up with is this:
    - Library on SSD
    - Masters older than 60 days on external FW800 drive (these will be Referenced Masters; let's call this drive your Referenced Masters Drive).
    - - - These constitute a complete Aperture "set".
    - Back-up Library _and_ Referenced Masters to _another_ external drive (1st Back-up Set)
    - Back-up Library _and_ Referenced Masters to a third external drive (2nd Back-up Set)
    - Keep one Back-up Set off-site.  Never have all three Sets in the same location.
    - Import new files into Library as Managed Masters.  Set Previews to be the size of your laptop screen.
    - Every 60 days, using a Smart Folder to identify Images with Managed Masters older than 60 days, relocate those Managed Masters to your Referenced Masters external drive
    - Back-up regularly
    The above is just a template:  adjust to suit.  (Added:) How you get there will depend on where you are.  I didn't untangle your post.
    When Images' Masters are off-line (you laptop is not connected to your Referenced Masters drive), you will be able to do all metadata administration, including moving Images, etc.  You will not be able to make Adjustments, Export, or Print.  (But you can drag the Preview out of Aperture and use it.)
    Message was edited by: Kirby Krieger

  • Unable to create any files under '/' and /export/home as root user

    Dear Forum,
    Please help me on this problem.
    In my server I cant create any files under / and /export/home . But I can save files under /var, /tmp ,/opt
    #cd /
    #touch tempfile
    touch: tempfile cannot create
    #cd /export/
    # ls -lp
    total 2
    drwxr-xr-x   9 root     root         512 Oct 20 11:09 home/
    # cd home
    # touch testfile
    touch: testfile cannot create
    # id
    uid=0(root) gid=0(root)
    # df -kh
    Filesystem             size   used  avail capacity  Mounted on
    /dev/md/dsk/d0         9.9G   5.5G   4.2G    57%    /
    /devices                 0K     0K     0K     0%    /devices
    ctfs                     0K     0K     0K     0%    /system/contract
    proc                     0K     0K     0K     0%    /proc
    mnttab                   0K     0K     0K     0%    /etc/mnttab
    swap                    22G   1.2M    22G     1%    /etc/svc/volatile
    objfs                    0K     0K     0K     0%    /system/object
    /dev/md/dsk/d3         9.9G   6.8G   3.0G    70%    /usr
    /platform/sun4u-us3/lib/libc_psr/libc_psr_hwcap2.so.1
                           9.9G   5.5G   4.2G    57%    /platform/sun4u-us3/lib/libc_psr.so.1
    /platform/sun4u-us3/lib/sparcv9/libc_psr/libc_psr_hwcap2.so.1
                           9.9G   5.5G   4.2G    57%    /platform/sun4u-us3/lib/sparcv9/libc_psr.so.1
    fd                       0K     0K     0K     0%    /dev/fd
    /dev/md/dsk/d5         7.9G   2.5G   5.3G    33%    /var
    swap                    22G   400K    22G     1%    /tmp
    swap                    22G    48K    22G     1%    /var/run
    /dev/md/dsk/d4          20G   4.7G    15G    25%    /opt
    /dev/md/dsk/d6          63G    36G    26G    58%    /data
    # more /etc/vfstab
    #device         device          mount           FS      fsck    mount   mount
    #to mount       to fsck         point           type    pass    at boot options
    fd      -       /dev/fd fd      -       no      -
    /proc   -       /proc   proc    -       no      -
    /dev/md/dsk/d1  -       -       swap    -       no      -
    /dev/md/dsk/d0  /dev/md/rdsk/d0 /       ufs     1       no      -
    /dev/md/dsk/d3  /dev/md/rdsk/d3 /usr    ufs     1       no      -
    /dev/md/dsk/d5  /dev/md/rdsk/d5 /var    ufs     1       no      -
    /dev/md/dsk/d6  /dev/md/rdsk/d6 /data   ufs     2       yes     -
    /dev/md/dsk/d4  /dev/md/rdsk/d4 /opt    ufs     2       yes     -
    #10.183.103.11:/backup  -       /backup nfs     -       yes     rw
    /devices        -       /devices        devfs   -       no      -
    ctfs    -       /system/contract        ctfs    -       no      -
    objfs   -       /system/object  objfs   -       no      -
    swap    -       /tmp    tmpfs   -       yes     -
    10.183.103.11:/nfs_backup/NCMDB1 - /rman_backup nfs 0 yes rw,bg,intr,hard,timeo=600,wsize=32768,rsize=32768
    Regards,

    Please have the output of /etc/mnttab
    /dev/md/dsk/d0  / ufs rw,intr,largefiles,logging,xattr,onerror=panic,dev=1540000 1382602314
    /devices /devices        devfs dev=5400000     1382602306
    ctfs /system/contract ctfs    dev=5440001     1382602306
    proc    /proc proc    dev=5480000     1382602306
    mnttab  /etc/mnttab mntfs   dev=54c0001     1382602306
    swap /etc/svc/volatile       tmpfs xattr,dev=5500001       1382602306
    objfs   /system/object  objfs dev=5540001     1382602306
    /dev/md/dsk/d3  /usr ufs rw,intr,largefiles,logging,xattr,onerror=panic,dev=1540003 1382602315
    /platform/sun4u-us3/lib/libc_psr/libc_psr_hwcap2.so.1 /platform/sun4u-us3/lib/libc_psr.so.1   lofs dev=1540000     1382602313
    /platform/sun4u-us3/lib/sparcv9/libc_psr/libc_psr_hwcap2.so.1 /platform/sun4u-us3/lib/sparcv9/libc_psr.so.1 lofs    dev=1540000     1382602313
    fd      /dev/fd fd      rw,dev=5700001  1382602315
    /dev/md/dsk/d5  /var ufs rw,intr,largefiles,logging,xattr,onerror=panic,dev=1540005 1382602315
    swap    /tmp tmpfs   xattr,dev=5500002 1382602315
    swap /var/run        tmpfs xattr,dev=5500003       1382602315
    /dev/md/dsk/d4  /opt ufs rw,intr,largefiles,logging,xattr,onerror=panic,dev=1540004 1382602318
    /dev/md/dsk/d6  /data ufs rw,intr,largefiles,logging,xattr,onerror=panic,dev=1540006 1382602318
    -hosts  /net    autofs nosuid,indirect,ignore,nobrowse,dev=57c0001     1382602321
    auto_home /home   autofs  indirect,ignore,nobrowse,dev=57c0002 1382602321
    NCM1:vold(pid393) /vol    nfs ignore,noquota,dev=5780001      1382602322

  • Unable to enter Manage Storage in Icloud backup and storage

    I'm unable to enter Manage Storage in Icloud backup and storage. I'm clicking on Manage Storage and nothing happens. I am out of storage and I cant menage it. I have IOS 5.0.1

    There is a unbelievably simple "solution" to this problem (not getting in to manage storage, only spinning wheel).
    All you have to do is click "backup now" at the bottom and immediately as it is calculating estimated time for backup you click "manage storage" and VOILA! You´re in. Believe me it works, done it on 2 phones this evening. One 4s on iOS7 and one 5 on iOS6.1.3

  • I want to reinstall an app fresh install from iTunes not from my iCloud, I go to manage storage I stop backup and delete app by turning green button off, but when I ask to download it again from iTunes it is only downloading it from my iCloud. help?

    I want to reinstall an app fresh install from iTunes not from my iCloud, I go to manage storage I stop backup and delete app by turning green button off, but when I ask to download it again from iTunes it is only downloading it from my iCloud.
    Please assist
    Thx

    You don't really delete app that way, that way to stop backup for that app in icloud.
    You need to delete it from your home screen and then while it still will look like download from icloud - app will start fresh.
    However if app uses any account sign in the stuff that is attached to sign in will come back, so if you do not want to you need to use
    fresh account as well.

  • Not sure about derviatives, file types, and export for lightroom

    hello:
    I am a new LR user and new to the DAM process. I am confused about something that is probably basic and fundamental to this process. My computer setup is two internal drives (working drive and a photo archive), as well as external backup and burning to dvd. Using LR for both image adjustment and catalog management (not using Photoshop).  For raw I will be converting to dng on ingestion. I will add bulk metadata on ingestion and then bring images into LR to rate images and make basic image adjustments. These images will then be archived as originals.
    Now, if I perform additional work on any of these archived images (addl. image adjustment or metadata) would these images be considered derviatives?  If so, would I then go to export in LR to add to file name (e.g., masterfile, master.dng or .tif) before sending to, for instance, a working derviative folder before sending off to the archive?
    Also, if I am sending these images to archive, but not sending out for email or web, do I need to change the dng to another file type (e.g., tif)?
    Lastly, I have some old jpegs on media cards that I want to put through this process. Is there anything special that I need to do in handling these images versus dealing with raw?
    I have read the DAM book as well as reading through various online resources, but this whole process is quite confusing for a beginner.
    Please respond anytime.
    Thanks
    joe.

    Lightroom is a non-destructive editor. This may be different than anything you have used in the past.
    With non-destructive editors, there is no concept of original and derivative (unless you choose to use a 2nd program like Photoshop, sounds like you're not planning to do that). With a non-destructive editor, you make your changes in the editor (Develop Module), and these instructions are stored by Lightroom as a "recipe" and not as a distinct file on your hard disk. These instructions about your edits are applied whenever you use Lightroom to view your image (or whenever you choose to print from Lightroom or make a slidesehow from LR or make a web page from LR). The original is completely unchanged, and there is no edited derivative photo exported or saved. Thus, there is no need to segregate original and edited photo, because there is no saved edited photo. Additional edits in Lightroom are handled the same, original is unchanged, your edits are stored, and there is no derivative or saved photo.
    The only time you would save a copy of the photo is if you wanted a copy for some non-Lightroom purpose, in which case you use File->Export... to make a file on disk in a "temporary" folder, then you use the file created in whatever non-Lightroom purpose you have, and when completed, you can delete the file in the temporary folder, knowing that LR can re-create it exactly if you ever need it again. If you are going to print from LR, or e-mail from LR, or create a web page from LR, or create a slideshow from LR, you do not need to create an exported photo.
    JPGs and RAWs and DNGs are all handled the same by you, the user. Yes, there is internal differences that aren't important to your workflow.

  • Large PDF file sizes when exporting from InDesign

    Hi,
    I was wondering if anyone knew why some PDF file sizes are so large when exporting from ID.
    I create black and white user manuals with ID CS3. We post these online, so I try to get the file size down as much as possible.
    There is only one .psd image in each manual. The content does not have any photographs, just Illustrator .eps diagrams and line drawings. I am trying to figure out why some PDF file sizes are so large.
    Also, why the file sizes are so different.
    For example, I have one ID document that is 3MB.
    Exporting it at the smallest file size, the PDF file comes out at 2MB.
    Then I have another ID document that is 10MB.
    Exporting to PDF is 2MB (the same size as the smaller ID document)... this one has many more .eps's in it and a lot more pages.
    Then I have another one that the ID size is 8MB and the PDF is 6MBwhy is this one so much larger than the 10MB ID document?
    Any ideas on why this is happening and/or how I can reduce the file size.
    I've tried adjusting the export compression and other settings but that didn't work.
    I also tried to reduce them after the fact in Acrobat to see what would happen, but it doesn't reduce it all that much.
    Thanks for any help,
    Cathy

    > Though, the sizes of the .eps's are only about 100K to 200K in size and they are linked, not embedded.
    But they're embedded in the PDF.
    > It's just strange though because our marketing department as an 80 page full color catalog that, when exported it is only 5MB. Their ID document uses many very large .tif files. So, I am leaning toward it being an .eps/.ai issue??
    Issue implies there's something wrong, but I think this is just the way
    it's supposed to work.
    Line drawings, while usually fairly compact, cannot be lossy compressed.
    The marketing department, though, may compress their very large TIFF
    files as much as they like (with a corresponding loss of quality). It's
    entirely possible to compress bitmaps to a smaller size than the
    drawings those bitmaps were made from. You could test this yourself.
    Just open a few of your EPS drawings in Photoshop, save as TIFF, place
    in ID, and try various downsampling schemes. If you downsample enough,
    you'll get the size of the PDF below a PDF that uses the same graphics
    as line drawing EPS files. But you may have to downsample them beyond
    recognition...
    Kenneth Benson
    Pegasus Type, Inc.
    www.pegtype.com

  • Extract work order data from r/3 system in flat file(csv)and export to BI

    Hi,
    I am new in interface.
    I need to extract data regarding actual cost and quantities of work assigned to Service Providers from SAP system and send it to BI for reporting purposes.
    This interface will extract Master data as well as transactional data. Extraction of Master data will be a full extract and that of transactional data will be Delta Extraction.
    Custom development will extract the data from different fields and will export it to flat files (CSV format). This program will amalgamate all the flat files created into one big file and export it to BI system.
    Export of data to BI system will be done by schedule program Control M, which will export the data from SAP system to BI system in batches. Control M is expected to run daily at night.
    Please provide the step-by-step proces to do this.
    Thanks
    Suchi
    Moderator message: anything else? please work yourself first on your requirement.
    Edited by: Thomas Zloch on Mar 25, 2011 1:21 PM

    Hi Ravi,
    you've got to set up the message type MDMRECEIPT for the Idoc distribution from R/3 to XI. Check chapter 5.2 in the IT configuration guide available on <a href="http://service.sap.com/installmdm">MDM Documentation Center</a>. It describes the necessary steps.
    BR Michael

  • Can RMAN backup and export datapump executed at the same time?

    Hello,
    I have several databases that I backup using RMAN and export datapump everynight starting at 6PM and end at Midnight. The backup maintenance window doesn't give me enough time for each database to run at different time. I am using crontab to schedule my backups. Since I have so many databases that need to be backed up between 6PM - Midnight, some of the export and RMAN backup scripts will execute almost at the same time. My question is can my export data pump and RMAN backup scripts run at the same time?
    Thank you in advance.
    John

    Needs must. If you don't run expdp parallel then it doesn't use that much. If it was really killing the system then look into setting up a Resource plan that knocks that user down but this is a big step.
    I woud better look into using Rman
    system incrementals, and block change tracking, to minimize your RMAN time.
    Regards
    If your shop needs to do both simultaneously then go for it.
    Chris.
    PS : One of my shops has maybe 20-30 rmans and pumps all kicking off some simultaneous, some not, from 0000 to 0130. No complaints from users and no problems either. Go for it.
    Edited by: Chris Slattery on Nov 25, 2012 11:19 PM

  • Binary file import and export in lab windows cvi

    Hello,
    I have some standard binary file format (.sie) that has been written in origin IDE. The sie type extension file format is a public format, but not defined by Origin. In origin IDE, i can be able to import these files and also be able to export it to excel as csv type. Now, i want to do the same thing in lab windows cvi. i need to import .sie type file and be able to read and export it to excel in lab windows cvi. Is this possible? Please help me to work on this project. Thanks in advance.
    Regards,
    Jayakumar.V

    Reading and writing binary file is indeed possible in CVI, the problem normally lien in the definition of the file format you want to access.
    I never found .SIE extension during my activity; a fast search on google returned this page with a link to the document explaining one file structure: it's a format dedicated to accounting so it's possibly not  your case, but could they be your files?
    Before finding the complete description of the file format you cannot think of any operation on the file itself, unless it is so self evident that you need little effort to access file content (this normally happens only on text files).
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

  • Large Schema file registered and using folders to load the data (WebDAV)

    I have a very large schema file that I have registered using binary (Schema file > 30000 lines).
    My data has been in MarkLogic and it is organized in a folder structure providing reuse of pieces of the xml.
    It seemed the most logical way to load the data was to create the folder structure in XDB and move the files via WebDAV. We also have extensive XQueries written.
    My question is how do I query the data back our when it is loaded this way. I have read and experimented with resource_view but that is not going to get me what I need. I would like to make use of my xqueries that I have.
    Can i do this, and if so, HOW??

    Sure. Let me lay out a with some more specific information.
    My schema has an overall root level of "Machine Makeup".
    All of these items are defined under it with a lot of element reuse and tons of attribute groups that are used throughout the xml schema. I can do a lot of things but what I cannot do is change the structure of the schema.
    The data is currently in a "folder structure" that more closely resembles the following. I have tried to annotate number of files and keep in mind all of these are working documents and additional "record" (xml files) can be added.
    Composite - contains 12 folders and each of these folders contain xml documents
    compfolder 1
    - compfolder 12
    Most of these folders contain < 200 .xml files (each with id.xml) as the name of the file however one of these directories currently contains 1546 files. They all belong... no real way to split them up further.
    At the same level as composite
    About half of these folders at this level contain > 1000 but less than 3000.
    Like
    PartsUse > 1000 but less than 3000
    transfer of parts 8000 and can continue to grow
    people > 3000 and these would be used in the transfer of parts like "who sold it" Everytime someone new a new one is added etc.
    There are about 12 folders at this level.
    Now, the way the system works is a new person that is not in our list is involved the users get a "new" empty xml document to fill in and we assign it an id and place it in the folder.
    So it would something like this
    Composite
    folder1 - 1000 xml file
    folder 2 - 200 xml files
    folder 12 - < 2000
    Locations
    contains < 2000 xml files
    Activities < 3000 xml files
    PartsTransfer 8000 xml files and growing
    materials < 1000 xml files
    setup < 1000 xml files
    people 3000 xml files and growing
    groupUse < 1000
    and so forth.
    All of the folders contain the links I had previously layed out
    So Activities would have materialLink id = "333" peopleLink ="666"
    And so of and so forth.
    So because my file numbers are > than the optimum about half the time, how would I have 3 separate xmltype tables (I understand mine would be more than 3...)the one schema file that is intertwined with others. Can you show me an example. Would I annotate it in the schema at just those levels. This schema file is so huge I would not want to have to annotate all elements with a table. Can I pick and choose to mimic the folder structure?
    THANKS SO MUCH for your help. I really want to show that Oracle is the way. I can implement it with a simpler structure but not the size of complexity that I need. I haven't been able to find a lot of examples with the new binary registration.
    Edited by: ferrarapayne on Nov 16, 2011 1:00 PM

  • Best way to manage large library with AAC and MP3 versions of files

    I have a large library of music. I listen to music through iTunes and iPod/iPhones, but also other devices which only support MP3, and not AAC. I typically convert iTunes purchases to MP3, but I'm left with two copies of each song in the same folder. I'm looking for recommendations on the best way to manage what would essentially be a duplicated music library - a high quality version I use with my iTunes/ipods, and a lower quality MP3 version for the devices that only support that format.

    I have had a similar problem. I have all my music residing on a NAS where I access it from iTunes (for managing my iPods/iPhones) and a Tivo PVR (which is connected to my house stereo system.) The problem is that Tivo does not recognize AAC. So I've used iTunes to create mp3 versions of all my purchased AAC music. Because the NAS is set as my iTunes media folder location, iTunes puts the mp3 copies back into the existing folder structure (which it keeps organised) on the NAS. Tivo accesses the NAS directly, over the network. But iTunes also puts the additional copy into the iTunes library which leaves me with multiple copies of the same songs on iTunes AND my iPods. Having multiple copies on the NAS (mp3 for Tivo and AAC for iPod) is OK: I've got plenty of space on the NAS and the Tivo doesn't 'see' the AAC files.
    The solution I'm planning to implement is to delete the mp3 entries from the library as soon as I create them (leaving the files in the media folder.) But what I'm looking is for a way to automatically select the mp3 copies (for deletion) from the existing duplicates list. Any ideas?

  • InDesign file saving and exporting S-L-O-W.

    I am a designer using the InDesign/InCopy layout workflow. After my last post things were working pretty good for awhile. But now it is taking more than an hour to save over the network. I tried putting everything in the same folder on the network (links and all) and it still took me more than an hour just to relink a placed InCopy story. I am going mad. A week overdue for my first draft and I can't do anything. So today I decided to just  use the package workflow to speed things up. However it still takes as long to create assignments even after I have saved the InDesign file and links to my desktop.
    Is there any possibility this is caused by using the last document and saving as new name? Even if I have unlinked all assignments and stories first?
    I've tried saving as an inx file but I lock up.
    Thanks for your help in advance,
    Holly

    It turns out that is the issue. AnneMarie, were you able to find the other thread? I thought my problem was solved by deleting the random 40,000+ lines of garbage using a text edit program. But the random lines are re-generated each time I save the file in indesign. My IT Manager thinks it has to do with the InCopy plugin. In analyzing the 'garbage lines', here is what we do know. Each time a body of text is saved using the InCopy plugin, it is generating 62 unique id's of random URLs, and multiplies those id's 657 times...thus creating 40,000+ lines of garbage that bogs down the file. The urls are not used anywhere in my Indesign file, so we can't figure how the plugin is gaining access to the URLs to generate them.
    We tested it on a brand new block of text in the file that had never been exported as an Incopy file (working off the desktop the whole time) - and the resulting .icml file still had similar URLs generated (all 40,000+). The plugin version is 6.0.0.352. Indesign Application version is CS4 6.0.4.578. Here is one URL pulled from the file: <HyperlinkURLDestination Self="u54e9" Name="URL 275" DestinationURL="http://www.evanta.com/details_popup.php?cmd=speaker&id=6963" Hidden="false" DestinationUniqueKey="694"/>
    Any ideas on how to stop the URLs from generating each time I have to save the file?
    Pelagia

  • Backups and exports taking longer after Oracle upgrade 8.1.7 to 9.2.0.6

    We recently upgraded Oracle from 8.1.7 to 9.2 and also applied HP-UX OS patches after the upgrade....now the backups, exports and batch jobs are running little longer (1 hour longer)...any known issues?

    For Batch jobs -- Try analyzing your database schema's referred by batch jobs. Also look if you have suff o/s resources as recommended by oracle for running Oracle 9i software.
    For backups -- Check the size of your dmp file before upgradation and after upgrading to 9i. If you do not find major diff in size then it could be a resource issue like cpu usage / io / memory.
    Thanks,
    http://askyogesh.com

  • Aperture file name and mobile.me

    Hello,
    I am in the process of figuring out if there is a way to publish an album to mobile me and have the version name become the file name. In this particular case the file names are all generated by the camera and I give them custom version names. The titles of art work. The point of the gallery is for the art galleries that i work with to have access to large jpgs for their promotional use.
    The problem I hope to solve is that the displayed version name is not the file name so the downloaded files are not easily identifiable.
    I suppose I could export and re import but i was hoping that there was a better way.
    Thanks for any suggestions

    The URL - http://www.marysmith.com/ - is working for me. Empty your browser cache and try again.

Maybe you are looking for

  • Multiple users sharing one itunes library

    i have multiple users on my mac, but i'd like us all to be able to access the same itunes library. when i go to itunes prefs on one of the other users accounts and redirect their music library to my folder, it says it is doing it, but itunes still go

  • Problem with refresh_table_display

    Hi there. I've got a code *   INCLUDE ZWOP_LISTING_INCL_UKOLY                                    * *&      Module  STATUS_1024  OUTPUT *       text MODULE STATUS_1024 OUTPUT. *  SET PF-STATUS 'xxxxxxxx'. *  SET TITLEBAR 'xxx'.   DATA: line   LIKE LIN

  • Why won't iTunes won't open on mac mini?

    Recently installed 2 gigs memory in my 2007 mac mini, wiped the hard drive, and installed OS x 10.6.4.( done at the apple store ) iTunes icon appears in applications but won't open when clicked. Help. I'm an old guy and not very computer savvy

  • Is home sharing down?

    Anyone else having trouble with home sharing? It was working fine just yesterday. Today, my AppleTV doesn't see my computer and I can't use AirPlay from my iPhone. I rebooted the AppleTV, which wasn't even able to connect to iTunes. It connected just

  • Drop down list created by fireworks MX disappears behind Flash movie.  Is there a way to have the menu list display front of Flash movie?

    Drop down list created by fireworks MX disappears behind Flash movie.  Is there a way to have the menu list displays in front of Flash movie, pls. let me know what I should do to have the drop down menue not disappear behind the Flash movie. Thanks,