Best Practice on Updating From a DB

Hi Everyone,
What are some best practices surrounding getting data from an oracle database into the cache layer when a data change event (insert, update, delete) happens? I've searched far and wide and the best answer I can find is to use Extractor/Replicator -> JMS -> Subscriber -> cache.
Thank you for your help.

You're right, DCN is interesting idea, but it's again the case where technology is working on simple Hello World things, but fails to deliver on real word.
To me DCN looks like an unfinished Oracle project, lot of marketing stuff, but poor features, it's good mostly to student's works or testlabs, but not for real world complexity.
Two reasons:
1.DCN has severe limitations on complexity of joins and queries in case you plan to use query change notification feature.
2. it puts too bug pressure on database by creating a tons on events, when you don't need and don't expect them, because it's too generic.
Instead of DCN, create ordinary Oracle AQ queues, using tiny SQL object type event as a payload, then create triggers and/or PL/SQL stored procedures, which ale filling the event with all the primary keys you need and the unique ID of the object you need to extract.
Triggers will filter out unnesessary updates, sending events only when you wish.
If conditions are too complex for triggers, you may create & place events either by call from the event source app itself or on scheduled basis, it's entirely up to you. Also, technique with creating object views + using instead of trigger on this object view works pretty well.
And finally, implement listener at Coherence side, which will be reading the event, making necessary extracts & assemble Java object ready to be placed into the cache, based on the event ID and set of event's primary keys. After Java object will be assembled, you can place it into the cache.
Don't use Hibernate, TopLink or any other relational-to-object frameworks, they're too slow and add excessive and unnecessary overhead to the process, use standard Oracle database features, they're much faster and transaction-safe. Usage of these frameworks within 10g or 11g database is obsolete and caused mainly by lack of knowledge among Java developers about database features on this regard.
In order to make a whole system fail-safe and scalable, you have to implement listener in fail-safe fashion, in a form of workmanager + slave processes, spawned on the other nodes.Work manager has to be auto fail-safe and auto scalable, so that if the node holding work manager instance fails due to cache cluster member departure or reset or smth else, another workmanager is automatically spawned on first available node.
Also, workmanager should spread & synchronize the work among the slave listener processes based on the current cache cluster members, automatically re-balancing and recovering work in case of cache member join/departure.
Out-of-the box Coherence has an implementation of workmanager, but it's not fail-safe and does not provide automatic scale-up/recover work features described above, so you have to implement your own.
All the features I've described are implemented and happily used in complex OLTP + workflow system backed up by big Oracle RAC cluster with huge workload, processing millions transactions per day.

Similar Messages

  • Best practice to update to Snow Leopard

    I just placed my family pack order on Amazon.com for Snow Leopard. But this will be the first time for me doing an OS upgrade on a Mac (all 4 Macs in my house came with Leopard on them so we've only done the "software update" variety). I am a reformed PC guy so humor me!
    What is the best practice to upgrade from 10.5.8 to 10.6? On a PC, my inclination would be to back up my data, reformat the whole drive and install Windows fresh... then all my apps.. then the data. I hate that and it takes hours.
    What is the best practice way to upgrade the Mac OS?

    The best option is Erase and Install. The next best option is Archive and Install. Use the latter if you do not want to or can't erase your startup volume.
    How to Perform an Archive and Install
    An Archive and Install will NOT erase your hard drive, but you must have sufficient free space for a second OS X installation which could be from 3-9 GBs depending upon the version of OS X and selected installation options. The free space requirement is over and above normal free space requirements which should be at least 6-10 GBs. Read all the linked references carefully before proceeding.
    1. Be sure to use Disk Utility first to repair the disk before performing the Archive and Install.
    Repairing the Hard Drive and Permissions
    Boot from your OS X Installer disc. After the installer loads select your language and click on the Continue button. When the menu bar appears select Disk Utility from the Installer menu (Utilities menu for Tiger.) After DU loads select your hard drive entry (mfgr.'s ID and drive size) from the the left side list. In the DU status area you will see an entry for the S.M.A.R.T. status of the hard drive. If it does not say "Verified" then the hard drive is failing or failed. (SMART status is not reported on external Firewire or USB drives.) If the drive is "Verified" then select your OS X volume from the list on the left (sub-entry below the drive entry,) click on the First Aid tab, then click on the Repair Disk button. If DU reports any errors that have been fixed, then re-run Repair Disk until no errors are reported. If no errors are reported, then quit DU and return to the installer.
    2. Do not proceed with an Archive and Install if DU reports errors it cannot fix. In that case use Disk Warrior and/or TechTool Pro to repair the hard drive. If neither can repair the drive, then you will have to erase the drive and reinstall from scratch.
    3. Boot from your OS X Installer disc. After the installer loads select your language and click on the Continue button. When you reach the screen to select a destination drive click once on the destination drive then click on the Option button. Select the Archive and Install option. You have an option to preserve users and network preferences. Only select this option if you are sure you have no corrupted files in your user accounts. Otherwise leave this option unchecked. Click on the OK button and continue with the OS X Installation.
    4. Upon completion of the Archive and Install you will have a Previous System Folder in the root directory. You should retain the PSF until you are sure you do not need to manually transfer any items from the PSF to your newly installed system.
    5. After moving any items you want to keep from the PSF you should delete it. You can back it up if you prefer, but you must delete it from the hard drive.
    6. You can now download a Combo Updater directly from Apple's download site to update your new system to the desired version as well as install any security or other updates. You can also do this using Software Update.

  • Best practice video conversion from download

    I am looking for best practice for video conversions.
    I am downloading adobe recordings via this method:
    http://server.adobeconnect.com/xyz/output/filename.zip?download=zip
    From here, I have been converting the FLVs using either freemake video converter or FLV converter. I have tried converting into AVI (XVID), MOV, WMV, etc. (I need the file to be under 600 MB for an hour of recording, therefore it is going to need some type of compression).
    My goal is to import the video into Sony Vegas Pro 10 for further editting. I have found that whatever method I use, the video and audio does not sync properly about 50% of the time. The video time is longer than the audio time usually. Or that there are other various errors, such as the video just freezing halfway through the video.
    I have been using connect for a few years now, but with each update I find (connect 8, 9, etc), that the problems are getting worse. At this point I am just wasting time trying to convert into various formats using various codecs just trying to luck upon one where the video is at least without error.
    What methods are others using to convert the FLV to a workable editable format?

    Can't the FLV files be changed into many different formats through Apple's
    Compressor or Adobe Media Encoder? These formats can then be opened in
    standard video editing software for editing.
    Best practice video conversion from download
    mach5kel
    to:
    jsb152
    05/21/12 01:03 PM
    Please respond to jive-509399086-9dnu-2-2mvb7
    Re: Best practice video conversion from download
    created by mach5kel in Connect General Discussion - View the full
    discussion
    Yes, I use this as a last resort, as the quality of capture this was is
    signifcatnly lower. As well as it is a much more time consuming process. I
    sometimes have over 50 parts of 1 hour video. To use camtasia, you need
    first to record it, then it must be saved in a camtasia format, and then
    lastly rendered into avi or wmv. Therefore, it is does take awhile.
    Really, I feel there shouldnt be so many errors in the conversion process,
    but I am finding the FLV recordings themselves have problems. This last
    file I am looking at, even the recording playback on adobe connect, has
    serious issues with audio and video sync. A problem that is all too common
    Replies to this message go to everyone subscribed to this thread, not
    directly to the person who posted the message. To post a reply, either
    reply to this email or visit the message page: [
    http://forums.adobe.com/message/4426243#4426243]
    To unsubscribe from this thread, please visit the message page at [
    http://forums.adobe.com/message/4426243#4426243]. In the Actions box on
    the right, click the Stop Email Notifications link.
    Start a new discussion in Connect General Discussion by email or at Adobe
    Forums
    For more information about maintaining your forum email notifications
    please go to http://forums.adobe.com/message/2936746#2936746.

  • Best Practice for Updating children UIComponents in a Container?

    What is the best practice for updating children UIComponents in response to a Container being changed?  For instance, when a Canvas is resized, I would like to update all the children UIComponents height and width so the content scales properly.
    Right now I am trying to loop over the children calling InvalidateProperties, InvalidateSize, and InvalidateDisplayList on each.  I know some of the Containers such as VBox and HBox have layout managers, is there a way to leverage something like that?
    Thanks.

    you would only do that if it makes your job easier.  generally speaking, it would not.
    when trying to sync sound and animation i think most authors find it easiest to use graphic symbols because you can see their animation when scrubbing the main timeline.  with movieclips you only see their animation when testing.
    however, if you're going to use actionscript to control some of your symbols, those symbols should be movieclips.

  • Best practice for exporting from iMovie '08 to iDVD

    I am looking to find out what is the best practice for exporting from iMovie '08 to iDVD. I have read the other postings that give the basic howto (export to Media Browser then select the video in iDVD). However, my question is a little more technical. I have 1080i HD projects. I am interested in burning them to DVD in the best possible quality. What setting should I be using when I publish to Media Browser?
    I am wondering about quality loss due to more than one conversion/compression. I suspect that when I export to the Media Browser then this is occurring. If I am not mistaken iMovie is using something like H.264 for this. Then, when I run iDVD I suspect it will it do another conversion/compression, I think to get to MPEG2. Not only could this result in a loss of quality but also it will take extra time. I am interested to know what others think about this.
    Finally, I am looking to create DVDs for a lot of video. I am wondering if there are any USB or firewire hardware devices out there that could speed up the compression. I use the Elgato Turbo.264 when I want to encode to H.264 but I wonder if there is something similar for DVD creation.
    Thanks in advance.

    the standards for videoDVD are 720x480, and usually mpeg2 encoded..
    so, your HiDef project HAS to be 'downsampled' somehow..
    I would Export with Qucktime/apple intermediate => which is the 'format' your project is allready, and you avoid any useless 'inbetween encoding'..
    iDVD will 'swallow' this huge export file - don't mind: iDVD cares for length, not size.
    iDVD will then convert into DVD-standards..
    you can 'raise' quality, by using projects <60min - this sets iDVD automatically to highest technical possible bitrate
    hint: judge pic quality on a DVDplayer + TV.. not on your computer (DVDs are meant for TVdelivery)

  • Best Practice in Upgrade from ECC 5.0 to ECC 6.0

    Dear All,
    Can someone help in looking for Best practice in Upgrade from ECC 5.0 To ECC 6.0 Project from Functional FI and CO Side.
    Thanks

    Moved to a different forum.

  • SQL 2008 R2 Best Practices for Updating Statistics for a 1.5 TB VLDB

    We currently have a ~1.5 TB VLDB (SQL 2008 R2) that services both OLTP and DSS workloads pretty much on a 24x7x365 basis. For many years we have been updating statistics (full scan- 100% sample size) for this VLDB once a week on the weekend, which
    is currently taking up to 30 hours to complete.
    Somewhat recently we have been experiencing intermitent issues while statistics are being updated, which I doubt is just a coincidence. I'd like to understand exactly why the process of updating statistics can cause these issues (timeouts/errors). My theory
    is that the optimizer is forced to choose an inferior execution plan while the needed statistics are in "limbo" (stuck between the "old" and the "new"), but that is again just a theory. I'm somewhat surprised that the "old" statistics couldn't continue to
    get used while the new/current statistics are being generated (like the process for rebuilding indexes online), but I don't know all the facts behind this mechanism yet so that may not even apply here.
    I understand that we have the option of reducing the sample percentage/size for updating statistics, which is currently set at 100% (full scan).  Reducing the sample percentage/size for updating statistics will reduce the total processing time, but
    it's also my understanding that doing so will leave the optimizer with less than optimal statistics for choosing the best execution plans. This seems to be a classic case of not being able to have one’s cake and eat it too.
    So in a nutshell I'm looking to fully understand why the process of updating statistics can cause access issues and I'm also looking for best practices in general for updating statistics of such a VLDB. Thanks in advance.
    Bill Thacker

    I'm with you. Yikes is exactly right with regard to suspending all index optimizations for so long. I'll probably start a separate forum thread about that in the near future, but for now lets stick to the best practices for updating statistics.
    I'm a little disappointed that multiple people haven't already chimed in about this and offered up some viable solutions. Like I said previously, I can't be the first person in need of such a thing. This database has 552 tables with a whole lot more statistics
    objects than that associated with those tables. The metadata has to be there for determining which statistics objects can go (not utilized much if at all so delete them- also produce an actual script to delete the useless ones identified) and what
    the proper sample percentage/size should be for updating the remaining, utilized statistics (again, also produce a script that can be used for executing the appropriate update statistics commands for each table based on cardinality).
    The above solution would be much more ideal IMO than just issuing a single update statistics command that samples the same percentage/size for every table (e.g. 10%). That's what we're doing today at 100% (full scan).
    Come on SQL Server Community. Show me some love :)
    Bill Thacker

  • EBS Supplier best practice to update vendor site code, update or create a new one

    I have a question related to EBS Supplier vendor site code. Application lets you update the vendor site code, but what is the best practice to update the site code?....would you inactivate the exiting one and create a new one? or would you just update the existing value?

    Ok,
    My workaround was to put in my TaskFlow an action to commit. After that I put two more actions (execute) and then back to my page. This way works but I would like to know if there is any more efficient way to do this just when I am inserting.
    Regards

  • Best practice for moving from a G5 to a new Mac with SL

    I am receiving my new iMac today (27") and am very excited
    However I want to move over using the best practices to assure that I remain excited and not frustrated
    My initial thoughts are to boot it up and doe the initial set up - to move my iPhoto library over and to use migration assistance to move the rest of my data files
    Then to install all of the extra software that I can find the packages for from the original installation disks
    And then finally to use migration assistant again to move over any software that I can not find original disks for (I've moved from Mac to Mac to Mac over and over and some of the software goes back to OS 9 (and won't run anymore I guess)
    Is this a good way
    OR
    will I mess up doing it this way
    OR
    am I spending far too much time worrying about moving old problems over and would be better off to just turn MA loose and let it do its thing form the beginning?
    BTW - mail crashes a lot on my existing system - pretty much everything else seems ok - except iPhoto is slow - hoping that the new Intel dual core will help that
    LN

    Migration Assistant is not a general file moving tool. MA will migrate your Applications and Home folders transferring only your third-party applications. MA will transfer any application support folders required by your applications, your preferences, and network setup. You do not have a choice of what will be migrated other than the above. MA cannot determine whether anything transferred is compatible with Snow Leopard. I recommend you look at the following:
    A Basic Guide for Migrating to Intel-Macs
    If you are migrating a PowerPC system (G3, G4, or G5) to an Intel-Mac be careful what you migrate. Keep in mind that some items that may get transferred will not work on Intel machines and may end up causing your computer's operating system to malfunction.
    Rosetta supports "software that runs on the PowerPC G3, G4, or G5 processor that are built for Mac OS X". This excludes the items that are not universal binaries or simply will not work in Rosetta:
    Classic Environment, and subsequently any Mac OS 9 or earlier applications
    Screensavers written for the PowerPC
    System Preference add-ons
    All Unsanity Haxies
    Browser and other plug-ins
    Contextual Menu Items
    Applications which specifically require the PowerPC G5
    Kernel extensions
    Java applications with JNI (PowerPC) libraries
    See also What Can Be Translated by Rosetta.
    In addition to the above you could also have problems with migrated cache files and/or cache files containing code that is incompatible.
    If you migrate a user folder that contains any of these items, you may find that your Intel-Mac is malfunctioning. It would be wise to take care when migrating your systems from a PowerPC platform to an Intel-Mac platform to assure that you do not migrate these incompatible items.
    If you have problems with applications not working, then completely uninstall said application and reinstall it from scratch. Take great care with Java applications and Java-based Peer-to-Peer applications. Many Java apps will not work on Intel-Macs as they are currently compiled. As of this time Limewire, Cabos, and Acquisition are available as universal binaries. Do not install browser plug-ins such as Flash or Shockwave from downloaded installers unless they are universal binaries. The version of OS X installed on your Intel-Mac comes with special compatible versions of Flash and Shockwave plug-ins for use with your browser.
    The same problem will exist for any hardware drivers such as mouse software unless the drivers have been compiled as universal binaries. For third-party mice the current choices are USB Overdrive or SteerMouse. Contact the developer or manufacturer of your third-party mouse software to find out when a universal binary version will be available.
    Also be careful with some backup utilities and third-party disk repair utilities. Disk Warrior 4.1, TechTool Pro 4.6.1, SuperDuper 2.5, and Drive Genius 2.0.2 work properly on Intel-Macs with Leopard. The same caution may apply to the many "maintenance" utilities that have not yet been converted to universal binaries. Leopard Cache Cleaner, Onyx, TinkerTool System, and Cocktail are now compatible with Leopard.
    Before migrating or installing software on your Intel-Mac check MacFixit's Rosetta Compatibility Index.
    Additional links that will be helpful to new Intel-Mac users:
    Intel In Macs
    Apple Guide to Universal Applications
    MacInTouch List of Compatible Universal Binaries
    MacInTouch List of Rosetta Compatible Applications
    MacUpdate List of Intel-Compatible Software
    Transferring data with Setup Assistant - Migration Assistant FAQ
    Because Migration Assistant isn't the ideal way to migrate from PowerPC to Intel Macs, using Target Disk Mode, copying the critical contents to CD and DVD, an external hard drive, or networking
    will work better when moving from PowerPC to Intel Macs. The initial section below discusses Target Disk Mode. It is then followed by a section which discusses networking with Macs that lack Firewire.
    If both computers support the use of Firewire then you can use the following instructions:
    1. Repair the hard drive and permissions using Disk Utility.
    2. Backup your data. This is vitally important in case you make a mistake or there's some other problem.
    3. Connect a Firewire cable between your old Mac and your new Intel Mac.
    4. Startup your old Mac in Target Disk Mode.
    5. Startup your new Mac for the first time, go through the setup and registration screens, but do NOT migrate data over. Get to your desktop on the new Mac without migrating any new data over.
    If you are not able to use a Firewire connection (for example you have a Late 2008 MacBook that only supports USB:)
    1. Set up a local home network: Creating a small Ethernet Network.
    2. If you have a MacBook Air or Late 2008 MacBook see the following:
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- Migration Tips and Tricks;
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- What to do if migration is unsuccessful;
    MacBook Air- Migration Tips and Tricks;
    MacBook Air- Remote Disc, Migration, or Remote Install Mac OS X and wireless 802.11n networks.
    Copy the following items from your old Mac to the new Mac:
    In your /Home/ folder: Documents, Movies, Music, Pictures, and Sites folders.
    In your /Home/Library/ folder:
    /Home/Library/Application Support/AddressBook (copy the whole folder)
    /Home/Library/Application Support/iCal (copy the whole folder)
    Also in /Home/Library/Application Support (copy whatever else you need including folders for any third-party applications)
    /Home/Library/Keychains (copy the whole folder)
    /Home/Library/Mail (copy the whole folder)
    /Home/Library/Preferences/ (copy the whole folder)
    /Home /Library/Calendars (copy the whole folder)
    /Home /Library/iTunes (copy the whole folder)
    /Home /Library/Safari (copy the whole folder)
    If you want cookies:
    /Home/Library/Cookies/Cookies.plist
    /Home/Library/Application Support/WebFoundation/HTTPCookies.plist
    For Entourage users:
    Entourage is in /Home/Documents/Microsoft User Data
    Also in /Home/Library/Preferences/Microsoft
    Credit goes to Macjack for this information.
    If you need to transfer data for other applications please ask the vendor or ask in the Discussions where specific applications store their data.
    5. Once you have transferred what you need restart the new Mac and test to make sure the contents are there for each of the applications.
    Written by Kappy with additional contributions from a brody.
    Revised 1/6/2009
    In general you are better off reinstalling any third-party software that is PPC-only. Otherwise update your software so it's compatible with Snow Leopard.
    Do not transfer any OS 9 software because it's unsupported. You can transfer documents you want to keep.
    Buy an external hard drive to use for backup.

  • Best Practice - Securing Schema from User Access

    Scenario:
    User A requires access to schema called BLAH.
    User A is a developer that built an application using this schema in a separate development environment, although has the same privileges mirrored to production (same roles etc - required for operation of the application built).
    This means that the User has roles that grant Select, Update etc rights for the schema / table in order to use (and maintain) the applications.
    How can we restrict access to the BLAH schema in PRODUCTION, enforcing it to only be accessible via middle tier / application (proxy authentication?)?
    We've looked at using proxy authentication, however, it's not possible to grant roles and rights to the proxy account and NOT have them granted to the user (so they can dive straight in using development tooling and hit prod etc)>
    We've tried granting it on a session basis using proxy authentication (i.e. user a connects via proxy, an we ENABLE a disabled role on the user based on this connection), however, it causes performance issues.
    Are we tackling this the wrong way? What's the best practice for securing oracle schemas (and objects in general) for user access where the users actually get oracle user account (or even use SSO) for day to day business as usual.
    To me this feels like a common scenario, especially where SSO comes into play ...

    What about situations where we have Legacy Oracle Forms stuff? In these cases the user must be granted select etc rights to particular objects, as this can't connect via a middle tier.
    The problem we have is that our existing middle tier implementation is built expecting the user credentials to be passed to it during initial authentication and does not use a proxy, or super user style account.  We have, historically, been 100% reliant on Oracle rights and controls to validate and restrict access to our underlying data.  From what you are saying, we should start to look at using proxy or super user access and move this control process further up - i.e. into Code or Packages ?  If so, does this mean that there is no specific way to restrict schema access to given proxy accounts and then grant normal user accounts to connect through these to get access (kind of a delegated access scenario), without using disabled roles?

  • Best practice when deleting from different table simultainiously

    Greetings people,
    I have two tables joined with a foreign key contrraint. They are written at the same time to keep the constraint happy but I don't know the best way of deleting them as far as rowsets and datamodels are concerned. Are there "gotchas" like do I delete the row in the foreign key table first?
    I am reading thread:http://swforum.sun.com/jive/thread.jspa?forumID=123&threadID=49918
    and getting my head around it.
    Is there a tutorial which deals with this topic?
    I was wondering the best way to go.
    Many Thanks.
    Phil
    is there a "best practice" method for

    Without knowing many details about your specifics... I can suggest a few alternatives -
    You can definitely build coordinating the deletes into your application - you can automatically delete any FK related entries prior to deleting the master, or, refuse to delete the master until the user goes and explicitly deletes the children... just depends on how you want to manage it.
    Also in many databases you can build the cascading delete rules into your database tables themselves.... so that when you delete the master the deletes automatically cascade. I think this is something you typically declare when creating the FK constrataint (delete cascade and update cascade rules).
    hth,
    v

  • Need Best Practice for Migrating from Solaris to Linux

    Hi Team,
    We are migrating our Data Center from Solaris to Linux and our EBS 11i, database 10g (10.2.0.5) is 6TB. Please let us know the Best Practice to Migrate our EBS 11.5.10.2 from Solaris to Linux RHEL 5.
    we require Database 10g (10.2.0.5) on Linux x86-64 RHEL 5 and Application EBS on Linux x86 RHEL 5. Please let us know for any details.
    EBS version: 11.5.10.2
    DB version: 10.2.0.5
    We have checked the certifications in Oracle support.
    Oracle EBS 11.5.10.2 is not certified with Linux x86-64 RHEL 5. 
    Oracle EBS 11.5.10.2 is certified on Linux x86 RHEL 5.
    So we require Database 10g (10.2.0.5) on Linux x86-64 RHEL 5 and Application EBS on Linux x86 RHEL 5.
    Thank You.

    You can transportable tablespace for the database tier node.
    https://blogs.oracle.com/stevenChan/entry/10gr2_xtts_ebs11i
    https://blogs.oracle.com/stevenChan/entry/call_for_xtts_eap_participants
    For the application tier node, please see:
    https://blogs.oracle.com/stevenChan/entry/migrate_ebs_apptiers_linux
    https://blogs.oracle.com/stevenChan/entry/migrating_oracle_applications_to_new_platforms
    Thanks,
    Hussein

  • Best practice for updating ATWRT (Characteristic Value) in AUSP

    I've notice that when we change the Characteristic Value of a Classification, that it does not update in the MM record. We have to go into MM02 for each material number that references Char Value and manually change it for row in AUSP to get updated.
    Should i just create a report to Loop through and update table AUSP directly? Or is there a better way to do this via a function or BAPI etc? Wanting to know what best practice is recommended.

    Hi Scott
    You can use a BAPI to do that.
    Check the following thread:
    BAPI to update characteristics in Material master?
    BR
    Caetano

  • What is the best procedure to update from Leopard (or Snow Leopard) to Mavericks?

    The pacient:
    Macbook Pro A1278
    Running 10.5.8 -- but have an Update DVD to 10.6 (Snow Leopard)
    Got slower in start-up
    Very slow in power-off (takes lot of time on shut-down!)
    What is the BEST procedure to update it from Leopard to Mavericks?
    A) update & clean
    Backup to Time Machine (in Leopard)
    Update from 10.5.8 to 10.6 (Snow Leopard)
    Update 10.6 to 10.6.8
    Donwnload and Update to Maverics
    Do any kind of clean to increase speed in start-up / shut-down - Any suggestions?
    or
    B) fresh install & update & recover Time Machine
    Backup to Time Machine
    Reinstall a fresh 10.5.8 (Leopard) - should increase speed in start-up / shut-down ??
    Update from 10.5.8 to 10.6 (Snow Leopard)
    Update 10.6 to 10.6.8
    Donwnload and Update to Maverics
    Recover users data & config from Time Machine
    I am thrilled to heard your advices!!
    PS. Aditional suggestions to make a faster boot / power-down , before or after updating are very welcome ;!)

    Go to  Menu > About this Mac > and tell us Version, Processor & Memory specs on your Mac. Also available hard disk space, by choosing your Macintosh HD and "Get Info" (cmd-i)
    If you're having issues now with slow sratup and shutdown, it's probably a third party item. If it is, then it may limited to your user account. If that's the case and you do a clean install, then migrate you will wind up migrating it right back.
    The first thing to check would be to boot into your Guest account and test, or try starting in Safe Mode and see if the problem still occurs?
    Restart holding the "shift" key.
    (Expect it to take longer to start this way because it runs a directory check first.)
    If this works look in System Preferences > Users & Groups > Login items and delete any third party login items.
    Also look in /Library/Startup Items. Nothing is put in that folder by default, so anything in there is yours. Then log out and back in or restart and test.
    If the problem is sorted, be sure to make a new backup before proceeding. Since the problem is sorted and you have a backup without the offending items, then there's no reason not to use the simple upgrade method you outlined in A. With the exception of "cleaning". The only cleaning your Mac should need is with a soft cloth. Stay away from so-called cleaning/optomizing utilities.

  • Best practice to update inline/publish folio?

    Hi there
    Think all is in my question
    I have an online application with an online folio and I need to update the same folio with a new version.
    What is the best practice to organize my work ?
    DId I have to continue working in Indesign with the same ID  but no update/re publish it in Folio Producer (this option scared me totally... what if mu draft goes online???)
    Did I have to recreate another folio and after testing it, publish it with the same Folio name and description ?? ( not sure it will update the same file as it is not the same)
    What is the best practice to organise me/my work/my file ?
    Thank U

    Hi there
    Think all is in my question
    I have an online application with an online folio and I need to update the same folio with a new version.
    What is the best practice to organize my work ?
    DId I have to continue working in Indesign with the same ID  but no update/re publish it in Folio Producer (this option scared me totally... what if mu draft goes online???)
    Did I have to recreate another folio and after testing it, publish it with the same Folio name and description ?? ( not sure it will update the same file as it is not the same)
    What is the best practice to organise me/my work/my file ?
    Thank U

Maybe you are looking for