Best practice for moving images between projects?

Hey all,
I have a project that has an album inside of it. I want to move the album, containing all of the photographs to a new project. If I drag the photos individually to the new project it moves them successfully although they now don't belong to an album in their new project. If I drag the album itself it moves the album and photographs but leaves the photos in the original project as well.
Does anyone have some best practice ideas for this scenario?
Thanks in advance for any help!

As you have discovered if the drop target is the project the images move projects. If the drop target is an album the images show up in the album but do not actually move anywhere. Moving albums does nothing to move masters. So...
Select all of the images in the album. Drag them to the new project and then drag the album to the new project. Simple enough.
RB

Similar Messages

  • What are the Best Practices for Optimizing Images in InDesign Files

    Is there a best practice for using images InDesign to optimize the document before converting to a PDF? Specifically, what I'm asking is, will the PDF file compress better if the images are cropped prior to placing them in Indesign? I'd like to know the answer for both creating PDF files for printing using images that are 300dpi and for creating PDF files for online delivery using images that are 72dpi. I have an employee that insists images need to be cropped to actual dimensions before placing in the InDesign document. I've never done it that way and believe that her recommended process is way too time consuming and leaves you with no leeway to tweak your page design since the images are tightly cropped.

    As for absolute cropping, I agree with your stance. Until the layout is fixed, preserving your ability to easily manipulate photo size and positioning is key.
    Some clever image management methods have been described in the discussion forums, and one that appealed most to me was the use of duplicate linked image folders. Having a high-res (CMYK) folder and a low-res (RGB) folder to switch between for different output enables you to use both to your advantage. Use the low-res images for layout, for internal proofing, and for EPUB/online PDF/HTML output. Then it's simply a quick switch to the high-res image folder for print purposes. You can easily prepare the alternate collection of images with a Photoshop batch convert script or with the Photoshop Image Processor. Save your presets!

  • Best Practices for Defining NDS Java Projects...

    We are doing a Proof of Concept on using NDS to develop non-SAP Java applications.  We are attempting to determine if we can replace our current Java development tools with NDS/WAS.
    We are struggling with SAP's terminology and "plumbing" for setting up/defining Java projects.  For example, what is and when do you define Tracks, Software Components, Development Components, etc.  All of these terms are totally foreign to us and do not relate to our current Java environment (at least not that we can see).  We are also struggling with how the DTR and activities tie in to those components.
    If any one has defined best practices for setting up Java projects or has struggled with and overcome these same issues, please provide us with some guidance.  This is a very frustrating and time-consuming issue for us.
    Thank you!!

    Hi Peggy,
    In Component Model we divide software projects into small components.Components can use other components in well defined manner.
    A development object is a part of a component that can be changed or developed in some way; it provides the component with a certain part of its functionality. A development object may be a Java class, a Web Dynpro view, a table definition, a JSP page, and so on. Development objects are always stored as “sources” in a repository.
    A development component can be defined as a frame shared by a number of objects, which are part of the software.
    Software components combine components (DCs) to larger units for delivery and deployment.
    A track comprises configurations and runtime systems required for developing software component versions.It ensures stable states of deliverables used by subsequent tracks.
    The Design Time Repository is for versioning source code management. Distributed development of software in teams. Transport and replication of sources.
    You can also find lot of support in SDN for the above concepts with tutorials.
    Refer this Link for a overview on Java development Infrastructure(JDI)
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/webas/java/java development infrastructure jdi overview.pdf
    To understand further
    Working with Net Weaver Development Infrastructure :
    http://help.sap.com/saphelp_nw04/helpdata/en/03/f6bc3d42f46c33e10000000a11405a/content.htm
    In the above link you can find all the concepts clearly explained.You can also find the required tutorials for development.
    Regards,
    Vijith

  • Help Please!!  Best Practices for building an NDS Project...

    We are doing a Proof of Concept on using NDS to develop non-SAP Java applications. We are attempting to determine if we can replace our current Java development tools with NDS/WAS.
    We are struggling with SAP's terminology and "plumbing" for setting up/defining Java projects. For example, what is and when do you define Tracks, Software Components, Development Components, etc. All of these terms are totally foreign to us and do not relate to our current Java environment (at least not that we can see). We are also struggling with how the DTR and activities tie in to those components.
    If any one has defined best practices for setting up Java projects or has struggled with and overcome these same issues, please provide us with some guidance. This is a very frustrating and time-consuming issue for us.
    Thank you!!

    Hello Peggy,
    this is my first post but I hope it helps you anyway.
    To learn the SAP "language" I additionally used the a SAP Presentation regarding the SAP JDI.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/documents/a1-8-4/java development infrastructure real world use webinar.pdf
    I think this one is quite useful as an addon to the other links for information you already got. Your name also indicates that your mother-tongue language is German. If so, the german version of the book (Java-Programming with the SAP WAS) is already available for purchase and really useful. Then, you can also use the information provided by the University of Potsdam. They have an introduction about how to setup a track in the SLD and then how to setup SCs.
    http://epic.hpi.uni-potsdam.de/nwlab/SC+Track.html
    Hope this helps...

  • Best practice for increasing image processing speed

    Are there best practises for efficient image processing so that the overall perfomance speed will be improved? I have a need to do near real time image processing (threshold, filtering, particle analysis/cleaning and measurements) at 10fps. So far I am not satisfied with my cycle time so I am wondering if there are documented ways to speed up performance.
    Solved!
    Go to Solution.

    Hi,
    Navigates to the "Vision and Motion" palette, then "Vision Utilities" and "Image Management" palette, you will find ther the "IMAQ multi-core options" vi.
    You will be able to set or get the number of cores used by the IMAQ functions.
    Regards

  • Best Practices for File Organizati​on/Project Explorer

    So we are finally getting SCC at my organization to manage our LabVIEW development, and that is good! 
    Now, we are starting in on discussions about how we should organize our files on disk and how we should use the Project Explorer. When I started here about 3 years ago, I wasn't very familiar with the project explorer, so I read the article at http://zone.ni.com/devzone/cda/tut/p/id/7197. Two of the main things I took away from that article are:
    1. Organize Files in a logical manner on disk. Whatever that is, it is not a flat file structure.
    2. The top level VI should be separate from other source code. Preferably, it should reside in the application folder.
    Push Back Against These Recommendations
    Before I was hired, most, if not all LabVIEW development was done utilizing a flat file structure and the top level VI lived with the source code. Since we didn't have a proper SCC, each individual organized files as he saw fit. So I started using the Project Explorer (not even its use is totally accepted right now) and I began follow recommendations 1 and 2 above. I didn't always follow #1 very strictly, but I have been working towards it, and I have always followed #2 religiously. 
    Since we are starting these discussions on how we should organize files on disk I'm starting to get some push back to following these two recommendations.
    The arguments I get in favor of using a flat file structure is that you always know where every file is; including the top-level VI. It is also argued that it is a lot of effort to organize and search for VIs when they all reside in different folders. I think the fear is that by getting "clever" and organizing our files in such a manner we'll make things complicated and we will somehow shoot ourselves in the foot. 
    The argument I get against separating the top level VI from the rest of the source code is that it:
    (a) Won't be clear where it is (like it is buried within hundreds of VIs). However, it is argued, you can just put a "!" in front of the file name and then it is always the at top of the flat file structure.
    (b) An extension of argument of (a) is that things either look or seem messy when VIs (including top level VI) don't live in a sub-folder and are just hanging out with the Project Explorer file. 
    (c) I think there may be some fear of breaking the VI by moving it and altering the dependencies for the VI. 
    Convincing Others its Good to Follow These Recommendations
    So, if I want to follow NI's recommendations, I need to come up with reasons we should follow these recommendations. Also, I should state that I care about following these recommendations because its what NI recommends. They've been around the block a few times and I'm sure there are good reasons why these are best practices. However, I don't think I've given a very compelling case for why these recommendations should be followed.
    So I'll tell you all what I think good reasons are for these recommendations and perhaps I can get some feedback or additional support? If I'm crazy for wanting to follow these recommendations maybe someone can point out why I'm crazy. 
    (a) Arguments for Following Both
    I. I passed the CLAD a couple of weeks ago, and I have started studying for the CLD. Part of the CLD is following both of these recommendations (see page 6 of http://ftp.ni.com/evaluation/certification/cld/cld​_exam_prep_guide_english.pdf). While this isn't a reason in and of itself, it suggests that if it important when being certified it is important in practice!
    II. If we hire new developers that are familiar with LabVIEW, they will most likely be familiar with these recommendations, especially if they are certified. That will lead to increased productivity out of the door because they won't have to learn our special way of doing things.
    (b) Arguments for Organized File Structure
    I. Unused VIs are easier to identify and remove. Right now we never remove VIs because we don't know if they are used or not. This leads to a lot of VI bloat.
    II. It is hard to know what a specific VIs function is in a flat file structure by looking at the name.
    (c) Arguments for Separating Top Level VI from Source Code
    I. Placing the top level VI is an intuitive place for this VI. As long as the top level VI is the only VI in the application folder there is no mistake it is the top level VI, especially once you open it. This makes it easy for new developers to find the top level VI. I'd argue it isn't very intuitive for new developers to know that a VI in the source code folder that is prefaced with a "!" is the top level VI.
    Summary
    So that is what I think so far. Is there anything else I am missing to support following those two recommendations or am I just being inflexible?
    Thanks!

    zenthoef,
    As a CLA, I have struggled with file structure over the years.  Here are my recommendations:
    1.  Put the top level VI and the project in the top-level folder.  This makes it very clear where to begin.
    2.  Put the remaining user interface VIs in a separate folder.  Again, it makes it very clear what the functionality of these VIs are.
    3.  If you are using object, put each object in a separate folder.  Place the family of objects in one folder, with each object in a subfolder.
    4.  Keep the remaining VIs either in a single folder.  This can contain a small number of subfolder if your project is large, but too many folders makes it hard to figure out where your VIs are.  For example, you might have a DAQ subfolder, an Analysis subfolder, and a Report subfolder.  But if you had a Test1 folder, a Test2 folder, and you had a VI that was used by both tests, where would it go?  Keep it simple.
    5.  You inferred that it is hard to figure out what a VI does by its name.  That implies that 1) you need better names, and 2) your VIs are too complicated.  A VI should do a single function which can be adequately described by its name.  That VI might be something like Analyze Data.vi, which would contain a bunch more subVIs (like Get 1st Harmonics.vi), but each VI would contain a single function.  You wouldn't save the data to a report in the Analyze Data.vi, for example.
    The most compelling reason for following these suggestions is that it is easier to figure out what the code is doing after you haven't looked at it for a while.  Once you have an application that is working and bug free, you shouldn't have to touch the code until you want to add features.  If that is even 6 months later, you will probably have forgotten how the code works.  As a consultant, I have had to update other people's code, and just figuring how where to start can be a challenge.
    Tom Brass
    Certified LabVIEW Architect
    Saint Bernard Engineering, Inc.
    www.saintbernardengineering.com
    Tom Brass
    Certified LabVIEW Architect
    Saint Bernard Engineering, Inc.
    www.saintbernardengineering.com

  • Best practice for moving from a G5 to a new Mac with SL

    I am receiving my new iMac today (27") and am very excited
    However I want to move over using the best practices to assure that I remain excited and not frustrated
    My initial thoughts are to boot it up and doe the initial set up - to move my iPhoto library over and to use migration assistance to move the rest of my data files
    Then to install all of the extra software that I can find the packages for from the original installation disks
    And then finally to use migration assistant again to move over any software that I can not find original disks for (I've moved from Mac to Mac to Mac over and over and some of the software goes back to OS 9 (and won't run anymore I guess)
    Is this a good way
    OR
    will I mess up doing it this way
    OR
    am I spending far too much time worrying about moving old problems over and would be better off to just turn MA loose and let it do its thing form the beginning?
    BTW - mail crashes a lot on my existing system - pretty much everything else seems ok - except iPhoto is slow - hoping that the new Intel dual core will help that
    LN

    Migration Assistant is not a general file moving tool. MA will migrate your Applications and Home folders transferring only your third-party applications. MA will transfer any application support folders required by your applications, your preferences, and network setup. You do not have a choice of what will be migrated other than the above. MA cannot determine whether anything transferred is compatible with Snow Leopard. I recommend you look at the following:
    A Basic Guide for Migrating to Intel-Macs
    If you are migrating a PowerPC system (G3, G4, or G5) to an Intel-Mac be careful what you migrate. Keep in mind that some items that may get transferred will not work on Intel machines and may end up causing your computer's operating system to malfunction.
    Rosetta supports "software that runs on the PowerPC G3, G4, or G5 processor that are built for Mac OS X". This excludes the items that are not universal binaries or simply will not work in Rosetta:
    Classic Environment, and subsequently any Mac OS 9 or earlier applications
    Screensavers written for the PowerPC
    System Preference add-ons
    All Unsanity Haxies
    Browser and other plug-ins
    Contextual Menu Items
    Applications which specifically require the PowerPC G5
    Kernel extensions
    Java applications with JNI (PowerPC) libraries
    See also What Can Be Translated by Rosetta.
    In addition to the above you could also have problems with migrated cache files and/or cache files containing code that is incompatible.
    If you migrate a user folder that contains any of these items, you may find that your Intel-Mac is malfunctioning. It would be wise to take care when migrating your systems from a PowerPC platform to an Intel-Mac platform to assure that you do not migrate these incompatible items.
    If you have problems with applications not working, then completely uninstall said application and reinstall it from scratch. Take great care with Java applications and Java-based Peer-to-Peer applications. Many Java apps will not work on Intel-Macs as they are currently compiled. As of this time Limewire, Cabos, and Acquisition are available as universal binaries. Do not install browser plug-ins such as Flash or Shockwave from downloaded installers unless they are universal binaries. The version of OS X installed on your Intel-Mac comes with special compatible versions of Flash and Shockwave plug-ins for use with your browser.
    The same problem will exist for any hardware drivers such as mouse software unless the drivers have been compiled as universal binaries. For third-party mice the current choices are USB Overdrive or SteerMouse. Contact the developer or manufacturer of your third-party mouse software to find out when a universal binary version will be available.
    Also be careful with some backup utilities and third-party disk repair utilities. Disk Warrior 4.1, TechTool Pro 4.6.1, SuperDuper 2.5, and Drive Genius 2.0.2 work properly on Intel-Macs with Leopard. The same caution may apply to the many "maintenance" utilities that have not yet been converted to universal binaries. Leopard Cache Cleaner, Onyx, TinkerTool System, and Cocktail are now compatible with Leopard.
    Before migrating or installing software on your Intel-Mac check MacFixit's Rosetta Compatibility Index.
    Additional links that will be helpful to new Intel-Mac users:
    Intel In Macs
    Apple Guide to Universal Applications
    MacInTouch List of Compatible Universal Binaries
    MacInTouch List of Rosetta Compatible Applications
    MacUpdate List of Intel-Compatible Software
    Transferring data with Setup Assistant - Migration Assistant FAQ
    Because Migration Assistant isn't the ideal way to migrate from PowerPC to Intel Macs, using Target Disk Mode, copying the critical contents to CD and DVD, an external hard drive, or networking
    will work better when moving from PowerPC to Intel Macs. The initial section below discusses Target Disk Mode. It is then followed by a section which discusses networking with Macs that lack Firewire.
    If both computers support the use of Firewire then you can use the following instructions:
    1. Repair the hard drive and permissions using Disk Utility.
    2. Backup your data. This is vitally important in case you make a mistake or there's some other problem.
    3. Connect a Firewire cable between your old Mac and your new Intel Mac.
    4. Startup your old Mac in Target Disk Mode.
    5. Startup your new Mac for the first time, go through the setup and registration screens, but do NOT migrate data over. Get to your desktop on the new Mac without migrating any new data over.
    If you are not able to use a Firewire connection (for example you have a Late 2008 MacBook that only supports USB:)
    1. Set up a local home network: Creating a small Ethernet Network.
    2. If you have a MacBook Air or Late 2008 MacBook see the following:
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- Migration Tips and Tricks;
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- What to do if migration is unsuccessful;
    MacBook Air- Migration Tips and Tricks;
    MacBook Air- Remote Disc, Migration, or Remote Install Mac OS X and wireless 802.11n networks.
    Copy the following items from your old Mac to the new Mac:
    In your /Home/ folder: Documents, Movies, Music, Pictures, and Sites folders.
    In your /Home/Library/ folder:
    /Home/Library/Application Support/AddressBook (copy the whole folder)
    /Home/Library/Application Support/iCal (copy the whole folder)
    Also in /Home/Library/Application Support (copy whatever else you need including folders for any third-party applications)
    /Home/Library/Keychains (copy the whole folder)
    /Home/Library/Mail (copy the whole folder)
    /Home/Library/Preferences/ (copy the whole folder)
    /Home /Library/Calendars (copy the whole folder)
    /Home /Library/iTunes (copy the whole folder)
    /Home /Library/Safari (copy the whole folder)
    If you want cookies:
    /Home/Library/Cookies/Cookies.plist
    /Home/Library/Application Support/WebFoundation/HTTPCookies.plist
    For Entourage users:
    Entourage is in /Home/Documents/Microsoft User Data
    Also in /Home/Library/Preferences/Microsoft
    Credit goes to Macjack for this information.
    If you need to transfer data for other applications please ask the vendor or ask in the Discussions where specific applications store their data.
    5. Once you have transferred what you need restart the new Mac and test to make sure the contents are there for each of the applications.
    Written by Kappy with additional contributions from a brody.
    Revised 1/6/2009
    In general you are better off reinstalling any third-party software that is PPC-only. Otherwise update your software so it's compatible with Snow Leopard.
    Do not transfer any OS 9 software because it's unsupported. You can transfer documents you want to keep.
    Buy an external hard drive to use for backup.

  • Best practice for maintaining URLs between Dev, Test, Production servers

    We sometimes send order confirmations which include links to other services in requestcenter.
    For example, we might use the link <href="http://#Site.URL#/myservices/navigate.do?query=orderform&sid=54>Also see these services</a>
    However, the service ID (sid=54) changes between our dev, test, and production environments.  Thus we need to manually go through notifications when we deploy between servers.
    Any best practices out there?

    Your best practice in this instance depends a bit on how much work you want to put into it at the front end and how tied to the idea of a direct link to a service you are.
    If your team uses a decent build sheet and migration checklist then updating the various URL’s can just be part of the process. This is cumbersome but it’s the least “technical” solution if you want to continue using direct links.
    A more technical solution would be to replace your direct links with links to a “broker page”. It’s relatively simple to create an asp page that can accept the name of the service as a parameter and then execute an SQL query against the DB to return the ServiceID, construct the appropriate link and pass the user through.
    A less precise, but typically viable, option would be to use links that take advantage of the built in search query functionality. Your link might display more results than just one service but you can typically tailor your search query to narrow it down. For example:
    If you have a service called Order New Laptop or Desktop and you want to provide a link that will get the user to that service you could use: http://#Site.URL#/RequestCenter/myservices/navigate.do?query=searchresult&&searchPattern=Order%20New%20Desktop%20or%20Laptop
    The above would open the site and present the same results as if the user searched for “Order New Desktop or Laptop” manually. It’s not as exact as providing a direct link but it’s quick to implement, requires no special technical expertise and would be “environment agnostic”.

  • Best practices for placing images in epubs

    Okay, I've read the books, I've watched the tutes and I'm still at a loss on the best way to add images to InDesign 5.5 documents that I will convert to ePub. The images are created in Photoshop at 300 dpi and sized at 800by600. And yet when I place them in InDesign and create the ePub, the images display poorly in ADE and other readers. What is the magic formula for adding images to my ePubs?
    Thanks in advance
    Chris

    Steve/Jongware:
    My thanks for your responses.
    Very good point to look at the images within the epub file. They are 72 dpi images, so it would seem that InDesign is indeed lowering the resolution. I had thought I tried every variation of the available settings within InDesign to avoid this but it would seem to not be the case. If you have any specific suggestions about what setting is causing this, that would be great.
    Viewing the images in the target device is of course ideal. I'm creating this for a wide range of online bookstores, so what device it will be viewed on can't be known. Since ADE drives more than 50 devices, I had hoped that would at least prove to be a reliable base of some sort -- disappointing to hear that ADE can not be trusted in that regard. I had assumed I could proof the book in that before converting to mobi and proofing on a Kindle.
    Why would anyone insert 300 dpi images, you ask? In its publishing guidelines, Amazon says: "To future-proof the content, save images in 300 dpi" Is this then bad advice on their part? Elizabeth Castro echoes this recommendation in her excellent book on the subject, by the way. It's rather difficult to know just what to do, I must admit. But I guess we're still in the early days of eBook creation, with best practices still being in a state of flux.
    Once again, my thanks for sharing your experience.
    Chris

  • Best practices for realtime communication between background tasks and main app

    I am developing (in fact, porting to WinRT Universal App) an application connecting to Bluetooth medical devices. In order to support background connectivity, it seems best is to use background tasks triggered by a device connection. However, some of these
    devices provide a stream of data which has to be passed to the main app in real time when it is active - i.e. to show an ECG on the screen. So my task ideally should receive and store data all the time (both background and foreground) and additionally make
    main app receive it live when it is in foreground.
    My question is: how do I make background task pass real-time data to the app when it is active? Documentation talks about using storage, but it does not seem optimal for realtime messaging.. Looking for best practices and advice. Platform is Windows 8.1
    and Windows Phone 8.1.

    Hi Michael,
    Windows phone app has resource quotas, to prevent it from interfering with real-time communication functionality, background task using the ControlChannelTrigger and PushNotificationTrigger receive guaranteed resource quotas for every running task. You can
    find more information from
    https://msdn.microsoft.com/en-us/library/windows/apps/xaml/Hh977056(v=win.10).aspx. See Background task resource guarantees for real-time communication section. ControlChannelTrigger is not supported on windows phone, so you can have a look at PushNotificationTrigger
    class.
    https://msdn.microsoft.com/en-us/library/windows/apps/xaml/windows.applicationmodel.background.pushnotificationtrigger.aspx.
    Regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place. Click HERE to participate
    the survey.

  • Best practice for moving portal solution using content db from UAT to PROD

    Hi,
     Would like to know can we backup the database from UAT env. and restore the  same to  PROD. if all of my functionality is working fine in UAT env.
    I have event receivers[web level features], site collection level features,custom web parts, custom permissions, saved site templates, custom discussion forums etc.
    Assuming that I have my custom solution deployed on the Prod. which will activate features for those web parts and my custom application page features.
    Is there any issues I can anticipate in PROD.env, if i perform this activity.
    or
    Is this approach not recommended by Microsoft ? if yes , whats the best approach for deploying portal solution in PROD?
    Should I create teh web application, site collections, everything in PROD.from scratch.
    any links regarding this approach and the bext practices / helpful info is appreciated.

    Thanks Trveor for the reply.
    so, I can go ahead and  create the web applns, site collections and  deploy my web parts, item event receivers, appln pages and my timer jobs in UAT and take the  backup of the same and restore it in PROD env.
    But, i ahve a doubt here , as I have few site pages created it in my site template and when i take the backupof this web apppln's content db --- [ i think i can take the backup of web appln content db through power shell] ---- 
    will the site pages also be part of this backup?
    I had some experience in prev.version of SP, wherein i have few site pages and saved site template I have taken the backup of the  web appln and  restore it in another farm and  associate the restored content db to the
    newly created web appln in the targeted farm.
    But when I navigated to thsoe restored site pages, it gave me "resource not found /file not found " error.
     I had  deployed the custom web parts as a custom wsp and added into those site pages.
    and it failed to load those web parts UI.
    I was not sute whether this happened because of backup or restore from source  spfarm to the  targeted sp farm .

  • Best practice for adding images to a RH8 HTML project?

    I'm working on a Word to RH8 HTML conversion. The images in Word are SNAG images but the resolution is poor and some of the images may need to be recreated from scratch again. Going forward I imagine I should work on getting them right in Snagit9. Any suggestions?

    For example, installation help topics dealing with install/uninstall screenshots for an application's client and server:
    client_dir_sum.gif
    client_dot_net.gif
    client_uninst_wlcm.gif
    server_wlcm.gif
    server_cfg_tomcat.gif
    server_success.gif
    server_uninst_wlcm.gif
    ...or, screenshots dealing with an Accounts window:
    acct_groups.gif
    acct_lookup.gif
    acct_lookup_btn.gif
    acct_templates.gif
    In other words, use cfg for configuration, dir for directory, btn for button, etc., and use either a function or application window as your lead, allowing you to see all related graphics in sequential order. If you've ever done any manual book indexing, think of this as Index entries/subentries, only the "entry" in file naming is your leading prefix (client_, acct_, etc.), and your subentries are the rest of each file name. Such as:
    Accounts
    configuring groups
    using lookup feature
    lookup button
    templates
    Use the same naming conventions for your topic file names as well; using this method eliminates the need for virtual folders. Many users on this forum use the folder structure, but editing/renaming/adding/deleting folders seems to very often create all sorts of grief. I find this naming structure to be less stressful.
    Good luck,
    Leon

  • Best practice for compressing images to uniform size?

    What would be an effective method of compressing all the images (JPEG) used by an application to a uniform contentLength?
    Initial thought is to generate a new copy using processCopy with successively smaller integer values for compressionQuality (in steps of 5 or 10 say) until a result smaller than the target length is achieved, then replace the original with the first suitable copy.

    contentLength is decided by many factors such as height, width, compressionQuality.
    If all images have same height and width, changing compressionQuality will help to get expected contentLength. But be care when using too small compressionQuality, which may make image quality poor.
    Another way to have uniform contentLength is using TIFF format with NONE compression.

  • Best Practices for Publishing your Captivate Project

    Come join RealEyes tomorrow for a session on publishing Captivate projects. We will be discussing each method of publishing, and the benefits and drawbacks of publishing that way. We will also cover some of the common things missed that can make a better experience for your viewers when accessing your project.
    Come here to join us: http://cyourteam.com/realeyesconnect/ev … te-project
    The session is from 11A-12P Mountain Time.

    Ok, we finally have a version of this session to post up here.
    Enjoy! http://cyourteam.com/realeyesconnect/ev … and-tricks

  • Best practices for moving from my iMac to my new MacBook Pro without carrying along a lot of junk?

    Some suggestions are welcome!
    I have been using an iMac late 2009 21.5" for the last few years. Recently I received a new MacBook Pro. The MacBook Pro has better specs (better cpu, more RAM, SSD drive) and is overall faster. I got an external 27" IPS monitor to hook up to the HDMI port and going forward I would like to use the new MBP with external monitor as my main desktop computer and keep my iMac as a backup machine if the MBP ever needs repairs.
    Anyway, I have two backups of my iMac: (1) Time Machine, which does its usual incremental backup, but excludes some things, like Parallels VMs and (2) A CCC bootable clone that runs every night at 3:30 am while I'm asleep. And as I'm writing this I'm updating that clone copy.
    The thing is, I don't want to move everything over to my new MBP. Over the years I've accumulated a lot of junk, and currently am using about 450 GB of the 512 GB disk space. My new MBP also has 512 GB of space, but I figure why not just bring over what I think is necessary and later on, if I'm missing something, I can get it from the CCC backup or the Time Machine backup. (I will get new backup drives for the MBP). My guess is I don't really need even half of what I've currently got on my iMac HD.
    So I don't think I want to use the Time Machine migration assistant. And I don't want to restore the new computer from the CCC clone. But I do want to add in enough applications and documents and settings so I can continue running much like I am running on my iMac. I'd like my installed software (BBEdit, Office, Parallels, CCC, etc.) to be licensed and avoid a new install of each app if possible. I'd like the apps with open documents to remember which docs were open, and when I start them again have things open up like they did on my iMac.
    Are there some strategies people can recommend for this? Like for each application, copy over the application, and related preference files from ~/Library? Or are there some applications which just plain have to be re-installed to work right?
    If what I'm thinking of doing is too complicated, the alternative would be to do a complete restore to the MBP and then try an extensive hash-and-slash and attempt to delete old, unneeded things.
    But I have old preference files going back years and I figure why not start afresh, since everything is so incredibly speedy on my new MBP.
    What do people here usually do when they get a new computer that will replace the old one?
    Thanks,
    Doug

    Well, I've been trying this for a while. By my calculations, at this rate, installing all the applications one by one and trying to restore all the data manually will take me about six years. I would finished just in time for the Tokyo 2020 Olympics.
    The main problem is, I don't know where all the bodies are buried. Just one example, I'm using Cornerstone for an svn client. I have working copies of multiple repositories on my computer. I really don't want to set them all up again, and I can't figure out where the Cornerstone preferences and all settings are stored. I couldn't find them in the Library anywhere.
    And that's just one case.
    I haven't done much with my MacBook Pro yet, I think in retrospect the easiest course of action is to reinstall OS X as though it were new MacBook Pro, start from scratch, and use my Time Machine backup with the migration assistant and go from there.
    And that's what I'm doing right now. I'm in the middle of reinstalling OS X.
    Afterwords I'll use a clean app uninstaller application to get rid of all the applications that I'm not using. And I'll just hack away at directories that I know I haven't looked at for years, with the confidence that I have a backup both in Time Machine and in CCC and also on my iMac. That's three backups.
    I would rather start out "light", but I can see it's just going to take forever and a day to get it done. Anyway, fortunately nothing is carved in stone, and I can always try again if I want to. But dealing with all the documents and all the applications one by one separately was just obviously going to take too much time.
    I'll report back on my results. And I appreciate your suggestion.
    Doug

Maybe you are looking for