Percentage and general Question

I just bought a 2.4 MB. I was given some info from Apple staff that differs somewhat from the MB manual. They suggested to drain the battery completely, then charge overnight. Also not to keep it plugged in longer than necessary. I was having loads of battery problems with my Powerbook so I followed this advice. My question is, the time seems pretty accurate but my percentage is dropping at a steady rate. Is this normal or should I re-calibrate to the manuals suggestion? I tend to get a little obsessive about battery issues so really want it to be as close to full-capacity as possible.

I would calibrate it per the manual.
Also, I leave my mba plugged in whenever I can. This way I don't use up cycles. There is no problem keeping it plugged in since the charging circuitry is intelligent and knows when to go into a trickle charge. You probably should use it off battery ever now and then though.
Glor

Similar Messages

  • Performance-testing-videos and general questions

    Hey,
    I've been trying to evaluate if it would be reasonable to use the packager for developing commercial products.
    I would like to show you some videos of my results.
    http://www.youtube.com/watch?v=lsprSZY-HEM   --> iPhone/iPod
    http://www.youtube.com/watch?v=HJNcz58690c   --> iPhone/iPod
    http://www.youtube.com/watch?v=5hH3vnhOWcs  --> iPad
    From my prespective this seems the maximum that can be done with the packager concerning performance.
    -There are 3 parallax scrolling background layers.
    -Scrolling is somewhat smooth
    -Performance is ok, gameplay in the demos doesn't feel sticky.
    -no slowdowns for up to max 30 enemy sprites on screen.
    - basic pixel based collision with an image is possible - though only for 2-3 sprites registered for collision at the same time.
    I am not a very profound programmer and i am sure a lot of my code could be optimized.
    What I would like to ask is:
    1) Has somebody been able to squeeze more out of the packager?
    2) Since packager apps have problems with multitasking on iOS and strategies to save and restore the apps state on reload/focus seem buggy - would it be professional to produce apps with the packager?
    3) If you start (and stop) several different packager Apps the device performance (iPhone/iPad) will come to a grinding halt because of memory issues when multitasking. So the question would again be, can you sell software that will only perform under certain circumstances?
    Those of you who have Apps in the store, have there been any negative comments on those issues?
    Please comment freely.
    Thanks,
    Simon

    in my opinion, flash is a viable option only if the application being developped
    1. is not graphic, memory or cpu intensive
    2. is trying to get rapid deployment into ios/android/wm7 market with relative ease
    3. relative easier programming/graphical UI (Flash much more user friendly than xcode)
    then yes flash is your solution.
    However the 1. thing varies greatly on hardware and what your app can achieve. the bottom line is that apps in flash are and will always be slower than native apps merely because it launches a flash player to run your app. hopefully next update can address the issue with performance rather than an API update pack.
    I would rather have better performance.

  • General questions on IDOCs and IDOCs for 2 Accounting Interface BAPIs

    This post involves several questions pertaining to the topic of IDOC creation. I downloaded a couple of PDFs and tried googling for material on that, but things are far from being clear in my mind.
    I tried to put my questions in some order, so we can follow a line of reasoning. Here we go, then:
    I have one code where I there are calls to 2 BAPIs:
    - BAPI_ACC_ACT_POSTINGS_REVERSE and
    - BAPI_ACC_GL_POSTING_REV_POST
    I am supposed to prepare/create an IDOC to perform the activities these BAPIs are responsible for, for the sole purpose of providing us much more details on the activities being executed in the system - this is one of the IDOC's features, if I got it right, its highly detailed logging of everything that is going on behind.
    Now, the 1st question arises:
    From the material I read, I understood that IDOCs are nothing more than data containers, whose sole purpose is to provide a means of communication between two different systems/parties - one of them would usually be SAP. If this is right, than what sort of IDOC would be this one I am supposed to build - if there's not going to be any inter-system communication ? Doesn't it sound strange that pure "data containers" can work as "logging functions" ? Please share some light here.
    The 2nd question - after I understand what an IDOC really is - is
    then connected to the job I have to do. I found 2 IDOCs which I think have the proper/correspondent basic types for the 2 aforementioned BAPIs. They are, respectively:
    - ACC_DOCUMENT_REVERSE01 and
    - ACC_GL_POSTING_REVERSE01
    Getting back to my understanding of IDOCs, I got that every IDOC is generally made of one control record, data record(s), and status record(s). 3rd question: Where do the segments fit in ? Are the segments definitions of the Data Records ? And why is it that some IDOC types have header segments only and others doesn't have one ? (header segments are not the same as control records, right ?)
    Finally, what is the general process flow for creating/preparing an IDOC ? I looked over a couple of forum posts about this but some of them differ one from another in the order of the steps, some don't mention this or that step, so I am still confused.
    4th and last question: what comes first ? The definition of a partner, the bonding of a message type with an IDOC basic type, definitions of the inbound/outbound interfaces ?
    Any help here would be highly appreciated.
    Thanks in advance,
    Avraham

    Hi Jaya,
    Answer 1. Class is a template for creating objects. Object can also be called as instance.
    Interfaces allow you to use different classes in a uniform way (polymorphism).
    Answer 2. Normal abap is a procedural programming where as by using abap objects we can achieve object oriented programing.
    Answer 6. Source code:
    In below code i have created a interface and a class which is implementing the interface. I have declared a reference variable of type interface and created a object. Then i have called a method.
    REPORT  ZABAPOBJECTS_INTERF.
          INTERFACE I1
    INTERFACE I1.
      METHODS METH1.
    ENDINTERFACE.                    "I1
          CLASS C1 DEFINITION
    CLASS C1 DEFINITION.
      PUBLIC SECTION.
        METHODS: METH2.
        INTERFACES: I1.
    ENDCLASS.                    "C1 DEFINITION
          CLASS C1 IMPLEMENTATION
    CLASS C1 IMPLEMENTATION.
      METHOD I1~METH1.
        WRITE: / 'This is a method one'.
      ENDMETHOD.                                                "I1~METH1
      METHOD METH2.
        WRITE: / 'This is a method two'.
      ENDMETHOD.                                                "METH2
    ENDCLASS.                    "C1 IMPLEMENTATION
    START-OF-SELECTION.
      DATA : REF1 TYPE REF TO I1.
      CREATE OBJECT REF1 TYPE C1.
      CALL METHOD REF1->METH1.
    Question 7: Yes we need to create a class but most probably we use the existing classes.
    Regarding BAPi's go through the below links,
    http://www.sapgenie.com/abap/bapi/example.htm
    http://www.sapdevelopment.co.uk/bapirfc/bapirfchome.htm
    Regards,
    Azaz Ali.

  • Report Builder Question - OA AR Aging - and a general question

    I'm sure this is the wrong forum for this question, but I thought there might be someone here who might be using Oracle Applications and Report Builder who'd be kind enough to help me out.
    We've recently implemented Oracle Applications 11.5.10 and I have to use report builder to change the Accounts Receiveable Aging (7 bucket) to a 5 bucket report. I've already made some changes to the seeded "ARXAGMW.rdf" report, but I'm not a big Oracle Reports guy. I've stumbled through making some changes in various other reports. But this one is just plain nasty! :)
    I was thinking that I could simply add buckets 6 & 7 to bucket 5, then just hide or delete the 6 & 7 buckets. But I'm not sure where to even start. Any help with this would GUARANTEE a Christmas or other holiday card this year! :)
    I really want to keep this simple as possible, so any help would be very....helpful. :)
    Oh, my general question is: Are there any resouces/books for Oracle Reports (Report Builder)? I feel so lost trying to modify existing reports, let alone creating new ones.
    Thanks again!
    Steve

    Hi Steve,
    I am working on the 7-bucket aging report and i want to add a new field in data model.
    As the query is build dynamically, i have modified the function BUILD_CUSTOMER_SELECT to meet my requirements.
    But the problem is that in the data model, the field is not present in my Grouping. and if I try to add the field in the Data Model query (Q_ Customer) section,
    i get the following error: ORA-01789: query block has incorrect number of result columns.
    The query is as shown below:
    select rpad('a',50,'-') short_cust_name,
    0 cust_id,
    rpad('a',30,'-') cust_no,
    rpad('a',500,'-') sort_field1,
    rpad('a',40,'-') sort_field2,
    0 payment_sched_id,
    rpad('a',32,'-') class,
    sysdate due_date,
    0 amt_due_remaining,
    0 days_past_due ,
    0 amount_adjusted,
    0 amount_applied,
    0 amount_credited,
    sysdate gl_date,
    'x' data_converted,
    0 ps_exchange_rate,
    0 b0,
    0 b1,
    0 b2,
    0 b3,
    0 b4,
    0 b5,
    0 b6,
    rpad('a',25,'-') bal_segment_value,
    rpad('a',500,'-') inv_tid,
    rpad('a',32,'-') invoice_type
    , 'y' parent_cust --I WANT A NEW FIELD HERE TO BE VIEWED ON THE REPORT LAYOUT LATER
    from dual
    where 1=2
    UNION ALL
    &common_query_cus
    Did i missed somthing 4 me to be able to add the field here?

  • General question about iTunes Match and multiple libraries

    Hello to everyone,
    I have a general question about the iTunes Match service, which is available since yesterday in my country (Italy). Currently my library situation is the following:
    Computer A (desktop, Windows 7): "big" iTunes library (about 20 GB), at the moment not associated with my Apple ID
    Computer B (MacBook Air 2011): "small" iTunes library (about 5 GB), associated with my Apple ID
    At the moment, both my iOS devices (iPhone 4 and iPad 2) are synchronized with the smaller library on the MacBook Air.
    Question is as follows: should I subscribe to iTunes Match, would it be possible to upload the "big" library (provided I associate it with my Apple ID) to iCloud while keeping my devices synchronized with the "small" one?
    Ideally, at the end of the day, the situation should be the following: both iOS devices with music from the small library + possibility of downloading songs from iCloud (coming from the big one). Is this possible?
    Maybe the question sounds stupid, but I want to be sure about this before paying for the service.
    Thanks a lot.

    Yes, you could also associate your larger library with iTunes match if you associated your Apple ID with it. However any purchases in the library made from another Apple ID will not be matched with iTunes much.
    If both libraries are part of iTunes match, then all your devices will see all of the content from both libraries, which content you choose to have on those devices and which you have accessible via iTunes match is entirely up to you.

  • Workflow and General Use Questions

    Hello,
    I'll apologize right off the bat for these novice question because I'm sure the information is probably somewhere in the forum, I just haven't been able to find it. I just purchased Aperture after completing the demo as my library is getting too large to manage using standard file folders. I'm now trying to figure out the best practices for workflow and general use before I invest some serious time into importing and keywording all my pictures.
    1) Store files in the there current location, or in the Aperture Library? It seems to me that once they are moved to the Aperture library, you can only access them from within Aperture. I'm thinking I would be better off leaving them in their current location. For one, if I want to quickly grab a picture as an attachment to an email or something it seems easier to grab it from the standard folders. Second (and more important) I do not have room to keep all my pictures on my Macbook, thus most of them are stored on the Time Capsule.
    So... Keeping photos in their current location appears to be the best choice for me even though it adds an additional step every time I bring in new photos from my camera. Does this sound right?
    2) Is there a way to mark the photos that I have uploaded to my website (Smugmug)? Ideally, I would like to badge photos that have already been uploaded so I can quickly recognize them and ensure I'm not duplicating. I've considered using the rating, or keywords to indicate that a photo has been uploaded but both methods have disadvantages.
    3) Any suggestions for general workflow and organization resources (tutorials, books, websites, etc.)? I've looked at the videos on Apple's site but they obviously didn't get that detailed.
    Thanks for the help, sorry for the length.

    I recommend to Manage by Reference with Master image files stored on external hard drives (note that Aperture defaults to a Managed-Library configuration rather than a Referenced-Masters Library). Especially important for iMacs and laptops with a single internal drive. The workflow as described below in an earlier post of mine uses a Referenced-Masters Library.
    I feel pretty strongly that card-to-Aperture or camera-to-Aperture handling of original images puts originals at unnecessary risk. I suggest this workflow, first using the Finder (not Aperture) to copy images from CF card to computer hard drive:
    • Remove the memory card from the camera and insert it into a memory card reader. Faster readers and faster cards are preferable.
    • Finder-copy images from memory card to a labeled folder on the intended permanent Masters location hard drive.
    • Eject memory card.
    • Burn backup hard drive or DVD copies of the original images (optional strongly recommended recommended backup step).
    • Eject backup hard drive(s) or DVDs.
    • From within Aperture, import images from the hard drive folder into Aperture selecting "Store files in their current location." This is called "referenced images." During import is the best time to also add keywords, but that is another discussion.
    • Review pix for completeness (e.g. a 500-pic shoot has 500 valid images showing in Aperture).
    • Reformat memory card in camera, and archive originals off site on hard drives and/or on DVDs.
    Note that the "eject" steps above are important in order to avoid mistakenly working on removable media/backups.
    Also note with a Referenced-Masters Library that use of the "Vault" backup routine backs up the Library only, not the Masters. Masters should be separately backed up, IMO a good thing from a workflow and data security standpoint.
    Max out RAM in your MB and keep the internal drive less than 70% full.
    Good luck!
    -Allen Wicks

  • Camileo charging problem (solved) and a general question!

    Hi all!
    First of all, I was going to ask for help as to why the Camileo S10 was not charging (the orange light wasn't flashing), and I'd seen a few people with similar problems.
    The solution?
    Give the contacts on the battery a clean.
    The insulation sticker that comes on it must leave some kind of residue on it, and it's enough to prevent charging. Now it's flashing away happily :]
    So my general question was, is it possible/advisable to use the camera on the mains?
    Rather than constantly draining and charging the battery during long shoots, I'd prefer to just leave it plugged in!
    Thanks very much in advance!
    Paul

    Hi
    I think the battery handling is always the same no matter what product it is
    From time to time the battery should be recalibrated.
    This means that the battery should be discharged fully and after then you should charge it again until the battery would reach 100%
    I do this with all my batteries; mobile phone battery, digi cam battery and notebook battery.

  • Update on Bioinformatics WIKI, scripting challenges, and a general question

    I am waiting for my site to go on-line at Oak Ridge National Labs (USA, Tennessee).  Should be another week or so, maybe less.
    When that happens, you will see a veritable explosion of scripting challenges in my wiki (Emerging Technologies->Bioinformatics.)
    One general question in preparation for these challenges.
    There are a number of standard  bioinformatic programs that can be run interactively via the web at various sites, e.g. "BLAST" and "STRIDE". 
    Although these can also be run locally, this requires that you download large databases and keep them updated.
    So here's my question to the scripting experts:
    Are scripting languages powerful enough to submit queries to web pages and then use regex's to parse the html that is returned?
    Bill Mann has used PERL to do some of the required regex parsing, but there is a lot left to do, and, his stuff only works when a perl program is invoking a bioinformatic program locally, not interactively.
    If so, we all can do some beautiful stuff together , if anyone is interested ...

    ...my wiki on...
    There is by it's very nature no such thing as MY WIKI, except you run your own wiki project in an exclusive mode. Which were...well ... unusual.
    Are scripting languages powerful enough to submit queries to web pages and then use regex's to parse the html that is returned?
    Yes.
    anton

  • General questions about using webservices and xml as opposed to an oracle driver

    Dear Experts; I have a general question which I have yet to test. Is it faster to use an ODBC driver to connect and retrieve data from an Oracle database than creating a .net webservices and using an XML to get the data for your web application. THank you

    At some point in the architecture stack some component will need to access the database in order to get the information from the database. That component will need to use an Oracle client driver.

  • Book printing turnaround time(Canada) and general sharpening question

    Hi,
    I am nearly finished a book in Aperture and have a couple of questions, one specific to Canada and another, more general question.
    For those in Canada(or anybody else's experiences), how long does it take for the book to be published and sent out to you? My book project is to be a christmas gift and I would like to order one copy to proof, but if the turnaround is really long, then I would just order all the copies I need and hope everything turns out satisfactory.
    The second concerns sharpening. Since the images are resized for the individual use on a page, how do you optimize the sharpening? Is it better to oversharpen, or resize and sharpen each image prior to placing in the book?
    Thanks for any and all help. This is a first book so any other comments or suggestions would be appreciated!
    Scott

    Perhaps I can help with the first question. I tried the same approach and was quite happy with the first book quality; however, when I ordered several of the same book a few weeks later, I was very unhappy with the color casts and how they differed from the original book I had printed. The moral of the story is that the print quality is highly variable between printings.
    Aves

  • SAP CM25 which profile to use and why and some general questions

    Hi there,
    I have a few general questions, if you may share your thoughts on this.
    In a manufacturing environment, is CM25 more used to finite schedule the work centers or labor or both ?
    Is it a good idea to even finite schedule the planned orders or just production orders ? We want to create production orders 1 week ahead of the production schedule and planned orders gets created 3-4 weeks earlier than production order gets created.
    The graphical tool seems to be slow sometimes when dispatching and doing some actions on the screen, has any one have any say on this. Should we need to have better computers on the floor for this to work ?
    I have been trying my solutions using CM25 and not CM21 or CM27 using the profile SAPSFCG005, is there any other profile that I need to consider and other transactions to consider like CM21 or CM27 for certain purposes?
    Thank you

    Hello
    In a manufacturing environment, is CM25 more used to finite schedule the work centers or labor or both ?
    It can be used for both depend upon bottle neck resource whether machine or labour.
    Is it a good idea to even finite schedule the planned orders or just production orders ? We want to create production orders 1 week ahead of the production schedule and planned orders gets created 3-4 weeks earlier than production order gets created.
    Better it should be finite planning for planned order also you can use MRP with lead time scheduling.
    The graphical tool seems to be slow sometimes when dispatching and doing some actions on the screen, has any one have any say on this. Should we need to have better computers on the floor for this to work ?
    It depends upon the data load and selection,
    Restrict the number of objects to be displayed, for example, select only a small number of work-centers/orders.
    Customize the time profile (transaction OPD2) such that the database read period, the planning period and the evaluation period are as small as possible. As a result, fewer objects are read and displayed.
    Refer KBA  2038780 - Tips for performance improvement on Capacity Leveling
    If you want to process more orders, check whether a batch planning (planning in the background on transaction CM40) is possible.
    I have been trying my solutions using CM25 and not CM21 or CM27 using the profile SAPSFCG005, is there any other profile that I need to consider and other transactions to consider like CM21 or CM27 for certain purposes?
    It depends upon your requirements which transaction to use and which profile. You can customize your own profile also.
    Best Regards,
    R.Brahmankar

  • Njawin - jcomgen and some general questions

    Hi.
    I'm currently evaluating njawin, and some questions came up:
    Will there be a command line interface for jcomgen?
    The interactive GUI isn't really suitable for an automated build process.
    njawin compared to jawin:
    njawin seems to be more advanced, but there's no source available.
    Does anybody know if this will change in the future?
    regards,
    roman

    Carl,
    The njawin distribution available for "evaluation" download around the web comes with no license, as it is a version of OLE for Java (OLEJA), a commercial product, which was developed before OLEJA went retail.
    One thing you'll notice is that OLEJA and Njawin have remarkably similar APIs. If your application's functionality does not have very complex requirements, you can simply replace all references to "oleja" with "njawin", and all references to "compose" with "develop". This holds true for the generated .java files, the .xml file, etc. I am not sure if OLEJA introduces or uses more APIs than the ones that appeared in Njawin, but for my application, this approach was successful.
    One other thing to keep in mind is that I don't see why you'd actually need to build your sources into a jar, if you simply include them in a larger package as part of your application's source code. Just put njawin.jar, njawin.dll, and yourdll.dll on the build path and the rest of your problems should be solved for you. Keep in mind that, since OLEJA is commercialized, it's been debugged a lot (as the author has stated in mailing lists) and that means Njawin could have some unexpected problems. In this case you should thoroughly test your usage of Njawin with your application to be sure it runs well in your environment. Errors that could potentially crash your JVM may appear when certain functionality (especially multi-threaded functionality) inside your ActiveX control is executed.
    Regards
    Sean McNamara

  • Lead Page Layout General Question - User Interface - Fields and Color

    General question if possible to do this or not, can we make a field be in color similar to the red required fields?

    Changing field or text colours is not support by OnDemand at this time

  • Intercompany Transaction Module - General Questions

    I do not have any real experience in using the Intercompany Transaction Module and have some general questions relating to the applicability of deploying this module at my site...
    The underlying GL systems capture Intercompany balances - but not the ICP entity detail. Therefore for each ICP Account the Trial Balance load into HFM (via FDM) will load the ICP Account balances to the [ICP None] member. We were originally going to provide a form through which these balances could be cleared from [ICP None] to the valid ICP members.
    However, a need to load multi-currency transactions has emerged putting the Intercompany Transaction Module under consideration.
    Not sure how the use of this module would work alongside the balances already loaded in the GL Trial Balance? Does this module merely allocate the existing balances or would we need to write a rule to reverse out of [ICP None] the values loaded via transaction to ICP Entities?
    Any advice will be appreciated

    By default, [ICP None] and [ICP Entities] will aggregate to [ICP Top],but there is one application setting: ICPEntitiesAggregationWeight
    this setting Specifies the percentage of intercompany partner entity [ICP Entities] amounts that aggregate to the [ICP Top] member of the Value dimension. By default, the value is 1, you can specify it to be 0, and using validation to make sure [ICP Entities] =[ICP None], then [ICP Top] will only =[ICP None] and will not double count.

  • LR 4.4 (and 5.0?) catalog: a problem and some questions

    Introductory Remark
    After several years of reluctance this March I changed to LR due to its retouching capabilities. Unfortunately – beyond enjoying some really nice features of LR – I keep struggling with several problems, many of which have been covered in this forum. In this thread I describe a problem with a particular LR 4.4 catalog and put some general questions.
    A few days ago I upgraded to 5.0. Unfortunately it turned out to produce even slower ’speed’ than 4.4 (discussed – among other places – here: http://forums.adobe.com/message/5454410#5454410), so I rather fell back to the latter, instead of testing the behavior of the 5.0 catalog. Anyway, as far as I understand this upgrade does not include significant new catalog functions, so my problem and questions below may be valid for 5.0, too. Nevertheless, the incompatibility of the new and previous catalogs suggests rewriting of the catalog-related parts of the code. I do not know the resulting potential improvements and/or new bugs in 5.0.
    For your information, my PC (running under Windows 7) has a 64-bit Intel Core i7-3770K processor, 16GB RAM, 240 GB SSD, as well as fast and large-capacity HDDs. My monitor has a resolution of 1920x1200.
    1. Problem with the catalog
    To tell you the truth, I do not understand the potential necessity for using the “File / Optimize Catalog” function. In my view LR should keep the catalog optimized without manual intervention.
    Nevertheless, when being faced with the ill-famed slowness of LR, I run this module. In addition, I always switch on the “Catalog Settings / General / Back up catalog” function. The actually set frequency of backing up depends on the circumstances – e.g. the number of RAW (in my case: NEF) files, the size of the catalog file (*.lrcat), and the space available on my SSD. In case of need I delete the oldest backup file to make space for the new one.
    Recently I processed 1500 photos, occupying 21 GB. The "Catalog Settings / Metadata / Automatically write changes into XMP" function was switched on. Unfortunately I had to fiddle with the images quite a lot, so after processing roughly half of them the catalog file reached the size of 24 GB. Until this stage there had been no sign of any failure – catalog optimizations had run smoothly and backups had been created regularly, as scheduled.
    Once, however, towards the end of generating the next backup, LR sent an error message saying that it had not been able to create the backup file, due to lack of enough space on the SSD. I myself found still 40 GB of empty space, so I re-launched the backup process. The result was the same, but this time I saw a mysterious new (journal?) file with a size of 40 GB… When my third attempt also failed, I had to decide what to do.
    Since I needed at least the XMP files with the results of my retouching operations, I simply wanted to save these side-cars into the directory of my original input NEF files on a HDD. Before making this step, I intended to check whether all modifications and adjustments had been stored in the XMP files.
    Unfortunately I was not aware of the realistic size of side-cars, associated with a certain volume of usage of the Spot Removal, Grad Filter, and Adjustment Brush functions. But as the time of the last modification of the XMP files (belonging to the recently retouched pictures) seemed perfect, I believed that all my actions had been saved. Although the "Automatically write changes into XMP" seemed to be working, in order to be on the safe side I selected all photos and ran the “Metadata / Save Metadata to File” function of the Library module. After this I copied the XMP files, deleted the corrupted catalog, created a new catalog, and imported the same NEF files together with the side-cars.
    When checking the photos, I was shocked: Only the first few hundred XMP files retained all my modifications. Roughly 3 weeks of work was completely lost… From that time on I regularly check the XMP files.
    Question 1: Have you collected any similar experience?
    2. The catalog-related part of my workflow
    Unless I miss an important piece of knowledge, LR catalogs store many data that I do not need in the long run. Having the history of recent retouching activities is useful for me only for a short while, so archiving every little step for a long time with a huge amount of accumulated data would be impossible (and useless) on my SSD. In terms of processing what count for me are the resulting XMP files, so in the long run I keep only them and get rid of the catalog.
    Out of the 240 GB of my SSD 110 GB is available for LR. Whenever I have new photos to retouch, I make the following steps:
    create a ‘temporary’ catalog on my SSD
    import the new pictures from my HDD into this temporary catalog
    select all imported pictures in the temporary catalog
    use the “File / Export as Catalog” function in order to copy the original NEF files onto the SSD and make them used by the ‘real’ (not temporary) new catalog
    use the “File / Open Catalog” function to re-launch LR with the new catalog
    switch on the "Automatically write changes into XMP" function of the new catalog
    delete the ‘temporary’ catalog to save space on the SSD
    retouch the pictures (while keeping and eye on due creation and development of the XMP files)
    generate the required output (TIF OR JPG) files
    copy the XMP and the output files into the original directory of the input NEF files on the HDD
    copy the whole catalog for interim archiving onto the HDD
    delete the catalog from the SSD
    upon making sure that the XMP files are all fine, delete the archived catalog from the HDD, too
    Question 2: If we put aside the issue of keeping the catalog for other purposes then saving each and every retouching steps (which I address below), is there any simpler workflow to produce only the XMP files and save space on the SSD? For example, is it possible to create a new catalog on the SSD with copying the input NEF files into its directory and re-launching LR ‘automatically’, in one step?
    Question 3: If this I not the case, is there any third-party application that would ease the execution of the relevant parts of this workflow before and/or after the actual retouching of the pictures?
    Question 4: Is it possible to set general parameters for new catalogs? In my experience most settings of the new catalogs (at least the ones that are important for me) are copied from the recently used catalog, except the use of the "Catalog Settings / Metadata / Automatically write changes into XMP" function. This means that I always have to go there to switch it on… Not even a question is raised by LR whether I want to change anything in comparison with the settings of the recently used catalog…
    3. Catalog functions missing from my workflow
    Unfortunately the above described abandoning of catalogs has at least two serious drawbacks:
    I miss the classification features (rating, keywords, collections, etc.) Anyway, these functions would be really meaningful for me only if covering all my existing photos that would require going back to 41k images to classify them. In addition, keeping all the pictures in one catalog would result in an extremely large catalog file, almost surely guaranteeing regular failures. Beyond, due to the speed problem tolerable conditions could be established only by keeping the original NEF files on the SSD, which is out of the question. Generating several ‘partial’ catalogs could somewhat circumvent this trap, but it would require presorting the photos (e.g. by capture time or subject) and by doing this I would lose the essence of having a single catalog, covering all my photos.
    Question 5: Is it the right assumption that storing only some parts (e.g. the classification-related data) of catalog files is impossible? My understanding is that either I keep the whole catalog file (with the outdated historical data of all my ‘ancient’ actions) or abandon it.
    Question 6: If such ‘cherry-picking’ is facilitated after all: Can you suggest any pragmatic description of the potential (competing) ways of categorizing images efficiently, comparing them along the pros and contras?
    I also lose the virtual copies. Anyway, I am confused regarding the actual storage of the retouching-related data of virtual copies. In some websites one can find relatively old posts, stating that the XMP file contains all information about modifying/adjusting both the original photo and its virtual copy/copies. However, when fiddling with a virtual copy I cannot see any change in the size of the associated XMP file. In addition, when I copy the original NEF file and its XMP file, rename them, and import these derivative files, only the retouched original image comes up – I cannot see any virtual copy. This suggests that the XMP file does not contain information on the virtual copy/copies…
    For this reason whenever multiple versions seem to be reasonable, I create renamed version(s) of the same NEF+XMP files, import them, and make some changes in their settings. I know, this is far not a sophisticated solution…
    Question 7: Where and how the settings of virtual copies are stored?
    Question 8: Is it possible to generate separate XMP files for both the originally retouched image and its virtual copy/copies and to make them recognized by LR when importing them into a new catalog?

    A part of my problems may be caused by selecting LR for a challenging private project, where image retouching activities result in bigger than average volume of adjustment data. Consequently, the catalog file becomes huge and vulnerable.
    While I understand that something has gone wrong for you, causing Lightroom to be slow and unstable, I think you are combining many unrelated ideas into a single concept, and winding up with a mistaken idea. Just because you project is challenging does not mean Lightroom is unsuitable. A bigger than average volume of adjustment data will make the catalog larger (I don't know about "huge"), but I doubt bigger by itself will make the catalog "vulnerable".
    The causes of instability and crashes may have NOTHING to do with catalog size. Of course, the cause MAY have everything to do with catalog size. I just don't think you are coming to the right conclusion, as in my experience size of catalog and stability issues are unrelated.
    2. I may be wrong, but in my experience the size of the RAW file may significantly blow up the amount of retouching-related data.
    Your experience is your experience, and my experience is different. I want to state clearly that you can have pretty big RAW files that have different content and not require significant amounts of retouching. It's not the size of the RAW that determines the amount of touchup, it is the content and the eye of the user. Furthermore, item 2 was related to image size, and now you have changed the meaning of number 2 from image size to the amount of retouching required. So, what is your point? Lots of retouching blows up the amount of retouching data that needs to be stored? Yeah, I agree.
    When creating the catalog for the 1500 NEF files (21 GB), the starting size of the catalog file was around 1 GB. This must have included all classification-related information (the meaningful part of which was practically nothing, since I had not used rating, classification, or collections). By the time of the crash half of the files had been processed, so the actual retouching-related data (that should have been converted properly into the XMP files) might be only around 500 MB. Consequently, probably 22.5 GB out of the 24 GB of the catalog file contained historical information
    I don't know exactly what you do to touch up your photos, I can't imagine how you come up with the size should be around 500MB. But again, to you this problem is entirely caused by the size of the catalog, and I don't think it is. Now, having said that, some of your problem with slowness may indeed be related to the amount of touch-up that you are doing. Lightroom is known to slow down if you do lots of spot removal and lots of brushing, and then you may be better off doing this type of touch-up in Photoshop. Again, just to be 100% clear, the problem is not "size of catalog", the problem is you are doing so many adjustments on a single photo. You could have a catalog that is just as large, (i.e. that has lots more photos with few adjustments) and I would expect it to run a lot faster than what you are experiencing.
    So to sum up, you seem to be implying that slowness and catalog instability are the same issue, and I don't buy it. You seem to be implying that slowness and instability are both caused by the size of the catalog, and I don't buy that either.
    Re-reading your original post, you are putting the backups on the SSD, the same disk as the working catalog? This is a very poor practice, you need to put your backups on a different physical disk. That alone might help your space issues on the SSD.

Maybe you are looking for

  • 10.4 Tiger start-up question

    When I start my laptop running 10.4 Tiger I-photo always starts and shows me a screen of my pictures. This is not what I want to happen but I cant figure out how to keep it from doing this. Thanks for your help

  • Database Performance Monitoring

    Hi, I use oracle 11.2.0.2.0,IBM AIX 6.1 operating System. My client/User complainting that the Application process is taking long time than usual,especially when they implementing some module in their applications. So when i closely monitoring my pro

  • Assign Functional area to Profit Center Distribution Cycle postings

    Hi All, I am using a NEW GL distribution cycle (Tran Code : FAGLGA35) for distributing balances in the Dummy Profit center to relevent Profit centers. My Company is using Functional area for FIN STAT Ver and other reportings. When I run the profit ce

  • Authentication errors ..

    Dear colleagues, I am trying to put things 100% working here with OBIEE, OBIP (Solaris)... almost getting everything working! I have some issues here that probably you can help me... 1) I am using BI SERVER authentication, with RPD security configura

  • Ampersands not being encoded in recordset paging

    Trying to validate a page and it appears that the ampersands are not being encoded, thus not validating the page. When I create a link and use parameters the ampersands get encoded correctly, its just the next/previous links that don't appear to get