Batch import fails with more than about 250 photos

The batch-import problem (more than about 250) persists with Keynote 09.
I am disappointed because I know no alternative to Keynote to make superior slideshows (Text, moving graphics, opacity control etc.)
I have to import 550 slides to Keynote 09. The only way is to move seperated batches of 200 slides into the program. Other tries (e.g. 250 + 250 +50) fail. It does not matter wherefrom I import, either from iPhoto or Expression Media; the import from the finder is unusable because of losing the sorting-order.
I hoped so much ...
Can anyone (besides Apple) help?
Thank You!
Roland Muehlberger, Linz, Austria

This may not be the solution you're looking for but have you tried using iMovie to create your slideshow?
I can tell you that it's super simple using iMovie '08. Nice transitions, nice titling, Ken Burns, etc.. Very nice. The only thing you mention that I don't think iMovie can handle is the moving graphics thing (at least by just inserting them as in Keynote).

Similar Messages

  • Slowly trash with more than 35,000 photos, is it correct6

    My trash is very slowly, i send more than 35,000 photos, is it normal?

    Relaunch the Finder, then from the Finder menu bar, select
              Finder ▹ Preferences... ▹ Advanced
    and uncheck the box marked
              Empty Trash securely
    Try again to empty the Trash.

  • Broadcasting fails with more than X variables in variable screen

    Hi all,
    We have a few reports which have a variable screen with 10+ variables, mostly 3 mandatory and the rest is optional.
    Normally a pop-up appears in which you can select which type (HTML, PDF, Portal etc.) and fill in the recipients. But for the reports I'm talking about, the pop-up stays blank or give an error "page not found"
    First I thought it was an iView setting but I've compared all iviews and the settings are all the same.
    Then I thought the webtemplates could have wrong parameters, but the LAUNCH_BROADCASTER settings are the same for 0ANALYSIS_PATTERN and custom templates.
    Now I'm testing at query level and I found out that when I cut some optional variables, at some point it works and I'll get the pop-up I want.
    So I wonder if there's a restriction in the maximum number of variables in the variable screen, which has impact on broadcasting? In one query it fails with 11 or more variables, but in another query it fails with 13 variables. It doesn't matter which variable I erase, as long as I stay under the mentioned number of variables, the broadcaster works for that query.
    Is there a parameter or something which can be set?
    We are on:
    BW 7.31 SP9.30
    Thanks for your help!
    Regards,
    Daan Boon

    There is nothing restriction on no. of Query Variables while broadcasting. There is something else which causes this strange behavior. Can you update your BEx GUI to latest Patches?

  • Query fails with more than one UNION

    Hi,
    my query stops producing results (with same input) when trying to UNION more than two tables.
    Moreover, it seems to have a "bad memory". See the following snippet:
    -- old events
    SELECT creationTime, eventId
    FROM errorEvents
    UNION
    SELECT creationTime, eventId
    FROM successEvents
    -- new events
    UNION
    SELECT creationTime, eventId
    FROM interactiveSuccessEvents
    UNION
    SELECT creationTime, eventId
    FROM interactiveErrorEvents 
    1. Starting with the above (considering only the lower SELECT - with a single UNION produces good results.
    2. Un-remarking to consider the 4 tables (3 UNIONs) produces no results at all - with same input data.
    3. Remarking again, still does not produce results.
    4. I get the expected results only after removing major parts of the query (and then bringing it back)
    is more than 1 UNION not supported?
    Thanks,
    Yoram

    The inputs are from different event hubs.
    I have checked the logs and no errors are reported. The logs also report inputs, but no outputs.
    I haven't verified it, but its possible that I had some outputs after a huge delay - I've noticed some output lines that I suspect are from the same "4-union query". If these lines are indeed from the same query - the delay was few hours.
    I've never had this with the 2-union version - but may be coincidence...

  • Importing CDs with more than one disc

    I have used iTunes for 6 years yet it still continues to perplex me.
    When I import a CD with 2 (or more) discs I add "[Disc 1] and [Disc 2] to the Album Name. e.g. Band On the Run [Disc 1], Band On The Run [Disc 2]. For 6 years when I did this iTunes created separate folders in iTunes Music (now iTunes media). So if I looked I would see within the Paul McCartney & Wings folder a folder for Band On The Run [Disc 1] and another folder for Band On The Run [Disc 2].
    That is until recently. Now it will create just one folder and many times the name is not even complete. Within that one folder will be the songs from both discs. I've tried changing the name to include [Disc 1] before I import. I've tried changing the name after I import and it doesn't matter.
    If I manually change the folder names within iTunes Media then the songs can't be located when I try to play them and locating them just restores the original folder name.
    I wonder if this has something to do with Win 7 and libraries vs. actual files...I'm new to Win 7.
    What could have changed to now cause this?
    Message was edited by: bru2

    It's nothing to do with Windows 7. iTunes has it's own idea about naming files and folders from information in the tags.
    If you have "Keep iTunes organised" in Preferences>>Advanced tab, iTunes will make changes, if you want to change the naming structure you need to uncheck the check box. AS you observe, iTunes will lose the track if you change the name.
    AS far as I can see the current track naming arrangement is disk number-track number-track name.
    iTunes isn't a very good choice of player if you care a lot about how things are named and arranged. With iTunes it's best to let it do its thing.

  • PROBLEM - I need to import Attr with more than 20 char.

    Hi,
    I need to import some text information.
    My first idea was to import them via Attribute (Attr) field. Unfortunately they are cut after 20 chars.
    Next workaround was to import into a non-used dimension (UD8 - UD 15): here the problem is, that there is no matching dimension in the target system. FAILED
    Also, trying to import in a dimension (USE AS LOOKUP) (via integration script) - that don't need to have a matching dimension in the target system - FAILED
    Are there any workarounds how script a integration script that imports the text information from a CSV file into e. g. unused dimension columns??
    Other workarounds?
    HELP - I'm lost :-)
    regards
    Hau

    Tony,
    added additional dimensions (enabled , not "use as lookup") -> done
    unchecking the use list box -> done
    how should the load rule look like. to pass them successful int essbase?
    up to now:
    import -> works
    validate (like *) -> works
    export -> :-( doesnt work
    I'm lost again...

  • Word Mail Merge does not accurately import a Text field in Excel with more than 15 numbers

    Hi, I've looked through some of the discussions regarding importing numbers from excel into word mail merge. I'm having a problem. In Excel I have a column that includes numbers with more than 15 digits. In Excel, I have made this column a text format, so
    now in Excel those long numbers show up correctly. However, when doing a mail merge in Word, again the numbers past 14 digits change to zeros. I've read many help articles about this but am still not finding a solution. I even tried going the DDE route and
    that didn't do it either. I checked out this answer: http://www.techsupportforum.com/forums/f57/mail-merge-data-corruption-429351.html which was the most helpful, but again, DDE seemed to work for this person but not for me. I hope someone can offer a solution.
    I was hoping that I could do a picture switch, but that does not seem to be an option for this particular problem. I don't know why importing it in DDE format did not solve the problem. Thanks for any help!

    Wow! After all my searching and just after posting this I figured out the solution! I originally had { MERGEFIELD Field_Name \# # } But then I just removed everything after the field name, as is normal for any other text field, so not indicating it was
    a number, and now it shows up correctly even if the number (from a Text formatted field in Excel) is longer than 15 digits. Hope this helps anyone else who has a problem. I did not use DDE to solve this problem.

  • When I import a cd with more than one artist iTunes separates each individual artist - how do I get it to show the album as a whole rather than individual artist?

    When I import a cd with more than one artist itunes shows each artist's song separately and won't group the album as a whole - this is incredibly frustrating something that Windows media player doesn't do. Does anyone have a simple cure for this annoyance?

    Generally all you need to do is fill in an appropriate Album Artist. For more details see my article on Grouping Tracks Into Albums, in particular the topic One album, too many covers.
    tt2

  • Import AP Invoices with more than one Prepayment Application

    Hi,
    We have a requirement to import AP Invoices from the legacy system with more than one prepayment application to the invoice.
    The AP_INVOICES_INTERFACE table provides only one column for Prepayment Number. Also, more than one line can not be created for the same invoice number.
    Is there any solution to the above situation?
    Gajendra

    Dear Srikanth,
    Thanks a lot for your reply.
    We want add a invoice with DI API, but the DI API provide only one costing center: CostingCode in Document_Lines Object. But we have two costing centers, one is department, the other is salesperson.
    This is my question.
    It will be appreciated highly if you give me a solution.
    Thanks!
    Xingjun Han

  • Updating JournalEntries object with more than one currency fails

    I'm using SBO 6.50 (exactly CEE version 7.60.014 SP:01 EF:08).
       I'm trying to update memo field of existing journal entries. Most of them works OK, but if I try to update entry with more than one currency, update fails with an error code -5002 and message:
       Transaction includes more than one currency [OCRN].
    Code (in C++) is quite simple:
    SAPbobsCOM::ICompanyPtr m_company = SAPbobsCOM::ICompanyPtr("SAPbobsCOM.Company");
    SAPbobsCOM::IJournalEntriesPtr pEntry = m_company->GetBusinessObject(SAPbobsCOM::oJournalEntries);
    long m_nId = 1;     // TransId field, any valid value
    try
         if (pEntry->GetByKey(m_nId) != -1)
              return;
         pEntry->Memo = _bstr_t("Memo");;
         if (pEntry->Update() < S_OK)
              throw(0);
    catch(...)
         long lError; BSTR bstrError;
         m_company->GetLastError(&lError, &bstrError);
    What's wrong? Using multiple currencies in journal entry is correct, it is possible to add them manually so it should be possible to update such an "safe" field as Memo is.

    Hi,
    I have the same problem, but if you go to Administration->System Initialization ->Document Settings, select "Per Document", and in Document, choose "Journal Entry", by default, the options are clicked, or "Blocked", this is why in DI yoy get this message.
    Try to unchecked and it will work.
    HTH,
    Ribeiro Santos

  • I am upgrading to Calendar Server 4.0 on NT with more than 9 nodes (or I already have) and my server won't start

    I am upgrading to Calendar Server 4.0 on NT with more than 9 nodes
    (or I already have) and my server won't start.
    <P>
    The steps are the same whether you have already upgraded or are about to.
    <P>
    Calendar Server 4.0 on the Windows NT platform can only support up to nine
    nodes on one server, while 3.51 supported up to 14. If you have a Calendar
    Server 3.51 with more than 9 nodes that you want to upgrade to Calendar Server
    4.0, Netscape recommends that you migrate the extra nodes to another Calendar
    Server 3.51 on another Windows NT system. To accomplish this:
    <OL>
    <LI>Install Calendar Server on another Windows NT machine and configure it to
    use the same directory server as your current Calendar Server.
    <LI>Stop and backup your current Calendar Server.
    <LI>Individually zip up the node directories you want to move to the new
    server. (drive:\unsers\unison\db\Nx
    where "x" is the number of the node you want to use.)
    <LI>Stop the new Calendar Server.
    <LI>Unzip the files into the new Calendar Server in the same place as they
    were on the old server.
    <LI>Edit the unison.ini
    file on the new server to add the nodes you have just migrated.
    <LI>Edit the unison.ini
    file on the old server to remove the nodes you have moved to the new server.
    <LI>Edit the nodes.ini
    file on the old server to remove the nodes you just moved and then add them
    with the new hostname. Keep in mind that you will only be able to modify the
    node network from the old host.
    <LI>Run unidbfix -export
    on both servers for all nodes.
    <LI>Edit the remotenodes.ini
    file on both servers to reflect your node topology.
    <LI>Run unidbfix -import
    to import the changes into the node databases.
    <LI>Run unidbfix -c,
    then -f,
    then -c
    again on both servers for all the nodes.
    <LI>If you don't get any errors, run uninode -test all
    to test if your nodes connections are set properly. If they are not, <B>do not</B>
    start either of the servers; instead, fix the errors and try again.
    <LI>Start both servers.
    <LI>Log in and check to see if you can see people on remote nodes.
    <LI>Notify the users on the moved nodes of their new calendar host.
    <LI>If you did this in preparation of an upgrade, you can now run the upgrade to
    4.0 since both servers contain fewer than 9 nodes.
    </OL>

    Use Disk Utility whihc is in the Utilities Folder.
    Select your Boot Disk on the left.
    Select Verify Permissions
    If any errors the do Reapir permissions.
    You might have to repeat the process.
    see this:
    Steve

  • Excel 2007 to Sql server table. Column with more than 255 characters.

    Hi there,
    I am facing a problem while converting data from Excel 2007 to SQL server 2005 table. I am using BIDS 2005.I have an excel file where one particular column has more than 255 characters. I use OLEDB connection for excel file as there is no driver for Excel
    2007 in BIDS2005. I am using Microsoft Office 12.0 Access Database Engine OLE DB Provider for Excel file.
    Next, I changed advanced properties for the column to DT_NTEXT. But when I am getting errors on execution. They are:
    [OLE DB Source [1949]] Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E21.
    [OLE DB Source [1949]] Error: Failed to retrieve long data for column "action".
    [OLE DB Source [1949]] Error: There was an error with output column "action" (2046) on output "OLE DB Source Output" (1959). The column status returned was: "DBSTATUS_UNAVAILABLE".
    [OLE DB Source [1949]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "output column "action" (2046)" failed because error code 0xC0209071 occurred, and the error row disposition on "output column "action"
    (2046)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
    Please advise on how can I deal with columns having more than 255 characters in Excel file.
    Thanks!

    Here is what your connection string should look like for excel source
    Provider
    =Microsoft.Jet.OLEDB.4.0;Data
    Source=c:\temp\test.xls;Extended
    Properties="EXCEL 8.0;HDR=YES";
    http://sqlworkday.blogspot.com/

  • Wcf Data Service fails when more than 8properties  in the 'select=' portion

    Hi:
    I am using WCF Data Service and Oracle
    EF Provider is ODAC11.2 Release 4
    Wcf Data Service fails when more than 8 properties are specified in the 'select=' portion of the URI
    here is my code
    var q = from c in this.ctx.SALESORDER_ITEM
    select new
    c.SORDERDETAILID,
    c.IID,c.DMFLAG,c.OWNERID,c.SKUID,c.SKU_ID,c.TRADENO,c.SOURCEID,c.SORDERID
    excetion:
    InvalidOperationException: An error occurred for this query during batch execution. See the inner exception for details
    The inner exception is null, but the DataServiceClientException states: Value cannot be null Parameter name: value
    the exception is thrown in base.OnStartProcessingRequest(args) method (overridden).
    Here is the call stack as well:
    at System.Data.Services.WebUtil.CheckArgumentNull[T](T value, String parameterName)
    at System.Data.Services.Internal.ProjectedWrapper.set_PropertyNameList(String value)
    at lambda_method(Closure , Shaper )
    at System.Data.Common.Internal.Materialization.Coordinator`1.ReadNextElement(Shaper shaper)
    at System.Data.Common.Internal.Materialization.Shaper`1.SimpleEnumerator.MoveNext()
    at System.Data.Services.Internal.ProjectedWrapper.EnumeratorWrapper.MoveNext()
    at System.Data.Services.DataService`1.SerializeResponseBody(RequestDescription description, IDataService dataService)
    at System.Data.Services.DataService`1.HandleNonBatchRequest(RequestDescription description)
    at System.Data.Services.DataService`1.HandleRequest()
    Is there a max number of properties in $select statement
    I think may be it is oracle provider's problem ,but i don't konw how to debug it Can anyone help me
    Any help is greatly appreciated

    I believe the null/empty string issue is unrelated to the 8 column issue, at least for ODP.NET. For example, let's take the original query in the OBE:
    http://.../yoursvcfile.svc/EMPLOYEES?$select=EMPLOYEE_ID,FIRST_NAME,LAST_NAME,SALARY,DEPARTMENT_ID,DEPARTMENT,EMAIL,PHONE_NUMBER,MANAGER_ID
    Let's make all the columns selected not nullable. You can do this with the Oracle Dev Tools. Specifically, PHONE_NUMBER and FIRST_NAME are the only nullable fields. I make them non-nullable and re-run the query and the same error occurs. Thus, these values should never be made null. Moreover, in all 107 rows, none of these row values consist of empty strings anyway.
    Looking into the problem further, WCF DS is calling methods in the System.Data.Services.Internal namespace.
    http://msdn.microsoft.com/en-us/library/system.data.services.internal.aspx
    Specifically, we see your issue when the ProjectedWrapperMany method is called. You will notice that there is ProjectedWrapper0, ProjectedWrapper1...ProjectedWrapper8 methods also present in the same namespace. As soon as the number of columns exceeds 8, ProjectedWrapperMany is called and we see the error. We're going to ask MS to help analyze the issue since this is an .NET-internal method being called.

  • How can we create a table with more than 64 fields in the default DB?

    Dear sirs,
    I am taking part in the process of migrating a J2ee application from JBoss to SAP Server. I have imported the ejb project.
    I have an entity bean with 79 CMP fields. i have created the bean and created the table for the same also. but when i tried to build the dictionary, i am getting an error message as given below,
    "Dictionary Generation: DB2:checkNumberOfColumns (primary key of table IMP_MANDANT): number of columns (79) greater than allowed maximum (64) IMP_MANDANT.dtdbtable MyAtlasDictionary/src/packages"
    Is it mean that we can not create tables with fields more than 64?
    How can i create tables with more than 64 fields?
    Kindly help,
    Thankyou,
    Sudheesh...

    Hi,
      I found a link in the help site which says its 1024 (without key 1023).
    http://help.sap.com/saphelp_nw04s/helpdata/en/f6/069940ccd42a54e10000000a1550b0/content.htm
      Not sure about any limit of 64 columns.
    Regards,
    S.Divakar

  • Owa_text.vc_arr: can't handle the string with more than 4000 characters?

    In the Oracel Web Application Server 4.0 documment, it says
    about owa_text.vc_arr :Type vc_arr is table of varchar2(32767)
    index by binary_integer.
    I amusing PL_SQL with Oracle8i and OWA4.0 web server.I want to
    use owa_text.vc_arr to pass the multple line texts in my form.
    If the text length is less than 4000 characters, everything works
    fine.However when the texts are longer than 4000 characters but
    less than the max length 32767 characters, I got this error
    message:
    OWS-05101: Execution failed due to Oracle error 2005
    ORA-02005: implicit (-1) length not valid for this bind or define
    datatype.
    Owa_text.vc_arr is supposed to handle the string with more
    than 4000 characters, is it true? Could anyone tell me why? Any
    help will be greatly appreciated!!!
    Thanks very much.
    Helena Wang
    Here is the pl_sql procedure to create my form:
    PROCEDURE myform
    IS
    BEGIN
    htp.p('
    <form action="'||service_path||'helena_test.saveform3"
    method=post>
    <input type=hidden name=tdescription value="X">
    Input1: <textarea name=tdescription rows=50 cols=70
    WRAP=physical></textarea>
    Input2: <textarea name=tdescription rows=50 cols=70
    WRAP=physical></textarea>
    <input type=submit name=WSave value="Save">
    </form>
    END;
    /***** here is the pl_sql procedure which I use to save the
    form***/
    procedure saveform3(tdescription in owa_text.vc_arr,
    WSave in varchar2 default 'No')
    is
    len pls_integer;
    begin
    for i in 2..tdescription.count loop
    len := length(tdescription(i));
    htp.p(len);
    htp.p(tdescription(i));
    end loop;
    end;

    Helena, I think you might get a better response either from the SQL-PL/SQL forum, or perhaps the Portal Applications forum - both of these tend to have folks very familiar with PL/SQL and the OWA packages.
    This forum is on Web services based on SOAP, WSDL and UDDI. These can be PL/SQL based but typically don't use the mod_psql or OWA web solution.
    As a pointer, I suspect you may already be familiar with, but just in case, you can always take a look at chapter 3 of the OAS documentation, "Developer's Guide: PL/SQL and ODBC Applications" where they go through a number of examples using parameters. See:
    http://technet.oracle.com/doc/appsrvr4082/guides/plsql.pdf
    Hope this or folks from the other list can help.
    Mike.

Maybe you are looking for