Submitted batch disappears

that's what happens now. I used to get Compressor to work...not now.
I submit a batch (from Final Cut Express or stand alone and I set a preset and destination, click submit. The progress bar shows for a few seconds in the monitor window and then disappears - and so does the batch in the batch window.
This is Compressor v. 1.2.1
I find myself spending half my working day trying to fix Apple software, and it's breaking me down....It's seems like such a crock that the developers couldn't make stuff that works.
Any help from any of you out there appreciated. Something has to be reset or whatever. Frankly, I don't expect any help is going to come of this question...I don't seem many questions answered for this sorry app, and what answers there are are so complex - go to terminal, replace Qmaster, etc etc, and I've tried all that...Apple, if you're listening - - but of course, they don't care about you...they just want your money...buy our new improved products...what a miserable thing - -people are reduced to being slaves to machines...well, this is my livlihood, or it was supposed to be, before it sucked up all my time with troubleshooting. Shoot, I can't afford expensive classes either. What a waste of human potential. Think Different, my ---!

I was going to ask you to post the logs here but I think they'd be far too long. Try looking through them and see if there are any errors listed towards the bottom. Also, try opening the Console and see if there are any relevant errors listed there.
Yes, I have updated QuickTime, but who hasn't? Every other time there's a >software update, there's a new QT version. None of which warns you that >applications will be affected.
If that's the case, QuickTime could well be the culprit. As you're running Compressor 1.2.1, I will assume you are also running FCP 4.5. This was designed to run on QT 6. Some users have managed to get it running on QT 7 but they are on borrowed time and any subsequent updates could break it.
And yes, you're right - they should say applications will be affected but unfortunately they don't.

Similar Messages

  • HT1819 My first Podcast that I submitted has disappeared from iTunes.  Where did it go?

    My first Podcast that I submitted has disappeared from iTunes.  It is still in my RSS feed.  How can I get it back?  I would like them all availbale for download.
    Thanks,
    Dan Z

    Downloaded films (and those synced from your computer) should be in the Videos app (iTunes on the iPad is just the iTunes store) :

  • Problem submitting batch request for sales order creation

    Hello experts,
    I have created a gateway service, implementing the CREATE_DEEP_ENTITY for order creation. I have tested my service with the Chrome Advanced Rest Client and it works fine with the following XML request:
    <?xml version="1.0" encoding="UTF-8"?>
    <atom:entry xmlns:atom="http://www.w3.org/2005/Atom" xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
      <atom:content type="application/xml">
      <m:properties>
      <d:OrderId>0</d:OrderId>
      <d:DocumentType>TA</d:DocumentType>
      <d:CustomerId>C6603</d:CustomerId>
      <d:SalesOrg>S010</d:SalesOrg>
      <d:DistChannel>01</d:DistChannel>
      <d:Division>01</d:Division>
      <d:DocumentDate m:null="true" />
      <d:OrderValue m:null="true" />
      <d:Currency m:null="true" />
      </m:properties>
      </atom:content>
      <atom:link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/SOItems" type="application/atom+xml;type=feed" title="SALESORDERTSCH.SOHeader_SOItems">
      <m:inline>
      <atom:feed>
      <atom:entry>
      <atom:content type="application/xml">
      <m:properties>
      <d:OrderId>0</d:OrderId>
      <d:Item>000010</d:Item>
      <d:Material>C20013</d:Material>
      <d:Plant m:null="true" />
      <d:Quantity m:Type="Edm.Decimal">100.000</d:Quantity>
      <d:Description m:null="true" />
      <d:UoM m:null="true" />
      <d:Value m:null="true" />
      </m:properties>
      </atom:content>
      </atom:entry>
      <atom:entry>
      <atom:content type="application/xml">
      <m:properties>
      <d:OrderId>0</d:OrderId>
      <d:Item>000020</d:Item>
      <d:Material>C20014</d:Material>
      <d:Plant m:null="true" />
      <d:Quantity m:Type="Edm.Decimal">200.000</d:Quantity>
      <d:Description m:null="true" />
      <d:UoM m:null="true" />
      <d:Value m:null="true" />
      </m:properties>
      </atom:content>
      </atom:entry>
      </atom:feed>
      </m:inline>
      </atom:link>
    </atom:entry>
    Now that my service is working, I want to be able to call it from a SAP UI5/Javascript application. In order to process multiple items for one order header, I use the OData batch request. Here is my Javascript method that is being processed:
    executeOrderCreation : function() {
      // Retrieve model from controller
      var oModel = sap.ui.getCore().getModel();
      oModel.setHeaders(
      { "Access-Control-Allow-Origin" : "*",
      "Content-Type": "application/x-www-form-urlencoded",
      "X-CSRF-Token":"Fetch" }
      // Define data to be created
      var headerData = {
      OrderId : "0",
      DocumentType: "TA",
      CustomerId : "C6603",
      SalesOrg : "S010",
      DistChannel : "01",
      Division : "01",
      DocumentDate : null,
      OrderValue : null,
      Currency : null,
      varItemData1 = {
      OrderId : "0",
      Item : "000010",
      Material : "C20013",
      Plant : null,
      Quantity : "100.000",
      Description :null,
      UoM :null,
      Value :null,
      varItemData2 = {
      OrderId : "0",
      Item : "000020",
      Material : "C20014",
      Plant : null,
      Quantity : "100.000",
      Description :null,
      UoM :null,
      Value :null,
      var batchChanges = [];
      oModel.refreshSecurityToken(function(oData, oResponse){
      alert("Refresh token OK");
      }, function() {
      alert("Refresh token failed");
      }, false);
      oModel.read('/SOHeaders/?$Batch', null, null, false, function(oData, oResponse) {
      // Create batch data
      batchChanges.push(oModel.createBatchOperation("SOHeaders", "POST",headerData ));
      batchChanges.push(oModel.createBatchOperation("SOHeaders", "POST",varItemData1 ));
      batchChanges.push(oModel.createBatchOperation("SOHeaders", "POST",varItemData2 ));
      oModel.addBatchChangeOperations(batchChanges);
      // Submit changes and refresh the model
      oModel.submitBatch(
      function(oData) {
      oModel.refresh();
      function(oError) {
      var error = oError;
      alert("Read failed" + error);
      false);
      }, function() {
      alert("Read failed");
    The result is when I submit the batch, I have an error saying: The following problem occurred: no handler for data -
    Am I doing right in the batchChanges creation ? (Header then items)
    Why am I facing this error ?
    Any help would be greatly appreciated.
    Thanks and regards,
    Thibault

    Hi,
    you should also have '/' before collection name so that it will be /SOHeader and as below.
      batchChanges.push(oModel.createBatchOperation("/SOHeaders", "POST",headerData ));          batchChanges.push(oModel.createBatchOperation("/SOHeaders", "POST",varItemData1 )); 
      batchChanges.push(oModel.createBatchOperation("/SOHeaders", "POST",varItemData2 ))
    Regards,
    Chandra

  • SAPROUTER SNC - ERROR in Batch

    Hello,
    We have our SNC connection working between our site and  SAPSERV2 as long as our call to saprouter is running in interactive mode.  When we submit the job to batch using a CL (copied from 4soi.de) the connection does not work.  The message in DEV_ROUT file we're seeing the following error:
    ERROR       GSS-API(maj): No credentials were supplied
    It seems like the the job isn't seeing either env variables or the SIDOFR STMF in the SECUDE directory.
    Any thoughts on why this works in the foreground from a command line, but not in batch as an autostart job, or just a submitted batch job?
    Thanks,
    Philip

    Hello Philip,
    if you follow the instructions at
    http://www.easymarketplace.de/snc-iseries-setup.php
    you set SECUDIR within the CL and it is not (necessarily) in directory /secude.
    And you have to run the CL as SIDOFR.
    Regards
    Guido

  • Batch Monitor - No Progress

    I've read on these boards that there are problems submitting batches in 10.5 with compressor 3.0.2 - which is what I'm running. My problem is slightly different...I am able to submit a batch and call up the batch monitor, where the job is listed as "processing" - but the progress bar never updates - and I've let it run on a short clip for WAY over the time that it should take and it never completes...I've already deleted compressor, trashed the prefs. and re-installed...any ideas or am I seriously just stuck until Apple gets their act together and fixes this?

    For the most part Apple doesn't monitor these discussions. Report the problem to the feedback page:
    http://www.apple.com/feedback/compressor.html

  • .sql scripts running in batch mode

    Hi all. Is there any way to run .sql scripts in a batch mode where the job would be time scheduled and run automatically when the specified time arrived? I am using Oracle 10.
    Thank you for any assistance.
    Robert Smith

    Hi Dimitri. A coworker of mine uses dbms scheduler and he has submitted batch jobs. However, I would like to be able to spool to a file, then send the file as an attachment in email. My friend says that the "spool" command will not work when you are inside of pl / sql. He writes to a file using a cursor and loop but he hasn't been able to send the file as an attachment in an email yet. Maybe there is something we don't know about the dbms scheduler yet.
    Robert Smith

  • Connot submit batch - Unable to connect background process?

    Trying to export 60 min high quality mpeg and get the following message: "Connot submit batch - Unable to connect background process"
    I use FCP 4.5 HD

    You don't give many details but Jon Chappell of DigitalRebellion has produced a free utility to help with Compressor problems.
    He posted these details recently:-
    "It seems like a number of people on this forum are having problems submitting batches from Compressor, so I created a utility to diagnose and fix some of those problems.
    Compressor Repair will check all necessary files exist on your hard drive and that permissions are set correctly. It also resets Qmaster, trashes the Qmaster spool directory and launches qmasterd."
    http://www.digitalrebellion.com/downloads/CompressorRepair.zip

  • Render disappears before completion

    I have a QT reference movie, 2h18m, in HDV, which I'm trying to get converted to MPEG2 for delivery on dual layer DVD.
    Whether from the FCP timeline, or direct into Compressor, the render begins, then just disappears after around 2000 frames.
    The files are on an external eSATA drive, connected to the MBP via a Sonnet Tempo Sata Express 34, or a Sonnet card in the Mac Pro.
    On the MBP, the file plays in QuickTime, although it hung really completely at 2h2m. The mouse and keyboard were non-responsive, while the audio stuttered. I had to use the hard switch. I restarted and tried again at 1:58, and it played through to the end no problem.
    I moved the drive to the Mac Pro in order to leverage the processors. It still disappears even before the barber pole starts moving, although the estimation timer is running.
    I installed and ran Compressor Repair, just in case. No Help.
    i guess there's some issue with the file itself.
    Any ideas?

    Well, the hits just keep on rollin'
    So I read somewhere maybe it was a bad render. Deleted all renders and immediately exported to a self-contained QT-HDV. Still got a General Error and bad file.
    Next, I began the reconstruction. The final piece consists of three sections. The first three minute Intro, was exported to a reference QT-AIC file and imported back. Don't expect any trouble there. Next is a nested filed built of source and multiclips. I collapsed the Multiclips and exported it to self-contained QT-AIC in 5 minute segments--no crashes and no General Errors. Imported each segment and placed on the Master T/L above it's corresponding nested sequence. Finally I moved to the last piece, about 1h40m, where I deleted the Multiclip and replaced with the source clip, then exported in 10 minutes segments, importing those and placing on the Master T/L. This whole process took less than 4 hours, which I found surprising since exporting either nested piece had always taken 4-8 hours.
    Once complete, it flowed smoothly, as it always did in FCP (there never were any crashes when playing from any timeline in FCP, although there were hiccups until everything was rendered).
    So now I have an AIC 1440x1080i60 sequence, filled with media with the same format. I exported to a self-contained QT with current settings and markers for DVDSP. That took longer than expected, around 3 hours. I played the first 10 minutes in QT, and it looked fine.
    I imported that into Compressor for conversion to MPEG2. It was chugging along, foretelling about 3 hours to process. I expanded everything in the Batch Monitor, and saw 12 segments. At first 6 video and 1 audio streams were processing, which I also saw in Leopard's activity monitor. One process hung, so I forced it to Quit. The other continued. Eventually the audio finished successfully. Now we have 6 segments processing the first of the two passes (I'd selected a 2-pass transcode at around 6Mbps--the setting for Best Quality, 90 minute DVD, modified to 16:9), while the other six showed 'Waiting'. The total completion strip showed about 70% complete the last I looked. This seemed a bit much since there were several segments that had not yet successfully completed.
    The next thing I know, the entire Batch disappeared from the Batch Monitor. The Activity Monitor still shows qmasterd with 8 threads, one of which is Not Responding, two show zero, five show CPU around 150 (what does that mean, I thought it was percent of CPU usage, not to exceed 100?) and memory usage around 90MB. The CPU graph is almost all green, showing around 80% User and 15% System. There is still Read and Write activity going to the disks.
    If anyone can shed light on this, I'd be SO GRATEFUL.
    Should I let the processes run or kill them? Why does Batch Monitor show no activity? The destination folder shows segments 6,7,8,9,10, averaging around 500MB apiece. There's no sign of the final complete conversion, nor of segments 1-5,12.
    If this gets any more interesting, I'll ......... I"ll,,,,,,,,,,,, well, I'll just...............
    Apologies for the long involved post. What did I overlook? I didn't try reconnecting the files since none showed up missing when opening the project...
    OK, Activity Monitor still shows qmasterd but all processes show zero under CPU usage. Still can't find the final file, nor the AC3 file. Get Info show 46GB free on the drive.
    Any ideas?

  • After encoding, out of application memory: problem & solution

    Shortly after completing a batch in Compressor 4.1 running in OS 10.9.1, I started receiving an error message that the system was out of application memory.  (Despite the fact that I have 16 GB of RAM!)  I had activity monitor open and there were numerous instances of "qmasterqd not responding" in red present. The error message made me force quit both compressor and activity monitor. I shut the computer all the way down and then rebooted.  Reopened both programs.  All of the qmasterqd not responding threads had disappeared in AM.  Resubmitted my batch in compressor.  As soon as it completed, all of the qmasterd not responding threads reappeared in AM.  Also, I noticed that "application memory" went from about 2 GB to 15.99!  I had to reboot again. 
    Mystery solved:  I remembered that I also have FCS 3 installed on my hard drive (in its' separate folder).  As an experiment, I cloned my HD to an external firewire drive.  Then, on the clone, I uninstalled compressor 3.5, Apple Qmaster, and compressor 4.1 using Digital Rebellion's FCS uninstaller.  Rebooted clone, emptied trash.  Reinstalled compressor 4.1 from app store.  Submitted batch as above-  Problems went away.  I suspect that the Qmaster stand alone program in FCS also became active when submitting a batch in compressor 4.1 that now has Qmaster already incorporated into it.  This caused it to seize resources while competing with the new version.  Uninstalling the old program removed this conflict.  I went back to my original HD, uninstalled compressor 3.5 and apple Qmaster.  No problems since doing that. And BTW, it was no great loss losing compressor 3.5 on my Mavericks system since it didn't work nearly as well as when I was running it in Lion!

    Thanks for that helpful info, even though I haven't (knock on wood) run into that. I also have 3.5 on Mavericks and so far it coexists well with 4.1. But I'm sure your experience will help others trying to troubleshoot similar problems.
    Russ

  • After 6210 Navigator 5.16 update, mark with # feat...

    I updated my nokia 6210 navigator to version 5.16. The function where you can mark items by pressing hash (#) has disappeared. Any workarounds?

    Compressor repair? Can you give me more details on what this is?
    I had FCP to Compressor using QMaster (2 CPU instances) working in 10.5.6. It created some bizarre project name under FCP (i.e. 13DAA64E-8A34-4476-A12B-86) but would work. With 10.5.7 the same steps just hang/fail -- fail as in they Submitted batch never completes (I can leave it for 10 hours on a H.264 small duration clip of 30 seconds).
    So now (with 10.5.7) the only way I can get Compressor and Qmaster to work is if I export sequence to QuickTime movie, close FCP, open Compressor create a new batch using the exported QT movie, set my settings/destination then submit. Which defeats the purpose of using a 2 Instance QMaster in the first place since I've had to introduce a Export to Quicktime in FCP. So essentially any benefit using 2 CPUs has been negated by the extra export step.
    Sorry for the frustration/tone, but if I wanted this much hassle with Mac software/OS, I would have just stayed exclusively on a Windows based platform -- as of right now I see very little difference between the two companies, both are not responsive to correct obvious problems and both create more work than solutions. I gotta wonder why I even selected to use a Mac and why I successfully converted at 11 friends over from Windows to Mac - SSDD.
    Frustrated.
    Rob

  • Adding attachments to PDF within ISR and possibly link to BDS in ESS

    I have a requirement where the user needs to attach documentation to a request (ISR) for approval.  From my research on this forum I see that currently SAP does not support attachments in a PDF within ISR as only a small subset of the xml data to render the form is actually saved as part of the request.  A person has the ability to add attachments to the form in the process but once the form is recalled from the work list and displayed the attachements are missing.

    Hello Guys,
    We are using Adobe forms with ISR framework and facing problems attaching the files  from within the PDF ; using the paper clip icon.
    Once the files are  attached to the PDF form and the form is submitted, attachment disappears from the Form . Instead we have to use a "Attachment Manager" on UWL to attach the files. It is quite Confusing to have a "Paper Clip" button inside the form and also a attachment manager on UWL.
    If anyone faced similiar issue and has solution to fix this, please let me know.
    We are on ECC6 with component SAP_BASIS - 701- 006- SAPKB70106.
    Thanks in advance.
    Regards
    Sandy
    Edited by: Sandy on Sep 6, 2011 5:18 PM

  • UPDATE SQL statement has poor performance

    Hi All,
    We have setup regular run background process,setup to "throttle"  user submitted
    Batch Requests to Batch Processing System.  Purpose of this "Throttle" DB level background
    process  (submitted using DBMS_SCHEDULER)  to check for currently active Requests and
    then accordingly (based on prevailing System Load)   inject new requests for Batch Request accordingly.
    This background process is scheduled to run every minute.
    We find that UPDATE statement below performs well when Table being updated (FRM_BPF_REQUEST)
    even when Table has upto 1 million rows.  (Expected Production volume)  UPDATE takes only few seconds  (< 10 secs)
    at most to execute
    However, we find that when there is a burst of INSERTS happening to  same Table  (FRM_BPF_REQUEST)
    via another database session,  UPDATE statement suffers with severe degradation.  Same UPDATE which used
    to perform  in matter of few seconds, takes upto  40 minutes when heavy INSERTS are happenning to
    Table.  We are trying to understand why Performance gets severely degraded when INSERTS are heavy on the Table,
    Any thoughts or insights into issue would be greatly appreciated.
    We are using Oracle DB 11.2.0.3.4  (on Linux)
    CREATE OR REPLACE PROCEDURE BPF_DISPATCH_REQUEST_SP(V_THROTTLE_SIZE NUMBER DEFAULT 600) AS
    --    Change History
    --001 -Auro    -10/09/2013  -Initial Version
    --    v_throttle_size    NUMBER DEFAULT 600;
          v_active_cnt         NUMBER DEFAULT 0;
          v_dispatched_cnt   NUMBER DEFAULT 0;
        v_start_time    TIMESTAMP := SYSTIMESTAMP;
        v_end_time    TIMESTAMP;
            v_subject_str   VARCHAR2(100) := '';
            v_db_name       VARCHAR2(20) := '';
      BEGIN
        -- Determine Throttle Size
        SELECT THROTTLE_SIZE
        INTO   v_throttle_size
        FROM   FRM_BPF_REQUEST_CONTROL;
        -- Determine BPF Active Request Count
        SELECT COUNT(*)
        INTO   v_active_cnt
        FROM   FRM_BPF_REQUEST
        WHERE  STATUS IN('rm_pending','rm_ready','processing','worker_ready','failed','dependency_failed','recover_ready');
        IF v_active_cnt < v_throttle_size THEN
            UPDATE FRM_BPF_REQUEST
            SET    STATUS='dispatched'
            WHERE  ID IN (
                    SELECT ID FROM (
                   SELECT ID
                   FROM   FRM_BPF_REQUEST
                   WHERE  STATUS='new'
                   ORDER BY ID
                    ) WHERE ROWNUM <= (v_throttle_size - v_active_cnt)
            v_dispatched_cnt := SQL%ROWCOUNT;
            COMMIT;
        END IF;
         v_end_time := SYSTIMESTAMP;
        INSERT INTO FRM_BPF_REQUEST_DISPATCH_LOG
        VALUES (
            v_start_time,   
            v_active_cnt,
            v_dispatched_cnt,
            v_end_time,
            NULL
        COMMIT;
        EXCEPTION
                  WHEN OTHERS THEN
                ROLLBACK;
             v_end_time := SYSTIMESTAMP;
            INSERT INTO FRM_BPF_REQUEST_DISPATCH_LOG
            VALUES (
                v_start_time,   
                v_active_cnt,
                v_dispatched_cnt,
                v_end_time,
                NULL
            COMMIT;
                SELECT ORA_DATABASE_NAME
            INTO   v_db_name
            FROM   DUAL;
                   v_subject_str := v_db_name||' DB: Fatal Error in BPF Request Dispatch Process';
            -- Alert Support                   
                DBA_PLSQL.SEND_MAIL(P_RECIPIENTS     => '[email protected]',
                                        P_CC         => '[email protected]',
                                            P_BCC         => '[email protected]',
                                            P_SUBJECT         => v_subject_str,
                                            P_BODY         => SUBSTR(SQLERRM, 1, 250));
    END;
    show errors
    Thanks
    Auro

    What the heck is this:
      EXCEPTION
                  WHEN OTHERS THEN
                ROLLBACK;
             v_end_time := SYSTIMESTAMP;
            INSERT INTO FRM_BPF_REQUEST_DISPATCH_LOG
            VALUES (
                v_start_time,   
                v_active_cnt,
                v_dispatched_cnt,
                v_end_time,
                NULL
            COMMIT;
                SELECT ORA_DATABASE_NAME
            INTO   v_db_name
            FROM   DUAL;
                   v_subject_str := v_db_name||' DB: Fatal Error in BPF Request Dispatch Process';
            -- Alert Support                   
                DBA_PLSQL.SEND_MAIL(P_RECIPIENTS     => '[email protected]',
                                        P_CC         => '[email protected]',
                                            P_BCC         => '[email protected]',
                                            P_SUBJECT         => v_subject_str,
                                            P_BODY         => SUBSTR(SQLERRM, 1, 250));
    Why are you programming for failure to succeed, willing to accept time taking rollbacks, committing afterward, fooling with transactions, swallowing/hiding all errors, all that 'nice and safely hidden' in the notorious WHEN OTHERS exception NOT followed by a RAISE?
    Only catch errors you expect.
    Programming to let a program fail is to fail.

  • Preceeding zeros disapprearing in the excel download.

    I have programmed ALV report using function module REUSE_ALV_GRID_DISPLAY.
    I have declared my final output internal table's  batch field as CHARG_D which is of 10 characters.
    I have passed field_catelog-no_zero = 'X'. for batch field.
    However when I download into excel sheet, preceeding zeros of batch disappearing in excel sheet.
    Is there any way to retain the preceeding zeros for batch values and other similar fields values  in the excel sheet after download ?
    THANKS IN ADVANCE.

    There's a few variations on how to get the data from ALV into Excel and I don't have a system right now, so I can only give a general comment. If the direct export to Excel (OLE) doesn't work, I'd recommend to export the data as local file using format Spreadsheet and give it an extension like ".csv".
    When saving the file this way, it really is just a tab separated file (text file, you can view it via Notepad and check for the leading zeros), which you can open in Excel. However, you have to be careful, because when you double-click on the file, Excel will usually open the file directly and format your batch (CHARG_D) field with type General. As a result anything that looks like a number, will be stripped off the leading zeros. Therefore instead you should open the data via the import wizard (e.g. in Excel 2007 it's on the Data ribbon under Get external data and then choosing From text) and reformat the field CHARG_D from General to Text. This will ensure that you'll keep the leading zeros.
    There's lots of other ways to do it and my version is probably not the most efficient. However, it works in general, as long as you see the leading zeros in the ALV and is easy to remember (at least for me)...
    Good luck, harald

  • After install 10.5.7 update QMaster/Compressor hangs/not responding??

    I find it funny that the 10.5.7 update was supposed to "improve OS stability". Well it looks like all it has done is made my Mac a worthless pile of pooh.
    I have an older MacPro (2007) with two dual core processors (4 CPUs). I setup QMaster as QuickCluster with services (this is the only way Compressor will see it when I submit a batch). I selected 2 CPU instances. Hit "Submit" in Compressor and open the batch window, I see it "try" to parse out the video segments to 2 CPU instances, but get Audio successful and Video "Unknown" with a Service controller "error" and a progress bar for the main batch at 50% with time duration increasing (never decreases).
    So I bring up Activity Monitor and find that coreaudio is "not responding" compressor is "not responding" and final cut pro is "not responding" ... oh joy!!
    All this "used" to work before the 10.5.7 update, so what gives???
    How to reproduce on my hardware:
    1. Open FCP
    2. Create a Sequence
    3. Send Sequence to Compressor
    4. In Compress I have a predefined H.264 setting and destination with "Job segmenting" checked, hit submit
    5. Open batch window, look at 50% mark with errors mentioned above
    NOT very happy.
    Rob

    Compressor repair? Can you give me more details on what this is?
    I had FCP to Compressor using QMaster (2 CPU instances) working in 10.5.6. It created some bizarre project name under FCP (i.e. 13DAA64E-8A34-4476-A12B-86) but would work. With 10.5.7 the same steps just hang/fail -- fail as in they Submitted batch never completes (I can leave it for 10 hours on a H.264 small duration clip of 30 seconds).
    So now (with 10.5.7) the only way I can get Compressor and Qmaster to work is if I export sequence to QuickTime movie, close FCP, open Compressor create a new batch using the exported QT movie, set my settings/destination then submit. Which defeats the purpose of using a 2 Instance QMaster in the first place since I've had to introduce a Export to Quicktime in FCP. So essentially any benefit using 2 CPUs has been negated by the extra export step.
    Sorry for the frustration/tone, but if I wanted this much hassle with Mac software/OS, I would have just stayed exclusively on a Windows based platform -- as of right now I see very little difference between the two companies, both are not responsive to correct obvious problems and both create more work than solutions. I gotta wonder why I even selected to use a Mac and why I successfully converted at 11 friends over from Windows to Mac - SSDD.
    Frustrated.
    Rob

  • Compressor doesn't work anymore....

    Hello,
    I got a really strange bug,
    I can not use Compressor anymore...
    I can't see the "project" window where there is all my video-sequence...
    I can only see these windows :
    • "Settings / Destinations"
    • "Inspector"
    • "Preview" (but the preview window looks very buggy, i think...
    • "History"
    So I can't add any new video to compress/export... and I can add any setting to export the videos...
    I try 2 times, to errase all the Compressor Applications and application support and prefs..
    and try to reinstall from the original DVD.
    but it don't work again and again....
    PLEASE HELP !!!
    now, I don't know what to do...
    Perhaps I didn't erase & re-install the applications and compressor files properly,
    but, CAN ANYONE HELP ME PLEASE ?!!
    Thanks in advance.

    keyman: Try following the steps Compressor Zealot links to. Read the instructions carefully, and make sure you remove all the Compressor/Qmaster files before reinstalling.
    Eyebite wrote:
    don't plan on using Virtual Clustering. As far as I can determine, no one has that working yet.
    Uhmmm... Virtual Clustering works fine, but you can not use virtual clustering when sending your sequence directly from FCP to Compressor.
    Compressor: Don't export from Final Cut Pro using a Virtual Cluster
    To transcode your FCP-sequence using Virtual Clustering with Compressor 3, you can export your sequence as a self-contained ProRes file, and then bring that file into Compressor.
    Keep in mind that job segmenting is not always good for compression.
    Job Segmenting and Two-Pass (or Multi-Pass) Encoding
    If you choose the two-pass or the multi-pass mode, and you have distributed processing enabled, you may have to make a choice between speedier processing and ensuring the highest possible quality.
    The Apple Qmaster distributed processing system speeds up processing by distributing work to multiple processing nodes (computers). One way it does this is by dividing up the total amount of frames in a job into smaller segments. Each of the processing computers then works on a different segment. Since the nodes are working in parallel, the job is finished sooner than it would have been on a single computer. But with two-pass VBR and multi-pass encoding, each segment is treated individually so the bit-rate allocation generated in the first pass for any one segment does not include information from the segments processed on other computers. First, evaluate the encoding difficulty (complexity) of your source media. Then, decide whether or not to allow job segmenting (with the “Allow Job Segmenting” checkbox at the top of the Encoder pane). If the distribution of simple and complex areas of the media is similar throughout the whole source media file, then you can get the same quality whether segmenting is turned on or not. In that case, it makes sense to allow segmenting to speed up the processing time.
    However, you may have a source media file with an uneven distribution of complex scenes. For example, suppose you have a 2-hour sports program in which the first hour is the pregame show with relatively static talking heads, and the second hour is high-action sports footage. If this source media were evenly split into 2 segments, the bit rate allocation plan for the first segment would not be able to “donate” some of its bits to the second segment because the segments would be processed on separate computers. The quality of the more complex action footage in the second segment would suffer. In this case, if your goal were ensuring the highest possible quality over the entire 2-hour program, it would make sense to not allow job segmenting by deselecting the checkbox at the top of the Encoder pane. This forces the job (and therefore, the bit-rate allocation) to be processed on a single computer.
    Note: The “Allow Job Segmenting” checkbox only affects the segmenting of individual jobs (source files). If you are submitting batches with multiple jobs, the distributed processing system will continue to speed up processing by distributing (unsegmented) jobs, even with job segmenting turned off.
    From the Compressor User Manual

Maybe you are looking for

  • Display PDF document in Flex Windows application

    Hi, I am creating a flex windows application using action script and mxml script.I need help in displaying a PDF document in that windows application.I tried google search it is giving me some open source projects with IFrames.But,they can be only us

  • Not Capturing All Open Windows

    When recording and a new window is open (within the same app or screen capture), Captivate doesn't always capture the newly open window. Any suggestions.

  • If I have PowerPivot Connection with 8 tables in the data model, can I extract the Command Text for each with VBA?

    I have a data model with a single PowerPivot connection to an Oracle database. This model had eight tables in it which all use the same "Existing Connection".  I can get the Query from the first Table connection using Command Text, however, I do not

  • Exception creating new Poolable object

    Hi All I have installed jdeveloper 12.1.2 and when running the hello world page iam getting this error....pls advice *500 Internal Server Error* oracle.apps.fnd.common.AppsException: oracle.apps.fnd.common.PoolException: Exception creating new Poolab

  • New asset class not showing in the list (AS01)

    Dear Experts, I have created a new asset class through the following configuration (IMG): - FA - Asset Accounting - Organization Structure-Asset Class-Specify Account Determination - Financial accounting --> Asset accounting --> Organizational Struct