Getting FCPX to export to multiple machines in Compressor

We have 3 - 2013 Mac Pro Towers (grey) 1 new late 2013 Mac Pro (black), 2 - 2013 - 27” iMacs and 2 - 2013 Mac Mini’s. All are running the latest version of FCPX (10.1.1) and Compressor (4.1.1) on OS X 10.9.2.
The Mac Pro (trash can) and the iMacs are connected through Thunderbolt and all of the other machines are connected through a gigabit network. We are exporting to a 12TB Thunderbolt RAID.
We have 100 - 90 minute edits we need to export to an MKV file format each week. We are trying to use multiple machines to export these on. We have set up Compressor on each machine and added them as “Shared Computers” in the Compressor preferences.  When we finish the edit and “Send to Compressor” the multiple machines do not pick up the export and share the work.
All computers are set as "Shared" with no passwords required. We have selected "All Interfaces" for the network sharing Preferences.
All machines are editing on FCPX and exporting through Compressor fine on their own. The part we are trying to sort out is the machines sharing the work in Compressor.
Any help or suggestions would be appreciated.

https://discussions.apple.com/community/professional_applications/compressor_4
Try the Compressor forum.
Al

Similar Messages

  • Export to CSV from tracker crashes Acrobat on multiple machines.  (xml works though)

    I have encountered a crash exporting data to csv from a distributed PDF in the tracker, yet exporting to xml works.  This occurs on multiple machines.
    I can email dump file if requested.
    [Product]
    (0x1): Windows 7 Ultimate[/Product]
    [Edition]
    Ultimate[/Edition]
    [BuildString]
    7601.17727.x86fre.win7sp1_gdr.111118-2330[/BuildString]
    [Revision]
    1130[/Revision]
    [Flavor]
    Multiprocessor Free[/Flavor]
    [Architecture]
    X86[/Architecture]
    [LCID]
    1033[/LCID]
    [/OSVersionInformation]
    [ParentProcessInformation]
    [ParentProcessId]
    3124[/ParentProcessId]
    [ParentProcessPath]
    C:\Windows\explorer.exe[/ParentProcessPath]
    [ParentProcessCmdLine]
    C:\Windows\Explorer.EXE[/ParentProcessCmdLine]
    [/ParentProcessInformation]
    [ProblemSignatures]
    [EventType]
    APPCRASH[/EventType]
    [Parameter0]
    Acrobat.exe[/Parameter0]
    [Parameter1]
    9.5.0.270[/Parameter1]
    [Parameter2]
    4f03f71d[/Parameter2]
    [Parameter3]
    AcroForm.api[/Parameter3]
    [Parameter4]
    9.5.0.270[/Parameter4]
    [Parameter5]
    4f03ea18[/Parameter5]
    [Parameter6]
    c0000005[/Parameter6]
    [Parameter7]
    0011222d[/Parameter7]
    [/ProblemSignatures]
    [DynamicSignatures]
    [Parameter1]
    6.1.7601.2.1.0.256.1[/Parameter1]
    [Parameter2]
    1033[/Parameter2]
    [Parameter22]
    0a9e[/Parameter22]
    [Parameter23]
    I have a zipped dump file if someone at Adobe wants it.

    Yes, the application is loaded and then closed for each file. I agree it's not ideal, but that's just how it works. So 45 minutes for 200 files comes down to about 13 seconds per file, which is reasonable considering it has to open Word, open the file, convert it to a PDF and then close Word.
    You can request this feature to be improved here: https://www.adobe.com/cfusion/mmform/index.cfm?name=wishform
    By the way, you should make sure that you have the latest version of Acrobat available, to ensure maximum compatibility.

  • Separate Distribution Monitor Export and Import Processes on Multiple Machines

    Hi,
    Would you kindly let me know whether it is possible (means officially supported way) to run Distribution Monitor Export and Import processes on different machines?
    As per SAP note 0001595840 "Using DISTMON and MIGMON on source and target systems", it says as below.
    > 1. DISTMON expects the export and import to be carried out on the same server
    I think it means that export and import processes for the same tables must be run on the same machine, is it correct? If yes, Export on the machine A, and then Import those exported data on the other machine B is not the officially supported way... (However, I know it is technically possible.)
    Kind regards,
    Yutaka

    Hi Yutaka,
    Point no. 2 & 3 clarify the confusion. However let me explain it briefly:
    Distribution Monitor is used basically in case of migration of large SAP systems (database). It provides the feature to increase parallelism of export and import, distributing the process across available systems.
    You have to prepare the system for using DistMon. A common directory needs to be created as"commDir" and in case you use multiple systems for executing more number of processes of export and import then that "commDir" should be shared across all those systems.  And this is what the Point no.1 in KBA 1595840 mentions about. Distribution Monitor will run both the export and import process from the machine which is prepared for using DistMon and DistMon itself will control the other processes i.e. MigMon. No need to start separate MigMon.
    For example: You are performing a migration of SAP system based on OS:AIX and DB:DB2 to  OS: HP-UX and DB: Oracle. You need to perform the export using DistMon and you are having 4 Windows servers which can be used for parallel export/import. Once you have prepared the system for DistMon which hosts the "commDir" you'll have to provide the information of involved host machines in the "distribution_monitor_cmd.properties" file. Now when DistMon is executed it will distribute the export and import process across the systems which were defined in "distribution_monitor_cmd.properties" file automatically.
    Best regards,
    SUJIT

  • Multiple Machine Render question

    Alright, so I just realized that After Effects (I'm running CS3) can utilize multiple machine renderings. At first, I thought this would be great since my typical renders are anywhere from 45 minutes to 2 hours. But, after much playing around, I realized that it only works for sequences.. which isn't much help because I export the majority of my work as DV Streams or MOV files! Is there any way I can get it to do formats other than image sequences or am I SOL?
    I also considered buying a few moderate-end machines and putting together a small render farm in case I couldn't, but I wouldn't know where the best place is to start with something like that (I'd imagine determine what software works, but I can't find much info on that, really).

    for PC's only =====
    i wrote my own dos scripts to setup network rendering (remotely start after effects engines, copy projects to a watchfolder, and setup RenderControlFiles, copy and install fonts, etc ... took me a while to write and set up and is really the property of the company I work for), but the basics are below and from my in-house help file. Works better for us than collecting footage, basically, just substitute local drive volume names with UNC names, save the project, submit to the watchfolder, in its own directory, and add an RCF text file (try collecting footage, an RCF file will be generated so you can see whats in it). The thing that took me longest to debug was to install required fonts remotely ... any way read the basics
    THE FOUR BASIC REQUIREMENTS
    The basic requirements for any software that can network render, are that
    (1)
    all participating computers are able to access the original source footage,
    (2)
    any required fonts are loaded and active on the participating renderers,
    (3)
    any plugins and codecs used are also installed and available, and
    (4)
    all rendered frames, ie, output are send to a common storage area; single frames sequences are the best output format, a single streamed format, like an AVI, MPEG, or Quicktime Movie cannot be created by more than one computer at one time. However if your render queue has multiple single file streams queued, multiple render slaves can render the multiple queued streams, with one slave rendering one of the streams at a time two or more render slaves cannot render the same stream at the same time.
    SETTING UP YOUR PROJECT TO NETWORK RENDER
    Source Footage Paths
    The first problem in the below example is that all the source footage is referenced using local paths. The default network rendering method Adobe use is really best suited to archiving, wherein the footage is collected from the source computer and copied to a common server in a hierarchial tree structure. In this example the source footage is comparatively short, but still over 900mb long, so there is a time issue involved, a network traffic issue, a storage issue at the server end, a server management issue, and if the network render fails, or needs to be redone, this step will need to be repeated or the process hacked by an appropriately skilled person.
    To network render, all source imagery should use a file path that can be seen by other computers on a network. The best method is to use UNC (universal naming convention) paths, ie \\computername\sharedvolume\foldername\foldername\imagename.
    In the above example a badly pathed file N:\TV1 Promos\sas\sas1.aif could be made legal by re-pathing to \\regan\zone2\TV1 Promos\sas\sas1.aif assuming that zone2 is an explicitly shared volume.
    TIP: you can create as many share points as you want. You could, for example create an additional share point N:regan\one2\TV1 Promos\sas and call it SAS so it appears as the share \\regan\SAS. Just right-hand click on the folder you want to share, select SHARING from the menu, create the new share and give it a meaningful name; on XP systems, you also need to specify read/write access permissions.
    You can create a shortcut to this or any other share, leave it on your desktop, and that will save you time navigating to that point or typing out \ \ r e g a n \ s a s
    Alternatively, you can highlight a network path in explorer, copy that text, and keep pasting it when subsequently specifying paths.
    There are not limitations on how many share points you can create.
    An further alternative method would be to use the administrative share path, \\regan\N$\TV1 Promos\sas\sas1.aif. Here the drive letter N: is replaced by \\regan\N$. This method is particularly useful where the volumes are not explicitly shared.
    One other method is to map shared volumes or folders to a drive letter, but for this to work, the drive mapping must be consistant across all participating renderers, so you would have to run around these computers mapping network shares to drives on a per job basis as required, so it is not at really very practical.
    After Effects 6.5 introduced a smarter way if the project is opened on another computer on the network, it will attempt to replace the locally specified footage paths, with network paths. Mostly it works okay, sometimes it doesnt. Footage that is sourced from mapped network drives, for example where W: drive refers to \\Gfxserver2\graphics2, will have problems.
    The best way is to use the UNC paths from the begining of the projects construction, then there is never an issue, and the project will always be portable, but if you forget or cant be bothered, or think that using network rather than local paths is slowing you down, ...
    1) save your project,
    2) open your project on another machine
    3) reload one of the missing footage files ... the rest should fill themselves in, if they dont they are probably on another drive
    4) check the output is network pathed
    5) save the project, and submit that version to the render farm
    Ultimately, for this project to network render successfully, the source footage needs to be completely specified in a manner where it is accessable by any machine on the network, so use must be made of
    \\computername\VolumeLetterDollarSign\foldername\foldername\imagename, or
    \\computername\sharedvolume\foldername\foldername\imagename, or
    \\computername\sharedfolder\imagename methods.
    Output Footage Paths and Render Settings
    Once the source files have been amended to include a full network pathname, the parameters for the render can be set.
    STEP ONE
    In the Render Queue window, firstly choose a single frame Output Module, such as TARGA or JPEG; a single file streamed format, like an AVI, MPEG, or Quicktime Movie CAN NOT be created by more than one computer at one time. However if your render queue has multiple single file streams queued, multiple render slaves can render the multiple queued streams, with one slave rendering one of the streams at a time two or more render slaves cannot render the same stream at the same time.
    STEP TWO
    In the Render Settings, make sure you turn off storage overflow and turn on skip existing frames so that once any frames that have been network rendered, those frames will not be rendered again by another machine.
    STEP THREE
    The destination for the outputted animation should include the full network pathname, so that all the outputted files end a up in the same place.
    Warning: If local paths are inadvertently used here the rendered files could be scattered over any local hard drive of the machines that will be rendering the animation, and each rendering machine would not be able to determine which frames had been rendered by other machines, so each machine could end up rendering the entire sequence itself, it the drive letter exists on that rendering machine.
    Again it doesnt really matter which of the network naming methods is employed; either
    \\computername\VolumeLetterDollarSign\foldername\foldername\imagename, or
    \\computername\sharedvolume\foldername\foldername\imagename, or
    \\computername\sharedfolder\imagename will work.
    STEP FOUR
    The project with its network pathed source and output files specified, and render queue setup would then be saved.

  • Managing sole iTunes library with multiple machines on-the-go with database&media on external drive?

    Hi fellows. Trying to organize & revamp my iTunes library & wish to achieve a configuration but lack the expertise to achieve it. My objectives are:
    Objectives
    +1. for my multimedia content to be stored/accessed from an external hard disk+
    +2. to keep the entire iTunes multimedia & all necessary operating files as self-contained as possible,+
    +3. Allow multiple machines to manage import of new content into library, i.e. download new episodes of podcasts, rip new music & movies into the library, sync-ing of iPod even when said machine is nowhere near my external hard disk (this is the tricky one)+
    _Justification for my objectives_
    For objective #3, it's not that I'm trying to be funny. My use case is I've 1 laptop (main machine & always at home) & 1 netbook (brought to school daily). Most of my iTunes usage will be via the stay-at-home laptop; but I may wish to add/manage new content onto my iTunes library/iPod while I'm outside (hence, decoupled from my external hard disk at home).
    I'm not trying to be funny for objective #1 either. I wish to have a streamlined process of retaining & re-instating my iTunes library meta-data across system re-format, i.e. where the iTunes library db files' location may be changed (most obvious being the new "My Music" folder path in Win7 from WinXP)
    _What I'm well aware of_
    I understand that storing/re-locating multimedia on external hard disk is no-kick to iTunes veteran. I also understand that after a new installation of iTunes, I can move the fresh copy of "iTunes_library.itl" & "iTunes_library.xml" files to my external volume & point iTunes to their new address from the 2nd launch onwards. I also understand any new machines I introduce into my environment can be directed to that exact copy of iTunes library database files when I install iTunes & pressing "Shift" key upon their launch. Lastly, I also understand that I will have no access to my multimedia files (for playback) while I'm outside, but I can still add new content, & download new podcasts & remove old content from my library/iPod
    _The question_
    However, what I have difficulty achieving, is launching iTunes on my on-the-go machine(s), without a local copy of the iTunes library database files. I doubt iTunes will be able to launch at all, & if I create a new library (just to launch iTunes), it will not possess content from my main media collection & likely to wipe out my iPod.
    _My novice attempt & its inadequacies_
    My 1st instinct, is to have the iTunes library database files stored in the cloud. More specifically, I can store them files in my DropBox folder/account, & have every machine update to a single latest version of iTunes library database files upon Internet connection. However, this method depends on:
    +* the DropBox application to exist forever,+
    +* assumed the installation path of DropBox (or any applications for that matter) to persist/remain unchanged across future versions of operating systems,+
    +* Will corrupt any un-sync-ed version of my iTunes library database files when I launched iTunes without 1st updating the local cached copy of them database files (suppose I manage my iPod playlists via iTunes on a plane)+
    _Open for recommendations_
    So here I'm, all ears to any fine recommendations you iTunes veterans have to offer. Thanx in advance.

    In case anyone is interested in these sorta thing:
    UPDATE #4: Unfortunately, if you discover you tagged something wrong after importing the songs, even when you correct the tags, iTunes will not re-organize the folder hierarchy, as long as “Keep iTunes Music folder organized“ is UNchecked, even if the songs were imported via the “Automatically add to iTunes folder” channel.
    Example: I imported a folder of Andy Lau songs into my library, unbeknownst to me, some of the songs had their “Artist” tagged as “Andy Lau“, while some had theirs tagged as “Liu De Hua“. Naturally, the 1st time round I imported them, 2 separate folders were instantiated as to iTunes, those were names of 2 distinct artists (this is expected, & acceptable behaviour). The downside is, even if I re-tagged all those songs with “Andy Lau” as the artist, iTunes will no longer re-organize & consolidate all them songs into a single “Andy Lau” artist folder (because I’d “Keep iTunes Music folder organized” UNchecked remember?)
    Well, you may suggest that I check that option when managing my future library & uncheck it right before launching/using my main, current library. The thing is, iTunes preferences persist across libraries, & one day I will forget to uncheck that option, launch my main library & lose all my current music folder structure to iTunes’ re-organization (which I don;t want, as of yet). Which brings me to another update:
    UPDATE #5: My future iTunes library has “I:/iTunes_Media” as iTunes’s “Home” folder. This setting persists even when I switch back to my main, current library. Which means, if I’d my external volume connected while launching/using my main, current library, all new content will end up in that folder on the external volume! Which messes up the whole scheme of things! Arghz!
    UPDATE #6: Here’re some updates with regards to ID3 tags:
    * the “Album Artist” in Windows (7) Explorer = the “Album Artist” field in iTunes 9
    * Modifying a ID3 tag field using Windows (7) Explorer will update BOTH the v1 & v2.3 version of the ID3 tags on the song (if the song has both versions of tags embedded & if the field exists in both tag versions). If you update the tags via iTunes, only the v2.3 tags get updated. This is surprising.
    * the “Contributing Artist” field in Windows (7) Explorer = the “Artist” field in iTunes 9 & the “Artist” field in ID3v1 tags (changes in tany of the 3 fields will affect the other 2 respectively)
    o However, iTunes Media folder organization will only adhere to the value saved in the “Album Artist” field [if you change the value of "Contributing Artist" in Windows (7) Explorer, the "Artist" field in iTunes or the "Artist" field in a song's ID3v1 tags, no folder re-organization will occur]
    UPDATE #7: Some updates with regards to workflow:
    Decided to retain all non-music content within a sub-folder named “non-music” beneath the main album folder (& take strict precautions to import songs only). The exception is “m3u” playlist files. Have read elsewhere if you import a folder of songs & there’s an “m3u” playlist inside, your library may have duplicates and/or the tags of those songs you just imported could be screwed, etc.

  • Run invoke-command on multiple machines at the same time

    Hey all so I read that if I store my New-pssession in a variable then used that in my invoke-command it would run all computers at the same time.
    $a = Get-Content "C:\Users\cody-horton\Desktop\list.txt"
    $session
    for($i=0;$i -lt $a.Length;$i++){
    if(!(Test-Connection -Cn $a[$i] -BufferSize 16 -Count 1 -ea 0 -quiet)){
    Write-Host $a[$i] -foregroundcolor red
    else{
    $session = New-PSSession $a[$i]
    Invoke-Command -Session $session -FilePath "\\My computer\C`$\Users\public\Documents\zip folder.ps1"
    What exactly am I doing wrong I just need to run this script on multiple machines at the same time.
    Thanks.
    Edit: Also what would be the best way to close all the sessions thanks.

    Hi there,
    So what I think you are doing wrong here is that you are overwriting the value in $Session everytime you executed code inside for loop. try the below:
    $a = Get-Content "C:\Users\cody-horton\Desktop\list.txt"
    $session = @() #define this as an array
    for($i=0;$i -lt $a.Length;$i++){
    if(!(Test-Connection -Cn $a[$i] -BufferSize 16 -Count 1 -ea 0 -quiet)){
    Write-Host $a[$i] -foregroundcolor red
    else{
    $session += New-PSSession $a[$i] #add the new session to the array, at the end it will be a collection of sessions
    Invoke-Command -Session $session -FilePath "\\My computer\C`$\Users\public\Documents\zip folder.ps1" #I think the above one won't work..first you need to copy the script locally on the machine and then execute it#Why this won't work because of Second-Hop Authentication
    Have put comments where I edited your code.
    Hope this helps
    Knowledge is Power{Shell}. http://dexterposh.blogspot.com/

  • How to run JMS in multiple machines?

    hello all,
    i am new to jms, so please let me know how to run jms in multiple machines.
    i mean, iin one machine publisher and the queue/topic are running, in other machine listener/clent. so client and the publisher must be running in an application servers. so it means only two app servers can send and receive messages?
    then what is the advantage of JMS?
    I think the sender will be sending messages and the listener can get them at later time, is my understanding is correct?
    Please let me know how it works in multiple machines? and how shold i start now?
    please give me the details....
    thanks in advance.
    charan

    Hi Charan!
    If I understand your questions right, yes, you can use JMS on several physical machines. They connect using JNDI.
    However, you cannot communicate asynchronously through JMS alone. The benefit of JMS is that you have a common interface to use when talking to a messaging provider, like WS MQ, Sonic MQ, etc. Most vendors support JMS, and those who do are called JMS providers. They are required to implement either the Queue functionality or Topic functionality (or both). When using a JMS provider, you really see the benefit of JMS. How else would you secure that your system is loosly coupled? Also, by using JMS and Message Driven Beans you get the benefits of container managed transactions, object pooling, etc.
    If you do not have a JMS provider, you still have the benefit of a loosly coupled system by using JMS to integrate your applications, but in a large system, you should consider a tool for asynchronous messaging(again WS MQ, Sonic MQ, etc).
    I would recomend you start by reading the JMS section of the J2EE tutorial provided by sun.
    -K-

  • Any way to export with multiple presets at once?

    Hi! Is there any way to export with multiple presets at once? I do a lot of product photography, usually clients send me around 5 to 10 pieces to photograph, and from each photo requires to be exported to about 7 different formats and sizes for different uses. I have a preset for each export, but this requires multiple clicks per export, select the preset, the destination folder, etc.....   and at the end I end taking about 1 hour to just export files when if I could just select 1 time a folder of presets and the destination folder just once, then it will take no time to get all oft those exports.
    Thanks!

    Some of those other photographers have assistants they can task with doing the drudgery.
    I would describe what you're wanting scripted multiple exporting, not multiple export presets, because multiple-exports don't necessarily rely on presets, just initiating an Export and clicking on various things for each one.
    I assume you have a preset set for each of the 7 format-size-uses variations, and then most of the time consuming part is setting the Choosing the destination folder that gets mirrored to DropBox?  You can copy/paste most of the path into the Folder address area after clicking Choose.
    Without knowing what your master-photo folders and dropbox-mirror folder names and organization are it's hard to know if you've thought of all the shortcuts you might use or if things are organized in the most efficient manner.
    If you're on Windows, maybe something like AutoHotKey macros would help with what you're doing.

  • Unexpected problem; recently upgraded to version 3. Export of multiple tracks makes each file length same as longest file. Didn't do that in Vers2

    Unexpected problem; We rrecently upgraded to version 3. Export of multiple tracks in a multitrack session makes each file length same as longest file. Didn't do that in Vers 2 of soundtrack

    Hmm... Ok, so I took that project and tried importing it to a new project, still with no luck. Then I tried taking that project and creating a new timeline by dragging one of the stills to the new item icon so that I'd get whatever PrPro decided was the best timeline, and still no luck. Then I tried creating a new project and importing different footage and it worked. Tells me perhaps there's something wrong with the stills? I know not to use overly-large stills in a project, but they're all around 4k resolution or smaller... so surely that's not too big, right? 412 stills total 1.5GB which means an average of about 2.5 MB per file... certainly not too big. I even had a couple video clips in the timeline that were shot with a point-and-shoot still camera, and I tried taking those out just in case it was something in those, and still no success.
    So, ultimately this project got exported without CUDA, and I'm okay... but I also am wondering what caused the problem? I do a fair amount with stills (not a lot, but enough) and I need it to work... do I just go on with life, assuming that it was some weird fluke? Or do I try a complete wipe and reinstall of OS and Programs?
    Oh, one other tidbit... whenever PrPro crashed and closed, the application itself wouldn't close. I have to go into the program manager and force quit it...
    Thoughts? Recommendations?

  • Need to get this project exported asap (CS6 (6.0.2)) (all details provided)

    Been working for a while with a slow workflow, I didn't feel like I needed to upgrade anything, because I always got the work done (somehow)..
    The file size is close to 900 MB, but there is no way I'll be able to get this project exported in a good quality (with low/average settings it freezes 5% in).
    I'm going to get a new machine in the near future, but this project need to be delivered now. What steps shall I take? Thanks

    ATI Radeon HD 4670 256 MB
    Mac OS X Lion 10.7.5 (11G63b)
    Yes there are a few open gaps (intended, can be filled by "black videos" obviously- will that help?).
    No third party effects.
    Several effects have been applied. Too many to mention ("transform", "mirror", color changes, contrast changes etc).
    I can render and export each nest separately. I think the encoder "gets overwhelmed" when it has to read the whole project (which is why I think it will be useless to try to get a proper result with my current system). Thanks

  • Networked Music Library - How to refresh multiple machines' library window?

    I want to have my iTunes music library on a network hard drive, accessed by multiple machines.
    But, if I add / rip new tracks into the library, only the machine that ripped the tracks knows they are there. The others are oblivious to the new tracks.
    How do I get the other machines to automatically check the library to see if any new tracks have been added, and add them to their respective library windows if they find any?
    Thanks - Dave G.

    Interesting question! And, one that I will use often, so I did a test.
    My main library is on my iMac, my travel library is on my MBP. Once back home, simply choose File:Import:Library/Project. Now browse to the MBP (file sharing must be enabled), and select the travel library & import. I did NOT have Aperture running on the MBP during the import.
    It works quite well.
    Steve

  • How to setup Print server on multiple machines?

    Hi,
    Can someone please let me know the process to setup Print Server on multiple machines?
    This is what I have been thinking.
    1. Install Ghostscript on all the machines.
    2. Setup Print Server on all the machines.
    3. I guess I have to point the IP address somewhere as final step?
    Thanks in advance
    PM

    The steps you have mentioned is correct.
    When you set up the print service ,you will define the URL for Financial Reporting web application server in the .properties file and this will get registered in the HSS registry(This is the step where you run the FRprintserversetup.cmd).So when you run the setup from each print server there will entries created for the same in HSS registry.Then there will internal logic which will work on which server should take which process.
    Thx

  • Why do I get a popup message in multiple languages requiring a shutdown and restart

    Why do I keep getting a popup message in multiple languages requiring a shutdown and restart?  Has been going on for a few months.  Because of the shutdown reports have been sent to Apple upon restart.

    After it restarted, did you see a dialog box resembling this one:
    If so, next time it occurs click Report...
    Before you send it to Apple, copy the text of the report. Paste it in a reply. Remove or obscure any personal information, should it appear.
    If problems continues to occur please determine if they also occur in "Safe Mode":
    Safe Mode or "Safe Boot" is a troubleshooting mode that bypasses all third party system extensions and loads only required system components. Read about it: Starting up in Safe Mode
    Starting your Mac in Safe Mode will take longer than usual, graphics will not render smoothly, audio is disabled on some Macs, and some programs (iTunes for example) may not work at all.
    To end Safe Mode restart your Mac normally. Shutdown will take longer as well.

  • Hyperlinks from other placed indd documents or pdfs getting lost after export to PDF

    after upgrade vom CS 5.5 to CS 6: hyperlinks from other placed indd documents or pdfs getting lost after export to PDF
    I tried all options for the as well known and working export options

    after upgrade vom CS 5.5 to CS 6: hyperlinks from other placed indd documents or pdfs getting lost after export to PDF
    I tried all options for the as well known and working export options

  • Indesign getting crash while exporting because of bad pdf inside file.

    Hi,
    My indesing is getting crash while exporting to pdf. I have link to pdf file inside the indesign file, when i export that indesign file to pdf indesign is getting crash without showing any error message. If i place another pdf file in same location then it is going fine and i can able to get the output pdf file. Please suggest me how can i check and correct that pdf file and is there any solution to this. Please help me i have thousands of indesign pages with the same issue. If you want pdf i can send it to your mail id. please i need fix for this problem.
    Thanks
    Kiran

    For all of you lurkers, Kiran sent me a copy of the PDF, and it looks like I found the problem (and learned a new technique which may help others in the same boat).
    I tried re-frying the file by creating an eps and distilling, but the result was the same, then I tried exporting from CS3, just for fun, and it worked, but I realized the PDF had been cropped, so I started trying differnt sections and isolated an area that failed, though that wasn't particularly helpful.
    Acrobat preflight didn't find anything that looked odd or problematic, and then I decided to run Examine Document. It came up with MetaData (no surprise) and also two items under a heading of Deleted and Cropped objects, with no preview or description. I unselected the MetaData from the list of found stuff, them clicked the Remove button in the Examine Document panel, and resaved (and while you have the opportunity to give it a new name, doing so doesn't remove the stuff untill you do the whole procedure again and save with the same name). Placed the new version of the PDF in a new file and exported no problem.

Maybe you are looking for