Best practice to unfreeze fcpx

Besides a hard shutdown, holding down the startup button, are there any key strokes, standing on 1 foot while spinning the computer on 1 finger or any other suggestions.  thanks.

Whenever this has happened to me I force quit FCP X, then just wait... Sometimes it takes a few minutes, other times much longer.

Similar Messages

  • Best Practices for sharing media with iMovie and FCPX

    So I've a large iMovie Events directory, and would like to use that media with both iMovie and FCPX projects.
    I'd rather not duplicate the media, so would prefer to import as references into FCPX.
    The dilemma is that I see that it's possible to modify or move media from within the iMovie application, and therefore break the reference to that media with FCPX.
    I only see two options:  (1) Never Ever modify the location/name of media in the iMovie Events file (even from within the iMovie app) since I would break an FCPX link if that media is referenced, or (2) always import (copy) the iMovie events into the FCPX Event Library making an independent original so that I can confidently operate on those media files in either application.
    I'd surely rather not have to do (2 )(e.g. doubling my storage demands) to gain the flexibility of using either application to edit the video, but really don't want to live with the restrictions of (1).
    Thoughts / Solutions?  What might you consider as options or best practices?

    Unless there is some other reason, users should own the right to share their mailboxes - it shouldn't be something that demands administrator management (if only so that the administrators aren't swamped by user requests for sharing their mailboxes). 
    For true shared mailboxes, when the mailbox is created, full access is granted by an administrator.

  • FCPX 10.1: Best practice moving libraries (files)?

    In the past, FCPX encouraged us to moves files WITHIN the application, through the UI.
    Does 10.1. now encourage us to move/delete files (libraries) "under the hood?"--in the finder? Given a choice, what is the best practice?

    Hi dostoelk,
    Welcome to the Support Communities!
    Clips, projects, and events should still be managed within Final Cut Pro X.  For more information, read our Managing Media with Final Cut Pro X Libraries white paper (PDF) at: www.apple.com/final-cut-pro/docs/Media_Management.pdf
    I hope this information helps ....
    Happy Holidays!
    - Judy

  • HT4664 What is the best graphics card for FCPX?

    In the nonstop anti-FCPX propaganda is an article of interest — posted 7/9/12 — comparing the benchmarks of FCPX and PP6.
    http://www.streamingmedia.com/Producer/Articles/ReadArticle.aspx?ArticleID=83582 &PageNum=1
    The system used was a 2 x 2.93 GHz Quad-Core Mac Pro from early 2009 running MacOS X version 10.7.4 with 12 GB of RAM and an NVIDIA Quadro FX 4800 graphics card with 1.5 GB of onboard RAM.
    In most cases PP6 outperformed FCPX with this configuration. However, in the comments Ben Balser pointed out that FCP X's A/VFoundation engine wasn't ideal on the NVIDA card:
    "Quadro is actually not the best card for FCP X's A/VFoundation engine, but great for CS6's Mercury engine, so the test is amazingly flawed right there. Try both on a 5780 card and watch things drastically change. I've done that test myself. Exporting to Compressor uses a MUCH more sophisticated encoding engine meant for higher level, professional transcoding, not simple outputs, which are faster using Export Media…"
    Apple lists this card on its support page: http://support.apple.com/kb/HT4664
    So,
    What is the best graphics card for FCPX?

    Ben,
    Thanks for chiming in on that article. It would be good to have a benchmark comparison with the two systems each with a preferred card.
    I'm hoping to see some other comparisons on this thread. Also, some links to other articles about best practices and configurations.

  • Best Practice: Export unrendered for internet

    Hi
    As I recall, in previous versions of FCPX  it was considered best practice, at least for exports for web, to export unrendered.
    Still true? Ever true for all other export reasons?
    I want to post a rough of a film on youtube, unlisted, for a few friends to comment on.
    best
    elmer
    Btw, always seems like when I open my browser while fcpx is open, I get problems and have to delete my prefs to get back to normal. Any reason why? Just curious.

    Steve: If these are bitmaps inside a PDF that's going to be viewed on the iPad, you cannot rely on its "native resolution". Think about this: What if the original page size of this PDF is 5.5" x 8"? What if it is 20" x 32"? Which one will show the images "at their native resolution"?

  • Best Practices for Export

    I have recently begun working with a few AIC-encoded home movie files in FCPX. My goal is to compress them using h.264 for viewing on computer screens. I had a few questions about the best practices for exporting these files, as I haven't worked with editing software in quite some time.
    1) Is it always recommended that I encode my video in the same resolution as its source? For example, some of my video was shot at 1440x1080, which I can only assume is anamorphic. I originally tried to export at 1920x1080 but then changed my mind as I assumed the 1440x1080 would just stretch naturally. Does this sound right?
    2) FCPX is telling me that a few of my files are in 1080i. I'd like to encode them in 1080p as it tends to look better on computer screens. In FCPX, is it as simple as dragging my interlaced footage into a progressive timline and then exporting? I've heard about checking the "de-interlace" box under clip settings and then doubling the framerate but that seemed to make my video look worse.
    3) I've heard that it might be better practice to export my projects as master files and then encode h.264 in Compressor. Is there any truth to this? Might it be better for the interlaced to progressive conversion as well?
    Any assistance is greatly appreciated.

    1) yes. 1440 will display ax 1920.
    2) put everything in a 1080p project.
    3) Compressor will give you more options for control. The h264 from FCP is a very high data rate and makes large files.

  • Best Practices for Keeping Library Size Minimal

    My library file sizes are getting out of control. I have multiple library's that are now over 3TB in size!
    My question is, what are the best practices in keeping these to a manageable size?
    I am using FCPX 10.2. I have three camera's (2x Sony Handycam PJ540 and 1x GoPro Hero 4 Black).
    When I import my PJ540 videos they are imported as compressed mp4 files. I have always chosen to transcode the videos when I import them. Obviously this is why my library sizes are getting out of hand. How do people deal with this? Do they simply import the videos and choose not to transcode them and only transcode them when they need to work on them? Do they leave the files compressed and work with them that way and then transcoding happens when exporting your final project?
    How do you deal with this?
    As for getting my existing library sizes down, should I just "show package contents" of my library's and start deleting the transcoded versions or is there a safer way to do this within FCPX?
    Thank you in advance for you help.

    No. Video isn't compressed like compressing a document. When you compress a document you're just packing the data more tightly. When you compress video you do it basically by throwing information away. Once a video file is compressed, and all video files are heavily compressed in the camera, that's not recoverable. That information is gone. The best you can do is make it into a format that while not deteriorate further as the file is recompressed. Every time you compress a file, especially to heavily compressed formats, more and more information is discarded. The more you do this the worse the image gets. Transcoding converts the media to a high resolution, high data rate format that can be manipulated considerably without loss, and go through multiple generations without loss. You can't go to second generation H.264 MPEG-4 without discernible loss in quality.

  • Importing/Transcoding best practices

    Hello
    Apologies if this is a very basic question: I just returned from Africa and I have many hours of video as a result. All videos are 1080p, high quality and what not, and vary from 30s to 30 minutes in length. Therefore, some files are a few megabytes, other, a few gigabytes.
    The output for my project is no more than 7 minutes or so, therefore I have to cut a lot.
    My question: what is the best practice when you have this much video? Should I import and transcode everything (all 73GB of video) or is there a best practice to say cut what you need, then transcode that.
    I am using a MBPr 15" with 16GB and 512GB, so I've unchecked the "Copy files to Final Cut Events Folder" in order to not eat up my local HD.
    Anyhow, any advice would be really appreciated, thanks again
    Rob

    First, thank you for your quick reply and being very nice to me seeing as my questions are probably very basic. I have purchased a book on FCPx but it didn't deal with workflows very well, especially with what I'm dealing with.
    "I'd suggest you acquire a large external drive":
    Done, I'm using a 1TB USB3 drive specifically for this project. All videos are loaded onto the HD and when I began importing, I checked off the option to "Copy files to FCPx events folder" in order to centralize my content to the HD. That said, the Events Folder on the local HD DOES have content, pointers I believe. Should I be backing those up to the external HD?
    "Before capturing footage create a CDL (capture decision list) and capture what you intend to use."
    Not done. Some of the footage I have didn't lend itself for this unfortunately. For example, I had a gopro camera mounted on my head and another mounted on the head of a local tribesman while we went hunting for small game (their food of course). So the videos are long and I'd like to include portions of it into the final video. Is the only option for me to import and optimize the whole thing, or can I import, not optimize, review, cut, save the portions I like, then optimize those sections?
    I'm hoping you can spare a little more patience for me. I'm a photog so my workflow there is solid. I'm very new at this and I'd like to get better. The management of files for me is key so I want to get off on the right foot.
    Cheers
    Rob

  • Logical level in Fact tables - best practice

    Hi all,
    I am currently working on a complex OBIEE project/solution where I am going straight to the production tables, so the fact (and dimension) tables are pretty complex since I am using more sources in the logical tables to increase performance. Anyway, what I am many times struggling with is the Logical Levels (in Content tab) where the level of each dimension is to be set. In a star schema (one-to-many) this is pretty straight forward and easy to set up, but when the Business Model (and physical model) gets more complex I sometimes struggle with the aggregates - to get them work/appear with different dimensions. (Using the menu "More" - "Get levels" does not allways give the best solution......far from). I have some combinations of left- and right outer join as well, making it even more complicated for the BI server.
    For instance - I have about 10-12 different dimensions - should all of them allways be connected to each fact table? Either on Detail or Total level. I can see the use of the logical levels when using aggregate fact tables (on quarter, month etc.), but is it better just to skip the logical level setup when no aggregate tables are used? Sometimes it seems like that is the easiest approach...
    Does anyone have a best practice concerning this issue? I have googled for this but I haven't found anything good yet. Any ideas/articles are highly appreciated.

    Hi User,
    For instance - I have about 10-12 different dimensions - should all of them always be connected to each fact table? Either on Detail or Total level.It not necessary to connect to all dimensions completely based on the report that you are creating ,but as a best practice we should maintain all at Detail level only,when you are mentioning any join conditions in physical layer
    for example for the sales table if u want to report at ProductDimension.ProductnameLevel then u should use detail level else total level(at Product,employee level)
    Get Levels. (Available only for fact tables) Changes aggregation content. If joins do not exist between fact table sources and dimension table sources (for example, if the same physical table is in both sources), the aggregation content determined by the administration tool will not include the aggregation content of this dimension.
    Source admin guide(get level definition)
    thanks,
    Saichand.v

  • Best practices for setting up users on a small office network?

    Hello,
    I am setting up a small office and am wondering what the best practices/steps are to setup/manage the admin, user logins and sharing privileges for the below setup:
    Users: 5 users on new iMacs (x3) and upgraded G4s (x2)
    Video Editing Suite: Want to connect a new iMac and a Mac Pro, on an open login (multiple users)
    All machines are to be able to connect to the network, peripherals and external hard drive. Also, I would like to setup drop boxes as well to easily share files between the computers (I was thinking of using the external harddrive for this).
    Thank you,

    Hi,
    Thanks for your posting.
    When you install AD DS in the hub or staging site, disconnect the installed domain controller, and then ship the computer to the remote site, you are disconnecting a viable domain controller from the replication topology.
    For more and detail information, please refer to:
    Best Practices for Adding Domain Controllers in Remote Sites
    http://technet.microsoft.com/en-us/library/cc794962(v=ws.10).aspx
    Regards.
    Vivian Wang

  • Add fields in transformations in BI 7 (best practice)?

    Hi Experts,
    I have a question regarding transformation of data in BI 7.0.
    Task:
    Add new fields in a second level DSO, based on some manipulation of first level DSO data. In 3.5 we would have used a start routine to manipulate and append the new fields to the structure.
    Possible solutions:
    1) Add the new fields to first level DSO as well (empty)
    - Pro: Simple, easy to understand
    - Con: Disc space consuming, performance degrading when writing to first level DSO
    2) Use routines in the field mapping
    - Pro: Simple
    - Con: Hard to performance optimize (we could of course fill an internal table in the start routine and then read from this to get some performance optimization, but the solution would be more complex).
    3) Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine).
    Does anybody know what is best practice is? Or do you have any experience regarding what you see as the best solution?
    Thank you in advance,
    Mikael

    Hi Mikael.
    I like the 3rd option and have used this many many times.  In answer to your question:-
    Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized  - Yes have read and tested this that it works faster.  A OSS consulting note is out there indicating the speed of the end routine.
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine). - Yes but by using the result package, the manipulation can be done easily.
    Hope it helps.
    Thanks,
    Pom

  • Temp Tables - Best Practice

    Hello,
    I have a customer who uses temp tables all over their application.
    This customer is a novice and the app has its roots in VB6. We are converting it to .net
    I would really like to know the best practice for using temp tables.
    I have seen code like this in the app.
    CR2.Database.Tables.Item(1).Location = "tempdb.dbo.[##Scott_xwPaySheetDtlForN]"
    That seems to work, though i do not know why the full tempdb.dbo.[## is required.
    However, when i use this in the new report I am doing I get runtime errors.
    i also tried this
    CR2.Database.Tables.Item(1).Location = "##Scott_xwPaySheetDtlForN"
    I did not get errors, but I was returned data i did not expect.
    Before i delve into different ways to do this, i could use some help with a good pattern to use.
    thanks

    Hi Scott,
    Are you using the RDC still? It's not clear but looks like it.
    We had an API that could piggy back the HDBC handle in the RDC ( craxdrt.dll ) but that API is no longer available in .NET. Also, the RDC is not supported in .NET since .NET uses the framework and RDC is COM.
    Work around is to copy the temp data into a data set and then set location to the data set. There is no way that I know of to get to the tempdb from .NET. Reason being is there is no CR API to set the owner of the table to the user, MS SQL Server locks the tempdb to that user has exclusinve rights on it.
    Thank you
    Don

  • Best Practice for Significant Amounts of Data

    This is basically a best-practice/concept question and it spans both Xcelsius & Excel functions:
    I am working on a dashboard for the US Military to report on some basic financial transactions that happen on bases around the globe.  These transactions fall into four categories, so my aggregation is as follows:
    Year,Month,Country,Base,Category (data is Transaction Count and Total Amount)
    This is a rather high level of aggregation, and it takes about 20 million transactions and aggregates them into about 6000 rows of data for a two year period.
    I would like to allow the users to select a Category and a country and see a chart which summarizes transactions for that country ( X-axis for Month, Y-axis Transaction Count or Amount ).  I would like each series on this chart to represent a Base.
    My problem is that 6000 rows still appears to be too many rows for an Xcelsius dashboard to handle.  I have followed the Concatenated Key approach and used SUMIF to populate a matrix with the data for use in the Chart.  This matrix would have Bases for row headings (only those within the selected country) and the Column Headings would be Month.  The data would be COUNT. (I also need the same matrix with Dollar Amounts as the data). 
    In Excel this matrix works fine and seems to be very fast.  The problem is with Xcelsius.  I have imported the Spreadsheet, but have NOT even created the chart yet and Xcelsius is CHOKING (and crashing).  I changed Max Rows to 7000 to accommodate the data.  I placed a simple combo box and a grid on the Canvas u2013 BUT NO CHART yet u2013 and the dashboard takes forever to generate and is REALLY slow to react to a simple change in the Combo Box.
    So, I guess this brings up a few questions:
    1)     Am I doing something wrong and did I miss something that would prevent this problem?
    2)     If this is standard Xcelsius behavior, what are the Best Practices to solve the problem?
    a.     Do I have to create 50 different Data Ranges in order to improve performance (i.e. Each Country-Category would have a separate range)?
    b.     Would it even work if it had that many data ranges in it?
    c.     Do you aggregate it as a crosstab (Months as Column headings) and insert that crosstabbed data into Excel.
    d.     Other ideas  that Iu2019m missing?
    FYI:  These dashboards will be exported to PDF and distributed.  They will not be connected to a server or data source.
    Any thoughts or guidance would be appreciated.
    Thanks,
    David

    Hi David,
    I would leave your query
    "Am I doing something wrong and did I miss something that would prevent this problem?"
    to the experts/ gurus out here on this forum.
    From my end, you can follow
    TOP 10 EXCEL TIPS FOR SUCCESS
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/204c3259-edb2-2b10-4a84-a754c9e1aea8
    Please follow the Xcelsius Best Practices at
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a084a11c-6564-2b10-79ac-cc1eb3f017ac
    In order to reduce the size of xlf and swf files follow
    http://myxcelsius.com/2009/03/18/reduce-the-size-of-your-xlf-and-swf-files/
    Hope this helps to certain extent.
    Regards
    Nikhil

  • Best-practice for Catalog Views ? :|

    Hello community,
    A best practice question:
    The situtation: I have several product categories (110), several items in those categories (4000) and 300 end-users.    I would like to know which is the best practice for segment the catalog.   I mean, some users should only see categories 10,20 & 30.  Other users only category 80, etc.    The problem is how can I implement this ?
    My first idea is:
    1. Create 110 Procurement Catalogs (1 for every prod.category).   Each catalog should contain only its product category.
    2. Assign in my Org Model, in a user-level all the "catalogs" that the user should access.
    Do you have any idea in order to improve this ?
    Saludos desde Mexico,
    Diego

    Hi,
    Your way of doing will work, but you'll get maintenance issues (to many catalogs, and catalog link to maintain for each user).
    The other way is to built your views in CCM, and assign these views to the users, either on the roles (PFCG) or on the user (SU01). The problem is that with CCM 1.0 this is limitated, cause you'll have to assign one by one the items to each view (no dynamic or mass processes), it has been enhanced in CCM 2.0.
    My advice:
    -Challenge your customer about views, and try to limit the number of views, with for example strategic and non strategic
    -With CCM 1.0 stick to the procurement catalogs, or implement BADIs to assign items to the views (I experienced it, it works, but is quite difficult), but with a limitated number of views
    Good luck.
    Vadim

  • Best practice on sqlite for games?

    Hi Everyone, I'm new to building games/apps, so I apologize if this question is redundant...
    I am developing a couple games for Android/iOS, and was initially using a regular (un-encrypted) sqlite database. I need to populate the database with a lot of info for the games, such as levels, store items, etc. Originally, I was creating the database with SQL Manager (Firefox) and then when I install a game on a device, it would copy that pre-populated database to the device. However, if someone was able to access that app's database, they could feasibly add unlimited coins to their account, unlock every level, etc.
    So I have a few questions:
    First, can someone access that data in an APK/IPA app once downloaded from the app store, or is the method I've been using above secure and good practice?
    Second, is the best solution to go with an encrypted database? I know Adobe Air has the built-in support for that, and I have the perfect article on how to create it (Ten tips for building better Adobe AIR applications | Adobe Developer Connection) but I would like the expert community opinion on this.
    Now, if the answer is to go with encrypted, that's great - but, in doing so, is it possible to still use the copy function at the beginning or do I need to include all of the script to create the database tables and then populate them with everything? That will be quite a bit of script to handle the initial setup, and if the user was to abandon the app halfway through that population, it might mess things up.
    Any thoughts / best practice / recommendations are very appreciated. Thank you!

    I'll just post my own reply to this.
    What I ended up doing, was creating the script that self-creates the database and then populates the tables (as unencrypted... the encryption portion is commented out until store publishing). It's a tremendous amount of code, completely repetitive with the exception of the values I'm entering, but you can't do an insert loop or multi-line insert statement in AIR's SQLite so the best move is to create everything line by line.
    This creates the database, and since it's not encrypted, it can be tested using Firefox's SQLite manager or some other database program. Once you're ready for deployment to the app stores, you simply modify the above set to use encryption instead of the unencrypted method used for testing.
    So far this has worked best for me. If anyone needs some example code, let me know and I can post it.

Maybe you are looking for

  • How do I change the number of emails in my inbox with ios 7

    I upgraded to ios7 and now I have soooo many emails loaded. I feel like that may be part of the cause of my battery drain.  Do I seriously have to delete all these emails or is there a way to limit the number of emails in my inbox?  All I have to say

  • How to preserve 2-byte characters when "Edit Link" ?

    Keynote 6.2.2 will remove any 2-byte characters presented in the hyperlink of an object. However, the previous version doesn't have the problem. Since internationalized domain names are getting popular, I hope Apple can fix the bug soon.

  • Audio configuration error message

    after upgrading to 7.3.1, i get an error message that reads "itunes cannot run because it has detected a problem with your audio configuration". and, consequently, itunes does not open. i have reset my computer to previous settings and have attempted

  • SSRS configuration in multi server environment with SSL

    Hi I have read numerous articles on configuration of Reporting Services in SharePoint integrated mode but not able to figure out if I'm missing something. Environment description SharePoint Enterprise 2010 2 web servers, 2 app servers, 1 DB server. A

  • Keeping track of deleted app updates

    Hi, Sometimes I keep an app just to know when there will be an update (I don't use it because i'm waiting for some feature to be developped). Do you know of any app where I can keep tracked of purchased apps updates without keeping them on my device?