Best practice when dealing with long clips

I have some long clips that I want to use more than once with different in and out points. I know this can't be done so is the best practice to make duplicates of the clips and set different edit points or somehow roughly split the large clip into smaller clips and name them accordingly.

Not certain what you think you can't do. Take a long clip and open it in the Viewer. Set In and Out points. Drop that into the Timeline. Now you can move along in the Viewer clip and set new Ins and Outs and drop that into the Timeline. Clips in the Timeline are created from the Ins and Outs you set in the Viewer.
Is that what you want to do? If it is, I don't where making copies of the clip would work for you
Later, if you want to match up a clip in the Timeline to that master clip, just use Match Clip (find) in the timeline to find where it correaltes to your main clip
You can have FCE automatically create subclips at camera cut points by using DV Stop/Start Detect if that is what you're looking for

Similar Messages

  • Best practice to deal with computer or departed employee

    Dear All,
        I would like to inquire about the best practice to deal with computer and computer account of a departed employee. should be disabled, reset, deleted, or just kept as it is until it is needed by another user?
    Regards
    hiam
    Hiam

    Ultimately your needs for their identities and equipment after they leave are what dictate how you should design this policy.
    First off, I recommend disabling the account immediatly following the employee's departure. This prevents the user from using their credentials to log on again. Personally I have a "Disabled Users" OU in Active Directory. When I disable accounts
    I move them here for easy future retrieval.
    It is possible the user may return, or if they have access to certain systems you may need the account again. I would keep the accounts for a specific amount of time (e.g. 6 months, but this depends on your needs) and then delete them after this period of
    time.
    If the employee knows the passwords to any shared accounts (not a good idea though many organizations have these) or has accounts in other systems that do not use Active Directory authentication, immediately change the passwords to these accounts again following
    the employee's departure.
    If the employee had administrative access to their computer (not a good idea, though is the reality in most cases) you should disable the computer account and remove it from the network. This will prevent the employee from remotely accessing the machine
    until you are able to rebuild or inspect it for unapproved changes.
    Ask the user's manager, team members, and subordinates if there are any files that the employee would have stored on their computer. Back these up as necessary.
    Most likely you will reuse the computer for another employee. For best results you should use an image so you can re-image their machine and not have to worry whether they had installed any unwanted software (backdoors, viruses, illegal software, etc).
    Hope this helps.
    Jason Warren
    @jaspnwarren
    jasonwarren.ca
    habaneroconsulting.com/Insights

  • Best practice for dealing with Recordsets

    Hi all,
    I'm wondering what is best practice for dealing with data retrieved via JDBC as Recordsets without involving third part products such as Hibernate etc. I've been told to NOT use RecordSets throughout in my applications since they are taking up resources and are expensive. I'm wondering which collection type is best to convert RecordSets into. The apps I'm building are webbased using JSPs as presentation layer, beans and servlets.
    Many thanks
    Erik

    There is no requirement that DAO's have a direct mapping to Database Tables. One of the advantages of the DAO pattern is that the business layer isn't directly aware of the persistence layer. If the joined data is used in the business code as if it were an unnormalized table, then you might want to provide a DAO for the joined data. If the joined data provides a subsiduray object within some particular object, you might add the access method to the DAO for the outer object.
    eg:
    In a user permissioning system where:
    1 user has many userRoles
    1 role has many userRoles
    1 role has many rolePermissions
    1 permission has many rolePermissions
    ie. there is a many to many relationship between users and roles, and between roles and permissions.
    The administrator needs to be able to add and delete permissions for roles and roles for users, so the crud for the rolePermissions table is probably most useful in the RoleDAO, and the crud for the userRoles table in the UserDAO. DOA's also can call each other.
    During operation the system needs to be able to get all permissions for a user at login, so the UserDAO should provide a readPermissions method that does a rather complex join across the user, userRole, rolePermission and permission tables..
    Note that f the system I just described were done with LDAP, a Hierarchical database or an Object database, the userRoles and rolePermissions tables wouldn't even exist, these are RDBMS artifacts since relational databases don't understand many to many relationships. This is good reason to avoid providing DAO's that give access to those tables.

  • Color correction/effects when working with long clips

    Hi,
    First time poster!
    I'm a beginner with Adobe Premiere Pro 5.5. I'm using it right now to do some color correction on badly produced DVD's in my film collection! I was wondering if it is possible when doing color correction effects on a clip if you can apply different effects to different parts of a clip. For example - At the beginning of a clip, the scene is really bright, but, some 30 minutes later, the clip is really dark. Is it possible to darken the beginning and lighten the ending? How would you folks deal with a situation like this? Would you edit the film down to smaller parts? If so, how would you go about that?
    Thanks for the help!
    Howard

    Howard,
    Welcome to the forum.
    Along with Jon-M-Spear's suggestion (the one that I use most often too), many Effects can be Keyframed, so that they change over time. However, this does take some time, and can involve a lot of fiddling.
    If you have a long Clip, and know that you will be adding the same set of Effects, but just with different attributes, you can do one of two things, and both rely on Jon-M's method:
    You can create an Effect Preset, and apply that to all of your Cut, let's call them "sub-Clips," even though that term has other meanings, so be careful when using it. Then apply that Preset to all of those sub-Clips. Then, tweak each application of the Preset, as is needed for the individual sub-Clip.
    In a similar fashion, you can apply all necessary Effects to the first of your sub-Clips, tweak for it, as is necessary, then Rt-Click on it, chosing Copy. Now, Select all of the rest of the sub-Clips, to which you wish to apply that set of Effects, Rt-Click on them, choosing Paste Attributes, and then tweak, as needed.
    A variation on the above would be to apply the Effects to the origianal Clip, and THEN make the Cuts. Go back and tweak each individual sub-Clip.
    Good luck,
    Hunt

  • Best practice for dealing with Recordsets, JDBC and JSP?

    I've spent the last three years developing web apps using JSP, Struts and Kodo JDO for persistence. All of the content for the apps was created as Java objects using model classes and saved to an Oracle db. Thus, data retrieved from the db was as instances of the model classes and then put into Struts form beans, etc.
    I changed jobs last month and am now having to use Servlets with JDBC to retrieve records from db tables and returning it into Recordsets. Oh, and I can't use Struts in my JSPs either. I'm beginning to think that I had it easy at my previous job but maybe that's just because I was used to it.
    So here are my problems/questions:
    I have two tables with a one to many relationship that I need to retrieve data from, show in a jsp and be able to update eventually.
    So here's what I am doing:
    a) In a servlet, I use a SQL statement to join the tables and retrieve the results into a Recordset.
    b) I created a class with a bunch of String attributes to copy the Recordset data into, one Recordset row per each instance of the bean and then close the Recordset
    c) I then add the beans to an ArrayList and save the ArrayList into the session.
    d) Then, in the JSP, I retrieve the ArrayList from the session and iterate over each bean instance, printing the data out to the jsp. There are some logic statements to determine when not to print redundant data caused by the one to many join.
    e) I have not written the code to update the data yet but was planning on having separate jsps for updating the (one) table and the (many) table.
    Would most of you do something similar? Would you use one SQL statement to retrieve all of the data for display and use logic to avoid printing the redundant part of the data? Or would you have used separate SQL queries, one for each table? Would you have saved the results into something other than an instance of a bean class that represents one record in the RecordSet? Would you have had a bean class with attributes other than Strings - like had a collection attribute to hold the results from the "many" table? The way that I am doing everything just seems so cumbersome and difficult compared to using Struts and JDO before.
    Your help/opinion will be greatly appreciated!

    Would you use one SQL statement to retrieve all of the data for display Yes.
    and use logic to avoid printing the redundant part of the dataNo.
    I believe in minimising the number of queries. If it is a simple one-many join on a db table, then one query is better than one + n queries.
    However I prefer to store the objects in a bean class with attributes other than strings - ie one object, with a collection attribute to hold the related "many" records.
    Does the fact you are not using Struts mean that you have to use scriptlet code? (shudder)
    Or are you using JSTL, or other custom tags?
    How about tools like Ant? Junit testing?
    The way that I am doing everything just seems so cumbersome and difficult
    compared to using Struts and JDO before.Anything different takes adjusting to. Sounds like you know what you're doing for the most part. I agree, in terms of best practices what you have described so far sounds like a step backwards from what you were previously doing.
    However I wouldn't go complaining about it too loudly, too quickly. If you're new on the block theres nothing like making a pain of yourself, and complaining how backwards the work they have done is to put your new workmates' backs up
    Look on it as a challenge. Maybe discuss it quietly with a team leader, to see if they understand how much easier/better/less error prone such approaches can be?
    Struts, cumbersome as it can be, definitely has the advantage of pushing you to follow good MVC practice.
    Good luck,
    evnafets

  • What is best practice for dealing with Engineering Spare Parts?

    Hello All,
    I am after some advice regarding the process for handling engineering spare parts in PM. (We run ECC 5)
    Our current process is as follows:
    All materials are set up as HIBE's
    Each material is batch managed
    The Batch field is used for the Bin location
    We are now looking to role out PM to a site that has in excess of 50,000 spare parts and want to make sure we use best practice for handling the spare parts. We are now considering using a basic WM setup to handle the movement of parts.
    Please can you provide me with some feedback on what you feel the best practice is for dealing with these parts?
    We are looking to set up a solution that will us to generate pick lists etc and implment a scanning solution to move parts in and out of stores.
    Regards
    Chris

    Hi,
    I hope all the 50000 spare parts are maintained as stock items.
    1. Based on the usage of those spare parts, try to define safety stock & define MRP as "Reorder Point Planning". By this, you can avoid petty cash purchase.
    2. By keeping the spare parts (atleast critical components) in stock, Planned Maintenance as well as unplanned maintenance will not get delayed.
    3. By doing GI based on reservation, qty can be tracked against the order & equipment.
    As this question is MM & WM related, they can give better clarity on this.
    Regards,
    Maheswaran.

  • Best practices for dealing with Exceptions on storage members

    We recently encountered an issue where one of our DistributedCaches was terminating itself and restarting due to an RuntimeException being thrown from our code (see below). As usual, the issue was in our own code and we have updated it to not throw a RuntimeException under any circumstances.
    I would like to know if there are any best practices for Exception handling, other than catching Exceptions and logging them. Should we always trap Exceptions and ensure that they do not bubble back up to code that is running from the Coherence jar? Is there a way to configure Coherence so that our DistributedCaches do not terminate even when custom Filters and such throw RuntimeExceptions?
    thanks, Aidan
    Exception below:
    2010-02-09 12:40:39.222/88477.977 Oracle Coherence GE 3.4.2/411 <Error> (thread=DistributedCache:StyleCache, member=48): An exception (java.lang.RuntimeException) occurred reading Message AggregateFilterRequest Type=31 for Service=DistributedCache{Name=StyleCache, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=1021, BackupCount=1, AssignedPartitions=201, BackupPartitions=204}
    2010-02-09 12:40:39.222/88477.977 Oracle Coherence GE 3.4.2/411 <Error> (thread=DistributedCache:StyleCache, member=48): Terminating DistributedCache due to unhandled exception: java.lang.RuntimeException

    Bob - Here is the full stacktrace:
    2010-02-09 13:04:22.653/90182.274 Oracle Coherence GE 3.4.2/411 <Error> (thread=DistributedCache:StyleCache, member=47): An exception (java.lang.RuntimeException) occurred reading Message AggregateFilterRequest Type=31 for Service=DistributedCache{Name=StyleCache, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=1021, BackupCount=1, AssignedPartitions=205, BackupPartitions=204}
    2010-02-09 13:04:22.653/90182.274 Oracle Coherence GE 3.4.2/411 <Error> (thread=DistributedCache:StyleCache, member=47): Terminating DistributedCache due to unhandled exception: java.lang.RuntimeException
    2010-02-09 13:04:22.653/90182.274 Oracle Coherence GE 3.4.2/411 <Error> (thread=DistributedCache:StyleCache, member=47):
    java.lang.RuntimeException: java.lang.ClassNotFoundException: com.edmunds.vehicle.Style$PublicationState
         at com.edmunds.common.coherence.EdmundsEqualsFilter.readExternal(EdmundsEqualsFilter.java:84)
         at com.tangosol.io.pof.PortableObjectSerializer.initialize(PortableObjectSerializer.java:153)
         at com.tangosol.io.pof.PortableObjectSerializer.deserialize(PortableObjectSerializer.java:128)
         at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3284)
         at com.tangosol.io.pof.PofBufferReader.readAsObjectArray(PofBufferReader.java:3328)
         at com.tangosol.io.pof.PofBufferReader.readObjectArray(PofBufferReader.java:2168)
         at com.tangosol.util.filter.ArrayFilter.readExternal(ArrayFilter.java:243)
         at com.tangosol.io.pof.PortableObjectSerializer.initialize(PortableObjectSerializer.java:153)
         at com.tangosol.io.pof.PortableObjectSerializer.deserialize(PortableObjectSerializer.java:128)
         at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3284)
         at com.tangosol.io.pof.PofBufferReader.readAsObjectArray(PofBufferReader.java:3328)
         at com.tangosol.io.pof.PofBufferReader.readObjectArray(PofBufferReader.java:2168)
         at com.tangosol.util.filter.ArrayFilter.readExternal(ArrayFilter.java:243)
         at com.tangosol.io.pof.PortableObjectSerializer.initialize(PortableObjectSerializer.java:153)
         at com.tangosol.io.pof.PortableObjectSerializer.deserialize(PortableObjectSerializer.java:128)
         at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3284)
         at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2599)
         at com.tangosol.io.pof.ConfigurablePofContext.deserialize(ConfigurablePofContext.java:348)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:4)
         at com.tangosol.coherence.component.net.Message.readObject(Message.CDB:1)
         at com.tangosol.coherence.component.net.message.requestMessage.distributedCacheRequest.partialRequest.FilterRequest.read(FilterRequest.CDB:8)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$AggregateFilterRequest.read(DistributedCache.CDB:4)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:117)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache.onNotify(DistributedCache.CDB:3)
         at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:37)
         at java.lang.Thread.run(Thread.java:619)
    Caused by: java.lang.ClassNotFoundException: com.edmunds.vehicle.Style$PublicationState
         at java.lang.Class.forName0(Native Method)
         at java.lang.Class.forName(Class.java:169)
         at com.edmunds.common.coherence.EdmundsEqualsFilter.readExternal(EdmundsEqualsFilter.java:82)
         ... 25 more
    2010-02-09 13:04:23.122/90182.743 Oracle Coherence GE 3.4.2/411 <Info> (thread=Main Thread, member=47): Restarting Service: StyleCacheOur code was doing something simple like
    catch(Exception e){
        throw new RuntimeException(e);
    }Would using the ensureRuntimeException call do anything for us here?
    Edited by: aidanol on Feb 12, 2010 11:41 AM

  • Best way to deal with AVI clips taken with stills cam?

    we have a nikon S70 consumer camera which also takes video clips; these clips are .AVI format.
    I thought it would be smart to convert them to QT format, but a 1.5GB clip became a 38-GB monster!
    What would be the best way to get these .avi clips into my FC Studio workflow so that I can edit, color correct, etc., so that I can burn them to DVD?
    this little cam offers several video formats, like HD (720) and a couple of formats it calls "TV" which appear to be 640x480...at least some of them are those dims.
    thanks!
    rc

    Like the ubiquitous QuickTime .mov file type, AVI is just a wrapper or container; the video contained within could be made from any number of codecs.
    Convert the AVI files to the codec that will match your FCP Sequence settings - prior to importing it into your FCP project.
    -DH

  • Question: Best practices for dealing with multiple AM configurations

    Hello all,
    I have a project using ADF Business Components and ADF Faces. I would like to set up multiple configurations for the Application Modules to support the following scenarios:
    1). Local testing and debugging - using a connection defined in JDeveloper and AM Pooling turned off.
    2). Testing and debugging on an application server - using a JDBC Data Source and AM Pooling turned off
    3). Production deployment - using a JDBC Data Source and AM Pooling turned on.
    It is no problem to create multiple AM configurations to reflect this scenario. In order for the web part of the application to use the correct configurations, the DataBindings.cpx file must specify the correct ones. I was thinking to have 3 different DataBindings.cpx files and to change the CpxFileName context-param in the web.xml file as needed.
    My questions:
    1). Does this make sense as an approach? It should be better than having to change a single AM configuration every time I deploy or test. Is there any easy way to keep multiple DataBIndings.cpx files in synch, given that we may add new pages from time-to-time? Alternatively, can we do some type of "include" processing to include just the dataControlUsages section into a common DataBindings.cpx file?
    2). How would you manage the build-and-deploy process? For the most part, in JDev we would be using configuration #1. The only time to switch to configuration #2 or #3 would be to build an EAR file for deployment. Is this something that it would make sense to accomplish with ANT? I'm not an ANT expert at all. The ANT script would have "build-test-ear" and "build-prod_ear" targets which would swap in a correct web.xml file, recompile everything, build the EAR, then put the development web.xml file back. I'm relatively sure this is possible... comments?
    3). Is there some other recommended approach?
    I appreciate any insights from experience, or even just ideas or thoughts that I can test out.
    Best regards,
    John

    Hi K,
    Sorry for the long long delay in responding I've been traveling - and thanks for the e-mail tickler too...
    To answer your question in short, I do think that ANT is the right way to go; there is an extra ANT task called XMLTask that I was able to download and play with, and it seems it would make this manipulation of the cpx file (or the xcfg file, for that matter) pretty straightforward. I don't have any code to post; it's just in the conceptual stage for me right now. I didn't see anything magical in JDev 11 TP3 that solved this problem for me either.
    Having said all of that, it's more complicated than it might appear. In addition to the DataBindings.cpx file (stores, among other things, which AM configuration to use for each data control), it's certainly possible to programmatically access an AM (specifying the configuration either directly in the code or via a properties file/etc). I'm not sure what the most common use case for AM configurations is, but in my case, I have a Test configuration and a Prod configuration. The Test config, among other things, disables AM pooling. When I am developing/testing, I always use the Test config; in Production, I always use the Prod config. Perhaps the best way for me to do this would be to have an "Active" config and use ANT tasks to copy either Test or Prod to "Active." However, our Subversion repository is going to have a few complaints about this.
    John

  • What is the best solution when dealing with user settings

    We have a webapp, when a user logs in, all the user en application settings are read (only once) from the database and stored in a session bean (for quick access).
    But when some application setting is changed, I want to tell all the session beans that they need to reload the settings.
    Is there a good solution for this?
    Thanks in advance.

    There are several ways to do this:
    1) Put a timestamp on the application settings tables whenever they are changed. At the beginning of each request, send a small query just for this timestamp and compare it to a timestamp for when the session was created. If it is newer then the session then force a full refresh of the session data, otherwise keep using pre-loaded session data. This keeps normal operation quick since the SQL needed for the intial test is simple and quick, but will get the user the new information they need when they need it.
    2) Collect your sessions into an external collection stored inside application scope. Have a trigger that when the application settings change, the session objects are iterated through and updated. This way the settings may be updated without the user making a request - and the updates may be time-invisible to the user. But you could get into situations where you are updating settings at the same time you are reading settings, so you will need to synchronize your access.
    3) Don't put application settings in the session. The session should be for user settings only. The application scope (Servlet Context) should be used to store the application settings. As such, when application level changes to the DB are made, the application-level object is reloaded. You only have one copy in memory, it updates just once, no need to iterate through sessions, and users who use the application settings are using the most recent copy.

  • Best way to deal with multi-track clips?

    Hi,
    I'm editing a documentary using pre edited program masters and have been supplied the audio stems. Can anyone recommend the best way to deal with them so I can access any of the tracks within the main edit.
    I've tried creating a compound clip and adding the stems but I only access to a mono track when using the clip in my main project. I've also tried creating a multicam clip but as my stems need to be split and spaced to line up with the main Pro Res file I don't seem to be able to insert slugs into multicam clip.
    Has anyone found an elegant way to do this?
    Many thanks,
    Leigh

    Hi leiski,
    Happy New Year, and Welcome to the Support Communities!
    Compound clips are still the best way to approach this, because you have access to the stems at any time with a double-click. 
    The following resources may provide additional information:
    Audio editing overview
    http://help.apple.com/finalcutpro/mac/10.1/#verdcd1cc1
    Advanced multichannel audio editing
    Final Cut Pro automatically groups channels into audio components according to how the channels are configured for the clip. You can expand the audio portion of clips to reveal and edit audio components down to the individual channel level. This allows you to apply different effects to different components and streamlines the process for making quick sound cutouts to a single microphone input or other fine adjustments. 
    Important:  Many digital audio file formats, such as AAC and MP3, use interleaved stereo files, which do not contain separate left and right channels. These files appear as a single audio component unless you change the clip’s channel configuration. 
    Keep in mind the following when editing audio components in Final Cut Pro:You view and change the audio channel configuration of your clips in the Audio inspector. You can change audio component names, add or remove audio components, and configure channels in mono, stereo, and surround formats. See Configure audio channels.
    Configure Audio Channels
    http://help.apple.com/finalcutpro/mac/10.1/#verc1fab5f6
    Using Roles to organize clips and export audio files
    http://help.apple.com/finalcutpro/mac/10.1/#verc1faa7ce
    Cheers,
    - Judy

  • Best practice when using Tangosol with an app server

    Hi,
    I'm wondering what is the best practice when using Tangosol with an app server (Websphere 6.1 in this case). I've been able to set it up using the resource adapter, tried using distributed transactions and it appears to work as expected - I've also been able to see cache data from another app server instance.
    However, it appears that cache data vanishes after a while. I've not yet been able to put my finger on when, but garbage collection is a possibility I've come to suspect.
    Data in the cache survives the removal of the EJB, but somewhere later down the line it appear to vanish. I'm not aware of any expiry settings for the cache that would explain this (to the best of my understanding the default is "no expiry"), so GC came to mind. Would this be the explanation?
    If that would be the explanation, what would be a better way to keep the cache from being subject to GC - to have a "startup class" in the app server that holds on to the cache object, or would there be other ways? Currently the EJB calls getCacheAdapter, so I guess Bad Things may happen when the EJB is removed...
    Best regards,
    /Per

    Hi Gene,
    I found the configuration file embedded in coherence.jar. Am I supposed to replace it and re-package coherence.jar?
    If I put it elsewhere (in the "classpath") - is there a way I can be sure that it has been found by Coherence (like a message in the standard output stream)? My experience with Websphere is that "classpath" is a rather ...vague concept, we use the J2CA adapter which most probably has a different class loader than the EAR that contains the EJB, and I would rather avoid to do a lot of trial/error corrections to a file just to find that it's not actually been used.
    Anyway, at this stage my tests are still focused on distributed transactions/2PC/commit/rollback/recovery, and we're nowhere near 10,000 objects. As a matter of fact, we haven't had more than 1024 objects in these app servers. In the typical scenario where I've seen objects "fade away", there has been only one or two objects in the test data. And they both disappear...
    Still confused,
    /Per

  • What's best strategy for dealing with 40+ hours of footage

    We have been editing a documentary with 45+ hours of footage and presently have captured roughly 230 gb. Needless to say it's a lot of files. What's the best strategy for dealing with so much captured footage? It's almost impossible to remember it all and labeling it while logging it seems inadequate as it difficult to actually read comments in dozens and dozens of folders.
    Just looking for suggestions on how to deal with this problem for this and future projects.
    G5 Dual Core 2.3   Mac OS X (10.4.6)   2.5 g ram, 2 internal sata 2 250gb

    Ditto, ditto, ditto on all of the previous posts. I've done four long form documentaries.
    First I listen to all the the sound bytes and digitize only the ones that I think I will need. I will take in much more than I use, but I like to transcribe bytes from the non-linear timeline. It's easier for me.
    I had so many interviews in the last doc that I gave each interviewee a bin. You must decide how you want to organize the sound bytes. Do you want a bin for each interviewee or do you want to do it by subject. That will depend on you documentary and subject matter.
    I then have b-roll bins. Sometime I base them on location and sometimes I base them on subject matter. This last time I based them on location because I would have a good idea of what was in each bin by remembering where and when it was shot.
    Perhaps, you weren't at the shoot and do not have this advantage. It's crucial that you organize you b-roll bins in a way that makes sense to you.
    I then have music bins and bins for my voice over.
    Many folks recommend that you work in small sequences and nest. This is a good idea for long form stuff. That way you don't get lost in the timeline.
    I also make a "used" bin. Once I've used a shot I pull it out of the bin and put it "away" That keeps me from repeatedly looking at footage that I've already used.
    The previous posts are right. If you've digitized 45 hours of footage you've put in too much. It's time to start deleting some media. Remember that when you hit the edit suite, you should be one the downhill slide. You should have a script and a clear idea of where you're going.
    I don't have enough fingers to count the number of times that I've had producers walk into my edit suite with a bunch of raw tape and tell me that that "want to make something cool." They generally have no idea where they're going and end up wondering why the process is so hard.
    Refine your story and base your clip selections on that story.
    Good luck
    Dual 2 GHz Power Mac G5   Mac OS X (10.4.8)  

  • How to get around a performance issue when dealing with a lot of data

    Hello All,
    This is an academic question really, I'm not sure what I'm going to do with my issue, but I have some options.  I was wondering if anyone would like to throw in their two cents on what they would do.
    I have a report, the users want to see all agreements and all conditions related to the updating of rebates and the affected invoices. From a technical perspective ENT6038-KONV-KONP-KONA-KNA1.  THese are the tables I have to hit.  The problem is that when they retroactively update rebate conditions they can hit thousands of invoices, which blossoms out to thousands of conditions...you see the problem. I simply have too much data to grab, it times out.
    I've tried everything around the code.  If you have a better way to get price conditions and agreement numbers off of thousands of invoices, please let me know what that is.
    I have a couple of options.
    1) Use shared memory to preload the data for the report.  This would work, but I'm not going to know what data is needed to be loaded until report run time. They put in a date. I simply can't preload everything. I don't like this option much. 
    2) Write a function module to do this work. When the user clicks on the button to get this particular data, it will launch the FM in background and e-mail them the results. As you know, the background job won't time out. So far this is my favored option.
    Any other ideas?
    Oh...nope, BI is not an option, we don't have it. I know, I'm not happy about it. We do have a data warehouse, but the prospect of working with that group makes me whince.

    My two cents - firstly totally agree with Derick that its probably a good idea to go back to the business and justify their requirement in regards to reporting and "whether any user can meaningfully process all those results in an aggregate". But having dealt with customers across industries over a long period of time, it would probably be bit fanciful to expect them to change their requirements too much as in my experience neither do they understand (too much) technology nor they want to hear about technical limitations for a system etc. They want what they want if possible yesterday!
    So, about dealing with performance issues within ABAP, I'm sure you must be already using efficient programming techniques like using Hash internal tables with Unique Keys, accessing rows of the table using Field-Symbols and all that but what I was going to suggest to you is probably look at using [Extracts|http://help.sap.com/saphelp_nw04/helpdata/en/9f/db9ed135c111d1829f0000e829fbfe/content.htm]. I've had to deal with this couple of times in the past when dealing with massive amount of data and I found it to be very efficient in regards to performance. A good point to remember when using Extracts that, I quote from SAP Help, "The size of an extract dataset is, in principle, unlimited. Extracts larger than 500KB are stored in operating system files. The practical size of an extract is up to 2GB, as long as there is enough space in the filesystem."
    Hope this helps,
    Cheers,
    Sougata.

  • Best way to deal with a lot of text in a presentation

    I'm creating a 2-3 minute medical presentation in FCP and Motion that contains a lot of bullet text. I'm using FCP 6 and Motion 3. Is it best to create several individual Motion projects and import them into FCP or should I have I have a single 3 minute Motion project imported into FCP (which I'm currently doing). I'm having issues matching up the timing with objects in FCP and I'm constantly shifting the text in Motion. How do others deal with a lot of Motion elements, particularly text?

    Pacific Man wrote:
    All of those are great ideas. I hadn't considered bringing the completed edit into Motion. I would think that it would be slow, though. Is it a bad idea to create a single 2 minute Motion project and import it into FCP? I'm trying to figure out what is a "best practice" when doing text intensive corporate work in FCP.
    I don't think that I will have enough control over the assets in Keynote, although that's not a bad idea either.
    You would use the completed edit as the underlying layer for timing. You might or nmight not render in Motion. You could turn off or delete the timing layer, export the Motion project as a movie or just drop the .motn file on top of the FCP timeline. One way or another you must render everything together into a single flattened movie.
    You have total control over text attributes and assets in Keynote. But you do not have all fo the geewhiz effects available in Motion. which are inappropriate for a medical presentation unless this is about new designer drugs.
    Your presentation equeipment may dictate your construction methods more than anything else. Video has a limited number of pixels. Keynote really doesn't. Your text will be shown in crisp vectors. If your text is converted to a video codec, you will lose that.
    bogiesan

Maybe you are looking for

  • Pinch in and out on trackpad not working to zoom in and out in firefox

    Ever since i bought my macbook pro (early 2011) I was able to use the pinch in and out trackpad feature to zoom in and out in firefox. This suddenly stopped working a couple of days ago. It still works in safari so this seems to be Firefox specific.

  • Installing Windows 7 64 bit using boot camp w/ an OEM disk?

    Has anyone made a successful attempt at installing Windows 7 from an OEM disk under Boot Camp?? I'm also noticing that it seems to be the trend of now going to Microsoft's website with an authorization code to download the OS!! I can only assume that

  • ORA-13786: missing SQL text of statement object - Unable to profile a sql_id in oracle 11g -

    11.2.0.3.5 / 2 node rac on rhel-6 / 64-bit I would like to a profile a sql_id since its following 4 plans and one of them is the optimal. But im unable to do it as it throwing ORA-13786. Steps I followed : {code} SET SERVEROUTPUT ON -- Tuning task cr

  • Newbie here having input problems

    Hi, I'm still very new to Java and I'm basically learning it on my own right now. I think my current problem is with the way I'm reading in the input. Since my entire program right now is over 100 lines, I'll only post how I'm reading input in, and h

  • PAL & NTSC

    In a couple of months our crew goes on a music video shoot in Spain. We are going to use local production to shoot due to the cost of bringing all our gear and for public relations reasons etc. I assume they are going to shoot using PAL cameras. My q