Compositing, Stitching, Mixing?

Hello
Wondering if FMS/Encoder could handle the following scenario in a relatively 'out of the box' way:
Given:
- Source video media on the server/web storage, either as solid assets or pre-fragmented into defined 'clip' segments
- Source 'overlay' media on the server/web storage, as video-with-alpha assets
- Source audio media on the server/web storage
Given a '3 layer' playlist where the desired outcome is to:
- Stitch together the selected clips (regions from within the source video media) from various sources into a single, solid video asset.
- Composite the 'overlay' media using normal alpha compositing into that asset at variable times (defined by the user).
- Mix the audio media into the solid asset (mixing with any audio contained in the original source clips).
Producing a cached, solid video asset that can be viewed/transcoded/retrieved from the server.
I don't see anything about compositing videos or mixing an audio into a stream. The desired output format would be high quality (as a source for later transcoding, so ... as close to HD as possible). Realtime compositing/rendering is not a requirement.
The following scenarios come to mind:
1)
Write our own DirectShow device that provides the pre-composited input as frames to Flash Video Encoder. That is, assuming the encoder/server does not provide a vehicle for compositing/mixing.
2)
We see that there is a way to use 'DVR' like/on-demand functions of the streaming server to stitch. That solves part of our problem (stitching together clip fragments from multiple origin source videos). But it doesn't solve the problem of compositing video frames on top of them, or mixing different audio with them.
3)
Is there any way to use SSAS to essentially replicate the behavior of our 'client side' player, which actually does all this compositing in realtime (it composites the 2 layers into a single bitmap data, and we could technically have it mix the various audio channels into a single byte array of the stereo audio output per frame if we needed to)? IE can we write our own 'provider' for encoding that would render frame data purely within ActionScript then provide it as the equivalent to a 'camera object' or whatever the encoder needs to do its thing?...
From a high level POV what we are trying to do is develop a rendering solution for a realtime UGC video toolset that we've developed on the Flash client platform. Our current solution requires a custom player (that 'performs' all the user's edits/choices in realtime) and we are trying to eliminate that in favor of a flattened media asset that can be pushed to any CDN (or streamed from our own service).
Thanks,
Neil Voss
alinear LLC

Well from what I can see Flash (Interactive) Media Server has a fair bit more capability than conventional serving ... Not that I see anything dealing with compositing or mixing data beyond maybe hacking the live encoder by feeding it our own input.  We've found a way to do what was described using a mix of other stuff (ffmpeg, image magic, sox) but it seemed worth it to explore FMS to see if it could be leveraged.  This is for a rendering situation. Realtime availability of a render would be nice but unexpected. The client player can still perform the mixes in realtime (so within the normal player and environment the user sees no delay between saving and publishing)... this is just a way to syndicate the result, allow it to be downloaded, and offer it up for devices that don't support Flash.  Thx N

Similar Messages

  • Compositing - Channel Mixer Help

    I've been using AE now for a few years, (since v5.5) and mostly use it for motion graphics. Using CS3 now. But right now, I'm beginning to work through Steve Wright's book, <a href="http://www.amazon.com/Digital-Compositing-Film-Video-Second/dp/024080760X">Digital Compositing for Film and Video</a> to focus more on true compositing.
    The author's aim is to keep the lessons application-independent and present concepts in a way that would be applicable to most software packages. (However, considering the author's frequent references to "nodes" and his well-known status as a Shake user & <a href="http://movielibrary.lynda.com/html/modPage.asp?ID=408">Shake instructor</a>, it's apparent that using Shake, while reading this book, would lead to the least bumps in the road, when attempting to follow along.)
    But I'm not using Shake, I'm using AE and have encountered a small bump in the road. Regarding luma keys he mentions how luma values are represented in the individual colors R,G,B. To cut to the chase, he says, "One standard equation that mixes the right proportions of RGB values to create a luminance image on a color monitor is:
    Luminance = 0.30R + 0.59G + 0.11B"
    Great. But I can't figure exactly how to mix the RGB values with the above proportions in AE. It's quite easy & straight-forward in Photoshop, using the Channel Mixer. <a href="http://www.geocities.com/kevincbrock/channel_mixer.png">Example here</a>. But AE's Channel Mixer does not seem to work quite the same way as it's counterpart in PS. There are values there, for instance, "Red-Red", but they seem to run in values from -200 to 200, not percentages, like PS. So my question is, how do I achieve the same channel mixing results in AE that I can in PS?
    I'll add this: I was able to achieve the desired result in AE, but by using a round-a-bout sort of tactic. Instead of using the "Channel Mixer", I applied the "Set Channels" effect to my sample shot. By changing the "Set Red to Source 1's LUMINANCE", then doing the same for Green and Blue, setting them all to LUMINANCE, the resulting image is identical to the sample shot that was tweaked in Photoshop. Again, it worked, but I'd like to know how to change the proportions of RGB to other values than those represented in the formula above.
    Hope this all makes sense.
    Oh, and while I'm here.....I'll throw this in: In AE, when you apply the "Curves" effect, can you input precise values for input / output. Again, this is simple in Photoshop, but unless I don't have some display items turned on in the Curves editor in AE, I don't see a way to enter precise values. Is this possible?
    Thanks in advance for any input.
    EDIT: for some reason, the links I've posted aren't showing up properly. So if your'e interested in following any of them you'll have to copy/paste on your own. Don't know why......

    >Great. But I can't figure exactly how to mix the RGB values with the
    >above proportions in AE.
    You could use expressions... You create sliders for each input, then feed their output to the actual effects. Something like
    //----begin expression
    linear(value,0,100,-200,200)
    //----end expression
    should do in most cases. Still, be careful. Not everything written in that book directly applies to AE. It does not figure in any specifics of color management nor different Gammas which could massively skew results because the source footage will already be pre-adjusted and the comp will add another level of adjustments.
    >Oh, and while I'm here.....I'll throw this in: In AE, when you apply >the "Curves" effect, can you input precise values for input / output.
    Nope. AE's Curve effect will adapt to the project bit depth as a lot of other effects, presenting you with the problem of which values to actually use. This would only make sense if the effect itself would be independent from AE and process in float all the time and only the inputs were presented as different normalized ranges. There was/ should have been an alternative Curves effect from Frischluft, but at this point development has reached a dead stop, so it's doubtful it will ever be unleashed upon the world.
    >Don't know why......
    HTML-Code functionality had to be disabled thanks to spammers. :-|
    Mylenium

  • Multi-camera IMAQdx systems: shortcuts for stitched composite image

    Imagine a system using for example multiple GigE cameras through the IMAQdx interface where we wish to form a composite stitched image from the multiple camera views. The stitching principle is naive, straigthforward concatenation, one next to another.
    The problem is that where it is trivial to build such a composite image, it's difficult to do it very efficiently. The image sizes are large, tens of megapixels, so every copy matters. Alternative hardware configurations would open up a lot of options but say we're stuck using GigE cameras and (atleast initially) the IMAQdx interface. What tricks, or even hacks, can you guys imagine facing this challenge?
    I've seen some talk about the IMAQdx grab buffers and it appears to me that one cannot manually assign those buffers or access them directly. The absolute optimal scenario would of course be to hack your way around to stream the image data directly next to each other in the memory, sort of as shown below in scenario1.png:
    The above, however, doesn't seem to be too easily achieved. Second scenario then would be to acquire into individual buffers and perform one copy into the composite image. See illustration below:
    Interfaces usually allow this with relative ease. I haven't tested it yet but based on the documentation using ring buffer acquisition and "IMAQdx Extract Image.vi" this should be possible. Can anyone confirm this? The copying could be performed by external code as well. The last scenario, without ring buffer, using "IMAQdx Get Image2.vi" might look like this:
    The second copy is a waste so this scenario should be out of the question.
    I hope this made some sense. What do you wizards say about it?
    Solved!
    Go to Solution.

    Hi,
    Sorry, the contraints are not really well documented as they are dependent on platform, camera type, camera capabilities, and how the driver handles things. All of these are subject to change and so we decided instead to try to make the errors be very self-descriptive to explain how to fix any requirements.
    You are correct that these fundamentally come down to making sure that the image buffer specified is able to be directly transferred to by the driver. The largest requirement is that the image data type is the same and doesn't need any decoding/conversion step. The other requirements are more flexible and change depending on many factors:
    - No borders, since this adds a discontinuity between each line. This error doesn't apply to GigE Vision (since the CPU moves the data into the buffer) or to USB3 Vision cameras that have a special "LinePitch" feature that can allow them to pad the image lines. The USB drivers of more modern OSes (like Win8+) have more advanced DMA capabilities so it is possible/likely that this also can be ignored in the future.
    - Line width must be a multiple of 64-bytes (the native image line alignment on Windows) - same as border
    So, if you end up using GigE Vision cameras, this should just work. If you want to use USB3 Vision you have a few more contraints to work with.
    Eric

  • Student Film Composer in need of mockup mixing tips! What do you do?

    Hello everyone,
    First off, I understand that its impossible to duplicate a professional film score mix/master with a home, sample based system. However, the technology has advanced to where people are achieving AMAZINGLY realistic midi mockups! In fact, frequent posters like Seven Von Kampen, Rohan, iSchwartz, and others are achieving GORGEOUS results with their mockup mixes, and inquiring minds like mine would love some tips to achieve a similar sound/space.
    I'm about to start a student film which FINALLY calls for a "hollywood orchestral score" and need some tips to place my VSL Opus orchestra in a "hollywood sound stage." I've recently realized the power reverb has in making samples really come ALIVE, and thus am looking for as much information as possible to aid in this project. I know topics on reverb & mixing may sound basic to you pros, but being a student composer without much mixing experience (or classes) to learn techniques, discovering the power of reverb came as a huge surprise! However, with this new perspective I now feel like I'm floundering in a sea with no direction to dry land! Thus I'm hoping to find some guidance from those in the know.
    As I said above, I'm currently just starting on composing the score, so perhaps its premature to be thinking about mixing already. However, I find it hard to separate composition from mixing, especially when effects like reverb could mean the difference between a realistic sounding, and therefore acceptable, musical idea.
    Any tips/tutorials you can provide would be GREALY appreciated. I've already found a great tutorial by Beat Kaufman that has proven very beneficial (though I'm still not happy with the results and need to experiment more): http://www.beat-kaufmann.com/tipspcmusic/howtousevslaudio/index.php#509469961800 42f2e
    Thank you all in advance for your help and reply. I'm EXTREMELY impressed with the quality and level of professionalism from my fellow Logic users are achieving, and hope a few tips and plenty of experimentation will lead me in the right direction. I feel I have decent music composition chops (for a student) but poor mixing skills, which makes my demos suffer. In fact, I'm even looking for a mixing engineer rather than a composer to intern with next semester, in the hopes of picking up some tips. Man, I really wish CSUN offered mixing classes!!
    Thanks again for your help! It is much appreciated!
    Jon

    Hi Jon,
    I'm no expert on composing for film, but have mixed a number of recordings and a few records.
    Reverb can be your friend or your worst enemy. I love you words, "floundering in a sea with no dry land." Those words - your words - will help you in understanding what to avoid with reverb. If your music sounds less direct and like it's now floundering in sea without dry land ... you're using reverb incorrectly or simply just too much verb.
    There are as many approaches to mixing as there are people mixing music. And no matter the approach it takes time to get it right .. time and practice.
    My thoughts ...
    Write/compose without any verb. Begin your mix without any verb. Reverb can become like wallpaper in that you will lose perspective if introduced too early. Once mix is more or less set begin adding reverb.
    Try with one or two reverbs and use Aux sends to route audio to the reverb (don't insert several reverbs on each track). Remember reverb is used as a unnatural way often times for pop and rock - and that can be cool. But with jazz and orchestral stuff - and oftentimes pop - reverb is used to emulate a space (large hall, small room, airplane hanger, bathroom). If you want your orchestra to sound like they all recorded in the same room at the same time, pick one very good reverb and route elements to it.
    If elements sound like they are floundering for dry land, back off the verb tail or turn down aux fader or add an EQ to strip and mess with that. To test the tail - to really see how dense your verb is - hit play and stop at a fairly dense section of music and listen to how long verb takes to die out. That is a good way to keep perspective of how much verb.
    Generally ... less is more. Watch elements too. Elements such as big orchestral bass drums don't work too well if they are too "wet" (too wet is too much verb). Some elements will react better to verb than others. I am rambling here now, but often by adding only choice elements to the reverb bus and leaving other elements out of verb mix - it will create the illusion that all elements are in the same space - but will keep some of the clarity of a dry mix.
    Final rambling point. Although reverb can cover up the sound of bad samples or mask it, try using the best samples you can get your hands on rather than masking with verb.
    Good luck.
    Darrel, Ottawa

  • How do i uninstall ichat effects?

    how do i uninstall ichat effects? i just want the bug out ones. they show up on photobooth but not ichat

    HI,
    Do you mean MoreiChatEffects ?
    This is "known" to have some issues as the Quartz files that are used are from and older version of Quartz Composer and some will not play in later versions of the OS.
    This installs the Quartz files in the "wrong" place.
    The Apple ones are in Hard Drive Name/System/Library/Compositions
    Ones to be used by all Users should be place in Hard Drive Name/Library/Compositions
    Ones for the User only should be put in Hard Drive name/Users/(Account name)/Library/Compositions.
    The MoreiChatEffects ones get put in the System/Library/Compositions getting mixed up with the Apple ones.
    There should be an "Uninstall" option if you re-run the Installer.
    Support Messages on the Site list an issues trying to run in 64 bit mode but end in 2009
    iChatFX is something similar  but expands on what Quartz can really do.
    It takes what ChatFX did and updates it for later versions.
    ChatFX (second of those two links) worked up until OS X 10.4.9  At OS X 10.4.10 Apple changed the way the video was produced for iChat (and PhotoBooth) which meant the access part of the Quartz files did not work.
    iChatFX  (Note the leading "i") was produced by a different company.
    The Demo does not have all the 200 options but does have many (with Demo written all over them).
    However these also have been caught out by changes in the way Apple grabs the Video from the Camera.
    iChatFX also puts files in the "Apple" System/Library/Compositions folder instead of the "All Users" /Library/Compositions folder.
    The Good News
    IN the Hard Drive Name/System/Library/Compositions folder there is also a file called "DefaultOrder.plist" which can be opened on QuickLook and/or made to Open With TextEdit  and lists all the Apple ones that Should be in the folder.
    The Mountain Lion ones are here  (TextEdit Copy)
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
    <plist version="1.0">
    <array>
        <string>/space alien</string>
        <string>/nose twirl</string>
        <string>/chipmunk</string>
        <string>/lovestruck</string>
        <string>/dizzy</string>
        <string>/blockhead</string>
        <string>/bug out</string>
        <string>/frog</string>
        <string>/sepia</string>
        <string>/blackandwhite</string>
        <string>/glow</string>
        <string>/comicbook</string>
        <string>/colorpencil</string>
        <string>/thermalcamera</string>
        <string>/xray</string>
        <string>/popart</string>
        <string>/bulge</string>
        <string>/dent</string>
        <string>/twirl</string>
        <string>/squeeze</string>
        <string>/mirror</string>
        <string>/lighttunnel</string>
        <string>/fisheye</string>
        <string>/stretch</string>
        <string>/clouds</string>
        <string>/color dots</string>
        <string>/earthrise</string>
        <string>/eiffel tower</string>
        <string>/fish</string>
        <string>/rollercoaster</string>
        <string>/sunset</string>
        <string>/yosemite</string>
    </array>
    </plist>
    You can then drag the ones you do not want to the Trash  (Or at least place then outside the Compositions folder(s).
    9:41 PM      Saturday; February 9, 2013
    Please, if posting Logs, do not post any Log info after the line "Binary Images for iChat"
      iMac 2.5Ghz 5i 2011 (Mountain Lion 10.8.2)
     G4/1GhzDual MDD (Leopard 10.5.8)
     MacBookPro 2Gb (Snow Leopard 10.6.8)
     Mac OS X (10.6.8),
     Couple of iPhones and an iPad
    "Limit the Logs to the Bits above Binary Images."  No, Seriously

  • Photo Stitching (i.e. compositing) with Aperture

    Hi, I'm a new Macer using 1.5 Aperture and want to stitch some pitures together to create a panoramic view.
    I thought Aperture would do this, but after using it, I don't think it has the capability, so if I'm wrong, please correct me. Although it doesn't appear to stitch, I/m still happy with the s/w.
    I know Shake4.0 can stitch, but that is more of a s/w solution then I'm looking for. My new iMac took most of my money.
    So, can anyone recommend a third party s/w (free or cheap) to stitch photos on an iMac using Aperture 1.5???

    The main Mac stitching programs are:
    PTGui (.com) - which started out as a Windows app but has recently been ported to Mac. Intel-native. Bit of a learning curve but very powerful.
    PTMac - originally the mac equivalent to PTGui, but has fallen quite a bit behind in terms of speed and features compared to PTGui. Written by the same person who wrote Calico, as linked above. NOT Intel-native
    Calico - much easier to use than either of the 'PT' apps, but no real control if it doesn't do what you expect. Intel-native.
    Autopano Pro - similar to Calico in ease of use, but with a bit more control, and more geared towards people who shoot quite a lot of panoramas.
    Canon Photostitch - free with your camera is a bonus. Pretty easy to use, but can't cope with wide-angle lenses and very limited output options.
    Photoshop & Elements - if you already have a copy it might be worth trying the photomerge function, but generally it's useless in comparison to a proper stitching program, even PhotoStitch.
    Lastly, Realviz Stitcher - Express costs about the same as the other apps but is relatively limited in output options. The full version costs as much as all the other apps put together, and is fairly easy to use, powerful, but rather buggy. NOT Intel-native.
    As Scott says, always apply the same adjustments to all the images in a panorama - lift-n-stamp is a WONDERFUL tool for the panoramic photographer. In fact, when shooting you should always try and lock focus, zoom, white balance and exposure, or you will get rather frustrated when trying to put the images together.
    So says your friendly neighbourhood panoramic photographer...
    Ian (to see some panos, just stick "Ian Wood" into Google...)

  • Problems with getting signal to my Video Mixer with DVI to Comp adapter

    hi, i have a MacBook Pro and a G5 Quadcore with an NVidia7800GT card. i am trying to get my G5 to detect my Edirol V4 video mixer. the MBP sees it immediately. the G5 cannot detect it!
    what i have tried so far:
    - swapped the DVI-Comp adapters i have on the MBP with the G5
    - tried the DVI to VGA adapter - works fine on the G5
    - swapped out cables (even though they are brand new)
    i have a show on the weekend and need to solve this asap. can anyone help me figure out why the G5 won't detect the V4??
    many thx in advance
    fj

    well normally i run using a dual output, where the external monitor is a VGA switcher or a video mixer. i run the video mixing software on my desktop monitor and the second monitor is the video mixed output.
    up to now i have only used DVI to VGA and it always worked. but i just picked up a better mixer and its inputs are composite. so i bought the DVI to comp adapters. as stated the MBP sees the V4 immediately. the G5 screen flashes but there is no detection.

  • DVI --- S-Video/Composite - Blinking?

    I have a DVI to S-Video/Composite adapter that I'm plugging into a video mixer for Keynote presentations, that displays on various resolution monitors (some are regular TVs while others are flat screen LCDs). Seemingly without reason, all the monitors begin blinking until I unplug / plug the connection back in again. From that point on, sometimes my presentation has no issues at all, while other times it seems to immediately start happening again. Has anyone experienced this issue as well?
    Running 800 x 600, 60hz (NTSC)

    Apple's video adapters have a ROM chip inside them with EDID data contained within the ROM. That is one thing that will definitely be different between the two. The NTSC and PAL version (M9267G/A) gives priority to NTSC. This could cause problems in a PAL country. If you hooked a PAL TV up using this adapter, you might get a rolling (non-synced) screen and you'd have no good way to recover since the image on the screen would be unusable.
    I don't think one adapter would have any advantage over the other in terms of PAL picture quality. It would just be a question of compatibility as outlined above. That's my best guess.

  • Understanding compositions with different frame rate

    Hey everyone,
    I am working with AVCHD footage with a frame rate of 25.
    If it drop my footage into a new compostion after affects sets everything
    up automatically.
    I'd now like to what happens to my avchd clip if I change the frame rate
    of my compostion from 25 to 24. What excatly happens to the frame
    missing? Does AE just gets rid of it?
    Does it make a difference if I just change my comp settings from 25 to 24
    or if I drop my 25 comp into a new one with 24.
    Thinking of it the other way around, if I change my frame rate from 25 to 50,
    does that mean that AE doubles the frames of my files?
    In general, does the changes have any impact on the speed of my footage?
    I'm looking forward to your reply. Much appreciated.
    Best,
    Alex
    PS: this count for Premier Pro aswell?

    When you create a layer in a composition, and that layer is based on a video footage item, the composition checks at each composition frame to see (sample) the data from the footage item. If there isn't a single frame of data from the footage item that exactly lines up with the frame in the composition, then the composition samples from the footage frames on either side and does one of several things: It can either just pick one of the two frames, or it can just mix them together, or it can do something much smarter and try to reconstruct the image that would have existed between the two frames.What it does depends on what frame blending setting you're using.
    When you're using a footage item that has a higher frame rate than your composition, and both frame rates are relatively high, you don't need to worry about this at all, as the default settings tend to work just fine.
    To sum up, changing the frame rate of the composition just changes how often the composition samples the image data from the footage item. It does nothing to the footage item itslef and doesn't affect how fast the video from that footage item plays in a layer.
    To interpret a footage item as being at another frame rate, or to conform a footage item to a frame rate, you have to use the Interpret Footage dialog box.
    For more information, see "Frame rate".

  • Dynamic link confusing two separate AE compositions in PPro from different projects

    I am using CS5 Master Collection for Windows 7 (64 bit) on an HP Pro desktop (Intel core 2 duo, 6Gb RAM). The tower has two hard drives installed (no RAID). CS5 is installed on the original C drive along with the OS, but the working files, footage etc are all on the larger, secondary F drive.
    I am working on a DVD project consisting of several seperate PPro projects. Each project has been colour corrected in AE using dynamic linking. Recently two of my projects suddenly became mixed up, by which I mean the video (not the audio) from one AE composition in one project has taken the place of another in another project. The clip remains the same length, but the link apears to be referencing the wrong clip during playback. Interestingly, there are two clips on the track above the affected clip that fade out and in at either end, during the dissolves between clips, the correct footage plays. It is only when the clip plays by itself that the problem occurs. The next clip also has the same error, though the problem only occurs for half the clip and then suddenly corrects itself. Just to be clear, All clips have been rendered, there are no more than two video tracks on the sequence, and I have tried replacing the composition from the asset menu (in which the clip plays back correctly), but the problem remains in the sequence.
    I hope that descrition is clear enough. My guess is some kind of file path error in the dynamic link, perhaps due to the separate hard drive. But I really wouldn't know. I'm very pressed for time right now and I'd prefer not to go against workflow and re-edit a new composition in AE, I would also like to understand what is going on in case it happens again.
    Any help with ths problem would be appreciated,
    J

    Thanks Colin, I tried what you suggested with both the AE Project that it should be referencing and the AE project it is erroneously referencing but both failed to fix the issue. Its as if a hidden clip were laid over the top of the one I want to play. Frustrating.
    Date: Mon, 24 Oct 2011 19:47:03 -0600
    From: [email protected]
    To: [email protected]
    Subject: Dynamic link confusing two separate AE compositions in PPro from different projects
        Re: Dynamic link confusing two separate AE compositions in PPro from different projects
        created by Colin Brougham in Premiere Pro CS5 & CS5.5 - View the full discussion
    It's not the comps that are the problem; it's the project file names themselves. You can have all the comps in a project named the same thing, and it'll be fine, but naming projects incrementally is an issue. After Effects includes a versioning function (Save and Increment) that will create a new AE project file with a serial number, e.g. AE Project 01, AE Project 02, etc. This is all well and good if you're just working in After Effects, but Dynamic Link gets confused and will usually start looking at the later project file for comps. That will just create a big mess, as you've found. Changing the names of the project files won't help things relink automatically properly, but it can help you fix things. Make all your DL AE comps offline in your Pr projects, and name your AE projects in a manner that is not serial (give them more random names). Select your offline AE comps, and right-click > Link Media, and point to the new AE project. Things should link up correctly then--but if they don't, you'll just have to remove all the AE comps from your Pr project, reimport the comps from the newly-named AE project, and replace them. It's a pain but it'll probably just keep getting worse if you don't.
         Replies to this message go to everyone subscribed to this thread, not directly to the person who posted the message. To post a reply, either reply to this email or visit the message page: http://forums.adobe.com/message/3988711#3988711
         To unsubscribe from this thread, please visit the message page at http://forums.adobe.com/message/3988711#3988711. In the Actions box on the right, click the Stop Email Notifications link.
         Start a new discussion in Premiere Pro CS5 & CS5.5 by email or at Adobe Forums
      For more information about maintaining your forum email notifications please go to http://forums.adobe.com/message/2936746#2936746.

  • The editable rectangles of my still photos in the composition panel suddenly stopped showing up.

    Hi There, Adobe Community!
    I am relatively new to After Effects CC and am using it to create a stop motion film with a large amount of photos.
    Usually, when importing the individual images into my composition panel, a sequence of editable rectangles appears in the compostition panel allowing me to edit the frame length of the still photos.
    After around 950 imported images, the little rectangles suddenly stopped showing up. But when looking at a preview of the project, it's as if the photos are in there, in the sequence they should be. I'm just unable to edit them like the previous ones with recangles showing.
    I probably am not describing this too well, so here is a screen shot which hopefully makes my problem somewhat more clear. The area of issue is where you see the descending sequence of boxes in the comp panel. The red tracking bar is there to show that there are still photos continuing the sequence, I just can't see the boxes for em..
    thank you so much for any help in the matter!

    So I guess you are talking about the layers in the timeline go missing, not the handles in the Comp Window.
    I think you have run into a display bug. Try selecting the top hundred or so layers and pre-compose them. If the missing layers appear then that's the problem. I have had it happen a couple of times before in previous versions of AE..
    Here's a workflow hint for you, break your comps up into short sequences instead of trying to do a 4 or 5 minute piece with 900 layers in a single comp. You'll find editing and changing a lot easier.
    I would also pre-mix your music. AE is a really terrible music mixer....

  • Manual stitching of photos in Photoshop CS5

    In Photoshop CS5, is there a way to manually stitch non-overlapping photos together, rather than using the auto layouts in photomerge?

    Put the photos on separate layers, align them by moving/rotating the upper layer, then use a layer mask or the eraser tool to remove parts of the upper photo where they meet up with those same parts from below (partical opacity can help you judge where this is).  You can even distort images using the Transform tool as needed.
    Besides matching brightness and color, if you mask the upper layer so that it's not on a straight boundary, but rather the transition follows natural lines in the image, you can make the transition between photos practically impossible to see.
    As an example, I did up a composite photo of the moon from many high magnification exposures that way.
    -Noel

  • Error message while trying to encode a video with afterFX composition involved

    OK, so heres the thing.  I wanna encode this video I made, in CS3, but every time I try, it says,
    "the source and output color space are not compatible or a conversion does not exist"
    This only happened after I tried adding an After Effects composition to the mix.
    The After Effects comp is nothing more than a high res JPG spinning, which I made in illustrator and exported as a JPG (obviously).
    I searched around the internet for this error and the help menu but cannot find anything that even comes close.  The videos in the Premiere project are YUV 422 10-bit CCIR 601 non-compressed.
    Anyone have any ideas?

    it doesnt solve the issue
    Maybe not.  But it will get the job done, which is the higher goal here, no?

  • IPad video to car DVD - has no sound with composite cable

    Hi,
    I'm trying to watch iPad movies on my in car DVD system. Through a genuine apple composite cable I can get video with out sound. When I connect the USB from the composite I get sound but no video.
    Any ideas??
    I have iPad 2.
    Stereo in car is JVC 846.
    Thanks

    Hi,
    I do have the same Problem with my F10
    I can capture TV with sound, but not either S-Video or FBAS with sound. In ShowBiz I can see, that recording TV I can choose Audio from the built in capture device - no other soudn device. Recording either S-Video or FBAS I can not choose ANY audio recording device. But if I want to record audio, I CAN choose an audio device other than the built in TV-capturing device - namely the audio device.
    It seems that trying to capture video via the built in cpturing device, you HAVE TO capture audio from the SAME device - and that using the S-Video or FBAS Input there is no Audio there - and so capturing video from anything else than TV is not possible to combine with audio capturing from an other source.
    The only chance is to capture video and audio seperate (one after the other) and mix them together in WinDVD Creator or Showbiz -
    Thanx a lot,
    Hann
    [Edited by: admin on 08-Mar-05 10:43]

  • Anyone recommend a  Mixer/FW interface with L8?

    A few months ago I replaced my analogue mixing desk + original MOTU 828 with a MOTU896 Mk3
    I cant hear the difference between the MK3 and the original setup ( through powered DynAudio BM5 A monitors).
    The 896 is packed with features which I would never use and is counter intuitive, with cheap small fiddly knobs. There is no simple way to get a separate 3 simple independent mixes: i) to head phones ii) to computer for recording and iii) a third mix purely to the monitors.
    MOTU UK admit this is not possible without a third piece of gear which involves assigning a separate stereo mix from Logic to the MOTU and then to a separate analogue fader which directly controls the monitor level.
    My studio is designed for overdubs and composition, not for mixing/mastering etc and have concluded this mixerless setup was a mistake.
    My question is: has anybody used a smallish mixer with fire wire built in that is compatible with/had good results with Logic 8?
    So far the M - Audio NRV 10 looks like the answer. It has the added advantage of being able to send 10 outputs from Logic back into the mixer for outboard mixing
    thanks in advance for any comments
    m s

    Hi Musicspirit,
    It is all depending on the amount of channels you need. Inputs/Outputs?
    I could recommend the TC Electronic Studio Konnekt 48 if 4 mic inputs are enough. It offers you 2 headphone mixes straight from the box trough 2 aux busses. Also this unit has some great features never seen with other units like fantastic effects, that can be used in the low-latency monitor/headphone mix or as a VST plugin! Check it out at http://www.tcelectronic.com/StudioKonnekt48.asp
    Anyway, I am still waiting for mine, I just ordered one for the tiny studio I have at home.
    Another choice could be the Presonus StudioLive 16.4.2 desk, which is awesome. In need for a live mixer, studio mixer, firewire interface, 16 state of the art class A mic pre's and channel strips, accompanied by loads of compressors, noise gates and effects and all packed in one sturdy 19" box?- this is it. Retail est. $2k, same in Euros. Pound? Very fluctuating lately, not? Well, I am seriously thinking of getting this desk too when it is available and the price is nice. Look here http://www.presonus.com/products/Detail.aspx?ProductId=52 There is also a nice blog here http://presonusaudio.blogspot.com
    Hope this helps,
    Picard

Maybe you are looking for

  • Get Url from a text box

    Hello, I have read a couple posts in regards to the GEt URL function but couldn't find what I'm looking for. Trusting someone can help. Here are two reference files 1- Main swf that loads the movie clip with the links Swf that displays the links I'm

  • Uploading a web page that I have designed myself

    Forgive me asking this but I am newbie to Macs and are slowly unraveling this wonderfull concept. I have checked out the templates that iWeb provides and I would like to design my own and upload them. I have looked and cannot find any reference to do

  • Steps for Automatic PO generation

    Hi Guys.          Can anyone pls help me out in creating a automatic Purchase Order.have gone across through the thread.. But if any one can explain the step by step process of the generation of automatic PO..it woud be helpful. pls help me out with

  • Materialized view problem

    Hi all, while refreshing MV I have the following error any idea why ? SQL> exec dbms_mview.refresh('ACCT_STMT_MV','c'); BEGIN dbms_mview.refresh('ACCT_STMT_MV','c'); END; ERROR at line 1: ORA-12008: error in materialized view refresh path ORA-30036:

  • 10.5.3 seemingly breaks PyMSNt 0.11.2

    I had a bad upgrade experience with 10.5.3. Software Updated seemed to hang for about 20 minutes after logout at the point it was 81% through "Writing files". I had no choice but to force restart. Upon reboot, my computer reflected 10.5.3, but I appl