Alpha channel weirdness with hardware MPE

To begin, I'm using a GTX 480 with the hack, so I'm not going to complain too loudly it this is a result of using as-yet unsupported hardware. However, I just want to verify with other hardware MPE users (both legit and unsupported) if this issue is happening on other systems.
I've noticed some oddities with imported PSD files and the way that their alpha channels are rendered with hardware MPE, versus software MPE. I'm putting up a couple frame grabs with GPU acceleration turned on and off at couple different point where I'm using the PSDs. Forgive the PNG compression; you should be able to see the difference, nevertheless.
First up, 100% opaque text over a background. The edges of the text are... fuzzy, maybe?
Software:
Hardware:
Second, a couple layers Fast Blurring and fading in. The logo is 100% opaque, and there is a separate "glow" layer (rasterized in PS) behind it. Pretty obvious, here...
Software:
Hardware:
Finally, a mostly-transparent logo bug. The hardware version is not as transparent.
Software:
Hardware:
The only difference between each of these examples is that I turned on or off GPU acceleration in the Project Settings; it's the same PSD for each grab. I've also noticed that standard cross dissolves are a little chunky when dissolving to or from a graphic (even a flattened version); the opacity change is not as linear as it usually is. In software mode, this goes away.
Anyone witnessed similar results? Again, I want to believe that this is just a result of using the GTX 480 and the hack, without official support. It could very well be the nVidia driver, too, I suppose, but I haven't tried rolling back to check that (I'm running the latest versions).
Thoughts?

I can confirm this.
I do not think its the psd but the Alpha Channel in general.
The colours are off when in MPE (Nvidia GTX 285)
I filed a bug report.

Similar Messages

  • CS5: Bug when exporting directly from PPro with hardware MPE?

    For the record, I'm using:
    A GTX-480 (unsupported)
    The hardware MPE-enabling hack (unsupported)
    The latest version of the nVidia drivers (257.whatever)
    I discovered an issue, of sorts, when exporting directly from Premiere CS5, versus using the "Queue" option to send to AME. I was exporting from an NTSC DV Standard sequence to a 320x240 H.264, in which I had enable cropping and trimmed a bit off the sides and top and bottom of the source; the Output tab was summarily set to "Scale to Fit". With hardware MPE enabled in the Project Settings, the export would be the proper final dimensions (320x240), but any area that was cropped off was instead padded with black. The actual video content was then squeezed and malformed into the area not occupied by the black bars.
    I found that, if I disabled hardware MPE or sent the export through AME by use of the Queue button (instead of Export), then the export was performed properly--the video was cropped correctly and then the video stretched to the borders of the exported video, maintaining the desired aspect ratio.
    As I mentioned at the outset, I'm breaking a couple rules and perhaps tempting fate with others, but before I go about rolling back my drivers and the like, has anyone noticed a similar issue?
    ADDENDUM: By the way, the export doesn't need to come from a sequence. If I load up a source clip into the Source Monitor, and export from there using the process outlined above, I get the same issue. This would seem to indicate, to me, that this is due to some incompatibility with my card, its driver and hardware MPE...

    Did he demo renders or exports?
    About 11 minutes in, he [Dave] demos accelerated H.264 renders.
    Harm Millaard wrote:
    AFAIK MPE only works with renders, not with encoding.
    Well, he did both, actually--but the quote above is a typo on my part. I meant to say "exports" or "encodes." The last couple minutes of the aforementioned demo video are dedicated to exporting a sequence directly from Premiere, and having the benefit of hardware MPE. That's where I'm getting the issue I've outlined above; with hardware MPE enabled, and using the "Export" versus AME "Queue" button/function, I get the strange black outline. With software MPE or through AME, all is well.

  • Finding special effects with alpha channel compatible with final cut studio

    does anyone know a special effect of a gun flash and explosions with alpha channel compatible with final cut pro studio? I would like to be able to buy these special effects. thanks.

    ArtBeats.com
    Depending on your needs, it's usually better to create muzzle flashes in Motion or After Effects.
    bogiesan

  • Alpha channel problem with Pixel Bender blendShaders

    I'm using Pixel Blender to try and create a blend shader for a Flex 4.0 app.
    It looks like this:
    <languageVersion : 1.0;>
    kernel PixelReverse
    <   namespace : "com.abc.def.filters";
        vendor : "EdAlive";
        version : 1;
    >
        input image4 foreground;
        input image4 background;
        output pixel4 dst;
        void
        evaluatePixel()
            pixel4 fgPixel = sampleNearest(foreground, outCoord());
            pixel4 bgPixel = sampleNearest(background, outCoord());
            if((bgPixel.r == 0.0) && (bgPixel.g == 0.0) && (bgPixel.b == 0.0) && (bgPixel.a == 0.0)){
                bgPixel.r = bgPixel.g = bgPixel.b = 1.0;
            dst.r = 1.0 - abs(fgPixel.r - bgPixel.r);
            dst.g = 1.0 - abs(fgPixel.g - bgPixel.g);
            dst.b = 1.0 - abs(fgPixel.b - bgPixel.b);
            dst.a = fgPixel.a;
    It looks correct in the Pixel Bender preview window.
    When I use it in a Flex project with proper 32 bit png images, the filter seems to work correctly, except it sets any partial alpha value on the images to fully opaque (like a 1 bit alpha channel).
    Because the image preview looks correct in Pixel Bender, I'm assuming it's a problem with the bitmap data that Flex is passing to the blend shader filter.
    I tried changing the filter to simply output a copy of the foreground firstly, and the background second, to test.
    When returning a copy of the foreground, the imags weren't visible on stage, either because the pixels were transparent or because it was really returning the background pixels, camouflaging the images.
    When returning a copy of the background, the images were just black rectangles.
    The images are being place on stage programmatically inside an mx Module, and having their depth property set.
    They also have draggable behaviour, where the current drag object is layered above all else during the drag.
    I'm actually using a subclass of mx.controls.Image, but am simply setting the source property with preloaded bitmap data, so I'm am fairly sure the subclass is not a factor.
    I hope this is enough information for someone to be able to provide some help.
    Thanks

    Please make sure you have installed the latest graphics drivers, from http://www.nvidia.com/Download/index.aspx?lang=en-us, for your graphics card.

  • Alpha channel data with layered format plugin

    I have a layered format plugin and I am trying to write the alpha channel information separately. I have set up channelPortProcs and I am attempting to read the pixel data with readPixelsProc. The issue is that even though the readPixelsProc function doesn't return any errors the rectangle that is supposed to store the rectangle that was actually read is a 0x0 rectangle. I can't seem to find why this would be or what I might be doing wrong. Here's what I have so far:
    // set up the pixel buffer
    int32 bufferSize = imSize.h*imSize.v;
    Ptr pixelData = sPSBuffer->New( &bufferSize, bufferSize );
    if (pixelData == NULL)
         *gResult = memFullErr;
         return;
    // Define the source and destination rectangles
    ReadChannelDesc *alphaChannel = gFormatRecord->documentInfo->alphaChannels;
    PIChannelPort port = alphaChannel->port;
    PSScaling scale;
    scale.sourceRect = scale.destinationRect = alphaChannel->bounds;
    VRect destRect = scale.sourceRect;
    // Set up the descriptor for reading pixel data
    VRect wroteRect = {0};
    PixelMemoryDesc dest = {0};
    dest.rowBits = imSize.h*alphaChannel->depth;
    dest.colBits = alphaChannel->depth;
    dest.depth = alphaChannel->depth;
    dest.data = pixelData;
    // Put the pixel data into the buffer
    *gResult = gFormatRecord->channelPortProcs->readPixelsProc(port,&scale, &destRect, &dest, &wroteRect);
    if(*gResult != noErr)
         return;
    The alpha channel gives me the proper name, but I just can't get the data associated with it. Thanks.

    I am still trying to find a solution to this.  The propChannelName is read only and the documentInfo structure is NULL when reading.  Any suggestions?

  • Extract layer with alpha channel or with whole layer size?

    First of all.Sorry about my English.But I'll try my best to describe what I mean.
    The text with transform:
    The extract window of text:
    Is any possible make the extract given whold layer size result ?
    What I except:

    But that's a lot of layer i need to extract
    Almost like ... 21 layers and i need another scaled level "x/2" "x/4"
    21 * 4 = 84 I must do 84 save as. thats not good.

  • Battling Hardware MPE, Episode 2: Chunky Blurs

    Round Two of my hardware acceleration MPE tests...
    When using direct export with hardware MPE, any effect that renders a blurred alpha channel (Fast Blur, Gaussian Blur, etc.) creates an extremely ugly/chunky/unusable result. The source footage and sequence does not matter, nor does the destination format. The following examples are of an animated Gaussian Blur (from 0 to something) on a title clip, wiped off with a Gradient Wipe transition (toggle the GradWipe makes no difference). As with my previous thread (Battling Hardware MPE, Episode 1: Cropping on Export), I tested four variations of exporting: with hardware acceleration on and off, and with direct export and sending to the AME queue:
    GPU Acceleration off, sent to AME queue:
    GPU Acceleration off, direct export:
    GPU Acceleration on, sent to AME queue:
    GPU Acceleration on, direct export:
    So we've got good, good, good, bad. As before, the direct export method using hardware MPE seems to throw a wrench in the works. Any effect that blurs like this (including soft shadows or glows) suffers this ugly banding and harsh falloff. What's curious is that the last example is how the Program Monitor appears while GPU acceleration is enabled; if I disable it, it looks like it does in the first three. What I can't figure out, then, is where hardware MPE is actually at work! Is it in the direct export (bad) or in the queue (same as non-GPU accelerated)? It's not making much sense to me.
    Now, I had a chance to have a brief email exchange with one of the engineers regarding a similar issue a few months ago. In response to similar observations and questions, here is his reply:
    You are correct that composting with alpha can give different results. This is caused by processing in linear color so that blending is more like natural light. With MPE GPU acceleration, all composting is always done in linear color. This is nothing to do with the hack, but an intentional design decision to never compromise in quality. In software, composting is only done in linear color when rendering at maximum render quality because on the CPU it takes a lot longer. This probably also explains why you occasionally saw this with software. In the monitors we never show anything at maximum quality with unrendered footage. With software you thus need to export with max render quality specified or set max render quality in the sequence settings and render previews. For consistent results when switching between software and GPU acceleration I suggest enabling both max render quality and max bit depth.
    Either I'm not understanding this, or it's confirming the bug. I get that hardware acceleration is supposed to enable "linear color processing;" that's fine, if that's better than how it's usually done (whatever that is--I'm not an engineer), but based on what I'm seeing with the hardware direct export, it's WORSE than any software render or encode. Ultimately, I don't care what is technically superior if it looks aesthetically inferior. With the GPU on, a direct export is not usable, and when rendering through the queue, it looks visually no different than when not using the GPU.
    So based on the response above, I just did some more tests, this time with the Maximum Render Quality and Maximum Bit Depth options. I did not change the MRQ and MBD settings for the sequence itself--only in the export window--as it is my understanding that those check boxes will enable or disable those features. Using the same example above, I found some interesting results:
    So, this would appear to largely bear out what the engineer explained. My observations now:
    Hardware acceleration, at least as it pertains to this linear color processing issue, is fundamentally equivalent to Maximum Render Quality in software rendering mode.
    Maximum Render Quality does nothing to soften the chunky blurs, shadows or glows. Instead, Maximum Bit Depth must be enabled.
    In my initial tests, GPU On + Queue resulted in the same visual effect as GPU Off; in this test, GPU On + Queue resulted in the same effect as GPU On + Export (???)
    Setting the Maximum Bit Depth option for your sequence in hardware mode will display smooth blurs with soft falloff in the Program Monitor.
    Setting the Maximum Bit Depth and/or the Maximum Render Quality option in software mode has no effect on the Program Monitor display.
    Regardless of sequence settings, failure to set either the MRQ or MBD option in the export window will result in those settings not being applied.
    Setting either the MRQ or MBD option in the export window will always result in those settings being applied, regardless of sequence settings.
    After going through all this, I may be willing to concede that everything is working correctly, more or less. However, my complaint is now, "WHY does this have to as complicated as this?" There are simply too many combinations that have to be set properly to get the desired quality of output, and I firmly believe that this needs to be simplified. When exporting or rendering in hardware/GPU mode, I believe that MRQ and MBD should be on by default; as it is, even with the promise of "linear color processing" with hardware acceleration, I still have to remember to tick another box to make sure that blurs, shadows, and glows don't look like stair steps. The jury is still out on how "good" linear color processing is; maybe I just got used to software rendering of these soft alpha channels, but I'm having difficulty seeing the benefit of more "realistic" light processing at the moment. With hardware acceleration on, you're basically stuck with how those soft elements look; with the hardware acceleration off, I can opt for the more subtle look if I like, even if it means I give up the other presumed benefits of linear color processing. When I design graphics in Photoshop, I expect them to look at least reasonably similar in Premiere; with hardware acceleration on, all bets are off.
    I realize this is a new technology and will, hopefully, continue to mature and improve, but I'm hoping that this at least sparks some conversation about this issue. Casual users may not care too much, but anyone using this product for broadcast or other commercial work should be aware of the complications associated with the technology, and should demand that it work consistently and at an expected level of quality. Maybe I'm expecting too much of this, but I certainly hope not.
    Your comments are requested and appreciated.

    According to your last picture, the one thing consistent with all the problem footage is MRQ.  It seems that feature is actually causing the problem.
    Correct; that was basically a test to see if what was described by the engineer (namely, enabling MRQ to produce better alpha channel compositing) held true. Well, it's sort of six of one, a half dozen of the other. In essence:
    Hardware MPE = Software MPE + Maximum Render Quality = Linear Color Processing
    Enabling MRQ in hardware mode does nothing; enabling MRQ in software mode makes software mode act like hardware mode. In either of those scenarios, the alpha channel is composited using linear color, which causes soft shadows, glows, etc. to balloon. Additionally, the soft shadows are banding, due to 8-bit processing. Only by enabling Maximum Bit Depth can the banding be eliminated, but the bulbous glow remains. Setting Maximum Bit Depth alone in software mode has no apparent effect; the alpha channel may not be composited "more realistically" thanks to linear color processing, but it doesn't grow grotesquely or reveal banding, either.
    The tests above were conducted with the Gaussian Blur effect within Premiere. I decided to try a test using a Photoshop document; the PSD is a simple text layer with an Outer Glow layer style applied. I tried importing the layer with the layer style as it is, I tried flattening the layer, and I tried merging the layered document on import into Premiere; all resulted in the same effect. The results are more troubling than the examples above, as I use Photoshop for the vast majority of my graphics work for Premiere. Here's the test:
    This is text composited against the background video in Photoshop, so that I can see how the glow should be rendered:
    Nothing fancy, but the glow has a nice, subtle fall-off. When I import the PSD into Premiere, edit the alpha text layer over the background clip, and render out in software mode, this is the result:
    Visually identical to what I created in Photoshop; so far, so good. Now, if I enable MRQ and MBD on export, still in software mode, this is the result:
    Um, yuck. Enabling MRQ has had the same ballooning effect as in the tests above, but since the graphic and background video are 8-bit, and I'm not using a 32-bit effect (like the Premiere Gaussian Blur effect), enabling Maximum Bit Depth has no effect. The glow shows banding, and is just downright ugly.
    So what happens in hardware accelerated mode? No prizes for guessing correctly:
    No combination of Maximum Render Quality or Maximum Bit Depth has any effect in this case; the glow always looks like this in hardware accelerated mode.
    This is a HUGE problem, as far as I am concerned. Not only can I not use hardware MPE to export if I want a decent-looking glow, but I can't even use it in the design and build phase because the composited result looks NOTHING like what I'm building in Photoshop. I have a keyboard shortcut set up now to launch the Project Settings panel so I can toggle back and forth between software and hardware mode, so I can at least enjoy some of the acceleration of hardware for general editing, but for detail work and for export, hardware MPE is proving to be an Achilles heel.
    This is proving to be really depressing. Until I did these tests, I didn't quite realize the depth of the problem. Granted, no one is forcing me to use hardware acceleration, but am I wrong in calling this a serious flaw?

  • Firewire output of Hardware MPE - I give up

    Ok... I'll forget about IEEE1394 output of hardware MPE.
    I suspect it isn't likely to be resolved in CS.anything.
    The technical challenge of routing a DV signal from the CPU/GPU
    back through a firewire channel may be an insurmountable hurdle.
    But it's really tough to let go of your pet peeve.
    If nVidiAdobe were to develop a companion card to the primary GPU
    that allowed realtime SDI and Component output of hardware MPE
    to a broadcast monitor and/or deck, I would gladly buy one in a second...
    regardless of cost.
    Using third-party hardware and sequence presets, or employing the
    suggested hinky workaround using a consumer grade monitor that
    requires monthly attention to maintain its color calibration using
    third-party hardware and software, or any other suggested approach
    I have read here are IMHO, all flimsy second-tier dead ends.
    Multiple monitors and/or Reference monitor in Premiere Pro
    http://forums.adobe.com/thread/744683
    With CS6 perched on the horizon, It seems Adobe is in the perfect position
    to continue its inexorable march toward taking charge of the NLE market.
    Most serious FCP editors are Ae users already.  Give them a good taste
    of expanded native file support along with Dynamic Linking  and throw in
    some new capabilities and reliability to Pr, and they will wonder why they
    were doing things the hard way for so long.
    Fellow pros used to snicker and roll their eyes at me when I said I was
    using PPro2.  But, when I mention CS5 they will always have a few
    questions about how well things work... and the snicker is long gone.
    However, I feel the lack of realtime uncompromised external monitoring
    is an issue that must be natively resolved before Pr can take the lead.
    Adobe and nVidia should find a way to work this out.
    I think a rhetorical question can be found here somewhere.

    function(){return A.apply(null,[this].concat($A(arguments)))}
    SCAPsinger wrote:
    I've posted this before, and at the risk of sounding like a clanging gong, I'll post it again JUST IN CASE it's as useful a solution for anyone else.
    You can easily output the program monitor of PPro through a 2nd (or 3rd?) video port. In the Playback options, you can select an additional monitor as the output for previews (same place where you would select DV output for previews). There is no extending of the PPro interface or other modification required other than to select the monitor in the Playback options.
    When you hit play in PPro, it activates full screen playback on the monitor. Pause playback and it pauses on the monitor. If/when you ALT+TAB out of PPro to another application, the fullscreen goes away (and - for me - reveals my beautiful company logo on the desktop) but it comes right back as soon as you hit play.
    It works on all timelines, no codecs or special setups needed.
    I previously connected via a DVI > HDMI adapter cable, but now I use the HDMI out of the video card straight to the HDMI port of the external HDTV.
    I'm sure this solution won't be satisfactory for everybody, but for me, it gives a very good representation of what my projects look like on the end users HDTV set. AND it's very easy to setup. AND it works with all sequence settings. AND it works with hardware MPE active. AND....well...guess that's about it.
    Thanks for your input...
    That is pretty much what I am doing (except I am using two video cards) but it is not accurate when you are using DV.
    I need to output to an NTSC monitor not a progressive computer monitor.  There is a big difference when viewing.

  • How do i create an alpha channel to place into edge animate?

    HowHow do i create an alpha channel compatible with edge animate?

    I don't use Edge, but since it is a web tool it stands to reason it would use standard web techniques, meaning it would rely on built-in transparency functions of formats like PNG and GIF, which you can easily produce by using Save for Web after creating normal transparency on a layer in Photoshop. No extra Alpha channel or otehr extra steps required. Perhaps Edge even has some stuff that does the conversion on the fly by allowing you to open a native PSD like in Dreamweaver, but beyond that I don't see what else it could/ would do - all the features it can provide are limited by standard specifications for HTML, CSS and JavaScript. There is simply no way to do something sensible with a TIFF in a browser, if you get my meaning .
    Mylenium

  • Better Alpha channel handling

    I've asked for this a lot: alpha channel saving with PS is terrible, and could use an update. And today, I was embarrassed by a bug that caused our game engine to perform poorly as a result of the way Photoshop handles the alpha channel.
    The bug: make a 1024x1024 canvas, give it an opaque alpha channel (100% white, every pixel), then Image Size it down to something smaller, like 128x128. Now look at your alpha channel: grey pixels all along the border. ARRGGGHHH!!!1!1! It interpolated, I'm sure, using black alpha outside the canvas that doesn't actually exist, changing, in a powerful, fundamental way, the nature of the alpha channel. Video cards care about this stuff: 99.9% opaque is not 100% opaque. This is terrible.
    The request: That PS intelligently offer better defaults when saving an image with an alpha. Right now, it offers, "What was the last bit depth you saved an image to?" Regardless of the fact that the previous image may have no relationship to the following image being saved, the default offering is always "what I did last time". This enables very easy pruning of alpha channels that should be there, or adding opaque alpha channels that shouldn't be there, bloating the file size.
    I'd like PS to determine if an alpha channel is present, and base its default choice on that. If one is present, default to 32-bits. If an alpha channel si not present, default to 24-bits. That easy.
    If multiple alpha channels are present, I'd like a dropdown menu to pick which one I want. Right now, if you save a PSD with multiple alpha channels to a 32-bit TGA, it throws all alpha channels away, and saves the image with a solidly opaque alpha channel, the choice no one asked for.
    For texture work, or works where the graphics card is the destination, not the printed page, this cavalier handling of alpha channels is definitely not sustainable.
    I'd love to never have to ask for this again.

    Generally speaking the resize of an image on a layer in which the canvas is exactly the size of the data will result in transparency peeking in around the edge (i.e., I'm agreeing with you here, just using Photoshop terms).  I've always thought this was kind of poorly thought-out too.  As you say, the algorithm must default to using 0 vs., say, replicating (or "clamping") to the alpha of the pixels right on the edge.
    I suppose theoretically, the thinking is that if you were to EXPAND the canvas, the area around the image would be transparent anyway, and a subsequent resampling would then have the same result as the above.
    Knowing this, one way to work around the problem would be to create a slightly larger image, then Canvas Size it down to your intended resolution.  That way there's layer data beyond the edges with which the resizing algorithm can work.  I realize that's probably not a practical solution in general, but a trick to keep up your sleeve if you really do need that 128 x 128 image with alpha solid to the edge.
    -Noel

  • Photoshop CS6 using javaScript to truncate alpha channel name

    Hello,
    I'm a production artist and I work with PSD files that were created in Adobe Scene7 Image Authoring Tool. These PSDs contain a background layer along with 1-20 alpha channels. My script has to make a new blank layer for every alpha channel in the document. Then it fills the new layer with light gray. So far, my code accomplishes this. However, I'd like to apply the name of the alpha channel to the layer, but I need the name to be truncated. Every alpha channel starts with one or more characters followed by a backslash and then finishes with one or more characters. Here's an example:
    An alpha channel might be named:  Floor\floor
    In this example I need my layer name to be just:  floor. This means all character to the left of the backslash, including the backslash itself needs to be discarded. I was using the subSring() statement to do this. When I try to step through the code, line by line in ExtendScript, I immediately get an error that says Unterminated String Constant and Line 31 of my code is highlighted. I suspect it doesn't like the way I wrote the backslash character, although I surrounded it in double quotes to define it as a string.
    Can anyone tell me why I'm getting this error?
    Below is my code with lots of comments to walk you through the process. I wrote where the error occurs in red type.
    I'm new to JavaScript so I'm not sure my while loop is accurate.
    #target photoshop
    // The #target photoshop makes the script run in PS.
    // declare variable to contain the active document
    var myDoc=app.activeDocument;
    // declare variable to contain the number of alpha channels, excluding the RGB channels
    var alphaChan = myDoc.channels.length - 3;
    alert(alphaChan + " alpha channels exist");
    // create loop to make new layers based on number of alpha channels, fill layer with gray and apply alpha channel name to new layer
    for (a=0 ; a<alphaChan ; a+=1){
    // make new blank layer
    myDoc.artLayers.add();
    // fill blank layer with gray
    var color = new SolidColor();
    color.rgb.red = 161;
    color.rgb.green = 161;
    color.rgb.blue= 161;
    myDoc.selection.fill(color);
    //variable stores alpha channel name
    var alphaName = myDoc.channels[3+a];
    // variable stores lenght of alpha channel name
    var lz = alphaName.length;
    // declare index variable to initialize position of 1st  character of alpha channel name
    var x= 0 ;
    // truncate alpha channel name by removing all characters preceding the "\" symbol
    while (alphaName.subString(x) != "\"){          (ExtendScript gives an error for this line and highlights the backslash and surrounding quotation marks)
        alphaName = alphaName.subString((x+1),z);
        x+=1;
        z-=1;
    return alphaName;
    // remove the backslash from alpha channel name
    alphaName = alphaName.subString((x+1),z);
    //  apply truncated alpha channel name to corresponding layer
    myDoc.artLayers[a].name = alphaName;

    while (alphaName.subString(x) != "\"){ 
    should be
    while (alphaName.subString(x) != "\\"){ 

  • Can no longer import QT files with alpha channel

    I have been using these client supplied QT with alpha channel files just fine for weeks, then all of a sudden, after a crash the other day, I was unable to open sequences with these files and exoprt them, the exporter would just freeze.  the clips played in the timeline, but VERY sluggish.
    On the reccomendation of a few posts around here, I removed the clips from the project, and now I cannot import them back in, I get a "the importer reported a generic error" message.  I am able to open in QT pro and export to a new file that will import to PPro, but I lose the alpha channel.
    As always, I'm up against a deadline and any help would be greatly appreciated.
    I have installed QT 7.07, no help.
    I'm on a windows 7 machine running CS master collection 5.0, Quad core AMD with 8 gig RAM.
    These files worked fine just days ago!!!!
    BTW, I just switched over to Premiere a few months ago and to honest I can't understand how anyone would stick with this buggy software, as a professional I've never used anything this bad before.

    Welcome to the forum.
    Try these.  Attempt to re-import after each one:
    Clean the media cache database via Edit | Preferences | Media
    If step 2 doesn't work, then find all the .qtindex, .mpgindex, .cfa and .pek files associated with the media that's supposed to be in your project and delete them.  Then clean the media cache database again.
    Launch Pr and while it's launching, hold down the Alt + Shift keys until the Welcome screen appears.  Alt resets your preferences and Shift resets the plug-in cache.
    To address the other issues you say you've been having with Pr, you should start a different thread (or threads).  Coming from other editors, there may be a difference in the way Pr does things that produce unexpected results that may be seen as bugs.  More serious issues, such as crashes, can often be caused by 3rd-party hardware like AJA, Matrox or BlackMagic and the associated drivers.  Outdated or incorrect drivers for audio and video cards can also cause problems.  I recommend that you start troubleshooting those areas first.
    Other issues may have workarounds.  If you have serious, reproducible problems that have no workaround, then please file bug reports here:
    Adobe - Feature Request/Bug Report Form
    -Jeff

  • Can't export with alpha channel

    PROBLEM: No matter what export settings I use or manipulate I can't get the exported movie to have an alpha channel.
    I've read several threads on this, and none of them solved my issue. Temporarily I just imported the Motion 3 project and used it as my overlay video - it's a title intro that slides out to reveal the video underneath it.
    However, I'd prefer to create self-contained QT's as there are 7 sessions.
    In After Effects I'd typically choose Animation and select 'using Straight Alpha'. Motion has a preset of Lossless + Alpha Movie, or you can go to advanced and select Output tab and deselect Pre-multiply Alpha. Neither works. The exported video still has black. What's totally insane is that I was using these settings up until today and everything was fine!
    And yes, I've chosen 'Transparent' under the view options and still black is always included.

    We do have an answer, it's usually a simple user error: not being able to tell the alpha is really there. If you're watching in QTPlayer, you will see black. If you open the movie in FCP, you should be able to change the alpha interp to suit.
    Even if OPs claim to have tried this, it's still the first thing that comes to mind and, since the OPs rarely return to tell us what REALLY happened, we are left with the assumption that this was the problem all along.
    If you set your render controls to Animation with millions of color + (many of us simply forget the PLUS) and we presume there is no solid shape or opaque layer (easily missed in Motion's weird timeline display) it is fundamentally impossible not to get an alpha.
    Not to say it can't happen but I've been on support forums for AE, Media 100, FCP and others for more than a decade and I have never heard of the Animation codec actually failing. It is always user error or a broken application.
    I hope you folks find a new and unique solution that we can add to the list. But this is the best we know to give you: Check and recheck the easy stuff.
    bogiesan

  • No Method of Batch Export for Clips with Alpha Channels?

    Good morning,
    As many a flustered editor has eventually discovered, in order for FCP to export sequences with alpha channels to a 32-bit format, the timeline has to be un-rendered at the time of export, or else the transparent parts will appear black in the outputted file. This sort-of makes sense if you know how FCP and render files work, but in a perfect world I think I'd have designed the export interface a bit differently. Now that I think about it, I'm actually working in an Animation (Millions of Colors +) sequence, so converting transparent areas to black makes no logical sense at all.
    Anyway, I have several sequences that I would like to export as 32-bit TGA QuickTime files, preserving their transparency. If I Export Using Compressor, the process results in pre-rendering of the sequence, turning the transparent areas black. The same problem occurs if I export QuickTime reference movies from FCP and open them directly with Compressor.
    Does anyone know of a way to avoid this silly phenomenon or am I stuck individually exporting each sequence from FCP, one...at.......a................time?
    Thanks,
    Zap

    Thanks, Andy, "Batch Export" eventually did the trick!
    I forgot about that tool because I've never actually had to use it before! After playing around with it for a while, I found that as long as the sequence settings for each sequence in the batch are set to a codec with an activated alpha channel, it works just fine.
    Thanks again,
    Zap

  • ImageIO PNG Writing Slow With Alpha Channel

    I'm writing a project that generates images with alpha channels, which I want to save in PNG format. Currently I'm using javax.ImageIO to do this, using statements such as:
    ImageIO.write(image, "png", file);
    I'm using JDK 1.5.0_06, on Windows XP.
    The problem is that writing PNG files is very slow. It can take 9 or 10 seconds to write a 640x512 pixel image, ending up at around 300kb! I have read endless documentation and forum threads today, some of which detail similar problems. This would be an example:
    [http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6215304|http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6215304]
    This surely must be resolvable, but after much searching I've yet to find a solution. If it makes any difference, I ONLY want to write png image, and ONLY with an alpha channel (not ever without), in case there are optimisations that that makes possible.
    If anyone can tell me how to address this problem, I'd be very grateful.
    Many thanks, Robert Redwood.

    This isn't a solution, but rather a refinement of the issue.
    Some of the sources I was reading were implying that the long save time might be due to a CPU heavy conversion process that had to take place before the BufferedImage could be saved. I decided to investigate:
    I loaded back in one of the (slowly) saved PNG images using ImageIO.read(file). Sure enough, the BufferedImage returned differed from the BufferedImage I had created. The biggest difference was the color model, which was DirectColorModel on the image I was generating, and was ComponentColorModel on the image I was loading back in.
    So I decided to manually convert the image to be the same as how it seemed to end up anyway. I wrote the following code:
          * Takes a BufferedImage object, and if the color model is DirectColorModel,
          * converts it to be a ComponentColorModel suitable for fast PNG writing. If
          * the color model is any other color model than DirectColorModel, a
          * reference to the original image is simply returned.
          * @param source The source image.
          * @return The converted image.
         public static BufferedImage convertColorModelPNG(BufferedImage source)
              if (!(source.getColorModel() instanceof DirectColorModel))
                   return source;
              ICC_Profile newProfile = ICC_Profile.getInstance(ColorSpace.CS_sRGB);
              ICC_ColorSpace newSpace = new ICC_ColorSpace(newProfile);
              ComponentColorModel newModel = new ComponentColorModel(newSpace, true, false, ComponentColorModel.TRANSLUCENT, DataBuffer.TYPE_BYTE);
              PixelInterleavedSampleModel newSampleModel = new PixelInterleavedSampleModel(DataBuffer.TYPE_BYTE, source.getWidth(), source.getHeight(), 4, source.getWidth() * 4, new int[] { 0, 1, 2, 3 });
              DataBufferByte newDataBuffer = new DataBufferByte(source.getWidth() * source.getHeight() * 4);
              ByteInterleavedRaster newRaster = new ByteInterleavedRaster(newSampleModel, newDataBuffer, new Point(0, 0));
              BufferedImage dest = new BufferedImage(newModel, newRaster, false, new Hashtable());
              int[] srcData = ((DataBufferInt)source.getRaster().getDataBuffer()).getData();
              byte[] destData = newDataBuffer.getData();
              int j = 0;
              byte argb = 0;
              for (int i = 0; i < srcData.length; i++)
                   j = i * 4;
                   argb = (byte)(srcData[i] >> 24);
                   destData[j] = argb;
                   destData[j + 1] = 0;
                   destData[j + 2] = 0;
                   destData[j + 3] = 0;
              //Graphics2D g2 = dest.createGraphics();
              //g2.drawImage(source, 0, 0, null);
              //g2.dispose();
              return dest;
         }My apologies if that doesn't display correctly in the post.
    Basically, I create a BufferedImage the hard way, matching all the parameters of the image I get when I load in a PNG with alpha channel.
    The last bit, (for simplicity), just makes sure I copy over the alpha channel of old image to the new image, and assumes the color was black. This doesn't make any real speed difference.
    Now that runs lightning quick, but interestingly, see the bit I've commented out? The alternative to setting the ARGB values was to just draw the old image onto the new image. For a 640x512 image, this command (drawImage) took a whopping 36 SECONDS to complete! This may hint that the problem is to do with conversion.
    Anyhow, I got rather excited. The conversion went quickly. Here's the rub though, the image took 9 seconds to save using ImageIO.write, just the same as if I had never converted it. :(
    SOOOOOOOOOOOO... Why have I told you all this?
    Well, I guess I think it narrows dow the problem, but eliminates some solutions (to save people suggesting them).
    Bottom line, I still need to know why saving PNGs using ImageIO is so slow. Is there any other way to fix this, short of writing my own PNG writer, and indeed would THAT fix the issue?
    For the record, I have a piece of C code that does this in well under a second, so it can't JUST be a case of 'too much number-crunching'.
    I really would appreciate any help you can give on this. It's very frustrating.
    Thanks again. Robert Redwood.

Maybe you are looking for

  • ITunes will not recognize my iPod miin _ HELP!!!!

    Hi I have a desktop running Vista and have iTunes version 9.02.25 running. I can connect my old 15GB iPod and iTunes will recognize it no problems. However when I try and connect my 6GB iPod mini, even though its charging, nothing happens. Can you pl

  • Ultranav won't install after upgrade to Win 7

    I upgraded from XP to Win 7 Ultimate. When I try to install the Win 7-compliant driver from the Lenovo site, the OS tells me that it's not compliant and "fixes" the problem by installing generic drivers that, of course, don't allow me to modify the U

  • ESR and ID login issue in Clustered SAP PI 7.3 environment

    Hi PI Gurus, We currently have issue for login in ESR and ID of PI 7.3 systems from desktop which is in customer VLAN. Please find below status for PI systems : Development and Quality PI systems as standalone : ESR and ID login works fine Regression

  • HT1175 I would like to set up my first time capsule to back up in an external HD

    I hit the Time machine icon, then a windows pop up, the window shows the button to turn it on, I click and a airtime capsule came over I click on that, Then another windows appeared that will connect to a "Station" ? what is this ? I left on overnigh

  • Are stored scripts mandatory in CATALOG mode?

    DB Version: 10gR2 In CATALOG mode, can we run scripts in OS path like below rather than running Stored Scripts connect target sys/oracle123 catalog rman/secretpass@rmancat @'/path_in_targetdb_machine/bkp.rcv'I am going to run the above command from m