FlasCC and video encoder

hi,
i don't know a lot about FlasCC but i think it might be possible to write a fast video encoder with it from c++ library?
I want a cross-platform solution to encode bitmapdata to video (mp4 or standard format). Right now i use native extension and it's a pain in the a** to import to android, windows, mac etc etc.
Is there anything that i'm not aware that would be able to encode bitmapdata to video like a ffmpeg port to swc from flasCC or something like that ?
I know there a FLV Writer that was made with alchemy some years ago but it only create flv and flv is not a standard format (iOS and Android can't open those format).
Any help on that matter will REALLY help me in my current project as my project create video from bitmapdata.
Regards,
Paul-André Belle-Isle

Hi Paul-André,
The easiest solution would be to use ffmpeg in SWC. But so far I didn't get an answer here / don't understand why when I export ffmpeg to SWC it keeps reporting stack overflow(when compiled as SWF it works fine with their example). If this could be resolved it would be pretty easy to convert to mpeg4 or other ffmpeg supported formats, with ffmpeg SWC library wrapped with a custom wrapper. I poked with ffmpeg some more, and when compiling with -O4 my example compiled without any stack overflow. The sample is trivial (encoding some MPEG1 dummy frames), but the library didn't crash with stack overflow. The only other side is the swf size jumps ~+1MB. So I guess the other task would be to write simple AS3<->ffmpeg wrapper, and understand/implement their muxing examples.
If ffmpeg swc doesn't work,  I'd say it would have to be the same route I took with ogg...e.g. getting specialized encoder, like http://www.videolan.org/developers/x264.html, attaching a custom wrapper for Flash for addFrame, encode etc. Btw. I did manage to make my OGG encoder async, basically calling addFrame every frame like the FLV encoder does. x264 is of course it's a video stream only, not mpeg4 container. Libtheora produces OGV file, so there it wasn't an issue.
So additionally this means getting a specialized mp4 muxer. I googled around and happened to find: http://gpac.wp.mines-telecom.fr/mp4box/ - it's a commnad line tool and from the source code this appears to do the job. But of course, thats another lib to compile for FlasCC (edit: this wouldn't be necessary if ffmpeg will work without other issues as I wrote above).
Finally about speed, depends on what you need - in my app I'm replaying some animation and capturing a 800x450 container. Comparing to the trivial Alchemy FLV lib I've been using before it's a bit slower but still fine - on my machine(3 years old pretty cheap laptop) -
FLV encoding: ~11fps
OGG encoding: ~7.3fps
So all in all, depends how you want to use it but surely it wouldn't be an easy job I've got also some paid/personal projects ongoing, i'd be willing to look into this further out of curiosity as well but only for good compensation - this is a bit hardcore(either solving stack overflow creating wrapper for ffmpeg and adapt their muxing example for it, or making mp4box work). If you want to discuss this further, probably better to drop me an email to [email protected]

Similar Messages

  • HuffYUV problem and video encoding

    I have a dilemma. Adobe Premiere CS6 or Adobe Media Encoder, do not support variable framerate. So now I have to convert my videos to a constant framerate without losing any quality.
    I'm aware of Lagarith but nothing encodes to it. ALL of the encoders I mainly use support HuffYUV, and they encode it way faster than something like VirtualDub. It's many hours of video. I thought Adobe Media Encoder would solve it, but it turns out that didnt support converting variable framerate to constant. So my audio and video are out of sync in premiere.
    I cant really get it into Lagarith so I probably just need to somehow use HuffYUV, anybody know of a HuffYUV plugin for premiere or something?

    You shouldn't need a plug-in for that to work, as long as the codec in installed on the system.

  • IDVD video encode error when burning to DVD

    I am trying to burn a new copy of a project I successfully burned to DVD 2 years ago.  I have not changed the source material.  After the menu renders and video encoding begins I immediately get an error message "video encoding error."  There are no menu warnings.  All video is successfully previewed.  Nothing has changed from the previous burn.  The project info box shows all video and audio properly encoded and present.
    What gives?

    The way from FCE or FCP has been the same for me for lot's of years since FinalCut 1.2.5 I think (was no Express or Pro then)
    And QuickTime Conversion never worked for me - I guess You were lucky two years ago.
    FCE/P to iDVD
    Several things
    • How to go from FCE/P to iDVD
    • Free space on Start-up hard disk
    • Encoding
    • Brand and type of DVDs used
    • Burn speed set
    • iDVD BUG
    • Chapters
    How to go from FCE/P to iDVD I do
    • Disable Screen and Energy saver
    • IMPORTANT --> FIRST in FinalCut - Mix Down Audio under Sequence Menu / Render Only / Mix-down
    • Export out as a QuickTime .mov file
    • Select with Mark - Chapter Mark
    • Not as Self-Contained (not important but saves time and space)
    • NO QUICKTIME CONVERSION (IMPORTANT)
    This QT.mov file I import into iDVD from within iDVD.
    Free space on Start-up hard disk
    I set a minimum of 25GB (for Mac OS and iDVDs temp files)
    Just double click on the Hard Disk icon (top right on DeskTop) and read on bottom part of this windows frame e.g,
    ( in English - 1 of 14 selected, 38.24 GB FREE)
    Encoding
    • I use Pro Quality encoding if material (video + menu) exceeds 40 minutes ! if less I use Best Performances
    Brand and type of DVDs used
    • I use Verbatim
    • I use DVD-R
    Burn speed set
    • I set down this to x4 (or x1) - BEST IS NOT BEST at all only fastest and results in Maximum amount of BURN ERRORS !
    iDVD BUG
    • One can not go back to movie-project for any alterations and then go back to
    the iDVD project. It will notice and ask You to either Up-date or Cancel. Neither
    of them will work.
    Medicine - Start a brand new iDVD project.
    Use of Chapters
    • I only use a to z and 0 to 9 in naming them. NO other symbol/letter !
    • NO Chapter-mark at very beginning - iDVD NEEDS TO set this by it self
    • No Chapter marks in or within two seconds from a transition
    (Way around this last one - Export movie as QT full quality and NO Chapter marks
    Import this into a new Movie-project and now You are free to set C-Ms where You want
    them except at very beginning - still)
    Material used to build movie
    • video - I use streamingDV (or convert all other to this e.g. .mp4, .avi, .wmv etc)
    • audio - I use .aiff 16-bit 48kHz or from Audio-CD (44.1kHz) - no .mp3 or direct from iTunes
    • photos - I use .jpg - no .bmp etc
    Trash iDVD pref. file and run Repair Permissions - and have a re-try.
    from post ??
    May not be relevant, but I had the same problem with iDVD, where burned DVDs showed a green screen. It was cured by quitting Quicksilver and Quickeys as well as disabling sleep and screen-saving
    Yours Bengt W

  • Qmaster and Flash Video Encoder 8

    Can I use Qmaster to set up a render farm to encode Quicktime movies to FLVs using Flash Video Encoder? If so, does anyone know where i can find the command lines samples to se up the Macs?

    http://manuals.info.apple.com/en/AppleQmaster3_UserManual.pdf
    Pages 18 and 29 should be helpful.
    However, it might just be easier to encode to H.264.

  • CS3 Video Encoder w/ .m2v and .wav files

    Hello - I use the CS3 video encoder to prepare Youtube videos. I usually input an avi file and it works great. However, I have a few videos which are seperate video (.m2v) and audio (wav) files. What is the easiest way or best program to use to enable me to convert these files for use w/the video encoder?
    TIA

    Premiere & After Effects can do that, if you don't have these tools, then you can also use some free tools: check www.videohelp.com for more information on this.

  • Flash video encoder and OS X

    I have Adobe Flash CS3 Video Encoder on my Mac OS X and am trying to convert movies to .flv files. They encode OK, but on playback, the movies are consistently cut off in the middle. Seems other folks are having this problem and are getting no answers from Adobe. Some have said that Flash Video Encoder won't work correctly with Mac 0S X at all and that you have to encode videos using it on Windows. That seems ridiculous to me. Does anyone know anything about this or possible workarounds? Apparently, the issue also occurs with Flash 8's Video Encoder. I need to make flash videos!

    .

  • Adobe Video Encoder and FLash 9

    Hi,
    Any idea when Adobe Flash CS3 Video Encoder will get an
    update to have Flash Player 9 available as an output option?
    cheers

    David,
    I am afraid computer Software industry-wide you will find that. And the funny thing about it, is that many applications, Including Acrobat, got their start on the Mac Platform. Because unlike the windows crowd, that unwilling to use a new product with a new ideas, The Mac crowd, was willing to experiment.
    Acrobat originated on the Mac platform. Excel if you dig really deep in history got it start as the spreadsheet/database part of Appleworks for the Apple II. There are many others. Then once the PC crowd saw oh look at this thing that the Mac people are using, let's have it too. THen because the user base is larger, Mac R&D to keep up lagged because the money numbers isn't there.
    So if your a new, Mac user be prepared to be frustrated, because of disparities. Until the Mac user base gains a a 50/50 shar will things change.

  • DV Video Encoder -- Type 2 and AVI OpenDML Uncompressed

    Dear all,
    I've been converting a number of edited 1280 x 720 videos to DV PAL Standard with Adobe Premiere Elements 9.  However, when I was importing the output file to another DVD editor, some of the files have been rejected due to the follwing reasons hightlighted in red.  Do you know how o fix it?
    100 - DV Video Encoder--type 2, 24bits 720 x 576 4:3, PCM 48kHz 16bit Stereo
    101 - AVI OpenDML Uncompressed, 24bits 720 x 576    , PCM 48kHz 16bit Stereo
    102 - AVI OpenDML Uncompressed, 24bits 720 x 576    , PCM 48kHz 16bit Stereo
    104 - AVI OpenDML Uncompressed, 24bits 720 x 576    , PCM 48kHz 16bit Stereo
    105 - AVI OpenDML Uncompressed, 24bits 720 x 576    , PCM 48kHz 16bit Stereo
    106 - DV Video Encoder--type 2, 24bits 720 x 576 4:3, PCM 48kHz 16bit Stereo
    100P- DV Video Encoder--type 2, 24bits 720 x 576 4:3, PCM 48kHz 16bit Stereo
    101P- DV Video Encoder--type 2, 24bits 720 x 576 4:3, PCM 48kHz 16bit Stereo
    102P- DV Video Encoder--type 2, 24bits 720 x 576 4:3, PCM 48kHz 16bit Stereo
    104P- DV Video Encoder--type 2, 24bits 720 x 576 4:3, PCM 48kHz 16bit Stereo
    105P- DV Video Encoder--type 2, 24bits 720 x 576 4:3, PCM 48kHz 16bit Stereo
    106P- DV Video Encoder--type 2, 24bits 720 x 576 4:3, PCM 48kHz 16bit Stereo
    Thanks in advance.
    Best regards,
    Anson

    Unfortunately although you can apply all kinds of styles in the Forum editor, it throws most of them away when you post. So, as you will have seen, you cannot display in colour. If I recall correctly the only universal accepted features are normal, bold and italic. Depending on your browser smileys' may or may not work.
    What other editor gave you the errors?
    Are all those lines you've posted error messages?
    Can you load the offending clips into GSpot v2.60 and post the screenshots here.
    Cheers,
    Neale
    Insanity is hereditary, you get it from your children

  • How to get audio and video in one file

    hi...I am "premiere beginner user" and I have a simple question , I think..:)...Every time when I am trying to export media out from my premiere (using media encoder) I get video file and some audio files, but what I need is only ONE file, that contains both (audio and video)..for example , I have edited film in hd quality and I want it to export, I choose h.264 format(make some settings,pal,audio quality...) and when it´s "all done", only "result" I get is one video file and two audio files and they are not together :).....so please help mme anyone..ok and as you can see I am not from english speaking country, so sorry for my english...thanks for help....

    There is no indication that something is out-of-sync or the ability to move/slip into sync like in FCE/FCP.
    To the best of my knowledge, using match frame and overwriting the video with video+audio (or attaching it as a connected clip if you don't want to overwrite the video, which might have effects applied) is the best way to do this.
    You can use the blade tool first in the timeline to get just the section you want to get back into sync. Make cuts so that you have a single clip that needs the audio resynced, select it, and use shift-F to select the original synced clip in the Event Browser. Make sure your play head is at the beginning of this video clip in the timeline. Press option-2, then 'q', and now you have the audio back in your timeline, synced with the video.  Delete the old, no-longer-synced audio and you should be set.

  • DO i need some extra hardware interface for receving both Audio and video

    hi i m doing e-learning project. i have to capture video from webcam and voice from headphone and send to client.
    but my code is working fine for either one at a time.
    DO i need some extra hardware interface for receving both Audio and video. im using code AVTransmit and AVReceive found from this site only
    After running TX
    i give Dsound:// & vfw://0 in Media Locater only sound is received and no vedio
    and when i give vfw://0 in Media Locater only live video is transmited.
    im using JMF1.1.2e.
    if any one know the method to run or cause of it plz reply me soon. i will be very thankfull
    transmiter/server side code .first run TX on server
    import java.io.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.net.*;
    import java.util.*;
    import javax.media.rtp.*;
    import javax.swing.*;
    import javax.swing.event.*;
    import javax.swing.border.*;
    public class Tx extends JFrame implements ActionListener, KeyListener,
    MouseListener, WindowListener {
    Vector targets;
    JList list;
    JButton startXmit;
    JButton rtcp;
    JButton update;
    JButton expiration;
    JButton statistics;
    JButton addTarget;
    JButton removeTarget;
    JTextField tf_remote_address;
    JTextField tf_remote_data_port;
    JTextField tf_media_file;
    JTextField tf_data_port;
    TargetListModel listModel;
    AVTransmitter avTransmitter;
    RTCPViewer rtcpViewer;
    JCheckBox cb_loop;
    Config config;
    public Tx() {
    setTitle( "JMF/RTP Transmitter");
         config= new Config();
         GridBagLayout gridBagLayout= new GridBagLayout();
         GridBagConstraints gbc;
         JPanel p= new JPanel();
         p.setLayout( gridBagLayout);
         JPanel localPanel= createLocalPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.gridwidth= 2;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( localPanel, gbc);
         p.add( localPanel);
         JPanel targetPanel= createTargetPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 1;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( targetPanel, gbc);
    p.add( targetPanel);
         JPanel mediaPanel= createMediaPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 2;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( mediaPanel, gbc);
    p.add( mediaPanel);
    JPanel buttonPanel= new JPanel();
    rtcp= new JButton( "RTCP Monitor");
    update= new JButton( "Transmission Status");
         update.setEnabled( false);
         rtcp.addActionListener( this);
         update.addActionListener( this);
         buttonPanel.add( rtcp);
         buttonPanel.add( update);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 3;
    gbc.gridwidth= 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( buttonPanel, gbc);
         p.add( buttonPanel);
    getContentPane().add( p);
         list.addMouseListener( this);
         addWindowListener( this);
    pack();
    setVisible( true);
    private JPanel createMediaPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "Media Locator:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_media_file= new JTextField( 35);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_media_file, gbc);
         p.add( tf_media_file);
         tf_media_file.setText( config.media_locator);
         cb_loop= new JCheckBox( "loop");
         startXmit= new JButton( "Start Transmission");
         startXmit.setEnabled( true);
         startXmit.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 2;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( cb_loop, gbc);
         p.add( cb_loop);
         cb_loop.setSelected( true);
         cb_loop.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( startXmit, gbc);
         p.add( startXmit);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Source");
         p.setBorder( titledBorder);
         return p;
    private JPanel createTargetPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         targets= new Vector();
         for( int i= 0; i < config.targets.size(); i++) {
         targets.addElement( config.targets.elementAt( i));
    listModel= new TargetListModel( targets);
    list= new JList( listModel);
         list.addKeyListener( this);
         list.setPrototypeCellValue( "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
    JScrollPane scrollPane= new JScrollPane( list,
    ScrollPaneConstants.VERTICAL_SCROLLBAR_AS_NEEDED,
    ScrollPaneConstants.HORIZONTAL_SCROLLBAR_NEVER);
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( scrollPane, gbc);
         p.add( scrollPane);
    JPanel p1= new JPanel();
         p1.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_address= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_address, gbc);
         p1.add( tf_remote_address);
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_data_port, gbc);
         p1.add( tf_remote_data_port);     
    JPanel p2= new JPanel();
    addTarget= new JButton( "Add Target");     
    removeTarget= new JButton( "Remove Target");
         p2.add( addTarget);
         p2.add( removeTarget);
         addTarget.addActionListener( this);
         removeTarget.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.gridwidth= 2;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 20,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( p2, gbc);
         p1.add( p2);
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( p1, gbc);
         p.add( p1);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Targets");
         p.setBorder( titledBorder);
         return p;
    private JPanel createLocalPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         JTextField tf_local_host= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_local_host, gbc);
         p.add( tf_local_host);
         try {
    String host= InetAddress.getLocalHost().getHostAddress();     
         tf_local_host.setText( host);
         } catch( UnknownHostException e) {
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_data_port, gbc);
         p.add( tf_data_port);
         tf_data_port.setText( config.local_data_port);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Local Host");
         p.setBorder( titledBorder);
         return p;
    public void actionPerformed( ActionEvent event) {
    Object source= event.getSource();
         if( source == addTarget) {
         String ip= tf_remote_address.getText().trim();
         String port= tf_remote_data_port.getText().trim();
         String localPort= tf_data_port.getText().trim();
         addTargetToList( localPort, ip, port);
         if( avTransmitter != null) {
         avTransmitter.addTarget( ip, port);
         } else if( source == removeTarget) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              if( avTransmitter != null) {
         avTransmitter.removeTarget( target.ip, target.port);
              targets.removeElement( target);
              listModel.setData( targets);          
         } else if( source == startXmit) {
         if( startXmit.getLabel().equals( "Start Transmission")) {          
         int data_port= new Integer( tf_data_port.getText()).intValue();
              avTransmitter= new AVTransmitter( this, data_port);
         avTransmitter.start( tf_media_file.getText().trim(), targets);          
              avTransmitter.setLooping( cb_loop.isSelected());
         startXmit.setLabel( "Stop Transmission");
         } else if( startXmit.getLabel().equals( "Stop Transmission")) {
              avTransmitter.stop();
              avTransmitter= null;
              removeNonBaseTargets();
              listModel.setData( targets);
         startXmit.setLabel( "Start Transmission");          
         } else if( source == rtcp) {
         if( rtcpViewer == null) {
         rtcpViewer= new RTCPViewer();
         } else {
              rtcpViewer.setVisible( true);
              rtcpViewer.toFront();
         } else if( source == cb_loop) {
         if( avTransmitter != null) {
              avTransmitter.setLooping( cb_loop.isSelected());
    private void removeNonBaseTargets() {
         String localPort= tf_data_port.getText().trim();
         for( int i= targets.size(); i > 0;) {
         Target target= (Target) targets.elementAt( i - 1);
         if( !target.localPort.equals( localPort)) {
    targets.removeElement( target);
         i--;
    public void addTargetToList( String localPort,
                             String ip, String port) {     
    ListUpdater listUpdater= new ListUpdater( localPort, ip,
                                  port, listModel, targets);
    SwingUtilities.invokeLater( listUpdater);           
    public void rtcpReport( String report) {
         if( rtcpViewer != null) {
         rtcpViewer.report( report);
    public void windowClosing( WindowEvent event) {
         config.local_data_port= tf_data_port.getText().trim();
         config.targets= new Vector();
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( config.local_data_port)) {
              config.addTarget( target.ip, target.port);
         config.media_locator= tf_media_file.getText().trim();
         config.write();
    System.exit( 0);
    public void windowClosed( WindowEvent event) {
    public void windowDeiconified( WindowEvent event) {
    public void windowIconified( WindowEvent event) {
    public void windowActivated( WindowEvent event) {
    public void windowDeactivated( WindowEvent event) {
    public void windowOpened( WindowEvent event) {
    public void keyPressed( KeyEvent event) {
    public void keyReleased( KeyEvent event) {
    Object source= event.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
    public void keyTyped( KeyEvent event) {
    public void mousePressed( MouseEvent e) {
    public void mouseReleased( MouseEvent e) {
    public void mouseEntered( MouseEvent e) {
    public void mouseExited( MouseEvent e) {
    public void mouseClicked( MouseEvent e) {
    Object source= e.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              tf_remote_address.setText( target.ip);
              tf_remote_data_port.setText( target.port);
         int index= list.locationToIndex( e.getPoint());
    public static void main( String[] args) {
    new Tx();
    class TargetListModel extends AbstractListModel {
    private Vector options;
    public TargetListModel( Vector options) {
         this.options= options;
    public int getSize() {
         int size;
         if( options == null) {
         size= 0;
         } else {
         size= options.size();
         return size;
    public Object getElementAt( int index) {
    String name;
    if( index < getSize()) {
         Target o= (Target)options.elementAt( index);
    name= o.localPort + " ---> " + o.ip + ":" + o.port;
         } else {
         name= null;
         return name;
    public void setData( Vector data) {
         options= data;
         fireContentsChanged( this, 0, data.size());
    class ListUpdater implements Runnable {
    String localPort, ip, port;
    TargetListModel listModel;
    Vector targets;
    public ListUpdater( String localPort, String ip, String port,
                   TargetListModel listModel, Vector targets) {
         this.localPort= localPort;
         this.ip= ip;
         this.port= port;
         this.listModel= listModel;
         this.targets= targets;
    public void run() {
    Target target= new Target( localPort, ip, port);
         if( !targetExists( localPort, ip, port)) {
         targets.addElement( target);
    listModel.setData( targets);
    public boolean targetExists( String localPort, String ip, String port) {
         boolean exists= false;
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( localPort)
         && target.ip.equals( ip)
              && target.port.equals( port)) {          
              exists= true;
         break;
         return exists;
    >>>>>>>>>>>>>>>>>
    import java.awt.*;
    import java.io.*;
    import java.net.InetAddress;
    import java.util.*;
    import javax.media.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.TrackControl;
    import javax.media.control.QualityControl;
    import javax.media.rtp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.rtcp.*;
    public class AVTransmitter implements ReceiveStreamListener, RemoteListener,
    ControllerListener {
    // Input MediaLocator
    // Can be a file or http or capture source
    private MediaLocator locator;
    private String ipAddress;
    private int portBase;
    private Processor processor = null;
    private RTPManager rtpMgrs[];
    private int localPorts[];
    private DataSource dataOutput = null;
    private int local_data_port;
    private Tx tx;
    public AVTransmitter( Tx tx, int data_port) {
         this.tx= tx;
         local_data_port= data_port;
    * Starts the transmission. Returns null if transmission started ok.
    * Otherwise it returns a string with the reason why the setup failed.
    public synchronized String start( String filename, Vector targets) {
         String result;
         locator= new MediaLocator( filename);
         // Create a processor for the specified media locator
         // and program it to output JPEG/RTP
         result = createProcessor();
         if (result != null) {
         return result;
         // Create an RTP session to transmit the output of the
         // processor to the specified IP address and port no.
         result = createTransmitter( targets);
         if (result != null) {
         processor.close();
         processor = null;
         return result;
         // Start the transmission
         processor.start();
         return null;
    * Use the RTPManager API to create sessions for each media
    * track of the processor.
    private String createTransmitter( Vector targets) {
         // Cheated. Should have checked the type.
         PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         localPorts = new int[ pbss.length];
         SessionAddress localAddr, destAddr;
         InetAddress ipAddr;
         SendStream sendStream;
         int port;
         SourceDescription srcDesList[];
         for (int i = 0; i < pbss.length; i++) {
         // for (int i = 0; i < 1; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();     
              port = local_data_port + 2*i;
              localPorts[ i]= port;
              localAddr = new SessionAddress( InetAddress.getLocalHost(),
                                  port);
              rtpMgrs.initialize( localAddr);          
              rtpMgrs[i].addReceiveStreamListener(this);
              rtpMgrs[i].addRemoteListener(this);
         for( int k= 0; k < targets.size(); k++) {
              Target target= (Target) targets.elementAt( k);
              int targetPort= new Integer( target.port).intValue();
              addTarget( localPorts[ i], rtpMgrs[ i], target.ip, targetPort + 2*i);
              sendStream = rtpMgrs[i].createSendStream(dataOutput, i);          
              sendStream.start();
         } catch (Exception e) {
              e.printStackTrace();
              return e.getMessage();
         return null;
    public void addTarget( String ip, String port) {
         for (int i= 0; i < rtpMgrs.length; i++) {
         int targetPort= new Integer( port).intValue();
         addTarget( localPorts[ i], rtpMgrs[ i], ip, targetPort + 2*i);
    public void addTarget( int localPort, RTPManager mgr, String ip, int port) {
         try {
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         mgr.addTarget( addr);
         tx.addTargetToList( localPort + "", ip, port + "");
         } catch( Exception e) {
         e.printStackTrace();
    public void removeTarget( String ip, String port) {
         try {     
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTarget( addr, "target removed from transmitter.");
         } catch( Exception e) {
         e.printStackTrace();
    boolean looping= true;
    public void controllerUpdate( ControllerEvent ce) {
         System.out.println( ce);
         if( ce instanceof DurationUpdateEvent) {
         Time duration= ((DurationUpdateEvent) ce).getDuration();
         System.out.println( "duration: " + duration.getSeconds());
         } else if( ce instanceof EndOfMediaEvent) {
         System.out.println( "END OF MEDIA - looping=" + looping);
         if( looping) {
         processor.setMediaTime( new Time( 0));
              processor.start();
    public void setLooping( boolean flag) {
         looping= flag;
    public void update( ReceiveStreamEvent event) {
         String timestamp= getTimestamp();
         StringBuffer sb= new StringBuffer();
         if( event instanceof InactiveReceiveStreamEvent) {
         sb.append( timestamp + " Inactive Receive Stream");
         } else if( event instanceof ByeEvent) {
         sb.append( timestamp + " Bye");
         } else {
         System.out.println( "ReceiveStreamEvent: "+ event);
         tx.rtcpReport( sb.toString());     
    public void update( RemoteEvent event) {     
         String timestamp= getTimestamp();
         if( event instanceof ReceiverReportEvent) {
         ReceiverReport rr= ((ReceiverReportEvent) event).getReport();
         StringBuffer sb= new StringBuffer();
         sb.append( timestamp + " RR");
         if( rr != null) {
              Participant participant= rr.getParticipant();
              if( participant != null) {
              sb.append( " from " + participant.getCNAME());
              sb.append( " ssrc=" + rr.getSSRC());
              } else {
              sb.append( " ssrc=" + rr.getSSRC());
              tx.rtcpReport( sb.toString());
         } else {
         System.out.println( "RemoteEvent: " + event);
    private String getTimestamp() {
         String timestamp;
         Calendar calendar= Calendar.getInstance();
         int hour= calendar.get( Calendar.HOUR_OF_DAY);
         String hourStr= formatTime( hour);
         int minute= calendar.get( Calendar.MINUTE);
         String minuteStr= formatTime( minute);
         int second= calendar.get( Calendar.SECOND);
         String secondStr= formatTime( second);
         timestamp= hourStr + ":" + minuteStr + ":" + secondStr;     
         return timestamp;
    private String formatTime( int time) {     
         String timeStr;
         if( time < 10) {
         timeStr= "0" + time;
         } else {
         timeStr= "" + time;
         return timeStr;
    * Stops the transmission if already started
    public void stop() {
         synchronized (this) {
         if (processor != null) {
              processor.stop();
              processor.close();
              processor = null;
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTargets( "Session ended.");
              rtpMgrs[ i].dispose();
    public String createProcessor() {
         if (locator == null) {
         return "Locator is null";
         DataSource ds;
         DataSource clone;
         try {
         ds = javax.media.Manager.createDataSource(locator);
         } catch (Exception e) {
         return "Couldn't create DataSource";
         // Try to create a processor to handle the input media locator
         try {
         processor = javax.media.Manager.createProcessor(ds);
         processor.addControllerListener( this);     
         } catch (NoProcessorException npe) {
         return "Couldn't create processor";
         } catch (IOException ioe) {
         return "IOException creating processor";
         // Wait for it to configure
         boolean result = waitForState(processor, Processor.Configured);
         if (result == false)
         return "Couldn't configure processor";
         // Get the tracks from the processor
         TrackControl [] tracks = processor.getTrackControls();
         // Do we have atleast one track?
         if (tracks == null || tracks.length < 1)
         return "Couldn't find tracks in processor";
         // Set the output content descriptor to RAW_RTP
         // This will limit the supported formats reported from
         // Track.getSupportedFormats to only valid RTP formats.
         ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
         processor.setContentDescriptor(cd);
         Format supported[];
         Format chosen;
         boolean atLeastOneTrack = false;
         // Program the tracks.
         for (int i = 0; i < tracks.length; i++) {
         Format format = tracks[i].getFormat();
         if (tracks[i].isEnabled()) {
              supported = tracks[i].getSupportedFormats();
              // We've set the output content to the RAW_RTP.
              // So all the supported formats should work with RTP.
              // We'll just pick the first one.
              if (supported.length > 0) {
              if (supported[0] instanceof VideoFormat) {
                   // For video formats, we should double check the
                   // sizes since not all formats work in all sizes.
                   chosen = checkForVideoSizes(tracks[i].getFormat(),
                                       supported[0]);
              } else
                   chosen = supported[0];
              tracks[i].setFormat(chosen);
              System.err.println("Track " + i + " is set to transmit as:");
              System.err.println(" " + chosen);
              atLeastOneTrack = true;
              } else
              tracks[i].setEnabled(false);
         } else
              tracks[i].setEnabled(false);
         if (!atLeastOneTrack)
         return "Couldn't set any of the tracks to a valid RTP format";
         // Realize the processor. This will internally create a flow
         // graph and attempt to create an output datasource for JPEG/RTP
         // audio frames.
         result = waitForState(processor, Controller.Realized);
         if (result == false)
         return "Couldn't realize processor";
         // Set the JPEG quality to .5.
         setJPEGQuality(processor, 0.5f);
         // Get the output data source of the processor
         dataOutput = processor.getDataOutput();
         return null;
    static SessionAddress destAddr1, destAddr2;
    * For JPEG and H263, we know that they only work for particular
    * sizes. So we'll perform extra checking here to make sure they
    * are of the right sizes.
    Format checkForVideoSizes(Format original, Format supported) {
         int width, height;
         Dimension size = ((VideoFormat)original).getSize();
         Format jpegFmt = new Format(VideoFormat.JPEG_RTP);
         Format h263Fmt = new Format(VideoFormat.H263_RTP);
         if (supported.matches(jpegFmt)) {
         // For JPEG, make sure width and height are divisible by 8.
         width = (size.width % 8 == 0 ? size.width :
                        (int)(size.width / 8) * 8);
         height = (size.height % 8 == 0 ? size.height :
                        (int)(size.height / 8) * 8);
         } else if (supported.matches(h263Fmt)) {
         // For H.263, we only support some specific sizes.
         if (size.width < 128) {
              width = 128;
              height = 96;
         } else if (size.width < 176) {
              width = 176;
              height = 144;
         } else {
              width = 352;
              height = 288;
         } else {
         // We don't know this particular format. We'll just
         // leave it alone then.
         return supported;
         return (new VideoFormat(null,
                        new Dimension(width, height),
                        Format.NOT_SPECIFIED,
                        null,
                        Format.NOT_SPECIFIED)).intersects(supported);
    * Setting the encoding quality to the specified value on the JPEG encoder.
    * 0.5 is a good default.
    void setJPEGQuality(Player p, float val) {
         Control cs[] = p.getControls();
         QualityControl qc = null;
         VideoFormat jpegFmt = new VideoFormat(VideoFormat.JPEG);
         // Loop through the controls to find the Quality control for
         // the JPEG encoder.
         for (int i = 0; i < cs.length; i++) {
         if (cs[i] instanceof QualityControl &&
              cs[i] instanceof Owned) {
              Object owner = ((Owned)cs[i]).getOwner();
              // Check to see if the owner is a Codec.
              // Then check for the output format.
              if (owner instanceof Codec) {
              Format fmts[] = ((Codec)owner).getSupportedOutputFormats(null);
              for (int j = 0; j < fmts.length; j++) {
                   if (fmts[j].matches(jpegFmt)) {
                   qc = (QualityControl)cs[i];
                   qc.setQuality(val);
                   System.err.println("- Setting quality to " +
                             val + " on " + qc);
                   break;
              if (qc != null)
              break;
    * Convenience methods to handle processor's state changes.
    private Integer stateLock = new Integer(0);
    private boolean failed = false;
    Integer getStateLock() {
         return stateLock;
    void setFailed() {
         failed = true;
    private synchronized boolean waitForState(Processor p, int state) {
         p.addControllerListener(new StateListener());
         failed = false;
         // Call the required method on the processor
         if (state == Processor.Configured) {
         p.configure();
         } else if (state == Processor.Realized) {
         p.realize();
         // Wait until we get an event that confirms the
         // success of the method, or a failure event.
         // See StateListener inner class
         while (p.getState() < state && !failed) {
         synchronized (getStateLock()) {
              try {
              getStateLock().wait();
              } catch (InterruptedException ie) {
              return false;
         if (failed)
         return false;
         else
         return true;
    * Inner Classes
    class StateListener implements ControllerListener {
         public void controllerUpdate(ControllerEvent ce) {
         // If there was an error during configure or
         // realiz

    I do this all the time, I put my MBP to a 60 inch Sharp. If you have the video working do the simple thing first. Check to make sure your sound is on your TV and Mac. Then if that doesn't work go to System Prefrences and under sound go to a tab called Output and see if your TV is listed and if it is change it to that setting
    Hope It Works

  • I have updated my computer to iTunes 10.5 and my iPhone 3Gs to OS 5 and it has wiped my music, photos and video etc.  I want to load my data from my wifes computer and it refuses because that computer does not have OS 5..Help

    I wanted to upgtade my iPhone software to Ver5 but it said I had to upgrade iTunes to Ver10.5.  Once the updates were complete I saw that all my music,photo' and video etc had been erased from both my iTunes AND my iPhone!!!!
    I tried connecting to my wifes computer and her iTunes in order to back up my music etc and it will not allow it and states that my iPhone needs iTumes Ver10.5!!
    This is very inconvenient and I am scared that if I update my wifes computer to Ver 10.5 we will lose that data also and as her phone is an older 2G iPhone, Ver10.5 may not work there!!
    How can I get her music etc onto my iPhone 3Gs Ver5 as ALL of my data is now lost?

    i had the same problem but i just got a new video converter and made it convert videos to ipod/iphone format and then put them into itunes. i would suggest GOM encoder as the encoder can convert them and then once they're finished converting, put them automatically into your itunes for you. from there you can just sync them to your ipod.

  • Using Premiere CC and Media Encoder CC at the same time

    Is this possible?  Doesn't seem to be.  Media Encoder bails on every job as soon as you try to render anything in Premiere.  I guess they don't play nice together, which is surprising.  I was hoping that Media Encoder would pause and wait for Premiere to finish a render.  Instead it dies on every render job, creating 0 size video files... while making a sheep bleating sound.
    Is this a GPU issue?  I have 2 nVidia cards, a 770GTX and 560GTX. (Windows 8.1, 16gb RAM, 3770k CPU)

    As soon as I initiate a render in Premiere, Media Encoder bleats, and all subsequent render jobs bleat and produce 0 size video output files... even if I quit the Premiere render before Media Encoder reaches some of those render jobs.
    I suspect this is a GPU sharing issue.... but can't be certain.
    What do you mean by new install?  Did I install it (AME or Premiere) recently? No.  I've had Creative Cloud on my machine for almost a year, and both Premiere and Media Encoder have been updated periodically.
    To my knowledge they have never worked together, but I only started using the queue feature within Premiere recently, so I haven't used AME until recently.

  • Premier Pro CC and Media Encoder crashing on long file export.

    Please forgive the length of this inquiry. But it is complicated and has a lot of variables.
    I have an hour long mixed format timeline both Red 3K footage and Black Magic ProRes 1080P footage that I want to make a DVD from and it takes forever and crashes half way through. I have tried doing it every way possible and the only way I was able to get even a partial export was using the "MPEG 2 DVD" export option in Premier CC. It goes fast at first telling me it will take about 2 hours but even though it starts quickly doing about 15 fps it slows down to less than 1 fps and finally just stops and crashes my Cuda Core enabled Macbook Pro Laptop.
    I really don't know who to ask about this since there are so many different variables. I never had a problem like this on my MBP with PP6 on a 4K timeline exporting to DVD or Blue Ray or to H-264 or Even Prores. But I wasn't mixing formats either. But on this project the first 20-30 minutes of footage goes through fine in a couple of hours but something seems to be bogging down the system on longer projects. My work around was to break the project up into multiple exports and combining them on the DVD timeline but I can't do this on every project. I thought maybe it was a corrupted clip somewhere in the timeline but when I broke the timeline up into several parts I had no problems.
    I did notice a dramatic variance in the time it took to export footage based on the sequence settings. A one minute timeline comprised Red 3K footage and ProRes 1080P footage Here are some sample export results:
    From a Red raw 3K timeline 2880x1620  with 1080P Prores footage upscaled 150%)
    to .m2v Mpeg 2 720x480 format    1:48 Minutes ***
    to .mov  Prores 422HQ  1080P                         22 Minutes
    to .mov H.264  1080P                    22 Minutes
    to .m2v Mpeg 2 1080P                                                22 Minutes
    From a 1080P timeline Prores footage at 100%  (Red footage downscaled to 67%)
    to .m2v Mpeg 2 720x480 format                         13 Minutes
    to .mov  Prores 422HQ 1080P                             13 Minutes
    to .mov H.264  1080P                                                             14 Minutes
    to .m2v Mpeg 2 1080P                                                            14 Minutes
    Since I am going to DVD,  I chose the fastest option  Red to m2v Mpeg 2 720x480 format which only took 1:48 per minute. Why? I have no idea. Maybe someone here knows the answer. But again these numbers are for a ONE MINUTE clip when I tried to convert the whole hour timeline to  the 720x480.m2v DVD File the Red timeline froze my computer after several hours even though I had set sleep mode to “Never” and turned the display off manually and I had to convert it in multiple chunks. In 20-30 minute chunks it did go fast about 1:48 per minute like the test indicated. But I would like to be able to convert the whole hour and not to have to try to stitch the clips together in my DVD authoring program.
    I don’t want to have to avoid using multiple formats on the same timeline or waste time transcoding everything to the same format.
    Can anyone give me any suggestions as to why the computer is crashing or bogging down in long exports and what timeline settings I should be using?
    I have plenty of room on my Scratch Disk and 8 Gig of RAM and according to my Activity Monitor during the export process  have about 800 Meg of System Memory during export but the CPU Usage is 98% and in my Activity Monitor I see under “Process Name” ImporterRedServer flashing up at the top of the list about every second during export.
    Why .720x480 mpeg DVD setting is so much faster than all the other export codecs?
    And if there are any other settings I can use to get higher speed exports in HD? 14x1 seems a little slow especially since every time it crashes I have to start over.

    Hi,
    Since a long time I did not encounter the crash of Media encoder with
    long file export.
    But I take care of a lot of details before the export :
    - clean up Media cache and Media Cache Files within PP CS6 and/or CC,
    - delete all render files with Preview Files,
    - rendering of sequences until the rendering is completed,
    - PP CS6 and/or Pp CC opened during export to AME and/or Encore CS6,
    I guessed to leave Adobe but this company remains the last with a
    serious authoring software (Encore CS 6).
    I tested :
    - Avid Media Composer, Final Cut Pro X, both are very good software but
    no really authoring software for these video editing softs.
    Th only way is to "flood" Adobe with "bug reports" and "wish forms"
    which are not very easy to use.
    And also to call the Adobe support day by day until they send they
    escalate the bug.
    Le 16/05/2014 16:51, alainmaiki a écrit :
    >
          Premier Pro CC and Media Encoder crashing on long file export.
    created by alainmaiki <https://forums.adobe.com/people/alainmaiki> in
    /Premiere Pro/ - View the full discussion
    <https://forums.adobe.com/message/6384665#6384665>

  • Premiere Pro CS6 and Media Encoder CS6 Color shift on export

    Hello,
    I am working on an early 2008 Mac Pro with 2 X 3.2 GHz Quad Core Intel Xeon Processors 32 GB of RAM a NVIDIA GeForce 8800 GT 512 MB Graphics Card and I am running OS X 10.8.2.
    I have the updated Version of Adobe Premiere Pro CS6 and Media Encoder CS6 through the awesome Subscription program that Adobe has started.
    My footage is shot with the Canon 1D Mark IV shot 1920X1080 at 24 FPS
    I am importing the camera files and then exporting them after I cut them up in to Quicktime Pro Res 422 HQ files
    With the technical explination out of the way here is my problem. I have color right from the camera files that I like. and with out doing any thing to change the color I export the footage and get a dramatic color shift. The exported file looks desaturated and slightly green.
    I have tried using Media Encoder CS6 to export my footage and that has the same result.
    I started my career as a photographic retoucher and use photoshop and light room constantly so I am pretty confident that i know color. I also calibrate my Lacie 526 and Lacie 324i montiors monthly as well as have a sensor that will slightly shift the profile depending on the time of day.
    I have attached three screen shots that show the color shift exactly. Now what I need is a solution. What I am not seeing is color settings for Premiere. I am totally open to any suggestions on what I might be doing incorrectly.
    Ben

    Benjamin Peterson wrote:
    I started my career as a photographic retoucher and use photoshop and light room constantly so I am pretty confident that i know color. I also calibrate my Lacie 526 and Lacie 324i montiors monthly as well as have a sensor that will slightly shift the profile depending on the time of day.
    I've been doing this for a few years now, and have come to some conclusions. Basically, the difference between still photography and video is like the difference between playing the trumpet and the sax. What you bring to the sax from playing trumpet is your ability to read music, what you know about composition, blending while playint with others, ect. But playing trumpet tells you nothing about how to physically play a sax. So it is with still photography and video.
    For starters, your carefully calibrated computer monitor is just that, a computer monitor. It doesn't display the correct working space for video. What you need for WYSIWYG in video is a monitor that can show you the Rec.709 working space. Rec.709 has a different gamut, a different gamma, probably a different white point (D65), different phosphor colors, etc., etc., etc. Enought differences that it's difficult to make a computer monitor that can successfully display Rec.709. Yet there are a couple of computer monitors that can do this (a few Eizos, one HP). The vast majority of people doing serious color correction work in video use a production monitor for just this reason.
    But a production monitor isn't enough. You also have to make sure that the signals you're sending the production monitor are correct. There's a sub-industry making signal converters for signals from NLE video cards (usually RGB based) -> signals for production monitors (usually YUV based).
    That said, it is certainly possible to get a very good match between your NLE suite and a DVD / BD as displayed on an HDTV. But it's not as simple or easy as a calibrtated computer monitor for still photography use.
    This is probably part of your problem. But Quicktime is probably the root of it. Quicktime is, well, I guess the polite way of saying it is that Quicktime is problematic. It gives all kinds of problems. Most people have abandoned it. You should too.

  • Premiere and Media Encoder will will no longer export my edit to any file format.

    This is on a Windows 8.1 PC.
    The edit exported fine until I made one big mistake.
    I created a Nested Edit to make a quick alteration.
    Since then the following is now the case and I cannot go back to an earlier edit to fix this.
    Premiere and Media Encoder will will no longer export my edit to any file format. Using either the Export Option or Media Encoder.  Get a dialog that says "Unknown Error Compiling Movie"
    There is nothing wrong with my footage.  This edit worked fine and I export through Media Encoder many times until I made a Nested Edit within the Premiere Project.
    I have tried all sorts of searches on the net for a fix to this but nothing seems to be current or indeed works.
    Any help would be appreciated as this is a show stopper.

    Thanks for the reply Mark, appreciated.
    I have found a work around but it still leaves a long term issue for Adobe to fix.
    As to your first solution:
    1).  The nested edit is already in its own timeline panel.  It will not export or encode via media encoder.  This is the problem.
    2).  I will try the KEM roll feature but I have a work around see following.
    3).  No I do not have a nested edit within a nested edit.
    I did a lot of research on the net on this issue as it happened to a colleague of mine on his machine about two weeks ago.  It took me six hours to get a fix and in his case there were six video clip that had read errors.  When I overwrote those clips from the original backup master it fixed the problem.  However I was never able to use Media Encoder again to export his project without it failing.  I could only use the Export feature.  Again he had used a nested edit in his project.
    I my case I do not have any nested edits within a nested edit.  All I did was drag my main edit onto the New Sequence icon at the bottom of the Project Bin.  This of course created a nested edit in a new Timeline Panel which is what I wanted.  I then slid the entire nested edit to the right in the timeline by two seconds as all a wanted to do was add two seconds of black at the head of the edit.  That was it.  From that point of creating the nested edit Media Encoder and the Export option failed every time with the fore mentioned error.
    The final work around was one of those I found previous mention of during my research.  I created a new clean Premiere Pro CC 2014 project, I then used the import option under the file menu.  I imported only the original main edit which brought in that edit along with all the required footage.  This would then export to disk using the Export feature.  However, Media Encoder will still not encode this new edit, it fails the same way every time, 'Unknown Error Compiling Movie'.
    As a test I loaded the old corrupt edit project back up and deleted the nested edit sequence completely and re-saved it.  This had no effect and did not fix the problem.  This particular project file will no longer export directly or via media encoder at all no matter what I do.
    This seems pretty serious to me.  I worked on this edit for a whole month without issue, encoding out roughs as a went without any issue, it was only yesterday after I made the nested edit that the project file is now useless as it will not encode out at all.  Yes I have a work around and was able to salvage my project and get it rendered in time for my deadline but the fact I have a partially working Project that Media Encoder will not touch does not instill me with confidence for long term use of Premiere Pro if this is going to happen again.
    Is there some place I can send this project file to Adobe to see if they can find out what is wrong with it?  I am not the only person having this issue I have seen other post going all the way back through CS6 it seems.

Maybe you are looking for

  • A question about AE (and Motion)

    First, I'm asking about AE as I know a lot of PRO's are in here. My question is, getting started with it, I opened a template on a Macbook Pro, 2.4 and I remember someone from Adobe sayintg this should not be the case but what happens is, when I hit

  • RMBP 15" right channel output not working properly

    Hello, I recently purchased a Retina Macbook Pro 15" no more than 3 months ago. Today I just noticed that whenever I connect my headphones I can hear the left channel clearly but the right channel is a lower volume and distorted. at first I thought i

  • Check complete. The overall result of this check is: Failed ASM ins

    Need steps to clean that plz Check complete. The overall result of this check is: Failed <<<< Problem: The installer has detected that you may have an Automatic Storage Management (ASM) instance improperly configured or one that was not properly clea

  • Query failing in Developer Forms 2k

    This Query is working fine in Oracle 7i,( Start with and connect by) but giving out a compilation error in Developer form 2k. select count(*) ID INTO lv_no_ldz from points where id in (SELECT DISTINCT ID FROM POINTS WHERE GROUPED_BY_PT_ID = 'LDZ' STA

  • Ztable modification problem

    hi, i have 3 fields . 3 fields are  primary key fields only. i modified 3 fields in ztable. now i want change 3 field  this time record is coming new record. but i want over right the first record.