Sync, timecode, ensemble, DA88, RME ADI-192DD

I'm facing a rather complicated set-up to get a job done and I think I need some confirmation that all will be right.
I am about to put together a film soundtrack. I have the sound on a DA88 8track tape in the form of 4 stereo stems, and the pictures are in the form of an uncompressed, 10bit QT file.
The pictures and sound all came from a post production house so I am assuming that they have been put together professionally with timecode sync etc.
I use an Ensemble interface which has an adat lightpipe but no Tascam style TDIF. I have acquired an RME ADI-192 DD box to convert TDIF to ADAT. I don't have any wordclock cables so I have set the ensemble to clock from its ADAT input (from the RME) and set the RME to clock from the tascam DA88. All sync lamps seem happy so I presume I have a nicely syncronised system. Can anyone confirm I have done things right and that this configuration should work fine?
Now timecode...
The timecode on the da88 tape should line up with the picture timecode. I want to, ideally, record the audio with timecode from the tape to make picture sync easier. I see that the DA88 has timecode output on a phono style plug. Is it just a case of connecting this to my Unitor timecode input and setting Logic to read external timecode?
Will this configuration result in my having all 8 digital audio tracks with the correct timecode "stamp" referenced from the tape?
A good brain bender

Oh well, I'll answer my own question then.
The answer to the main question towards the end of that post is "Yes".
I now have my audio in perfect accurate sync with the video.
merry christmas.

Similar Messages

  • Logic 8 and Digi 003 with RME-ADI-2 as DA/AD converters.

    I am very new to this so, I'll do my best at explaining what I am trying to accomplish. I am running Logic 8 with a Digi 003 as my interface (connected by firewire). I just got an RME ADI-2 converter (connected to 003 via optical i/o). I have the outputs of the converter working fine to my monitors, but I can not figure out what I need to do to use the RME as a source for inputs. In Logic preferences for Audio Hardware, it does not give me the option for using the RME (only the 003).
    Any help on this would be great. Thanks in advance.

    Usually you'd create an aggregate device (Utilities Folder - Audio/MIDI set up) to create a new virtual device, into which you can combine two or more devices, including built-in audio I/O.
    But although I'm not certain (as I don't use an 003) I don't think you can create an aggregate device with an 003 (no harm in trying, though).
    As your RME is already working fine connected to the 003 via optical, presumably as the 003's outputs 9/10 (S/PDIF?), wouldn't the inputs be available also on 9/10? Doesn't the 003 support ADAT - in which case the RME's ins/outs should show up, but as part of the 003 if you cabled the RME to the 003 via the ADAT connections.
    Sorry to not have a definitive answer, but maybe try some of the above variations. Maybe an 003 user will chime in.

  • Connecting synths directly to Ensemble or RME FF800

    Is it possible to hook up synths directly to an Apogee Ensemble inputs 5-8 and have enought gain, or is a mic-pre necessary to achieve a healthy signal?
    Save question for RME FF800 inputs 5-8.
    thank you
    12" powerbook   Mac OS X (10.4.8)   1 GB RAM

    Synths usually just output a standard line level, no mic preamps are necessary.
    Connect away....

  • How do I sync timecode between 2 Canon 7d Mark II Cameras?

    I have 2 canon EOS 7d Mark II cameras.  I want to do a multi-camera shoot and have one camera film the action and have the 2nd camera filming the 1st camera and the set for a "how we did it" video.  How do I keep the 2 cameras in sync for sound?  I figure I need to use timecode, but have never done it...
    Follow-up question: I used Adobe premiere to edit in....  What do I need to do in Premiere to line up the 2 clips and is any "correction" needed for "audio drift"?
    Thank you,
    SG

    First, I'm shooting stills, not video, so I don't know if this will work as well for you. But I sync my cameras shortly before any multi-camera shoot so that the stills will be in exact order taken when I go to edit them.
    What I do is attach the camera to one of my computers (desktop of laptop) with the mini-USB cable. Then start EOS Utilities (might start automatically, depending upon your setup). Now click on the little tab with the "wrench" icon, then click on the date/time section that appears. This opens a dialogue box that has a button to set the time to match the computer's clock exactly.
    Do this with each camera and they are all exactly in sync.
    The cameras do drift out of sync rather quickly. I guess their internal clocks aren't all that precise from one camera to the next. For my purposes shooting stills, I would usually re-sync them after a few days, maybe a week at most. For your purposes shooting video that has to exaclty match you might need to do it shortly before each shoot.  
    It probably also can be done through GPS or remotely if using a WFT module, but I don't use those currently.
    Alan Myers
    San Jose, Calif., USA
    "Walk softly and carry a big lens."
    GEAR: 5DII, 7D(x2), 50D(x3), some other cameras, various lenses & accessories
    FLICKR & PRINTROOM 

  • Logic & Apogee Ensemble or RME Fireface 800?

    Which of these two would be the better choice at the present time?
    Its very hard to get an unbiased opinion on this. What I'd like to know from users of both is:
    1) What buffer size they're using (with what machine) and how much perceptible latency there is?
    +Buffer size is more frequently the minor part of the overall latency, the majority generally caused by the whole DAC/ADC, firewire and everything else, so the perceptive latency is really the whole latency and whether this would be noticeable on e.g. a Mac Pro Octacore with either unit.+
    2) What the audio quality is like, is there a substantial difference between these two units? Specifically on the ADC is one better than the other? - I know that on paper the RME has a lower noise floor and greater dynamic range, but some people are saying that the Apogee still has the better ADC/DAC (confused).
    3) How stable are they? Any driver issues? How is their support?
    4) I've read on the forums that the Apogee can hum or hiss for some people even with nothing connected, and for some of those that don't they instead get a hiss when audio is being played through followed by an audible "pop" of a noise gate when the audio stops.
    Can anyone say anything further on this? Is it loud or really a barely noticeable thing that only someone with very trained ears would pickup? Does the RME also suffer from this?
    5) And finally, what's your opinion of the preamps on each?

    The FF400 does look good, but I'm more inclined to go with the FF800 because of the extra inputs and pre's.
    Also I'm still somewhat confused as there seem to be so few reviews out there of the Apogee. I'd love to really know what the pro's cons of each unit when working on a Mac with Logic are, and whether there's a significant difference in quality between the two (like enough to justify the extra on the Apogee, or if conversely maybe I'd be paying more for the brand than the actual content if I went with it). Both units look superb, the RME is older, but that doesn't seem to always mean inferior, it's very hard to decide :/
    Message was edited by: Per-Anders

  • Multicam media with sync timecode?

    Hi friends.
    Anyone here have any timecode-synchronized multicamera media at your
    disposal that I could get for some testing?
    Thanks.

    Hey Wes...what are you looking for? I could get you some. Email me off list (in profile).

  • RME out-of-sync summary

    Hi there experts,
    so someone have an idea to get the out-of-sync summary from the RME in cisco works in an email (pdf eg.)?
    Well i thought to find an job to do so but well i think i am blind ...
    thx 4 feedback
    Rob

    Hi Robert,
    Unfortunately, there is no such way to get the info in email or pdf format at the moment.
    Many Thanks,
    Gaganjeet

  • FCP Sound Syncing Question

    Hey Everyone,
    I have an audio syncing question for you. Any help would be greatly appreciated!
    I am cutting in FCP. I logged and captured my footage and with timecode breaks, etc. the tape was cut up, and I was left with eight files, let's call them Footage1-Footage8. I have completed my cut of the piece using the in-camera audio that was linked to the video on the DV tape that I imported. Our sound guy in the field used a DAT recorder to record the production audio and apparently captured the timecode to the audio files, so that it matches the video (i.e. the DAT recorder and our cameras - DVX-100s had synced timecode during the shoot). The DVD he gave me has 20 audio files as .wav, let's call them Audio1-Audio20. My question is, now that I have finished my cut, I would like to replace the current audio tracks with the production audio. How can I sync the audio with the video and replace it in what I've cut. Or perhaps I can't, in which case - how can I sync them up and THEN cut, but I'd really rather have to not do that, lol!
    Thanks for your advice!

    Sorry about the long winded post, though.
    I guess my real question, summarized is - is there a way to sync separately recorded production audio (that has timecode) with the footage that I shot (that has the same timecode as the audio, but also has in camera audio that accompanies the video track - I would imagine I can just "cut" the extra audio tracks, however).
    I'm at my bay trying to figure this out, and I've got nothing, and the people I work for don't want to spring for that bwf2xml program you were talking about What's the emote for pulling one's hair out?

  • RME - Failed to get error string

    Constantly getting errror "Failed to get error string from logger 6203" whenever trying to retrieve Startup/Running out of sync report.
    Context: RME > Configuration Mgmt > Startup/Running out of sync report.
    Any one has ever encountered such a problem?

    Hi
    Has the default location of the config files changed from:
    Windows: C:/PROGRA~1/CSCOpx/files/archive/config
    Solaris: /var/adm/CSCOpx/files/archive/config
    to some other directory or path?
    If Solaris are the permissions ok on
    /var/adm/CSCOpx/files/archive/config
    They should be 750 on the directories in here and then the .cfg files these directories should be 640
    Also, the owner of the files and directories should be user casuser with group casusers and not root
    David

  • Turning off Audio Sync

    Is there an easy way to turn off an individual timeline clip's desire to sync back up with its audio once I have deleted its audio? All my clips have large red sync timecodes on them and the "move into sync, slip into sync" dialogue gets in the way of the regular right-click dialogue box when I am trying to add composite effects to them.
    This is very annoying. Just looking for a quick setting I can change to alleviate these clips' need to sync.

    thanks Tom! You still manage to teach me.
    I took a short advanced edit workshop summer class from you about 7 years ago at Stanford...

  • Multicam editing without timecode and audio?

    Hello!
    I am editing a multi-cam interview with the unfortunate issue of 1. having no audio recorded from the secondary camera, and 2. the camera operators did not sync timecode or set the right time/date (DSLRs) and appear to have started/stopped rolling at different times, breaking up the interview into short clips...so I've been trying to figure out, what is the best way to go about this??  Is it possible to use the multi-cam edit without audio, and if so how do I get around it being broken into small clips that don't quite match?
    Any help would be much appreciated!
    -caleb

    Hi caleb116,
    Welcome to the Support Communities!
    The multicam editing feature in FCPX will help you work around these obstacles.  The link below explains the multicam editing workflow:
    Final Cut Pro X Help: Multicam editing workflow
    http://help.apple.com/finalcutpro/mac/10.1/#ver10e087fd
    Having the date, time and time zone set on your recording devices before the shoot is helpful for FCPX to create multicam clips automatically.  Normally, you can sync automatically using the audio in your clips.   However, since you don’t have that option in this case, you can just sync from the start of each clip and then enter the multicam editor and manually line up the angles.
    Final Cut Pro X Help: Assign camera names and multicam angles
    http://help.apple.com/finalcutpro/mac/10.1/#ver26f5c183
    Final Cut Pro X Help: Create multicam clips in the Browser
    http://help.apple.com/finalcutpro/mac/10.1/#ver23c764f1
    Final Cut Pro X Help: Sync and adjust angles and clips in the Angle Editor
    http://help.apple.com/finalcutpro/mac/10.1/#ver23c76b1a
    In addition to the Final Cut Pro X Help manual, there are additional resources, including web based tutorials, to explore multicam editing.  Check them out here:
    Apple - Final Cut Pro X - Resources
    http://www.apple.com/final-cut-pro/resources/
    I hope this information helps ....
    Have a great day!
    - Judy

  • Audio Timecode changes when match framing

    I'm working in Premiere CC v7.2.2. While working with synced audio, I've noticed that when matching back from audio in my sequence, the timecode displayed in my source monitor is off by a matter of minutes. Each .wav file I'm using has about 10 tracks of audio. This is slowing our workflow considerably and could cause problems for us down the line. Has anyone else experienced this issue? Any help would be appreciated!

    kmbarry23_,
    Thank you for snapping photos to clarify this problem.
    Richard, by default, the "audio time units" or samples is unchecked in the Program Monitor, and the affected timecode to the right of the Program Monitor's wrench/spanner refer to the last frame of media in the sequence, not the timecode of the clip nor audio file your playhead is parked on. The problem is that the dual system audio Overlay timecode as kmbarry23_ has revealed is always displayed in "audio time units" or samples. The only preference in Overlay settings is "source timecode," so there is nothing to change.
    The camera timecode overlay displays correctly, so you have to match frame on your clip, make a marker, and then match frame on your audio, type in the camera timecode, make a marker, and line to two up in the sequence; not too shabby, but it defeats the efficiency and usefulness of overlays. Coupled with the Indeterminate Media Timebase preferences under "Media" always defaulting to 29.97 and the random glitch of drop frame defaulting in the source monitor, it could be confusing how Premiere was really reading the field recorder's timecode.
    Perhaps not all editors are experiencing this phenomenon depending on whatever field recorder was used, but I am still experiencing this problem in CC 2014 (last two week's big update) with jam-synced Ambient timecode from a Sound Devices 744t into a Sony F55 as well as a Red Epic.
    Yes, Pluraleyes is very popular, but not without its own set of glitches. I'd prefer to use Overlay settings in Premiere for syncing when the effort of jam-synced timecode was administered. I'll submit a bug report.

  • Subframe Audio Sync Question

    I'm syncing audio recorded from a separate source, and the shot is slated.
    I want to sync the waveform to picture as close to 1/100th as possible.
    I didn't think I would ever need to ask a question like this, but there's a debate brewing here, so here it goes. The question is more about how FCP works and how it plays the single frame, where when zoomed all the way in and you can see the dark area representing that single frame. The playhead shows the beginning of the frame, but doesn't it play through the audio waveform of the single frame while that frame is "paused" for 1/24th of a second?
    There are two theories we're seeing out there; one is to sync the waveform at the frame where you see the slate is closed, even though it happened in the space between the previous frame. The other is that you can reasonably estimate where the slate would have hit between frames and allow the waveform to land just before the frame where the slate is closed.
    For example, say you have a frame where the slate is just about to hit and it was moving fast, so fast there's motion blur, but there's a small enough space where you know the sound hasn't happened yet, maybe only a centimeter or less. Obviously the next frame shows the slate closed, and you can guess the slate actually touched between the time each frame was exposed, probably even closer to the previous frame.
    So, where exactly to place the waveform? At the beginning of the frame where it's closed, or at the end of the previous frame where it would have happened in real life?
    Here's a silly attempt at an illustration of the dark area representing a single frame on the timeline and the waveform underneath:
    {Single frame}{Single frame}
    .................||......................
    {Single frame}{Single frame}
    .....................||..................
    (The || represents the waveform generated by the slate.)
    If anyone could weigh in on this and settle the debate we would appreciate it! Thanks in advance.

    I've dealt with this issue in some detail when assisting and I'd say your thinking is sound (no pun intended) regarding the way fcp displays a single frame.
    We were moving to a deva recorder using synced timecode and were getting consistently confusing results where the audio seemed to be in advance of the picture. When we tested by recording directly to the camera we found that the behaviour we were seeing was normal ie the audible clap often occurred on the frame before the sticks were visibly closed. Obviously on previous series, where we'd used the sound recordist's stereo mix track from the camera tape, we'd never had any reason to pay attention to the sync, we weren't doing any syncing so we just assumed it was right. When I looked back at those old rushes tapes I found that this behaviour had always been the case.
    d-trick wrote:
    For example, say you have a frame where the slate is just about to hit and it was moving fast, so fast there's motion blur, but there's a small enough space where you know the sound hasn't happened yet, maybe only a centimeter or less. Obviously the next frame shows the slate closed, and you can guess the slate actually touched between the time each frame was exposed, probably even closer to the previous frame.
    So, where exactly to place the waveform? At the beginning of the frame where it's closed, or at the end of the previous frame where it would have happened in real life?
    Here's a silly attempt at an illustration of the dark area representing a single frame on the timeline and the waveform underneath:
    {Single frame}{Single frame}
    .................||......................
    {Single frame}{Single frame}
    .....................||..................
    So in this case I'd say the first is 'correct' for film or interlaced video at a high shutter speed. At low shutter speed I'd say it's a toss up, it's too hard to tell if the sticks shut in the time between frame 1 and frame 2 being exposed or just too early in frame 2 to produce any detectable motion blur. That said, I don't think anyone would be able to tell the difference between the two anyway.
    Of course in an app that doesn't support subframe editing the alternatives would actually be
    {Single frame}{Single frame}
    .................||......................
    {Single frame}{Single frame}
    ......................................||.
    So the second option is clearly delayed which makes it a no-brainer, and saves you the agony of choice.

  • Auxiliary timecode and drop/nondrop timecode

    I'm new to FCP and am trying to edit a 4-camera shoot that didn't have synced timecode. I'm trying to use auxiliary timecode to sync the footage. Here's my question: If the source timecode is nondrop frame, but the FCP sequence is drop frame, do I make the auxiliary timecode drop frame or nondrop frame?
    Also: I can't watch the multiclip in the timeline on Safe without rendering. If I watch it on Unlimited, it keeps dropping frames on playback. Can I edit anyway, or will I have a mess later?

    The 10.4.7 update won't hurt you any... just run Software update to get it.
    Is the LaCie drive really full?
    If you would, download a utiltiy from AJA.com... here: http://www.aja.com/html/supportkona3swd.html The System Test is what you want about 1/2 way down the page.
    Run that on the LaCie... the performance tests. Post back what it's Sustained Read is. It needs to be pretty fast to run several streams of video at the same time. That's what you are trying to do when you play a multiclip.
    Be sure that multiclip playback is turned on in the RT pop up menu in the upper left area of the timeline window too.
    Jerry

  • RME 4.3.1 & Nexus 7000 device package

    I am have 27 devices that RME reports as "Out-of-Sync", among those some are Nexus 7K devices, with the sysObjectId .1.3.6.1.4.1.9.12.3.1.3.612, which according to the "Supported Device Table for LMS 3.2" is supported in RME 4.3.1.
    Searching for device package with that sysObjectId I get no result in Common Services. However, when performing a "Device Update" with my CCO account, Ciscoworks tells me that "All the available package(s) at source location are already installed."
    What do I have to do to get that device package?

    Hi,
        To check / download the latest RME Device Package for Nexus 7k support, please do the following :
    1. Go to Common Services -> Software Centre -> Device Update and click on the hyperlink for 'CiscoWorls Common Services'
    * You should see 2 packages listed  - Device Data and MDF.
    * The latest versions of these are 1.10 and 1.48 respectively.
    * If you don't have the lateset versions installed, please run an update to install these.
    2. Once you have the most up to date packages for Common Services installed, go to Common Services -> Software Centre -> Device Update and click on the number in the 'Device Type Count' collumn for Resource Manager Essentials.  Filter the list of devices by Package Name = Nexus.
    * Do you see sysObjectId .1.3.6.1.4.1.9.12.3.1.3.612 displayed ?
    * What version of the Nexus device package do you have installed ? (The latest version is 2.5)
    3. Even if you have the latest Nexus package installed, run a device update for RME so that you loaded the latest versions of all packages (I see there were some new updates for shared packages eg LibCommon as recent as 21st January 2011.
        In terms of the devices being reported as 'Out-of-sync' , this means that RME has detected that the running configuration and the startup configuration on the device are different. This could be because RME doesn't have the most recent copy of either config or it is a real situation of the startup and running config being out of sync on the device.  If you go to the Out-of-Sync Summary page, select the Nexus 7k's and click on 'Sync on Device' RME will run a job to login to the devices and execute a copy run start.
    * If your Nexus 7k's are still reported as 'Out-of-Sync' after the job completes then we'll need to investigate further.
    Regards
    Derek Clothier

Maybe you are looking for