Timing Accuracy

Used Acrobat 9 to create a slide show with auto transitions and music background.  Problem is trying to synchronize music with slides since same pdf file played on different computers have music ending at different spots (photos) in the slideshow.  Since presumably the music is played at the right speed, is there a way to make the xx seconds for autoflip more accurate (so music and slideshow ends at the same point regardless of which computer pdf file is played on)?

Again, your response provoked me into more timing experiments.  This time I used the same pdf file as before but changed the transitions to NONE and again timed the play on two machines.  The results were that the transitions ADD to the total length of the slides (unlike using a video editor), so the total time on both machines were substantially shorter (but still longer than the number of pages x autoflip time).  HOWEVER, there was still about 20 seconds DIFFERENCE in the total playback time to reach the same point in the pdf file.  You may be correct in saying CPU/GPU differences is making the difference, but it's for displaying the page, NOT for the transitions!  So again, there is a fairly large timing inaccuracy using autoflip.

Similar Messages

  • DirectX, Authorware and Timing Accuracy

    Hi,
    Does anybody know if Authorware’s default Response
    Times (ResponseTime) use DirectX?
    If not can anyone point to ways Authorware can use DirectX to
    support response registration and graphics presentation with high
    timing accuracy (temporal resolution)? And also use DirectX to
    interact with the output (e.g., video and sound) and input
    interfaces (e.g., keyboard, mouse)?
    Cheers,
    Miles.

    I am indeed fairly certain Authorware uses the system clock
    for such
    timings. I was not aware you could use DirectX in such a
    fashion... Very
    happy to hear you recognize Authorware's flexibility and
    power! It's a
    shame it hasn't kept up with technology. However, if it's
    within your
    skillset, Authorware can be extended through use of custom
    DLLs created
    as 'Xtras' or 'U32' files. I believe the Authorware CD
    contains examples
    of how to create your own (using tools like Delphi). So
    perhaps the
    functionality you seek is still possible...
    Best of luck!
    Erik
    m1lesr1969 wrote:
    > Thanks Erik. I presume Authorware uses the system clock
    for its response times
    > then?
    >
    > The reason I ask (in case you wondered) is that I use
    Authorware for
    > experimental purposes (stimulus and response recording).
    There are bespoke
    > packages designed to do this (all use DirectX for
    millisecond accuracy), but
    > Authorware is a much more powerful package in terms of
    conditional branching,
    > feedback, random conditions etc. It can also run of the
    web and exe's can be
    > used on machines without a licence - making it more
    flexible and cheaper too!
    > Only downiside is the timing accuracy, which is good,
    but not the best....
    >
    >
    Erik Lord
    http://www.capemedia.net
    Adobe Community Expert - eLearning
    http://www.adobe.com/communities/experts/
    http://www.awaretips.net -
    Authorware Tips!

  • Timing accuracy of individual samples within any sampler plug or patch.

    i'm refering to the editing done within any given sample product on the market(by the manufacturers).
    i have observed different amounts of "dead air" at the beginning of any given individual sample file that comprises a given sampler instrument patch.
    i sometimes re-edit all the separate files myself for a given patch if it is too far "off". of course, this is impossible with some libraries that don't let you acess these files. sometimes i get a patch that is perfect except of two or so individual files. when this happens i have to "go into" the rendered audio track and cut and move these into place on the time line. anyone else observed this?
    BTW, the improved timing by deleting the dead air IS VERY NOTICABLE in the groove of a track!!!
    G4 450DP   Mac OS X (10.4.4)   digidesign, motu, apogee

    I too have done this. With pleasing results. Audiofile engineering makes an app (SampleManager) which among many other things trims files' heads and tails by seperate thresholds. If you want to do this to a large library check it out and save yourself a few days.

  • Deterministic Timed loop

    I am right now using the PCI-6229 for some
    critical machine functions and data logging inside of a 50ms timed loop in LV8.0.
    The source of clock is the 1kHz time base from O/S.
    However the timing accuracy is
    inadequate as it varies whenever there is a mouse movement or some such
    activity. ( even though I make sure that there is no other concurrent
    program running on the WIN_XP system ) Normally for most other projects
    this is not an issue but in this particular machine I need to switch
    on/off a solenoid at 0.25 Second intervals and the resulting pressure
    curves are plotted. Even a small drag in timing flattens the curve
    Is it possible to have a hardware
    timing source ( from the PCI 6229 card ) for the Timed loop so that it
    is more deterministic on a relative term ? I know that WIN_XP is not
    real time but for the time being I am looking at getting closer to it
    without major hardware change.
    Thanks
    Raghunathan
    Raghunathan
    LV2012 to Automate Hydraulic Test rigs.

    You are correct that Windows is not deterministic. While there are some things you could do to make it more deterministic, the timing errors will still effectively be unbounded. Also note although you don't see any other programs running, WinXP is always running tasks in the background that can steal CPU cycles from LabVIEW, however high LabVIEW requests its own relative priority to be.
    Your best solution will involve separating the HW timing in your DAQ board from the SW timing in your application (the timed loop). Yes, you can use your sample clock on your DAQ board as a timing source for your loop, but this is still going to be dependent on software timing and won't remove the jitter you are seeing.
    You will want your timing to be solely dependent on a continuous or finite DAQmx task that uses a sample clock. For instance, if you are using DIO lines to switch your solenoids on or off, you will want to use one of the 32 clocked DIO lines on your 6229 to do so. I believe your card has at least a 1MHz sample clock for digital stream generation, though you should doublecheck me there. You could then design your digital pattern such that you send the data out at a clocked speed. For instance, set your sample rate to be 0.25 seconds, then have your digital pattern be a finite sequence of F-T-F. The digital lines will be updated according to a HW clock on your DAQ board that is independent of Windows.
    Check the Example Finder for examples of doing clocked Digital Output.
    Jarrod S.
    National Instruments

  • What happened to Emagic's good old AMT = Advanced Midi Timing in Logic 7, 8

    I wonder if anybody knows if Apple dropped the AMT-algorithm enabeling Emagic's AMT 8 and Unitor 8 Mk 1 and 2 to be more accurate in Midi timing than ordinary interfaces without AMT are. The last button/check box that was related to AMT (Advanced Midi Timing) I saw was in Logic 6. Steinberg and Motu had their own timing accuracy systems then.
    Since in the whole Logic Studio manual AMT is not mentioned a single time I supposed this very useful tool has been sacrificed by Apple's dingy future spirit.
    Not to be misunderstood: I know the interfaces can be installed and are supported and be run by the actual machines and systems as normal Midi interfaces .

    Just having a random search around:
    http://developer.apple.com/audio/pdf/coreaudio.pdf
    "The MIDIPacketList are a time-stamped list of packets to or from one endpoint.
    All MIDI I/O functions use this structure. The MIDIPacket contains one or
    more simultaneous MIDI events."
    Also, back when Logic was first going in OSX, emagic had an article about this as well, although that's long since gone.
    OSX's time stamping isn't exactly the same as AMT in detail, but is in essence the same thing. AMT works by time stamping events ahead of time, so the interface can fire them off as close to perfectly as the current MIDI stream allows. If you have two events at 1 1 1 1, but on different ports, an AMT interface can fire those events at precisely the same time, which can't happen with a regular (non-timestamped) MIDI stream, as the stream is serial in nature.
    Because OSX has time stamping built in to the MIDI server, theoretically all MIDI output operations under OSX should take advantage of this - but at the geeky levels, I haven't been programming with this so I'm not completely up on all the details - it maybe that programs have to be written specifically to take advantage of it.
    In any case, it's these reasons why AMT was never supported in Logic under OSX - it was basically redundant.

  • 1MHz Software Timing

    Without LabVIEW Real Time it is not possible to create a 1MHz loop. You can either have a really fast (unthrottled) loop that depends on your system or one with a period that is some multiple of 1mS. I understand that Windows is not deterministic and the loop will be extremely jittery.
    I am contemplating proposing an idea on the Idea Exchange for something like a Wait uS function. It seems like it should be possible since the Get Date Time in Seconds returns a sub mS precision value. I would like to have a 1MHz loop, jittery as it would be.
    Is there something inherently wrong with that idea?
    =====================
    LabVIEW 2012

    I find this discussion interesting, since it comes up here and there all the time. And it is not limited to LV by any means.
    In C#, a product of MicroSoft (MS), you have a Timer class in the Windows Forms you can use to time your application. Reading through the documentation, you will find two very important key constraints (link):
    a) It is for use in single threads only
    b) Limited to an accuracy of 55ms
    Quote:
    "The Windows Forms Timer component is single-threaded, and is limited to an accuracy of 55 milliseconds. If you require a multithreaded timer with greater accuracy, use the Timer class in the System.Timers namespace."
    Following the suggested "workaround", you will find this:
    Quote (link):
    "The Timer component is a server-based timer, which allows you to specify a recurring interval at which the Elapsed event is raised in your application.[..]The server-based Timer is designed for use with worker threads in a multithreaded environment."
    But you will have to define an Interval for your Elapsed event:
    Quote (link):
    "The time, in milliseconds, between Elapsed events"
    So you see that MS does not provide any tool to achieve any timing with increased accuracy than 1ms. Nevertheless, you CAN achieve higher accuracy since the timing source is indeed running faster than 1kHz. But keep in mind, that timing is always a difficult issue.
    Spoiler (Highlight to read)
    Most CPU architectures do not supply a clock which can devided down to exactly 1ms. Therefore, simply counting those clock ticks will not result in exact timings with a 1ms resolution.
    Most CPU architectures do not supply a clock which can devided down to exactly 1ms. Therefore, simply counting those clock ticks will not result in exact timings with a 1ms resolution.
    There are projects to supply higher resolutions on Windows (e.g. read this) but you have to admit that it does not seem to be simple.
    Now coming back to LV:
    NI does supply access to a higher resolution clock if you have LV RT installed. Even though the API is working on Windows as well, you most probable will run into the question Wart already stated in his answer in this thread. It would only introduce confusion for unexperienced/unaware users. It is already often enough an issue that users do not believe that 1ms is not a determinism on Windows OS......
    For all interested in a discussion about timing accuracy, you might find this community entry very interesting.
    hope this helps,
    Norbert
    CEO: What exactly is stopping us from doing this?
    Expert: Geometry
    Marketing Manager: Just ignore it.

  • Acquiring High resolution data from usb mouse movement

    Is there a way of acquiring high resolution data from the movement of a usb mouse?  I have a NI PCI 6221 daq card to use. I have a Pentium 4 PC with 1 Gb of RAM. I need to get the position, velocity and acceleration of the mouse.
    Is there a way to do it with the above hardware.
    Thanks in advance

    I don't see how you could use a PCI-6221 to get high resolution mouse movement measurements. The PCI-6221 can acquire voltages at up to 250kS/s, but what voltage would you measure? It could also read in digital data at 1MHz, but you would have to be able to understand the USB communication protocol your mouse uses, and I doubt that your mouse vendor will give out that information. You might be able to take your mouse apart and hook up leads to the sensors around the trackball and do a sort of quadrature encoder application, but there's no guarantee we're dealing with TTL digital signals here (you might even be using an optical mouse!).
    Your best option - and I have no idea how good this option is - is to use the driver already installed for your usb mouse. What software application are you going to use to program your application that will measure the mouse movements?
    If you would consider using LabVIEW, you could set up an event structure to capture mouse movements across the front panel. Each event registered includes a timestamp from the CPU's millisecond clock as well as the coordinates of the mouse. If you stored up a buffer of previous mouse positions and times, you could infer a velocity, perhaps with the help of interpolation functions.
    All of this would have somewhere on the level of millisecond timing accuracy. Were you envisioning something else?
    Jarrod S.
    National Instruments

  • Can LabVIEW threads sleep in increments less than a millisecon​d?

    I am aware of two LabVIEW sleep functions:
    1) All Functions | Time & Dialog | "Wait (ms)"
    2) All Functions | Time & Dialog | "Wait Until Next ms Multiple"
    In this day and age, when 3GHz processors sell for less than $200, it seems to me that a millisecond is an eternity. Is there any way to tell your LabVIEW threads to sleep for something less than a millisecond?
    In Java, the standard Thread.sleep() method is written in milliseconds [sorry, the bulletin board software won't let me link directly]:
    http://java.sun.com/j2se/1.4.2/docs/api/java/lang/​Thread.html#sleep(long)
    but there is a second version of the method that allows for the possiblity of nanoseconds:
    http://java.sun.com/j2se/1.4.2/docs/api/java/lang/​Thread.html#sleep(long, int)
    So there does seem to be some consensus that millisecond sleep times are getting a little long in the tooth...

    Hi Tarheel !
    May be you should get some idea of the kind of timing accuracy that you can reach when using a loop.
    Use the attached vi, which runs repeatedly a For loop (10 iterations) reading the time, then calculate the average and standard deviation of the time difference between the loop iterations.
    On my PC (P4, 2.6 MHz, W2K), I get a standard deviation of about 8 ms, which appears to be independent of the sleep duration I asked for.
    Same thing with a timed loop.
    Under MacOS X (PowerBook, 1.5GHz), the SD falls down to 0.4 ms.
    I tried to disable most of the background processes running on my PC, but I could not get a better resolution.
    Seems that the issue is not in LV but on the way the OS manage its internal reference clock.
    Since you are a Java afficionado, may be you could produce something equivalent ?
    A proof that nanosecond resolution is available on a PC could be of great help to NI. Why bother with costly timers on DAQ cards ?
    By the way, it took me about one minute to create the attached vi. I would like to have an idea of the time required to do the same thing in Java.
    Tempus fugit...
    CC
    Chilly Charly    (aka CC)
             E-List Master - Kudos glutton - Press the yellow button on the left...        
    Attachments:
    Timing precision.zip ‏11 KB

  • Using Laptop to time sprints and running events

    I would like to use a computer to assist in timing and keeping track of 15 soccer players that I coach. The timing accuracy need only by one tenth of a second. I'm not looking for anything fancy, just a simple method for the kids to time themselves as they practice each day.
    Can anyone help me with a reference or two?
    thanks

    I guess I should provide more info to make the question more clear. I would like to time sprints (never more than one lap). The key would be some type of hardware accessory that would turn on the stopwatch at the 'start' and some time of switch to stop the stopwatch at the 'finish'. That way the boy/girl could time themselves. That's my goal. To allow the kids to time themselves each day at practice when they do they 40 yard sprint. I was thinking of a switch under their foot that activates when they step/lift their foot. Another alternative would be a light beam that "switches" when the beam is broken.
    Any ideas?

  • I Movie 11 Freezing -

    I am making a project for my sons all star baseball team but IMovie 11 keeps freezing and locking up the whole system? No beachball, just locking up?

    No, not like you would find on the Windows PC for example Aegisub. That is a very well made open source sub-title editor. What iMovie has are embedded or hard coded subtitles burned right into the video frames after you Share the movie (exporting it really) to the Desktop. There are some title choices that can mimic the sub-titles you see in the movies, but you don't get the timing accuracy required to to real sub-titling, and that's what a program like Aegisub provides, timing, two rows of text, all kinds of different options to not just add but refine the titles.
    The closest you can get to a sub-title in iMovie '11 is Lower Third, and Lower options unde the Titles toolbar. (the letter T tool)

  • Time Resolution in ELVIS DAQ

    Hi All,
    I need to find the time difference between two pulses that are generated by my circuit. For this, I need to know how reliable the time stamps are in data collected from the NI ELVIS DAQ Board (i.e. if I sample data at 1kHz, how sure can I be that the delta t between two data points is 1 ms) ?
    Thanks,
    Jari

    Hi Jari,
    Are you using ELVIS I or ELVIS II? If you are using ELVIS I, the specs will depend on which DAQ card you are plugging into. With ELVIS II, the DAQ is built-in, so the specifications are available in this manual: http://www.ni.com/pdf/manuals/372590b.pdf which specifies that the timing accuracy is 50 ppm. If you are not familiar with ppm used to describe clock accuracy, I recommend looking through this article: http://www.best-microcontroller-projects.com/ppm.html. One caveat to keep in mind is that the sampling rate that you specify will be generated by dividing down the internal base clocks. Therefore, the sampling rate you specify should be an integer multiple of the internal timebase (ELVIS II has a 0.1, 20 and 80 MHz timebase). Otherwise, the DAQ will just round to the nearest sampling rate that can be derived by dividing the internal timebase by an integer value. See this document for more information: http://digital.ni.com/public.nsf/allkb/4BBE1409700F6CE686256E9200652F6B. 

  • How can I change the speed of a stepper motor?

    HSI bipolar Linear Actuator Size14, HSI Chopper Driver 40105, NI PCI-6052E, NI SC-2345, 2 SCC-DO01 modules, and SCC-FT01
    The motor works at one speed with a boolean array input, using the SCC-DO01.  But I need to be able to change the speed of the motor.  Specifically it needs to operate on several frequencies of functions of as displacement vs. time, such as sine wave, triangle wave, and square wave.  I tried to find information about using a digital pulse train but had no luck.  Is it even possible to accomplish with this hardware?
    Message Edited by Draben on 01-22-2006 09:43 PM

    In general E-Series DAQ-boards like the PCI-6052E are not very good for motion control tasks. The 6052E doesn't provide hardware timed digital I/Os so you can control the digital lines only by software. This is both slow and inaccurate as long as you are running the board under a non-deterministic OS like Windows.
    On a real-time OS the jitter would be better but still the frequency would be low and it would still be hard to program accurate acceleration and deceleration ramps.
    If you use the counter/timer instead of the digital I/Os you could reach higher velocities and a much better timing accuracy but it would be even harder to program acceleration/deceleration ramps which you will need if you want to reach higher velocities with your stepper motor. Additionally for motion control applications you may want features like limit switch monitoring and following error tracking which require again real-time capabilities.
    With this said I don't recommend the usage of a multifunction DAQ board as a motion control board. Please have a look at the NI motion control boards like the PCI-7332 which does all the motion I/O and trajectory control onboard.
    Best regards,
    Jochen Klier
    National Instruments Germany

  • M-Series read + Write, threading or async , performance issues

    I have a slightly unusual issue: Using 2 USB-6251 and 2 USB-6015 within the same application.
    1 of the USB-6251s reads data on a 50 msec look (Driven by an async timer).
    Depending on the input the code decides to output a 40 Hz, 300 usec  waveform  of varying amplitude  - the output goes one until there is  a user event that might make it stop.
    Now , here is the problem.  I have implemented the output function using a sepereate high priority thread and the delay( delay_in_sec) command.
    I find that while the data acquisition timer is NOT running the frequency of the output is very close to 40 Hz. When the data acquisition IS running , the frequency of the output varies considerably - most likely reflecting USB delays.
    I was able to see the output 40Hz, 300 usec waveform using a seperate app using a single async timer w/o any data acq and the timing was immaculate.
    My questions are:
    1. Could I use 2 different async timers in the same app - I tried to do this and the output was still wacky when I turned data acquisition on
    2. Is there any way outside the delay() command that could improve the timing accuracy of the analog output thread?
    3. I measured the time to create a 300 usec pulse of a certain amplitude, write it to the USB-6251 and start the task and it was ~ 13 msec. This is way too slow for 300 samples at 1 MHz - is it a USB bus limitation, or I should look at my design to see what is going on?
    4. Any NI hardware that would allow me to handle 4 analog outputs more gracefuly? (FYI, 2 of my outputs are at 1 MHz and 2 of them at 50 Hz).
    Thanks
    AP

    THanks for taking the time to look at my message.
    I am using 4 DAQs. First one, USB-6251 is used to collect data (4 analog inputs @ 20Hz) and also occasionally provide analog output @ 1 MHz.
    Second DAQ is a USB-6251 occasionally provide analog output @ 1 MHz.
    Third and Fourth DAQ  areUSB-6015 occasionally providing analog output @ 50 Hz.
    All devices are connected via USB to a Dell Dual Core notebook.
    THis is how I create & configure the analog output channels - using 1 task per device.
                        DAQmxErrChk(DAQmxCreateAOVoltageChan(TaskAnalogX,channelstring , "Ramp 1",0 , 10, DAQmx_Val_Volts, ""));
                        DAQmxErrChk(DAQmxCfgSampClkTiming(TaskAnalogX, "",1MHz OR 50Hz, DAQmx_Val_Rising, DAQmx_Val_FiniteSamps     , SECONDS*1MHz));
                        DAQmxErrChk(DAQmxWriteAnalogF64 (TaskAnalog1, ( int32 )SECONDS*1MHz OR 50Hz ), 0, DAQmx_Val_WaitInfinitely, DAQmx_Val_GroupByChannel , PHAOarr, &sampsPerChanWritten, NULL));
    THe design/architecture is pretty complex but on a very high level whenever I need to deliver analog output I configure the output waveforms and then write it to the DAQs and then use some timers/logic to deliver the waveform from the DAQ as needed.
    Now the challenge has been that in one specific case I need to deliver analog output of uknown duration (whereas everything else is deterministic). In that case i  created a seperate thread and write each 300 usec pulse individually to the analog output (using USB 6251)using the following tASK:
                        DAQmxErrChk(DAQmxCreateTask("Tonic Output Task", &TaskTonic));  
                        DAQmxErrChk(DAQmxCreateAOVoltageChan(TaskTonic, channelstring , "PulseToPulse",0 , 10, DAQmx_Val_Volts, ""));    
                        DAQmxErrChk(DAQmxCfgSampClkTiming(TaskTonic, "",1.0E+06, DAQmx_Val_Rising, DAQmx_Val_FiniteSamps , 301));
    As described in my first message this is very sensitive to timing and it takes 13 msec to stop the task , write the array to the DAQcard and then start the task again.
    On the hardware timing recommendation. I tried to configure the DAQ  using the DAQmx_Val_HWTimedSinglePoint samplemode but it seems not to be supported by the hardware I am using.  Is there a CVI example of using hardware timing that I can take a look at?
    THanks
    AP
    Hi AP,
    My first impression is that you would want to use hardware timing
    rather than software timing.  Try using a DAQmx Timing command and
    specify the rate there.  Aside from that, could you be more specific
    about your application?  How do you have your hardware set up and which
    devices are handling each part of the task?  How are you generating the
    output?
    Regards,
    Joe S.

  • Tough, quirky, hard to figure sync problem with FCP / compressor /  DVDSP

    OK, where to begin?
    I have a dvdsp project. Everything started in FCP. The video, because of poor results in compressor, has been compressed using bitvice. This results in an mpeg2 that is slightly longer than the original, because of the way bitvice handles 24p material (intersting aside, once the track is in dvdsp, its back to the original length)
    So, in final cut pro, I took the audio tracks and put them at a speed of 99.9% to make them match. Then the audio went to compressor to make an ac3. The files that compressor output match up with the video, as far as length goes, just fine.
    Here's where it gets a little wierd. Out of two tracks, one appears to be just fine on the very first pass through compressor... in sync, proper length etc...
    In the other track, I got a delay, in the surround speaker tracks, that got longer as the track went on. I examined the fcp sequences that the individual sound files were exported from... same settings. I looked at the self contained quicktime files that were exported from FCP... all the exact same length. Yet, listening to the tracks separately, I could tell that indeed, a couple tracks drifted later (but still didnt add any length on to the end of the track). The only difference in the workflow for those tracks was that some were exported individually and some were batch exported. (dont ask me how/ why/ if that made the difference). There was also a wierd dropout of audio on some tracks, so I figured maybe just a fly in the ointment, try again.
    So then I very methodically started over with the sound and repeated the process. I couldnt repeat the whole batch export making a difference thing. The resulting ac3 was in sync within itself, and matched up in length with the video, but the entire audio track drifted late, out of sync with the video.
    My solution has been to encode the audio at 100% which is now in sync, but is actually shorter in length in dvdsp and now ends almost 2 seconds earlier on the timeline (but yet doesnt lose sync).
    So, one track worked as expected and one threw me a bunch of curveballs resulting in a fix the doesnt make sense but kind of works. The material in both tracks as far as source, settings etc.. is identical.
    Can anyone help me wrap my head around some of this so I can have confidence in a workflow moving forward?
    thanks,
    sorry for the lengthy post
    powermac g4 933    

    Audio/video timing accuracy is never better than the tool you are measuring it with. Therefore, you can't always trust time measurements based on SMPTE time code with your life, especially when 2:3 pulldown is involved. If you end up with a MPEG-2 multiplex that stays in sync on a set-top DVD player, then you are safe, regardless of what FCP, DVD SP or any other software tool think of it.
    Why is that? Because the timing accuracy in MPEG-2 is at least 3000 times as accurate as any SMPTE time code. It is true that MPEG-2 video may contain SMPTE time code information, but it is only optional and even when present it is never used for anything time critical, such as A/V sychronization. It just wouldn't work.
    Assuming that I have correctly interpreted your post, that the length of the movie is about 33 minutes and that no further info is required to solve your puzzle, this is my 2 cents guess:
    In your paragraph beginning with "My solution" I suspect that that the AC-3 file might have been slowed down by 0.01% even though it "ends almost 2 seconds earlier" according to the DVD SP timeline. Alternatively, the video time code might have been misinterpreted, DF versus NDF, suggesting it being longer or shorter than it really is. This cannot happen in a real DVD decoder, since it does not rely the SMPTE time code at all.
    In the BitVice log you can read exactly how many frames there were in your 24p movie. Divide that by 23.976 to get a reliable duration, in seconds, for the encoded file. Note that if you get a reminder (meaning extra split seconds) it represents 1 to 23 extra frames. Roughly, every extra 41.7 ms means an extra frame.
    You don't want to go by the 29.97 frame rate, because then (due to the 2:3 pulldown) odd frames will last for 33,37ms but even frames last for 50.05ms.;-)
    As you have noticed yourself, different tools like DVD SP, FCP, QT player et.c, may report different lengths of a MPEG-2 video file. But, given the frame count, from the BitVice log, you can always calculate the exact duration, +/- 10 microseconds, because it is controlled by a 90 kHz clock derived from the 27MHz crystal in the DVD player. This clock makes
    3003 ticks for every NTSC/29.97fps frame,
    3600 ticks for every PAL/25fps frame or
    3750 ticks during a 24fps frame.
    In comparison, due to its error prone and ambiguous nature, the SMPTE time code system is not even reliable enough to be frame accurate, unfortunately.
    Roger Andersson / Innobits AB, makers of BitVice MPEG-2 encoder for Mac.

  • Pulselength USB 6501

    Hello,
    I want to use one or more USB 6501 Boards to fire up some 10s of IGBTs. Each of these switches turns on ~10A for 1ms-3ms (each IGBT has a different turn-on length). Since the pulse pattern repeats after at least 40ms, i want to fire them step by step to minimize the stress for my main power line.
    I managed to create a VI which does exactly what I want - with one exception: The digital waveform gets created from an array using "build waveform". But no matter what value I feed into "dt" to control the length of each digit/pulse, all I get is pulses of 1 ms for each logical "1".
    From my point of view, "dt" tells the Channel how long it should stay "high" for each "1" it receives. Is this incorrect? I want to control the length of each pulse in steps of 0.1 ms - in other threads I read something about the need to use some kernel-functions to be able to use sub-millisecons resolutions. If this is correct, shouldn´t I be able to see a 2 ms pulse when setting "dt" to 0.002?
    Thank you for every hint, Sascha
    BTW: Labview 8.5 Evaluation / Labview 6.1 full
    BTW2: In the attached file you will only find 8 digital waveforms for 24 digital ports... I don´t know how to save the full 24*20 binary array, therefore you will receive an error if you try to run this VI without correcting the number of physical ports to use.
    Attachments:
    IGBT.vi ‏49 KB
    6501.png ‏53 KB

    Hello Philosoph,
    the USB-6501 is a static DIO device, which means it does not have an onboard clock to switch the digital ports with a certain frequency.
    You can turn on and off the digital lines with software timing in LabVIEW. That is ok if the timing does not have to be very accurate - i.e. it does not matter if the pulse is 10 ms or 12 ms - and if the frequency is not too high.  A pulse length of 1 ms is borderline - most of the time, it will probably work, but when your PC does have to do other tasks, you might get a delay of up to  100 ms. In any case, you will have a jitter on the pulse length. For higher timing accuracy requirements, you would have to use a DIO device with correlated, which means clocked, outputs.
    Having said that, I would recommend a different approach for the signal generation. I attached a VI that writes state information to the digital outputs; you can modify the digital pattern in the array, and set the desired pulse length. As the first iteration of "Wait for next ms multiple" does not have a determined duration, I would leave the first line all zeros and start signal generation with the second line. If needed, you can wrap the for-loop in another loop to repeat the pattern.
    Regards,
    Johannes
    NI Germany
    Attachments:
    Digital Signal - static.vi ‏19 KB

Maybe you are looking for

  • Error auto-firing an approval workflow inside a project's workspace

    Hi all, I have activated the SS2010 Workflow in PWA site collection features in order to use an approval workflow in one of my project's workspace. I setup the workflow under "Project Documents" and select to auto-start upon an item's creation. Howev

  • Problem with command console????

    I am using Windows XP pro. When using the command console, usually if you type: cd c:\java the console prompt should be in that directory i.e. it should read C:\Java> However, when I type this, it does not do anything, other commands such as 'dir' wo

  • How do I build this array?

    I have a vi that generates two time intervals in ms. One interval is between the time a light turns on and button 1 is released (time 1). The other interval is the time between the same release and the press of button 2 (time 2). The vi loops through

  • HT5622 I cannot use my bank card to purchase anything on my ipad

    I cannot purchase itunes or anything else with my bank card

  • Sync with SYNC on Ford car

    I have been trying for several weeks now to sinc my Treo 700W with my new Ford Escape, the sync feature.  I have downloaded Verizon 1.22 update but my car fails to recognize my phone.  Any help PLEASE???  Thank you. Post relates to: Treo 700w (Verizo