CRIO 9474 Output delay time

I'm going to use the cRIO 9474 digital output modlule to control a high speed/high current driver. The following data is stated in the manual: Output delay time (full load): 1 μs max. Full load is 1A, the current output for the driver is only 10mA. Is there any chance that output delay time will be longer then 1us with such small load (10mA)? Will there be any jitter? My switching frequency is going to be around 200-300kHz.
Best regards,
Per

Hello Per,
There is a bit on confusion about the 1μs that we spec on the manual.  That time or delay time is just the time that it takes the signal from the cRIO Backplane to the MOSFET in charge of turning the channel ON or OFF.  That time won't change depending on the load.  That means that the gate will start turning ON or OFF at the same time no matter what the load is.
What changes is how long it takes for the line or channel to get to the desired voltage value.  That will change depending on the load but shouldn't be too significant.
We don't spec that because that a value that depend on your load.
Hope this answers your question.
Ricardo Santa Olalla
CompactRIO Product Support Engineer
National Instruments.

Similar Messages

  • Restardo salidas digitales cRIO 9474

    estoy trabajando con un compactRIO 9002, chasis 9102 y un modulo sa salidas digitales NI 9474.
    Intento generar una señal cuadrada pero cuando la frecuencia pasa de 50 o 60 Hz la señal empieza a recortarse y no le da tiempo de pasar de +V a 0.
    En las especificaciones viene un output delay time de 1 us, por lo que no deberia de haber problemas en alzanzar frecuecias de varios kHz.
    Estoy seguro que no es por limitacion del hardware y me temo que no estoy programando correctamente el cRIO
    Cuelgo el VI de la fpga y el VI del compactRIO, es un codigo muy sencillo.
    Agradezco de antemano la ayuda recibida
    Adjuntos:
    fpga.vi ‏22 KB
    cRIO.vi ‏225 KB

    adjunto de nuevo los archivos que pretendia; los anteriores eran de otra prueba
    Adjuntos:
    fpga.vi ‏41 KB
    cRIO2.vi ‏196 KB

  • Digital Output With Timer (Simulation)

    Hello everyone, I just learned how to make LabVIEW program a week ago. I try to make a simulation of Digital Output by LabVIEW (my attachment). In this simulation I have a slider as an input (0-10 V), two numeric control (upper limit and bottom limit), a waveform chart that plot those 3 value, and two boolean LED (P0.0 and P0.1) as an indicator. In this simulation you can fill any number (between 0-10) in the numeric control as a limit for your slider input. If the input from a slider exceed those upper and bottom limit then the boolean LED will turn on, P0.0 if exceed upper limit and P0.1 if exceed bottom limit. The problem is I don't know how to make timer for those boolean LED. As example:
    1) Make an input from slider,
    2) If input (1) exceed the upper limit,P0.0 will turn on for 5 second,then turn of for 10 second,
    3) If in that 10 second you change the input back to normal (between upper and bottom limit) then P0.0 will stay turn of until the input from slider exceed the upper limit again,
    4)If in that 10 second you didn't change the input (stay exceed the upper limit) then P0.0 will repeat the process (2) until you the input from slider back to normal.
    (Same process for input that exceed the bottom limit).
    Can you help me to make this timer? Thank You (I'm sorry I made a double post):newbie
    Regards
    Juventom
    Attachments:
    Digital Output With Timer.vi ‏16 KB

    Hello Juventom,
    As I understand it you want to be continuously checking the value of the sliding bar and comparing that to the upper and lower limit controls whilest also chaning the LED booleans to true for 5 seconds then false for 10 seconds if the sliding bar value is outside of the limits.
    To do this you would probably be best using a parallel loop design, where you have 3 while loops in place of the one you have currently. Each one of these while loops would be responsible for a part of your program (e.g. the top one would display your values on the graph and the second one who check the sliding bar value against the upper limit and then turn on the LED, etc)
    I've found this tutorial about multiple loop programs and I think you should look at the section entitled "Parallel Execution"
    http://zone.ni.com/devzone/cda/tut/p/id/3749
    This way you can use normal delay VIs but when they run they only pause that loop rather than the whole program.
    Please let me know how you get on with this, and ask me if you need further help.
    James W
    Controls Systems Engineer
    STFC

  • How to convert double to unsigned long integer for cRIO analog output?

    All,
    Having issues sending out my arbitrary waveform to my cRIO analog output. My values are doubles and by the time I send them out, they have been converted to U32 values and are all turning out as zeroes. This led me to assume two things: 1) That cRIO analog output can only output integers since the values need to be deployed in the memory first and 2) that I'm missing a step in the conversion process. My values range from 0-8, therefore I don't expect that the simple conversion tool in labview should make everything into zeroes.
    Any help?

    Since we are using the cRIO's FPGA interface, you really should be doing most of this inside of the FPGA.  Use DMA FIFOs to pass your data between your RT and the FPGA (and visa versa).
    On your FPGA, you can have a loop that just reads the analog inputs at whatever loop rate you want.  You just send the data to the RT using a DMA.
    Similarly, use a DMA to send your analog output values to the FPGA.  The FPGA can have another loop that reads the DMA and writes the value to the analog output.  This should be done in the FPGA since you can have the FPGA send out the values at a given (and deterministic) loop rate.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Digitaizer Card Acquisiton Delay time

    Dear Sir/Madam,
    I've got some problem about the 5911 Digitaizer Card Acquisiton Delay time .
    How to measure the time from the Trigger siganl to the first point of Acquire ?? Do LabView Support this function??
    Beside what is the difference between the "Acquisiton Start Time " and "relativeInitalX" ??? Plz
    Plz help me ?? Thx
    With Regards
    David
    Attachments:
    Acquire.gif ‏40 KB

    Hi David,
    Both the Acquisition Start Time and relativeInitialX provide the same information: the time in seconds from the trigger to the first sample in the acquired waveform.
    The only difference is that the relativeInitialX is an output from several read and fetch VIs (ie, niScope Read Cluster) whereas the Acquisition Start Time is a property in a Property Node which can either be read from or written to. Thus you can set the Acquisition Start Time through the property node.
    Hope that helps, have a great day!
    Brian Spears
    Applications Engineer

  • Set frame delay time for animated gif using ImageIO

    I'm trying to change the delay time of each frame for an animated gif by changing the metadata for each frame as following but it doesn't change anything.
    static private IIOMetadata setMetadata(IIOMetadata metadata, int delayMS) throws IOException
              Node root = metadata.getAsTree("javax_imageio_gif_image_1.0");
              for (Node c = root.getFirstChild(); c != null; c = c.getNextSibling())
                   String name = c.getNodeName();
                   if (c instanceof IIOMetadataNode)
                        IIOMetadataNode metaNode = (IIOMetadataNode) c;
                        if ("GraphicControlExtension".equals(name))
                             metaNode.setAttribute("delayTime", Integer.toString(delayMS));
         }Does anyone know how to set delay time for animated gif using ImageIO ?

    I'm trying to change the delay time of each frame for an animated gif by changing the metadata for each frame as following but it doesn't change anything.
    static private IIOMetadata setMetadata(IIOMetadata metadata, int delayMS) throws IOException
              Node root = metadata.getAsTree("javax_imageio_gif_image_1.0");
              for (Node c = root.getFirstChild(); c != null; c = c.getNextSibling())
                   String name = c.getNodeName();
                   if (c instanceof IIOMetadataNode)
                        IIOMetadataNode metaNode = (IIOMetadataNode) c;
                        if ("GraphicControlExtension".equals(name))
                             metaNode.setAttribute("delayTime", Integer.toString(delayMS));
         }Does anyone know how to set delay time for animated gif using ImageIO ?

  • Cannot change delay time for animated GIFs

    I just got PSE 6.0 for Mac, and I've slowly been learning how to do what I'd like to with it. I've noticed that there have been some problems with the program, though, such as sudden lockups that make me force-quit. One of my more serious issues involves making adjustments to the delay time in some simple, animated GIF files that I have been working on, using the Save For Web command. For some reason, I can't change the delay time from its default of 0.2 seconds to anything else; the slider doesn't respond when I click on it, nor can I make any direct adjustments in the field itself. In the end, I'm stuck with just that 0.2-second delay. Is this a bug in the program, or am I just missing something?
    If this helps, I have created a set of animated GIFs with two layers/frames apiece, looping continuously. The second and topmost layer is asigned a Dissolve filter, and usually left at 100% opacity when I make the final adjustments. The workspace background for each is transparent when I begin.
    Oh, also, I am using PSE on an Intel iMac with OS 10.5.6, with 2GB RAM and enough hard drive space to fit everything.

    Unfortunately, this is a known bug in PSE 6 for mac:
    http://kb.adobe.com/selfservice/viewContent.do?externalId=333620&sliceId=1

  • Can i use a cRIO-9474 module as a low side switch?

    A setup I want to test applies continuous 12V to the hot side of the device and controls it by switching ground.  The only nearly-suitable module I have available is a cRIO-9474.  Looking at the data sheet schematic, it seems as though I could use this module to switch ground by connecting Vsup to my system ground and DO to the device.  However, the specs list the min Vsup as 5V.  Does this approach sound feasible, or will I mess up the switching transistors by connecting Vsup to ground?
    Jeff

    Hello, Jeff!
    I think the best approach would be to use a cDAQ relay module such as the 9485 or 9481. These will allow you to directly control a relay, and can handle the voltages you're working with. Please let us know if you have further questions on these, or if there is a reason they won't work for your application.
    Have a great weekend!
    Will Hilzinger | Switch Product Support Engineer | National Instruments

  • Delay/Timer only in a conditional loop.

    Hello,
    I want to use a delay(or some kind of timer) within my
    program. I intend to use to it within a conditional loop. Specifically,
    if a particular condition is true, then I want to wait for 50ms and
    then check another condition, which if true, I want to stop my program.
    I have used wait vi in the timing section. But this vi pauses my
    whole program. Specifically, when the above mentioned condition is met,
    I want the delay vi to execute along with my data acquistion vi. At
    present, when the delay vi executes, everything (data acquisition, peak
    detetction, plotting etc) in the program pauses. I guess its correct
    functionality of wait vi but it does not suit my need.
    Does someone have any alternate for pausing (delay/timer) a
    particular condition loop without affecting the other parts of my
    program? Specifically, only the conditional loop delays for a certain time while other features like data acquisition, peak
    detetction, plotting etc excute normally.
    Message Edited by Neuropotential on 02-22-2010 12:47 PM
    Message Edited by Neuropotential on 02-22-2010 12:48 PM
    Solved!
    Go to Solution.

    Yes. This is exactly what I was looking for. But, there is a issue. In practise, i will not be looking at 10 sec delay but a very small number like 50 ms. With this approach, when I set a value of 50ms, my program stops after close to 90-93ms (After 50ms, there is a AND gate, when true stops the program; Refer to the attachement).
    There are multiple reasons which I see. First is obvious I am using a windows xp with lot of other stuff going on simutaneously. Also. elapsed time vi is an "express vi". Do you think this can be one of the primary reason for such a delay? Finally, I intend to use this program on RT based machine.
    Do  you see it working exactly as expected on those machines or do I have to modify my code? Please guide me. 
    Also, how to I mark your post as answer and other good things. 
    Thanks 
    Attachments:
    Image.JPG ‏60 KB

  • How to set delay time in tooltiptext

    Hai,
    I'm developing desktop application by using SWINGs in that app there in some text fields for that i have written mouseEntered method.
    in that method i used tooltiptext. It working properly. I need it to display message upto mouseout from text field.
    so any one could please help me how to set delay time for the tooltiptext.

    Read the API for avax.swing.ToolTipManager. Also, you don't need to use a MouseListener, just call setToolTipText or override getToolTipText.
    db

  • Delay time for Cancel query dialog

    Hi Folks.
    I have a form where the property Interaction Mode is set to NONBLOCKING.
    When a slow query is initiated, the cancel query dialog is displayed after approx. 5 secs.
    Can anyone tell me if it's possible to alter the delay time before this dialog is shown e.g., to 30 secs?
    Regards
    Carsten Nielsen

    super

  • Delay time in standby

    Hi All,
    I have created a standby database and i have given the command
    Alter database recover managed standby database delay 30;
    Now the archived logs will prod will be applied after half an hour delay?
    am in right
    Now in which view can i see this delay time given?
    regards,
    prem

    Delay time is usually specified in the init.ora. You can check to see which arch file was generated on the primary and then query the standby to see if that particular arch file was applied to the standby or not.

  • Reducing the delay time.

    hi sapgurus,
       I am facing problem with the process chains.
    1.The job BI_PROCESS_TRIGGER is taking a delay time of 1445 secs.
       The chain is getting after 15 minutes of scheduled time,
      I checked in sm37,its showing delay.
    2. BI_PROCESS_LOADING taking a delay of 1203 secs.
       The infopackages are getting started after about 10 minutes,from previous job.
    Can anybody help me to solve this problem.
    thanks in advance.

    Hi,
    If the chain start, and the job is in delay mode, check in SM50 that you have a BGD/BCT Work Process to take the request from the process chain.
    May be you dont have free Work Process to take the request.After waiting for 10 or 15 minutes, the system get a free work process which starts processing the process chain.

  • Change Timer Class Delay time

    Is it possible to update the Timer Class Delay time while it
    is running. I want to update it using the slider component however
    it doesn't seem to update. I even stop the timer on the THUMB_PRESS
    event of the slider that restart it after it is changed. Doesn't
    seem to work that way either. Any Ideas?

    I figured it out. Instead of setting a variable speed in the
    event handler function for the slider I direclty changed the timer
    delay variable to equal the slider value. I feel dumb now but it
    works great.

  • Javascript Code Assist Delay Time

    How can I increase the delay time before code assist pops up when editing javascript in a .cfm file?
    Under Preferences:Coldfusion:Editor Profiles:Editor:Code Assist
    there is a setting "Delay before showing Code Assist" but changing this setting does NOT alter the javascript pop up delay.
    Under Preferences:HTML:Editors:JavaScript:Code Assist
    there is no similar setting.
    So maybe I can edit this setting in a config file or something? I'm using coldfusion builder 1.0.0.271911 on snow leopard 10.6.3
    thanks.

    do I download Aptana plugin?
    OMG please NO! Aptana is a bloated pig, and is what made CFBuilder 2 so slow.
    I would suggest putting your JavaScript code in a separate file from your ColdFusion page, and using the <script> tag to load it in.  If you need to pass ColdFusion variables into JavaScript, put a small <script> block on the CFML page wrapped in <cfoutput> just to create and populate the JavaScript variables, then add a second <script> tag to load your external JavaScript file.
    By keeping your JavaScript separate from CFML code you achieve multiple benefits:
    The built-in JavaScript parser/syntax highlighter will actually work, as those CF Builder features are somewhat dependent on recognizing the file extension to know which parser to use.
    Maintenance of your application will be easier, because you don't have a mixture of CFML and JS code in the same file, and thus the files will be smaller and more targeted.
    You can run your JavaScript file through "linters" to validate syntax.
    You can minify your external JavaScript code files for better performance.
    If you use build tools such as Grunt or Gulp, you can automate 3 and 4 without potentially breaking any CFML code in the process.
    Probably additional benefits I'm not thinking of at the moment (someone will chime in I'm sure).
    HTH,
    -Carl V.

Maybe you are looking for