Setting voltage sensitivity on NI 6581

Hello,
I've got the following system:
http://sine.ni.com/nips/cds/view/p/lang/en/nid/206651
The specification says:
Selectable voltages of 1.8, 2.5, and 3.3 V or external reference voltage (1.8 to 5.5
How do I select these voltages? The manuel uses somekind of read write control with an unnamed reference.  I cannot find this reference. Is the a possibility to set this on the FPGA's vi?
Greeting
Matthias

hello,
I don't really know that device, but I have ever used some RIO devices, so I can try helping you.
The manuel you are talking about shows this image :
This is a "Read/Write Control" node which can be found in the "FPGA Interface" palette. This palette is available when you edit the diagram of the RT code (I said : "RT code"...not "FPGA code").
The reference you mentioned in your message is a FPGA VI reference : you can get this reference by using "Open FPGA VI Reference" function, in the same palette ("FPGA Interface", available from RT code diagram).
Those tools allow you to control the execution of your FPGA VI from the RT part of your code : you can set control values (controls placed on the FPGA VI front-panel), run this FPGA VI, retrieve data from DMA FIFO, etc...
So, from this little screenshot, we can assume that the developer put 2 commands on the FPGA VI front-panel. The respective labels for these 2 controls are "Power Supply (Connector A)" and "Internal Power Supply Voltage".
Then, from RT code diagram, he opened a FPGA VI reference, and then changed the value of these 2 controls by using "Read/Write" Control node (before or after running the FPGA VI with an "Invoke Method" node, also available in the palette).
But...there's no screenshot showing how these 2 parameters are modified by FPGA. I guess that in the FPGA VI, an "FPGA I/O Property" node gives an access to power supply settings.
In the manual, it is said :
"Use the NI Example Finder to find several NI FlexRIO device power setting examples"...Beyond my explanations that probably sound a little bit compliicated, I think that it will be the best way to find an answer to your question ! ;-)
Julien

Similar Messages

  • How do I find and/or change the pre-set voltage selection switch in an HP TouchSmart IQ500 series

    how do I find and/or change the pre-set voltage selection switch in an HP TouchSmart IQ500 series (specifically an IQ524)
    This question was solved.
    View Solution.

    Doesn't it use an external AC Adapter? If so, all you should need to use in another country is an adpater plug to match the wall socket.
    ... an HP employee expressing his own opinion.
    Please post rather than send me a Message. It's good for the community and I might not be able to get back quickly. - Thank you.

  • Setting case sensitivity for text searches

    I am trying to figure out how to set case sensitivity for text searches for Oracle. I am using 8.1.7 and the documentation keeps referring me to the Basic_Lexer object. Does setting the mixed_case parameter to 'No' remove case sensitivity for the index or is there something else I need to do?

    Do you recommend another field type that can be used that does not use case sensitivity?
    Thank you

  • HT4059 I'm using ibooks on an ipad mini. I am forever inadvertently advancing to the next  or prior page without meaning to do so. Is there some way to require a double tap to advance the page, or to set the sensitivity lower?

    I'm using ibooks on an ipad mini. I am forever inadvertently advancing to the next  or prior page without meaning to do so. Is there some way to require a double tap to advance the page, or to set the sensitivity lower so that the page doesn't advance every time I move imperceptibly?

    I'm using ibooks on an ipad mini. I am forever inadvertently advancing to the next  or prior page without meaning to do so. Is there some way to require a double tap to advance the page, or to set the sensitivity lower so that the page doesn't advance every time I move imperceptibly?

  • Sample programmes in C# for setting voltage in a TDK lambda power supply

    please provide some guidance for setting the voltage of a TDK lambda power supply through GPIB. Any driver software needto be installed?

    If you use an NI GPIB controller, then you will need to install NI-488 and it is recommended to install and use NI-VISA. There will be coding examples installed with the drivers. You don't mention the model number but you might find an existing instrument driver from NI or the vendor.

  • Setting voltage level Digital Out in NI 6221

    Hi all,
    I am working with a NI 6221. I am wondering if it is possible to change voltage level of digital out channels? Currenty I am using "Digital Bool 1line 1 point" as my DAQmx write.
    Thanks,
    Saridar
    Solved!
    Go to Solution.

    Saridar wrote:
    Hi all,
    I am working with a NI 6221. I am wondering if it is possible to change voltage level of digital out channels? Currenty I am using "Digital Bool 1line 1 point" as my DAQmx write.
    Thanks,
    Saridar
    can be done with a ttl to cmos level shifter   

  • MOVED: Needed Help Identifying and Properly Setting Voltages for OC Environment

    This topic has been moved to Overclockers & Modding Corner.
    https://forum-en.msi.com/index.php?topic=136075.0

    With the CPU V. taken out of Auto, no point in bothering to enable any power saving features, they are essentially out the window. IOH and PCH probably don't need to be changed out of Auto. Spread Spectrum should be left disabled all the time. 87C is getting on the hot side, but nothing wrong with it in the 70C's.
    Yup, you are right. Too much variation in Intel CPU's to give cut and dried voltages required. DRAM V. may need increasing to maintain stability with the IMC. QPI V. often needs an increase.  Once the CPU is stable, then the RAM can be tweaked using the DRAM Ratio. Trial and error as to speed and timings most likely will be required along with the voltages to make it stable.

  • Setting Sensitivity Area on WVC54GCA Camera

    Inside the web based setup utility, you can custom set the sensitivity area of the camera.  I am using the camera from inside my windows to look to an area about 4-6 feet (maybe up to 10-15 feet) away for motion.  I am having mixed results.  In your experience, is it better to have custom 2-4 window sensitivity, or is it better to use the full screen option? Also, I just want to confirm that moving the slider to the right increases the camera's sensitivity to motion.
    Thanks in advance

    Thank you very much for the reply.  I am not sure I completely understood the answer, so let me clarify what I think you said in my words.
    1. Using full screen sensitivity vs. 2-4 Window sensitivity should not make any difference in the camera's ability to detect motion.
    2. Your comment about full screen for vision clarity confuses me. Regardless of the sensitivity chosen, would I not see the entire screen?  My original thought was that a smaller area for motion detection would make it more likely to actually capture motion since the camera has less surface area to cover.
    3. Your comment about multiple camera's is true in my case. I have 2 camera's looking at different places (near each other, though). I am not sure I understand how 2-4 window sensitivity has any impact on one vs. two cameras since you set them both separately. This might point to what may be a fundamental misunderstanding on my part, so I do hope you will explain what you meant!
    I appreciate your help 

  • Use Text Ring to Set Parameter; Need Reverse Text Ring to Read?

    Using the manufacturers manual I have managed to program a number of simple programs to interface with one of the pieces of lab equipment I am using. To set the sensitivity of the device I used a text ring to allow the user to select the sensitivity visually, while using the number to program the device. Essentially the internal programming had a text ring in mind. To program a sensitivity of 1 nV I use SEN 0. For 3 nV SEN 1. This continues until 3 V with SEN 15. 
    Now, In order to read the values from the device properly I need to be able to read the sensitivity. The SEN command returns the numeric value but how do I convert the number into a string (much like a text ring in reverse) for each possible outcome? 
    Subsequently I could just read the numeric value, but I need to treat each sensitivity differently. For example. Reading a 10mV source at the 30mV sensitivity the output voltage is .333V, indicating 1/3 the full range.  At each sensitivity I have to multiply the reading by the sensitivity to get the true value. 
    What is the most compact way to program this? I could set up 16 if equals comparisons and 16 separate branches of code, but that seems highly inefficient. 
    Nukem

    I think some of these solutions will work to clean up the use of the sensitivities, but let me be more clear on the reading and writing portion.
    In the manual I have a table that tells me what command to send to the device to achieve a certain sensitivity level:
    SEN 
    0               100 nV
    1               300 nV
    2               1 micro V
    3               3 micro V
    etc.
    I want the user to be able to see the sensitivities on the right, while the program uses the values on the left. Hence, I used a text ring to achieve this when setting the sensitivity level. I select 3 micro V on the front panel and the program runs with the value 3. 
    Now, If I want to check the sensitivity level I have to do this in reverse. Have the program read a value of 3 and output the string telling me that the level is 3 micro V. 
    It looks like I should be able to use the ring cluster Jeff Bohrer suggested to do this both forwards and in reverse. I will try this at work tomorrow when I have access to LabView again. 
    Thanks everyone!
    Nukem. 

  • Sensitivity of iMac to power fluctuations, waveform distortions

    After months of working fine, my APC UPS BE750G apparently shuts down when there is a line voltage fluctuation or waveform distortion (based on observation during very brief power fluctuations during storms, such as cause lights to dim briefly), and shuts down everything connected to the UPS. In other words, the UPS fails to go into battery backup even though the battery is fully charged.  APC is replacing the UPS under warranty.  I have discussing with APC how I might avoid with the new unit the possibility that local voltage irregularities might have damaged the UPS circuit.  One suggestion they give is:
    Use [this UPS] voltage sensitivity setting with equipment [that] is less sensitive to fluctuations in voltage or waveform distortions -- setting status to low.
    Has anybody had the problem that an APC UPS that seems to be working fine (all green lights) fails with feeble weather-related line fluctuations?
    Where do I find the range of "sensitivity to fluctuations in voltage or waveform distortions" for the iMac 3.06 GH Intel Duo?
    Has anybody adjusted the sensitivity on an APC UPS, and with what results?
    Thanks

    The Format Value function will not accept an array so put it inside a loop. Use a shift register to append all the values from the array to the string.
    Lynn

  • Problem with CPU Voltage

    Hello
    I build new PC with:
    i5-760
    Noctua NH-D14
    MSI P55-GD65
    G.Skill 2x2GB CL7 ECO
    Antec TruePower New 750W TP-750
    TwinTech GF 9600GT
    I have problem with CPU voltage as even when I set it for example to 1.201v in bios, under load it overshot to 1.256-1268. The best is when I oc`ed cpu to 4GHZ and set voltage in bios for 1.30v it overshot for 1.45v+! Not to mention that temps are skyhigh from that. It doesn`t matter if I try to put BIOS Vcore on 1.1v on stock because under load it still rise to 1.16. I turned off all eist, c-state, turbo boost, phase led and switch phase, set vdrop to low and still something is wrong here. I updated the bios under dos from USB stick to newest 1.9 version in order to update CPU micro code without error and it not helped. I want to know is that even possible to fix or should I RMA MB or CPU.
    Minor problem is with sysfans as they power on for a second and turn off instantly to start working again when I see list on devices in post. After rebot when my monitor is switched off it takes something like 5 sec to power monitor on and another 5 to show bios, is that normal?

    I have similar issues, it's hard to tell which measurement is the most accurate, there are 2 different measurements in the control center that don't match up with the figures set in the bios.  The only way to be sure would be to get an expensive accurate multimeter but If you're only interested in running a stable overclocked computer without knowing the exact figures then get the computer running stable and lower the voltages until it starts to become unstable with prime95,
    Or you could work the other way round, set the voltages so it boots but doesn't run prime95 for long, then raise the vcore and vtt one at a time until your happy with the length of time prime95 runs without errors.  I'd be happy with 24 hours, I seem to remember errors creeping in at around the 9 hour mark on the setting below what I have it set to now.

  • How to set the number of elements dequeued?

    Hello All,
    I am relatively new to LabView. I use LabView 8.2 to detect Joystick movement using a set voltage threshold. Totally, I have 6 channels of analog input. The sampling rate is 2 Kilo Hertz with continuous sampling mode (buffer size set to 10 Kilo Samples). I show a visual stimulus on a computer monitor. Within  3 seconds from stimulus onset, the subject has to press the joystick. So I use a while loop that runs until the subject moves the joystick in any direction as detected by crossing of voltage threshold. The while loop was intended to run for a max of 3 sec if joystick was not moved. My intention while using the vi (see below) was to continuously monitor the voltage signal by means of removing voltage data points from the queue by the labview function/vi - "dequeue element". When I dequeue from inside the loop, the dequeue function removes about 100 msec worth of voltage data points ( = 198-201 points; sampling rate -2Khz) from each of the six channels. This creates problem for me because a single dequeue operation takes 40 to 85 msec during which the loop cannot be stopped. This results in miscalculation of the actual time when the joystick was moved.
    Is there a way to set how many data points the dequeue function removes? My goal is to remove about 1-2 msec worth of data points so that the loop can exit within under 5msec of the joystick movement.
    I have attached the screen shots of the subvi's I am using.
    Acquire Response.jpg - the while loop that has the subvi: Access Analog Data Queue
    Access Analog Data Queue.jpg - front panel of the subvi that has the 'dequeue element' function.
    Remove Element.jpg - a case of Access Analog Data Queue.vi showing the 'dequeue element' function
    Sorry for the long message. Any help would be greatly appreciated.
    Thanks
    Mani
    Attachments:
    Remove Element2.jpg ‏135 KB
    Acquire Response1.jpg ‏266 KB
    Remove Element1.jpg ‏135 KB

    Hello Lynn and tst,
    Thanks for your suggestions. I have attached a screen shot of a vi that has the enqueue function. As Lynn pointed out, it was the enqueueing size that was reflected in the size of the dequeueing. I tried various ways to control the enqueue element size in order to control the dequeue element size. Many of my tricks failed. I had set the data acquisition to be continuous at 2kHz with 10Ks buffer size. You may want to look at the attached image of the subvi "Analog Acquisition" while I explain my attempts. The first thing that I noticed was the 'dt' value in the input node of the timed loop which had been set by somebody to 100. I thought that was part of the reason why my dequeue size was always worth 100msec data points. So I changed it to 1 msec. This definitely made a difference in the chunk size of the dequeued element. Now dequeueing removes only data points worth 1 - 5msec. I also noticed that my data reading timer loop(the while loop in "Analog Acquisition" picture) takes about 20 -25 msec instead of the set 1 msec.
    Why does the loop take so long? I have set the 'number of samples per channel' to -1 so that DAQmx read.vi (see it in Analog Acquisition subvi block diagram) can grab whatever data is available currently and put that into the queue. Can this be a reason why the loop takes more than 20 msec? I also tried to set the number of samples per channel to 2. I reasoned that with the sampling rate of 2KHz, 1ms loop should be able to pull 2data points and enqueue them. But it gave me the much-often-seen error code ":Error -200279 occurred at DAQmx Read (Analog 1D Wfm NChan NSamp).vi:1->Timed structure(s): DAQ Loop". I tried various combinations of loop time and number of samples per channel. I get Error-200279 very often. Note that I have several state transitions and only in a couple of them during every cycle of state transitions, I remove elements and flush the data queue (one place where I remove elements is shown in my original message in the picture-'Acquire Response.jpg').
    How do I set the data acquisition loop to enqueue elements for exactly 1 msec or n msec so that I can control my dequeue size?
    Thanks a lot, I am getting really tired of fixing this problem. Please help.
    Mani
    Attachments:
    Analog Acquisition.jpg ‏363 KB

  • Need to create a 1-sided matrix in Virsa CC5.2 for (132) sensitive actions

    Hello
    I need to set up sensitive transactions in Virsa Compliance Calibrator 5.2 , in addition to the out of the box global rule set. Our company tracks 132 sensitive transactions in SAP like PA40 for example. I need to be able to have CC 5.2 give me a list of all users that have access to any of the 132 transations. So I would want to run a report in Virsa CC5.2 and it needs to provide me with the names/IDs of the users that have access to the action PA40. Display/Add/Create etc. it does not matter I just need to know who can perform the action. I believe there should be a way to create a one-sided matrix with the 132 sensitive transactions on one side. I tried putting a few in a Function ID, and creating a Risk with just that function but it is not pulling up any matches, when I am sure there should be some. Can you please let me know how to create the one sided matrix in CC5.2 if you can.
    Please help if possible.
    Thank You

    Hello Vince,
    The logic that transactions within a function use to calculate a risk on Critical action Level is "OR".
    Thus, for your scenerio what you can do is to create a Function and add all the transaction to it. Then, define a Critical Action risk with this single function only. That should do the work for you. Thus anytime, any role or user has one or more of these transactions, this will show up as a risk in your reports.
    Regards,
    Hersh

  • Different scaling and sensitivity units for acceleration in Measurement & Automation Explorer

    Hi,
    I am using  Measurement & Automation Explorer version 5.1.0f0.  I have previously configured my accelerometer using this module.  Under setting, I have used g as my scaled unit.  The sensitivity units is set at mVolts/g.  This is just fine.
    However when I tried to change the scaled units to m/s^2.  I DID NOT get the option to set the sensitivity units into mVolts/m/s^2. 
    Is it OK to continue using mVolts/g for sensitivity units even when I had changed the scaled units to m/s^2.  Since the two engineering units are different, will the accelerometer output correctly give out m/s^2 reading or I have to make some adjustment at the vi post acquisition like dividing with 9.8?
    Thanks
    Attachments:
    MAX.jpg ‏86 KB

    You do need a scale (oh, setting one up with slope=9.8 and offset around 0 ought to get mV/g to mV/M^2 pretty effectivly)
    Jeff

  • Apple mouse scrolling is too sensitive in Yosemite

    I've use a wired USB Apple mouse with scroll ball (A1152) on my iMac 27", and since upgrading to Yosemite, the mouse scrolling has been way too sensitive and doesn't allow any fine adjustment of scrolling speed/acceleration in system preferences. In the Mouse preference pane, the 1st position on the Scrolling slider is way too slow–a full scroll of the bar down moves about 1/4 of a full webpage at native resolution with very slow acceleration. The 2nd slider position is not any faster than the 1st. The 3rd slider position accelerates way, way faster, and scrolls a distance of about 8 full pages using a moderate flick of the scroll ball. At this high sensitivity, I need to scroll fairly carefully to move only one page. In a finder window with 500 items, it accelerates fast enough to move from top of the list to the bottom with less than one full finger swipe of the ball at moderate speed. Any setting from 3-8 on the Scrolling slider is the same speed and acceleration and is way too fast to use in a well-controlled manner. The behavior system-wide and is the same in any browser (Safari, Firefox, Chrome). This is nothing like how it behaved in OS X Lion through Mavericks over the past four years. Also, I can set the scrolling slide to the 3rd position, close system preferences, and go back in to find that it has set itself to the 2nd position, but the scrolling behavior is that of the 3rd position until I move the slider again.
    I've trashed some .plist files from ~/Library/Preferences/ that I thought would reset things (com.apple.driver.AppleHIDMouse.plist and others) upon logout/login, but the behavior persists. I've tried a few pieces of software to try to override mouse settings, and while they will affect the mouse tracking speed and acceleration, they don't work well at controlling the scroll speed. I've reset the PRAM and SMC. I've created a new user account. Nothing has allowed usable adjustment of scroll speed/acceleration of the Apple mouse.
    Any suggestions on how I can regain in Yosemite the usable adjustment of scroll speed/acceleration that were in all previous OS versions would be appreciated.

    Try setting the scroll wheel scaling manually, by running the following command in the Terminal and then logging out and then back in to re-load the settings:
    defaults write -g com.apple.scrollwheel.scaling -float VALUE
    In this case, try values of 1, 0.5, 0.25, etc, decreasing in half each time to see how the scroll wheel sensitivity responds, and then use the one that works best. Alternatively, you can use the system preferences to set the sensitivity to the two values you have used so far, followed by running the following command between each to see what the value is for each:
    defaults read -g com.apple.scrollwheel.scaling
    Then you can determine some intermediate value to use, and use the first command above to set it to that value (again followed by logging out and then back in so the value will load).

Maybe you are looking for

  • How do I establish a new icloud account - I can't locate a screen for this

    How do I establish a new icloud account - I can't locate a screen for this

  • Can a user override the Current Date field?

    I have a simple form with a Current Date field.  Today's date is automatically populated in this field, but I would like to give users the option to enter a different date in the field.  There is a value called "Calculated - User Can Override" but th

  • Formatting lost in FireFox

    Hello Friends, Try visiting: http://help.adobe.com/en_US/RoboHTML/7.0/ Using both Firefox and Internet Explorer. The right pane looks alright in IE but not in FF! RH output is supposed to work well in IE, Safari & FF but often the formatting goes for

  • Firmware update different codes?

    Here is the dilema. I purchased a nokia n82 in Australia. I now live in China. I want chinese language on the phone. I know that it will require a firmware change or update. I went to the local NCP and some stupid cow wanting to avoid work said "bad

  • HT4897 Icloud email alias apparently causing duplicate notes and contact groups

    I seem to be getting duplicate notes and nearly duplicate contact groups as a result of my alias email account.  This coincided with the upgrade of my iphone, but the problem is on my iMac running Mountain Lion OS.  When I open preferences for Mail,