DC Bias on 4276a initiated by Labview

I am working with the 4276A LCZ Meter, and I have downloaded and installed the driver for it. However, I open up the example VI and it seems to work except it puts out the error "ERROR 1300" in Labview and on the realy instrument it displays "ER 13".
I talked to my colleagues about this and it seems that there is some assumptions made by labview over the instrument. In the 4276a manual (Section III, 3-14) (page 56 in pdf), it says that this error occurs if internal DC bias voltage was set via the HP-IP but instrument is not equiped with Option 1 (Which is a DC Bias Supply), or it occurs if the comparator enable code (E1) was sent via the HP-IB, but the instrument is not equipped with option 2 (which is a comparator/handler interface).
Well, this 4276a is not equipped with either option. The switch on the instrument that turns on the DC Bias is off. However, as you can see from the picture showing my error, there is a DC bias in the VI itself.
So I am wondering, how would I go about turning off the DC bias or stopping LabView from turning it on? I have looked through the subVIs that make up this example.VI and there is no code that inputs the string "E1" for the enable code, so I don't believe the error occurs because of the Option 2.
I have added the block diagram for the example.VI. In addition, I added the subVi (ConfigTestSignal.VI) which contains the DC Bias input.
Let me know what you guys think. THANK YOU!
Solved!
Go to Solution.

Hello Erny123,
I've not read the manual, but I would guess that BI is a string that indicates a bias configuration, and EN indicates the end of the command.  "%.1f" is a format specifier that inserts a floating-point decimal value with one digit of precision (refer to that link for the syntax requirements when using the Format Value function). Just above that you can see that a similar format function is used to insert the test frequency into the command string using FR rather than BI, and a slightly different numeric format.
The example you're using hasn't been written using the standard LabVIEW style, so it may be a little hard to track what inputs and outputs go where.  Try clicking the Clean Up Diagram icon in the example's block diagram to re-order things in a more logical left-to-right fashion. 
What this VI is doing is composing a string of commands to send to the instrument- you'll need to refer to the manual to see what the format requirements are, but I would guess that you can just delete the second format value function and wire the output of the first through to the send message VI. 
I wouldn't say that you need to start from scratch every time, but yes, you'll need to make some modifications to this example to get it to work for you.  I'd strongly recommend that you go through one or more of the LabVIEW tutorials available online, it'll save you a lot of time in the long run.
If you have an active support contract and your software serial number has been activated on your ni.com account, you should have access to the LabVIEW Core training online here:
NI Online Self-paced-training
https://lumen.ni.com/nicif/us/LMS_LOGIN/content.xh​tml?du=http%3A%2F%2Fsine.ni.com%3A80%2Fmyni%2Fself​...
if you don't have access to that, there are a number of other intro to LabVIEW tutorials available on ni.com The links on the right-hand side of this page are a pretty good place to start for online tutorials:
Academic LabVIEW Training: How to Learn NI LabVIEW
http://www.ni.com/academic/labview_training/
Regards,
Tom L.

Similar Messages

  • I have installed successfuly Nidaq692.exe and LV5Fra.exe drivers for LabVIEW AND PCI-6110 card

    Unfortunately, I didn't find the data acquisition functions to acquire data from PCI-6110 through LabVIEW.
    How can I do to find VI data acquisition in Labview.

    Is the Data Acquisition palette completely missing? It may be that if you initially installed LabVIEW without NI-DAQ, that none of the VI's got installed. You may have to reinstall at least part of LabVIEW.

  • Real time labels on x-axis

    Hi all
    I thought I had cracked the business of putting realtime labels on the
    x-axis of Labview charts but today I found that the way I had done it
    was failing at any times between midnight and 01:00.
    In the past I have used a property node for my chart or graph (not sure
    which) and initialised it with the current PC time. I'd set the x-axis
    format to be time & date. This seemed to work but I had noticed that,
    when daylight saving was active, that Labview would try to correct for
    this. (I'm GMT, London so occasionally go GMT +1). So I'd added a
    correction to subtract the "error". This works fine apart from between
    midnight and 01:00. At these times my correction results in a negative
    number which Labview cannot interpret as a time (not surprised by this)
    and so it plots Neg on the x-axis until the time changes to past
    00:59:59. The PC time zone setting also causes a problem that I hadn't
    appreciated until I started looking at this.
    In the Labview examples for real time graph axis, NI have used an x-axis
    formatted decimal and then selected realtime from the radio boxes below.
    This works OK, and is immune to DST and time zone offsets but doesn't
    look like a proper time at times between midnight and 01:00.
    So the only way I can get a nice looking display and for it not to
    misbehave at times between midnight and 01:00 is to set the PC system
    clock for GMT:Casablanca,Monrovia. This time zone has no offset or
    daylight saving activity.
    Have I missed something or is this way harder than it should be?
    Many thanks in advance for any help or advice.
    Regards
    Bill
    mailto:[email protected]

    Hi
    Sorry, was posting from a newsgroup didn't realise this was web based too.
    In my example
    timeongrapghs+dst correction.vi
    where the chart x-axis is formatted to be date and time, for some reason Labview adds an offset according to the time zone and/or
    whether or not daylight saving is active. I had corrected this offset by using a simple subtraction but at midnight the hour value is zero so, as I said in my initial post, Labview does not recognise -1 as a time and so causes the x-axis labels to be in error (they actually say Neg). My example shows this. If you set your PC clock time to e.g. 00:12 and your PC's time zone to GMT, London. If you change the PC date to be in or out of daylight saving (Apr-Oct daylight saving is active) you will see the problem come and go.
    So I looked at the Labview example and they use the decimal formatted x-axis with realtime selected from the radio buttons
    underneath. I tried this approach on my example (timeongrapghs decimal x axis correction.vi) and this does not seem to suffer from an offset introduced from the time zone and/or whether daylight saving is active but at midnight the time displayed on the x-axis reads without zeros, so 00:14:45 reads 14:45 - misleading.
    I found that setting your PC clock to GMT:Casablanca,Monrovia results in my example behaving perfectly around the midnight time. This time zone has no offset or daylight saving activity.
    timeongrapghs NEEDS GMT CASA.vi
    Finally I looked at the example you linked too and although that works (but I don't understand it ), I want my display to fill from the far left to the far right, and once the x-axis is full, for the display to scroll across from right to left. If you look at my examples, they do this, although I noticed that using them in LV 7 (they were written in 6i) automatically seems to switch the autoscale x-axis on for the chart - you need to switch autoscaled x-axis off to see how I want my charts to appear. The example you linked doesn't do this, and turning off the autoscaling seems to stop the data from being displayed. Can you change the example you linked so that the chart behaves like my examples but without the time zone/DST problem - if so can you make it so I can plot more than one data set on the y-axis (same time resolution of data)?
    Thanks for helping
    Bill
    Attachments:
    timeongrapghs NEEDS GMT CASA.vi ‏45 KB
    timeongrapghs+dst correction.vi ‏42 KB
    timeongrapghs decimal x axis correction.vi ‏41 KB

  • Triggering and Recording for specific duration

    Enclosed is the VI which I am using to trigger a electrical circuit for generating a spark.
    My system is very simple wherein I am generating spark in a vessel (by a triggred electrical pulse to HV generator) and measuring the pressure and temperature for a specified time of X and Y seconds after the spark.
    Part A:
    On a single Boolean input (TRUE) I would like to have following:  
    1. The Trigger pulse (USB 6009 device) should start and last for say 1.5 sec. which is input to my electrical circuit. Two DAQ max is used since current from one port is not sufficient to trigger the HV generator - electrical circuit.
    2. At the same time the pressure inside the vessel should be recorded for X sec. only (writing to file)
    3. Simultaneously the temperature should also be recorded for Y sec. only (writing to file)
    In the present VI, I have to keep the “Auto fire and record” button pressed to write the data to file. I have set up preset time for the trigger pulse.
    After triggering and recording for specific time 1.5, X and Y sec. the Boolean should go to FALSE position since long duration pulse to electrical circuit can be damaging.
    *** REQUEST SOME-ONE HELP ON THIS MATTER ASAP. ****
    Part B:
    Later I have to record the values of current and voltage measured at three locations by Tektronix TDS 3034B Scope (channel 1, 2  and 3) for a duration of 300 nano-seconds after the spark has occurred (i.e. say P sec. after trigger pulse and recording till P + 300 duration)
    This simple logic seems to be more difficult than, I expected, can some one please help
    / suggest in the right direction. (**VI attached**)
    Sorry for the language – New to Labview and not an electrical engineer. So pl. explain.
    THANKS
    Email: [email protected]

    Hi xseadog,
    Sincere apologies for inordinate delay in reply.
    Was blocked up with actually setting up the experiment in physical sense. Keeping all the bits and pieces in right place.
    The pressure sensor / transducer is Omega PX219 with output 0 – 5 V. Recording pressure reading every 0.001sec. immediately after the spark trigger (USB 6009 signal) is switched ON. (through labview)
    The temperature is K type thermocouple. Recording temp every 1/12 sec.(0.0833 sec) after spark.
    I have made a provision to trigger the spark manually – through push button. But this mode requires me to start recording press and temp., before pushing the button to generate spark. Hence keen on using the trigger initiated through Labview.
    Regarding relay controlled by the 6009, I understand that circuit is similar to that. The HV is generated through an EHT pulse generator, which is ready now, the only thing is it requires a trigger of 5 V and more than 10 mA.
    Manually when we generate spark, we do not require the USB 6009, so we get spark every time when button is pushed, it was only during the Labview trigging we diagnose this prob. And hence clubbed the two channels of 6009.
    As of now the loop time “Auto fire and record” which is the key for firing –spark generation –runs every 1.011 sec. can you suggest, how to reduce the loop time ?.
    Does the DAQ’s consumes time or which step consumes more time. I would request your comments.
    Thanks for the continuous help / replies.
    nnnsh

  • O record the data of a graph in a file

    Hello,
     I am initial in LabView and I would have liked to know how I will be able to make to record data which I visualize on a graph in a binary file for example with his date and his hour?
     Thank you in advance for the future answer.

    There are several options, depending on what you want to do with your data, listed by easiest to hardest.
    Read it into a spreadsheet (e.g. Excel, OpenOffice Calc) - Use the Write LabVIEW Measurement File Express VI and pick the tab separated text format.  I know it is not binary, but it is the most compatible form of data storage - anything can read it.
    Read it later into LabVIEW, DIAdem, or Excel - Use the Write LabVIEW Measurement File Express VI or the Waveform Storage VIs and pick the TDMS format (not TDM).  This stores the data in a proprietary binary format optimized for streaming to disk.  This is the highest performing option for large numbers of waveforms simultaneously.  This requires LV8.2 or above.  Use TDM with earlier versions.  A plug-in is required for Excel, but it is available on the NI website.
    Use the data with one of NI's waveform editors or soft front panels - Use the NI-HWS VIs.  This is the highest performing option for small numbers of waveforms.  Since it is based on HDF5, it can also be read by other programs (e.g. Matlab, Mathematica).  It also has compression and supports file sizes over 2GBytes in LV versions before 8.0.
    Use the data in a custom fashion - There are very few instances that cannot be handled by the above methods, but if you want to roll your own system, use the LabVIEW file primitives.  Save your t0, dt, and waveform in a sequential manner and it will be easy to get them from disk.  If the t0 is in timestamp format, you can decode it on the other end by knowing its format.  The timestamp is a fixed point 128 bit number with the decimal point in the center.  It is the time in seconds since midnight, Jan 1, 1904 GMT.
    If you need more information, let us know.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • Things that go under radar of CPU usage meters

    For example, when I compile a new kernel, the cpu runs at 100% usage, but top conky process is some unrelated process using just several %.
    The same goes for vmware machines running heavy processes inside the guess system.
    Why is it so and is there a way to get those things shown in cpu usage tables?

    Its not due to the .NET API. The knowledgebase is for all DAQmx based APIs in general...So you should see the same behaviour in labVIEW or in CVI or C++(other APIs that use DAQmx). In fact i believe that the KB i linked to was initially a labVIEW KB...
    I haven't tried this in the test panels yet, but i will let you know what i find out..
    From your reply above, you mentioned that "One thing is certain, you will never know if your application is getting too many resources unless your acquisitions failled! " , does that mean that you are getting an exception of some sort and not getting a message? or getting one too late ? or not getting one at all ?
    Best Regards,
    NandanD
    Nandan Dharwadker
    Staff Software Engineer
    Measurement Studio Hardware Team

  • Trying to use labview to analyse analog data from a jump on a force plate and measure peak force (at two points, initial land and 2nd land from jump). Also need to mark the time of flight (time off plate).

    Attached is a file of 3 trials of a drop vertical jump activity onto a force plate.  the subject stands on a platform off the force plate, jumps onto the force place and immediately jumps up as if going for a rebound.  I am able to run this data and obtain a waveform graph with no problems.  however, I need to be able to find the initial contact with the force palte, the peak of the drop, the intial time off the force plate (prior to the jump), the return from the jump and finally the second peak (landing from the jump). 
    I want to calculate the time of flight ( time off the plate and in the air after the intial drop) to calculate juimp height.
    I had someone write me a mathscript for it and it works well, however, I need to do it without mathscript as I do not understand mathscript (nor Labview!!).
    Please help
    Attachments:
    Jose_Index and shift register6.vi ‏130 KB
    NI post.docx ‏365 KB

    OK, but I'm not understanding what you're asking us to do... Are you asking us to explain what the MathScript code is doing? (It's searching the array for the elements when the values are above or below a threshold.) Are you asking someone to convert the whole MathScript code to LabVIEW (we are not a code-writing service), or you just want to be able to calculate the new stuff you want with LabVIEW?
    In the future, please do not post proprietary file formats. Most people do not have Word, or Word 2007 for that matter. Please post text files or PDFs. Thanks.

  • LabVIEW 2012 used to hang at shutdown, now hangs at launch ("Initializing menus")

    Hello,
    I'm running Windows 8 64-bit, and have LabVIEW 2012 32-bit (12.0f3). I started using it a few days ago, and was never able to shut down LabVIEW properly. I could close my projects, but when I then tried to exit LabVIEW from the home screen, the GUI simply stop responding, and I'd have to forcibly kill it. Then, I installed the latest VIPM, and noticed that I can't shut that down properly either (I believe VIPM is implemented in LabVIEW?). None of my non-LabVIEW-based programs have this issue.
    Then, I tried to install OpenG from within VIPM, which launched LabVIEW itself. However, that hung at "Initializing menus".
    I'm not sure if VIPM had anything to do with it, but now, I'm unable to start LabVIEW altogether. Even after restarting my machine (using the Restart option, not Windows 8's Hybrid Boot), LabVIEW would hang at "Initializing menus". Task Manager shows 0% CPU activity, and LabVIEW had only taken ~20MB of memory.
    What could be the cause, and how can I fix it?
    Thanks
    Solved!
    Go to Solution.

    Ok, I:
    Uninstalled ALL National Instruments, JKI, and IVI software.
    Rebooted
    Manually deleted residual NI, JKI, and IVI files/folders
    Cleaned the Windows registry of NI, JKI, and IVI entries
    Rebooted
    Checked Task Manager to ensure that no NI service eluded deletion
    Installed LabVIEW 2012 from the NI Developer Suite DVD (LabVIEW only -- no other components)
    Rebooted
    Launched LabVIEW
    LabVIEW launches now, but I'm still unable to shut it down properly... One day it will probably refuse to launch too.
    Some help please?

  • When using the LabVIEW Simulation Module, how can I start a frequency sweep with the Simulation Chirp signal generation VI at an arbitrary time after the simulation initial time?

    I'm using the Simulation Loop on LV RT when interacting with some hardware (i.e. all I/O is happening in the Sim loop). I'm going to conduct a frequency sweep test on the hardware and want to use the Simulation Chirp function. However, there is no way (that I can see) to trigger the Chirp to start at some event (i.e. I click a boolean on the front panel and enter a case statement). The Chirp is tied to the global time of the Simulation loop and so are it's input parameters. Is there a way to make it relative to some time at which an event happens?

    Craig,
    Your solution will 'cut' the beginning of the signal and only show the signal afterwards.
    To control when the chirp should start, the best option is to use the Chirp Pattern.vi ( in the Signal Generation palette) and use a Lookup table to control when to feed the time. The shipping example (in LabVIEW 2013) shows how to code using a lookup table.
    C:\Program Files (x86)\National Instruments\LabVIEW 2013\examples\Control and Simulation\Simulation\Signal Generation\SimEx Chirp.vi
    Then, to start from a toggled boolean, look at the example:
    C:\Program Files (x86)\National Instruments\LabVIEW 2013\examples\Control and Simulation\Simulation\Signal Generation\SimEx Step.vi
    and Here is the example in 2013 (in attachment):
    Barp - Control and Simulation Group - LabVIEW R&D - National Instruments
    Attachments:
    SimEx Chirp with Delay.vi ‏241 KB

  • Problem initializing phd 2000 pump in LabVIEW

    I am a LabVIEW rookie and I am trying to write a small programe controlling a Harvard Apparatus PHD 2000 pump. (I did find the driver for the pump from NI and it works fine, but it is too complicated for my purpose)
    I ran into problems at the beginning. I tried to use "simple serial.VI" found in LabVIEW examples and see if I can initialize the port and get any response from the pump. However, I cannot get any expected response from the pump though I set up the parameters right (baud rate 9600, stop bit 1, flow control none, etc).
    The commands I tried out are all from the pump manual, including some very basic commands such as VER (showing software version of pump), etc. The port is working fine through checking using HyperTerminal. Anybody has idea why this happens?

    When you configure the serial port, did you enable sending a termination character? When you use HyperTerminal, and you hit "Enter" at the end of a command, that "Enter" gets sent to the PHD pump. LabVIEW normally sends exactly the characters you provide. If you wire a single-line string, it won't include an end-of-line termination character (Carriage Return in this case). You can configure VISA to append a termination character after each transmit, as shown here: http://forums.ni.com/t5/LabVIEW/PHD-2000-Pump-driver-change-infuse-rate/m-p/2637177#M788074

  • Avoid labview initial window to show up when running a sequence

    Hi,
    I'm running a testand sequence that calls labview Vi's (not exe not dll) and right when it starts running the main start labview window shows up right in front of testand. I mean the window that appears when you open labview. How can I avoid this? If i try to close it the sequence stops with an error , If i minimize it the popup messages that I'm sending from labview starts appearing in the left upper corner.
    Can somebody help me?
    Thanks
    Eren

    Hello Eren,
    The reason the LabVIEW screen appears is that TestStand's LabVIEW code module adapter is configured to use the LabVIEW development environment to run the VIs called by your sequence.
    In TestStand 3.0 or 3.1, you can change the LabVIEW adapter configuration to use the LabVIEW run-time engine, which will not display the LabVIEW screen. In the TestStand sequence editor, click the Configure menu >> Adapters, then highlight the LabVIEW adapter. Click the Configure button, then select the option "LabVIEW Run-time Engine".
    David Mc.
    NI Applications Engineer

  • Labview installation initiation problem

    Dear All
    I had some sudden problems with LabView (Ver: 8.2.1) on windows XP since then I am trying to resolve the issue. After several time of useless unistalling and reinstalling I was told if I remove some of registry entries related to labView the problem might be solved so I removed some of registry entries which had some sort of connection to national instrument package from my computer but I did it very carelessly.
    After that, the problem became even worse now when I start the installation process it aborts with this error message "Error: Windows SDK function returned an error  (error code -12) the system can not find the file specified"
    Do you have any suggestion to resolve this issue?

    Dear Muks
    Sorry! not really, I don't remember what I removed and more stupidly I did not make a backup of my registry but I just looked for national instrument entries in registry and deleted  a couple of those. Do you know if I can use any registry cleaning software to remove this problem or based on the error which relates to Windows SDK I can install something to resolve it? Because I installed windows SDK library files and so on but it did not work.
    regards
    Afshin

  • Is LabVIEW best for implementating an array of microphones?

    Hello techies,
    I am planning to implement an array of microphones for localization of sound. Is LabVIEW the best for this or does anybody know a better one??
    I would appreciate it if anybody could help me with this.
    thanks in advance ! 

    Stream of consciousness alert.
    My sense is that LabVIEW is not the BEST.
    The best is probably an all-hardware solution.  Very difficult to build, harder to debug, harder still to modify.
    Introduce software, the next best is probably coded in machine language.  Probably harder than all-hardware.
    Next best is to use a low-level language such as C with custom routines optimized for your particular problem.  Hard, but doable and not user-friendly.
    Move to a higher level language such as C++, better UI, slightly worse performance.
    Move to LabVIEW/Measurement Studio.  Slightly worse overhead, great UI, relatively easy to modify and debug.  Probably the only one that works on a reasonable budget/timeline.
    99+% of applications would probably not need the performance (at least initially) beyond what LV can deliver.  Even if you did, you'd be crazy not to start with the most straightforward.
    My personal bias is towards Measurement Studio.  I feel that it gives me the ability to get my hands dirty when necessary but maintain a very clean UI.  With LV I feel that there is some overhead when interfacing with external code.  If I already knew LV would I learn/buy Measurement Studio just for this application, almost certainly not.
    Don't let perfection be the enemy of the very, very good.  I say go with LV.
    End stream of consciousness.

  • Simulation des harmoniques avec LABVIEW

    bonjour à tous 
    Alors voilà, j'ai un projet cette année qui se base sur le logiciel LABVIEW, et je suis un peu embêter car j'ai un soucis dans mon programme. je vous explique l'objectif du projet :
    -Envoyer un signal d'entrée ( Amplitude, Fréquence, Offset, Phase), puis le décomposer pour obtenir les harmoniques.
    -Sélectionner les harmoniques que l'on souhaite observé à une sortie d'appareil.
    Le but du projet est d'en fait suivre l'évolution d'un signal par le biais de ses harmoniques en simulant ensuite un transformateur sous multisim et faire passer le signal de labview par ce programme de multisim. (d'ailleurs, si vous connaissez également le moyen de faire passer le signal de labview dans le circuit du transformateur que je créerai sous multisim, ce serait génial 
    Voilà, j'espère que quelqu'un pourra m'aider dans mes recherches.
    Je vous en remercie à l'avance.

    Hello,
    You're on the english section of the forum. Please write in english or post on the french section.
    Thanks!
    By the way, you didn't ask anything in your post. We don't know what you're exactly waiting for.
    Regards,
    Jérémy C.
    National Instruments France
    #adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
    Travaux Pratiques d'initiation à LabVIEW et à la mesure
    Du 2 au 23 octobre, partout en France

  • How can i update an html web page from Labview?

    I intend to publish the "status" of an experiment online through an web page. I mean, if my vi is running i would like to have a phrase or indicator set on at my web page. If my Vi is not running i would like this indicator set off. This page is hosted in a server and I would like to know if its possible to update the html file from labview through an ftp connection.
    Thanks in advance...
    MSc. Alexandre Michel Maul
    Parana Federal Univeristy - Brasil

    The system exec function is on the Communication palette. Its for executing system commands. On my Win2K system, the help for FTP is:
    "Ftp
    Transfers files to and from a computer running an FTP server service (sometimes called a daemon). Ftp can be used interactively. Click ftp commands in the Related Topics list for a description of available ftp subcommands. This command is available only if the TCP/IP protocol has been installed. Ftp is a service, that, once started, creates a sub-environment in which you can use ftp commands, and from which you can return to the Windows 2000 command prompt by typing the quit subcommand. When the ftp sub-environment is running, it is indicated by the ftp command prompt.
    ftp [-v] [-n] [-i] [-d] [-g]
    [-s:filename] [-a] [-w:windowsize] [computer]
    Parameters
    -v
    Suppresses display of remote server responses.
    -n
    Suppresses autologin upon initial connection.
    -i
    Turns off interactive prompting during multiple file transfers.
    -d
    Enables debugging, displaying all ftp commands passed between the client and server.
    -g
    Disables file name globbing, which permits the use of wildcard characters (* and ?) in local file and path names. (See the glob command in the online Command Reference.)
    -s:filename
    Specifies a text file containing ftp commands; the commands automatically run after ftp starts. No spaces are allowed in this parameter. Use this switch instead of redirection (>).
    -a
    Use any local interface when binding data connection.
    -w:windowsize
    Overrides the default transfer buffer size of 4096.
    computer
    Specifies the computer name or IP address of the remote computer to connect to. The computer, if specified, must be the last paramete
    r on the line."
    I use tftp all of the time to transfer files in a similar manner. Test the transfer from the Windows command line and copy it into a VI. Pass the command line to system exec and wait until it's done.

Maybe you are looking for