Why is Self Calibration of 6711 inconsistent

Self Calibration of 6711 using E-series calibrate in Labview RT is inconsistent (error issued: -10842) inspite of warming up the board, ensuring that the board has an IRQ assigned and Front Mount connector being removed. Is calibration for 6711 not supported in Labview RT?

Have you tried using the 6711 as an RDA device and performing the calibration with the configured RDA device? This might get around the error. I hope this helps.
Regards,
Todd D.
NI Applications Engineer

Similar Messages

  • How many DSA Cards can be simultaneo​usly Self Calibrated​?

    I have two PXI-1045 chassises connected to a single computer via a dual port MXI Express card.
    Each of the two chassises have 12 PXI-4462 4 channel DSA cards.
    I am using the "DAQmx Self Calibrate.vi", which takes about 156 seconds.
    It appears that if place 24 of these DAQmx Self Calibrate VIs on my block diagram,
    then it appears that 4 will finish after 156 seconds, then another 4 will finish after
    another 156 seconds, and so on, until all are finished after a total of 936 seconds.
    I have  found that 1, 2, 3, or 4 can be simultaneously self calibrated, but more than that requires additional time for each group of 4.
    So my question is, is it possible to sell calibrate more than 4 at once?
    My computer is a quad core, but I don't think this is the limiting factor.
    Could it rather be some other resource, such as Triggers on the PXI bus, that are the limiting factor?

    Hey Kevin,
    The problem you are running into is likely caused by the fact that LabVIEW allocates a default of 4 threads per priority per execution system.  DAQmx VIs by default run at the same priority and execution system as their caller (despite the existence of a "DAQ" execution system), so they are probably running at normal/standard:
    While a .dll is being called by LabVIEW the thread is reserved.  So, 4 parallel calls to DAQmx Self Calibrate.vi (which call into nilvaiu.dll inside the subVI) will reserve all 4 threads that LabVIEW has allocated to the normal/standard priority and execution system.
    To raise the number of threads that LabVIEW allocates, add the line "ESys.Normal=24" or whatever number you'd like to LabVIEW.ini.  You might also have to run threadconfig.vi (instructions found here).  The end result should look like this when you run threadconfig.vi (you'll have to restart LabVIEW after editing the .ini file):
    You could technically just use threadconfig.vi to set the number of threads without manually editing the .ini file, but this only lets you allocate up to 8 threads per execution system per priority.  The number shown in < > is two higher than the actual number of threads allocated due to how the enum was defined in the vi.
    Best Regards,
    John Passiak

  • Self-calibration of NI PCI-6036E

    I have a new NI PCI-6036E (low cost) data acquisition board. We set it up using one of the analog input channels in single-ended nonreferenced mode.
    I want to acquire a voltage value which is equal to a pressure reading of the connected pressure controller.
    After connecting all parts together and set up the parameters I still have an offset between the voltage value from the pressure controller (7.85 V) and the card reading (8.05 V). How can I solve this problem?
    I tried to do a self-calibration of the data acquisition board using the provided online E series diagnostic utility on your web page undere: www.ni.com/support/selftest but it says "device detected but not supported". The E-series model I am using sould be supported
    Can I do a quick self-calibration in an other way?
    It would be very nice somebody could give me some advice how to solve this problem.
    Thank you very much for your help!
    beam

    Beam,
    You can programmatically perform a self-calibration. In LabVIEW, you can call the E-Series Calibrate VI within the Data Acquisition>>Calibration and Configuration subpalette. If using a text-based ADE, you can call the Calibrate_E_Series NI-DAQ function.
    You are right in that the PCI-6036E device should be supported by the online diagnostic utility; that should work. Make sure you have NI-DAQ 6.9 or higher and Internet Explorer 5.0 or higher.
    Also, is the voltage you are reading floating or ground-referenced? If the source is floating then you should use bias resistors when configured in NRSE or differential mode. Chapter 4 of the E Series User Manual (linked below) goes over how/when to connect bias resistors. I would take a look at this chapte
    r and use it as a reference for connecting signals. I hope this helps.
    PCI E Series User Manual
    http://digital.ni.com/manuals.nsf/webAdvsearch/06F1C9FB0D0BA5C286256C010057461B?OpenDocument&vid=niwc&node=132100_US
    Regards,
    Todd D.
    NI Applications Engineer

  • Error during self calibration - PXI-4461

    We are running Calibration Executive 3.2.  We are using a PXI chassis and controller and trying to calibrate a PXI-4461 card.
    In running the procedure, we received the following error during Self Calibration:
    Error 200718 occurred at DAQmx Self Calibration.VI at step self calibrate.
    Any guidance in resolving this issue would be appreciated.
    Richard

    Here is the info from the calibration report.  I can email you a PDF of the report and also a screen capture of the error message which states: "Measurement taken during calibration produced an invalid AI gain calibration constant. If performing an external calibration, ensure that the reference voltage passed to the calibration VI or function is correct. Repeat the calibration. If the error persists, conatct National Instruments Technical Support.
    CALIBRATION PERFORMANCE TEST DATA
    DUT Information
    Type: PXI-4461
    Tracking Number: 33367
    Serial Number: 33367
    Notes
    Customer Information
    Name: Cal Lab
    Address:
    Purchase Order:
    Notes
    Environmental Conditions
    Temperature: 23.0 C
    Humidity: 13.0 %
    Operator Information
    Operator Name: administrator
    Calibration Date: Friday, March 23, 2007
    16:27:03
    Notes: Error or termination
    occurred. This calibration
    may not be valid. Error
    code: -200718 Error
    message: Error -200718
    occurred at DAQmx Self
    Calibrate.vi at step Self
    Calibrate Possible
    reason(s): Measurement
    taken during calibration
    produced an inval
    PXI-4461 Serial Number: 33367
    Friday, March 23, 2007 16:27:03 Page 1 of 2
    Standards used during Calibration
    Type Tracking Number Calibration Due Date Notes
    Fluke 5500A Multifunction
    Calibrator
    32261 3/15/2008
    DMM 32260 11/21/2007
    33250A 29536 10/13/2007
    Calibration Results
    Test Canceled
    Calibration As Found As Left
    Test Value Low Limit Reading High Limit PassFail Low Limit Reading High Limit PassFail
    N/A N/A N/A N/A N/A N/A N/A N/A N/A
    PXI-4461 Serial Number: 33367
    Friday, March 23, 2007 16:27:03 Page 2 of 2

  • SCXI 1125 self calibration

    Hi,
    Does SCXI  1125 support self calibration? If yes, can I do it using the DAQmx Self Calibrate.vi?
    "A VI inside a Class is worth hundreds in the bush"
    യവന്‍ പുലിയാണു കേട്ടാ!!!

    Thank you Vishal.
    SCXI 1125 user manual says that internal calibration is possible (for offset) and that we should use the SCXI calibration VI (not the DAQmx Self calibration VI). I am not able to find this VI in the DAQmx Calibration palette. The only SCXI calibration VI present is for external calibration.
    "A VI inside a Class is worth hundreds in the bush"
    യവന്‍ പുലിയാണു കേട്ടാ!!!

  • PCI-4472 self calibration error under MAX

    Hi,
    Our PCI-4472 appeared to have offsets on the order of several hundred microvolts, so I tried a self calibration under MAX (version 4.1.0.3001) to try to reduce them. The self calibration gave me the error message:
    "Measurement taken during calibration produced an invalid AI gain calibration constant."
    Does anyone know where I can find troubleshooting information for this issue, or have any suggestions on how to fix it?
    Thanks,
    Ron Norton
    Faculty Research Assistant
    Gravitation Experiment Research Group
    Department of Physics
    University of Maryland
    College Park, MD

    I am posting this for an applications engineer, intended for this to post last in a linear fashion ~
    I have attached screen grabs of the self test and calibration panels.
    When running the "self-test" panels within MAX, all tests pass.
    I performed the tests after hitting "reset" in MAX.
    Current versions of software installed: Labview Full Dev 8.2., and Max 8.5.
    All upgrades have been applied.
    Ai1 is setup to measure a voltage differential of 1-5 VDC based upon a "differential" input
    across a precision resistor. I have setup custom scale, 1V = -40`C 5V = +180`C.
    I have tried setting sample rates of: 3 @3Hz, 10@60Hz, 100 @600Hz, etc...
    As I run the self test, and it's the same when I launch Labview and work within a VI, I see
    the wave output with what looks like a square wave. I have put a scope on the wire pair
    and do see steady voltage levels. Sensor has a current loop supplying external excitation.
    Sensor output passes thru a linearization circuit operating at 60Hz.
    In the 3rd image I captured Ai0 which is similarly setup in "differential" 1-5VDC input to scaled output.
    You can see the graph image captures the trailing signal just like on Ai1 analog channel.
    I have verified from Omega that the sensor is wired correctly supplying current loop,
    and also taken the wires and "twisted" the pairs for less EMI/RFI crosstalk/noise.
    The sensor and it's power supply are brand new purchased directly from Omega.
    I am planning to send this to a local vendor for calibration, SE Labs in Santa Clara, but if it's
    a matter of the boards circuitry being damaged than it needs repair. I started here about
    3 months ago and after asking around found the PCI-6250 DAQmx board was available.
    Problem is I don't know the history, besides the 2 year calibration expiring Nov of 2006.
    The 1 thing I have not tried is moving the PCI to another slot, but FYI I have just this week
    moved the PCI card over to another workstation that exceeds minimum hdw requirements.
    This card was installed in a Pentium III @848MHz workstation with 424MB of ram.
    Facts are the graph output of both channels look exactly the same in either pc.
    The previous user was an engineer, whom may have exceeded voltages/parameters
    on the analog and/or digital inputs. Thank you again for your knowledgeable replies!
    Sincerely,
    Phil Johnson
    Hardware Technician
    http://www.digitalpersona.com/

  • We had some accuracy issues with NI 5112 scope in one of the ATE and then I decided to perform self calibration using Labvidew vi to perform "niScope_CalSelfCalibrate(handle, "", 0);" function.

    We had some accuracy issues with NI 5112 scope in one of the ATE and then I decided to perform self calibration using Labvidew vi to perform “niScope_CalSelfCalibrate(handle, "", 0);” function.
    But it made it worse. I tried using option 2 to restore but it did not work.
    Could you pls advice me to resolve this issue.

    Hi Ana10,
    Are you using this digitizer with NI VideoMaster? if not you should probably post this in the Digitizer forums. That said I would suggest using the self calibrate function in MAX for this device rather than the LabVIEW API method just so that you can rule out any errors in correctly configuring the digitizer for self cal in LabVIEW. Also you should ensure that all inputs are disconnected before performing a self cal. If this still results in an error in calibration you could refer to the following document or arrange to return the digitizer to NI for external calibration.
    http://www.ni.com/pdf/manuals/370328e.pdf
    Hope this helps,
    Nick

  • Why my self-signed applet could not read local disk but could write?

    I used a self-signed certificate for the applet yesterday and it worked
    fine at the beginning time, that is, it could write/read file to/from
    local disk, and it could connect to other MDS servers. But later, a
    problem happens. It could write file to local disk(I tested it and created
    files to C:/ under windows) but when it try to read that file, it got a
    io acessing exception. It could connect to other servers. I am puzzled
    about this problem, and I wonder why I could write but could not
    read. Need I deal with any policy file issue here?

    hi,
    i 'am doing something which is similar to the stuff ur doing .I wanted to write into file on the local disk on which the applet is running.for this i have signed the applet .do i need to make it self signed wat is difference between the self signed and signed applet .The problem which i facing is that it still gives me the security exception even if i define the policy file for that applet............Can u help me reagrding this
    Thanks in advance
    your great help would be apprecriated
    rao_lavs

  • Why is screen calibration not correct in FCP X ?

    I use Macpro + 2x30" Cinama Displays
    When calibrating the displays with ColorMonki, everything is fine..... Colormunki makes great calibration .......
    Until I run FCP X!
    When using the colormunki-profile calibration profile, FCP x is just fine except for the viewer, that turns all the dark parts of my clips much too dark.
    When turning off the profiles, everything is ok.
    The same clip in events and in timeline is not "hurt" - it's only the viewer.
    If you want to see it, I've made this 10 sec. screencast to show you: http://dl.dropbox.com/u/2901482/calibration.mov
    Why on earth is this happening and what can I do (except from turing off the calibration........ which is not really what I want)

    Bonjour Kenneth,
    Thank you!
    I just change the calibration to V2 instead of V4 and then it also works in Final Cut Pro X!
    Glad that I could be of help. In case if you need to print stuff related to photo papers if that is what you do sometime... just be sure to set version 4 in ColorMunki Photo and then re-calibrate... so on... I think it is a bit of inconvience. But it can be done, though.
    Time for me to go to bed.
    Bonne nuit!
    Brian

  • Why are my iTunes file listings inconsistent?

    As a new Apple user I have been ripping my CDs to my Macbook's hard drive, in order to transfer to my music player and create my own compilation CDs etc, but am finding the album and track listing inconsistent. When bluetoothing files to my mobile phone I found the file listings all over the place. For example I ripped the 4 disc anniversary set of John Barry's film music and when looking for the files found discs 2 and 3 listed under John Barry and, after a long hunt, discs 1 and 4 under the 'Compilations' sub heading. All the title information is the same and in the same order and I can find no reason for the discrepancy. The listings in iTunes are fine it's just the file transfer boxes. I have also experienced the same problem with other artists where their greatest hits albums have, for inexplicable reasons, also been arbitrarily placed in the 'Compilatons' file despite clear titling e.g. Janis Ian's- Souvenirs placed under 'Compilations' NOT Janis Ian. Any ideas out there in etherland?

    Dear Chris CA,
    Many thanks for the swift reply and a Merry Christmas and Happy New Year to you and yours. The 'File transfer boxes', apologies for that, I let my computer illiteracy run away with me. I suppose they are the windows found when you click on 'bluetooth' at the top of your screen then 'Send File-Music-iTunes-iTunes Music-Artist-Album-Track' to make a selection for a music file to be bluetoothed to a mobile device.
    I take on board your point about compilations, but it still does not explain why a four disc set from a single album by the same artist would find discs 1 and 4 under the artist's name (John Barry) and discs 2 and 3 under Compilations. I will, however, try to follow your advice when the Christmas rush is over.
    I must take issue on the 'Compilations' comment though. I know there are those amongst us, I can be one myself, who are a tad anal-retentive, over protective or plain nuts when it comes to their music collection and filing systems (people have been known to file by date, artist and even, believe it or not, autobiographically), but I've found nobody who places single artist Definitive, Greatest Hits or 'B' Sides collections under compilations. They're always listed by artist because using any other system leads us to chaos and the collapse of the world as we know it. Imagine hunting through racks of albums of N.O.W. CDs (1 to infinity) to try and find that Olivia Newton-John Greatest Hits!!! Having to scroll through several thousand Amazon compilation pages to get to 'The Best Of Captain Beefheart'. My friend the system is flawed and I am saddened that Apple have let me down in this way when Windows, of which I am now no friend (that's a different story), have this problem licked!!!
    Kind regards,
    Barksandbytes.

  • Error -10007 after self calibration

    I am trying to self calibrate an AT-MIO64E3 using 'Calibrate_E_Series'
    under CVI5.0. After the calibration using 'SCAN_op' I regularly (not
    always however) receive the error message -10007:
    "Channel out-of-range for device type or input configuration; either the
    combination of channels is not allowed, or you must reverse the scan
    order so that channel 0 is last"
    Any hints would be appreciated.
    Thanks, Matthias
    Sent via Deja.com http://www.deja.com/
    Before you buy.

    Here is the info from the calibration report.  I can email you a PDF of the report and also a screen capture of the error message which states: "Measurement taken during calibration produced an invalid AI gain calibration constant. If performing an external calibration, ensure that the reference voltage passed to the calibration VI or function is correct. Repeat the calibration. If the error persists, conatct National Instruments Technical Support.
    CALIBRATION PERFORMANCE TEST DATA
    DUT Information
    Type: PXI-4461
    Tracking Number: 33367
    Serial Number: 33367
    Notes
    Customer Information
    Name: Cal Lab
    Address:
    Purchase Order:
    Notes
    Environmental Conditions
    Temperature: 23.0 C
    Humidity: 13.0 %
    Operator Information
    Operator Name: administrator
    Calibration Date: Friday, March 23, 2007
    16:27:03
    Notes: Error or termination
    occurred. This calibration
    may not be valid. Error
    code: -200718 Error
    message: Error -200718
    occurred at DAQmx Self
    Calibrate.vi at step Self
    Calibrate Possible
    reason(s): Measurement
    taken during calibration
    produced an inval
    PXI-4461 Serial Number: 33367
    Friday, March 23, 2007 16:27:03 Page 1 of 2
    Standards used during Calibration
    Type Tracking Number Calibration Due Date Notes
    Fluke 5500A Multifunction
    Calibrator
    32261 3/15/2008
    DMM 32260 11/21/2007
    33250A 29536 10/13/2007
    Calibration Results
    Test Canceled
    Calibration As Found As Left
    Test Value Low Limit Reading High Limit PassFail Low Limit Reading High Limit PassFail
    N/A N/A N/A N/A N/A N/A N/A N/A N/A
    PXI-4461 Serial Number: 33367
    Friday, March 23, 2007 16:27:03 Page 2 of 2

  • Why would the TNT4882 be responding inconsistently.

    I am using a microcontroller to talk to the TNT4882. During write cycles, the microcontroller correctly sends the address, csn, wrn and data signals to the TNT and all the signals are within tolerances of the generic mode ac timing characteristics from the tnt data sheet. However, the TNT does not consistently respond with the RDY1 signal which would indicate the data had been latched. The RDY1 only seems to work correctly when I am talking to specific registers, for example, I have always been able to load data into FIFOA, FIFOB, INTR, HSSEL, ADR, and the TIMER registers, but when I address any other register, the TNT fails to respond. On a read cycle, the TNT has the tendency to work even less, th
    e only registers I have been able to read from are the FIFOA and FIFOB registers. Once again, the microcontroller correctly asserts the address, csn, rdn, and data signals while the TNT fails to respond with a RDY1 pulse. What could be causing this inconsistency?

    As a first source of trouble shooting and reference, I would suggest looking through the following resources.
    TNT4882 Manual --> http://www.ni.com/pdf/manuals/320724.pdf
    General Register Level Programming Information --> http://ni.com/support/gpib/reg_prog.htm
    Basic shell program for interfacing to a TNT4882 cihp --> ftp://ftp.ni.com/support/gpib/misc/ESP-488.ZIP
    The specific pages of interest for you will be 5-3, 5-4, and B-4, which all deal with the RDY1 line. If, after reading through the manual and other resources, you still have questions, please let me know.
    Ryan Tamblin
    Applications Engineer
    National Instruments

  • Why is BT so unreliable and inconsistent?

    I occassionaly get ping spikes and get kicked out of games, while playing league of legends. This same problem happens for some of my friends at the same time. I get tired whenever I try to get help about this, it is the usual "check speed->restart router" etc... This has been going for months now and getting tired of it, a waste of money 

    BT service is one of the worst services I have ever experienced, and paid for in the world. The amount of money pumped into it gives me an equal amount of lags and ping issues whenever on the internet. On top of this, while carrying out important things through the internet, the internet would randomly decide to drop, especially near midnight and it happens frequently. I don't know if this is supposed to be done purposely, but at least give us a **bleep** 30 or an hour's notifcation before going on to **bleep** up, as usual? 

  • What is, "Motion Calibration" and why is it always running?

    I recently upgraded from a 4s to a 6 so a lot of things are new to me. But why is, "Motion Calibration" running 24/7 in Location Services? Won't that nuke my battery?

    I think I might be able to answer this now, although it may be a little late in coming. The reason this setting was introduced was probably not to determine how many steps you take, but to correlate it more specifically to your distance. It must have been introduced when Apple started work on the Watch and realized they couldn't put GPS into it to measure distance running.
    Instead, blurbs on Apple's site say that the watch learns what your stride is based on a few runs with your iPhone. Because there is no third-party app the Watch can tap into in order to measure this, this setting must have been placed there in order for a user to be able to have these measurements ready for when they do purchase and pair their Watch with their iPhone. The Watch would then be immediately able to access both walking and running distance that had been measured more accurately by the iPhone for months. If you deactivate it, you can still get a very accurate step count, but the distance is a guesstimate based on your height. With Location Services enabled, Apple can measure distance much better, and this aids the fitness experience in the Apple Watch.

  • PCI-5112 Simulation and Calibration

    Hello,
    I am working in a lab where they are using the PCI-5112:
    http://www.ni.com/pdf/manuals/373495b.pdf
    I had a few questions.
    First, I would like to simulate this device on my computer using NI-DAQmx so I can play around with a program which we have on another computer with the actual hardware. However it is not included in the list when I try! Is there a way to "import" the PCI-5112 so that I can simulate it? If not, is there a similar device which someone recommends?
    Also while reading the spec for PCI-5112 I saw this:
    Calibrated Vertical Ranges; ±25 mV to ±25 V in 10% steps
    Just to make sure, this means:
    ±25mV, 27.5mV, 30.25mV, 33.275mV, etc...
    Or does it mean:
    ±25mV, 27.5mV, 30mV, 32.5mV...etc...
    As well, if I select a voltage range outside of these values, is it completed uncalibrated? Or does it round my input to the near calibrated range?
    I assume the same answers to the above question also apply to the Calibrated Offset Range? 
    Finally, I saw that the internal source should be calibrated to an external source every year, and the self-calibration every 24 hours. I know for certain my lab doesn't do the self-calibration every 24 hours, and I assume they have never done it to an external source. With our experiments however we are just concerned with measuring the decay of an exponential signal. Therefore, would calibration be a moot point?
    Thanks in advance for your help and sorry for the multitude of questions!
    Cheers
    Solved!
    Go to Solution.

    The 5112 is a very old device and the driver is written using tradition DAQ, not DAQmx.  This is why you cannot simulate it using DAQmx like you can most high speed digitizers.  You can simulate it by using the correct options in the Initialize with options VI instead of the simple Initialize VI.  I believe they are the default option string (it has been almost a decade since I did this).  If you try this and have problems, let us know and I can dig up the exact string.
    The 5112 does indeed have 10% increments in the voltage range setting.  The exact values vary slightly from device to device.  On the 5112, you should set the range to your anticipated need and the driver will choose the next highest range which matches your request, assuring you of best resolution (the 5112 is an 8-bit device, so this is important).
    Self calibration can be done on demand by any user.  You can see an example of how to do this in the example niScope EX Calibrate.vi.
    Let us know if you have further questions.  Good luck.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

Maybe you are looking for