How to increase scan rate of NI Switch SCXI 1130

Hi,
I have NI PCI 4070 DMM used with NI SCXI 1130 sitch module. I have connected 10 thermocouples to 1130 module. I am scanning the channels and reading the values in the program using niSwitch and niDMM VIs. I am using software trigger in the program. I have configured Software Trigger in niDMM configur trigger and niDMM configure Multipoint. I get the correct values when i scan using chi->com0, where i goes from 0 to 9. But the problem is that the rate of scanning is very slow.
There is niSwitch Configure scan Rate.vi, here i have given scan delay as 0 second.
It takes one second for one channel when i run the program. why is this , is this because i used software trigger for each channel scan? how to improve the scan rate. ?

Sorry for the confusion, I started writing a post and got interrupted and came back to it too late.  You can disregard the last post and here is my final answer:
I would actually recommend that you use synchronous scanning if you want to maximize the speed of your scan, rather than using software triggers.  If you use synchronous scanning, the DMM will generate a digital pulse (Measurement Complete) each time it completes a measurement, allowing the switch to advance to the next entry in the scan list the instant the DMM has completed its measurement.  The DMM will then take the next meausurement after a specified harware-timed interval.  This will be much more efficient than sending software triggers back and forth to time the scanning.  To set up your application using synchronous scanning, follow these steps:
Open the LabVIEW shipping example "niSwitch DMM Switch Synchronous Scanning.vi", found in the NI Example Finder in the folder Hardware Input and Output » Modular Instruments » NI-SWITCH (Switches).
Physically connect the Measurement Complete output trigger from the DMM to the trigger input of the switch.  How you will do this depends on what type of chassis you are using (PXI/SCXI combo chassis or separate chassis) and what switch terminal block you're using.  If you need assistance with this please provide more details about your hardware setup and I'd be happy to help out.  The following resource may be helpful here: KnowledgeBase 3V07KP2W: Switch/DMM Hardware Configurations.
Select valid values for all other front panel controls and run the VI.
I hope this is helpful.  Please let me know if I have misunderstood your application, or if you would like me to go into more detail on any specific part of the solution provided above. 

Similar Messages

  • How to set scan rate with NI Switch scan voltages

    Hi
    I have SCXI 1130 switch and NI 4070 DMM . I have connected 3 voltage channels on the SCXI .
    When I read just one channel at one time, I get correct voltage reading. here I gave scan input as ch0->com0.
    Later , i placed a For loop in the block diagram and programmatically wired the scan channel input,
    for the switch and read the voltage output from the DMM, i do not get the correct outputs.
    That is , for my 3 channels, i gave For loop iteration count as 3 and ,'i ' is taken and appended for ch i ->com0. the DMM measurement is not proper. But if I highlight execution in the block diagram ( if i put the bulb and the probe), i can see the correct output voltages coming out. The moment, i turn off the execute high light, the program gives incorrect output. So  do I have to give a scan dealy or what time has to be set to get correct values. I am using software trigger in the bloack diagram.

    Hi Hema,
    CJC is an acronym for Cold-Junction Compensation, and this value adjusts for the change in voltage caused by the thermocouple wire to copper wire junction. 
    For example, a J-type thermocouple will have thermocouple wire consisting of iron and constantan metals.  When these iron and constantan metals meet the copper at the switch connection, a difference in voltage results.  This difference in voltage is the "cold-junction".  The difference in voltage resulting from the iron and constantan connection in the thermocouple is the "hot-junction".  When you measure temperature using a thermocouple, what you desire is the "hot-junction" change in voltage.  Unfortunately, the DMM is going to measure the sum of both the "cold" and "hot" junctions, and a CJC measurement is needed so we can adjust the measurement to remove the undesired offset.
    Once Cold-Junction Compensation is performed, converting from voltage to temperature is fairly simple.  Each thermocouple type has its own temperature to voltage conversion equation and associated coefficients.  Here's a great resource for the equations, coefficients, and specific voltage to temperature tables:
    NIST ITS-90 Thermocouple Database
    http://srdata.nist.gov/its90/main/
    Hope this helps!
    Chad Erickson
    Switch Product Support Engineer
    NI - USA

  • How to search/Scan Vlan of cisco switch ports

    Can any one tell me how i can scan/search vlans of cisco switch port through any monitoring tool (orion/solarwinds).
    Consider this scenario as i have no access to switch and i want to know below things:
    1-Vlans created on switch?
    2-which switch port belongs to which vlan id?
    Thanks

    Hi,
    You can do it only with hub in between and also please note that when sniffing with Wireshark on Windows the OS would remove VLAN tag so you may need to use Linux machine.
    Regards,
    Aleksandra

  • How to increase SPAN sessions on 6509 switch?

    Hi, I am using WS-C6509-E Switch having Supervisor Engine 720 10GE (VS-S720-10G)with IOS sup-bootdisk:s72033-advipservicesk9_wan-mz.122-33.SXH3.bin
    Please let me know -
    1- what is the limit of SPAN (Ingress/Egress) session using same scenario?
    2- How Can i increase SPAN (Ingress/Egress) sessions?
    Jeet!!!

    Please see attached document
    Here is the link retrieved from:
    http://www.cisco.com/en/US/docs/switches/lan/catalyst6500/ios/12.2SX/configuration/guide/span.html#Local_SPAN,_RSPAN,_and_ERSPAN_Destinations
    Summary:(PFC3)
    Total sessions: 80
    Local & Source 2
    Local span egress only: 14
    destination sessions RSPAN:64 ERSPAN 23
    Router(config)# monitor session 1 type local
    Router(config-mon-local)# source interface gigabitethernet 1/1 rx
    Router(config-mon-local)# destination interface gigabitethernet 1/2

  • How to increase scan resolution above 300 dpi on Officejet Pro 8620 e-AIO?

    Hi everyone, I wonder if anybody can help me with this, as I haven't found any info in the product support options ...
    I've just checked the technical specs for my relatively new printer (bought about 3 weeks ago), and on HP's product page, it says that the scan resolution of said model should be up to a max. of 1200x1200 dpi.
    However, in the scan window, there's no selection above 300 dpi possible anywhere. I've tried all kinds of different scenarios, eg. scan colour photo to jpeg/pdf, colour doc to jpeg/pdf, but in any of the selected options, the highest I can go is 300 dpi. File size is not an issue, this is about quality of the scanned document, and at the moment, I'm not happy with the results I'm getting at 300 dpi. I've played with additional options, such as Brightness/Contrast, but that's not giving me the desired results either. And besides, it's about prinicple - if it states a higher resolution, where is it?
    I'm attaching a screenshot, if that's of any help. Shouldn't I see more dpi options in the drop-down menu, ie. 600 dpi, 1200 dpi?
    I'll be really grateful for any insight you might have on this issue! Many thanks in advance.
    Connie

    @teutonica 
    ‎Thank you for using HP Support Forum. I have brought your issue to the appropriate team within HP. They will likely request information from you in order to look up your case details or product serial number. Please look for a private message from an identified HP contact. Additionally, keep in mind not to publicly post ( serial numbers and case details).
    If you are unfamiliar with the Forum's private messaging please click here to learn more.
    Thank you,
    Omar
    I Work for HP

  • HOW DO I ACHIEVE A TIME STAMP, IN THE FIRST COLUMN OF MY SPREADSHEET IN EXCEL,SHOWING THE TIME INTERVAL, INCREASED BY THE SCAN RATE ?

    FROM AI START VI TO FORMAT AND APPEND VI TO CONCATENATE STRINGS,SECOND INPUT IS TAB,THEN EVEY OTHER ONE IS TAB. THIRD AND FIFTH INPUT IS ATTACHED TO GET DATE/ TIME VI, THEN END OF LINE, THEN CHANNEL 0; TAB; CHANNEL 1;TAB, AND SO ON THROUGH CHANNEL 7. THEN THE LAST INPUT IS END OF LINE. ON THE SPREADSHEET I RECIEVE THE SCAN RATE, THEN THE DATE ,THEN THE TIME; ALL ON THE FIRST ROW. THE NEXT ROW HAS THE COLUMN LABLES, AND DATA GOES TO THE CORRECT COLUMN. BUT NO TIME STAMP COLUMN. I WOULD LIKE IT TO BE IN COLUMN "A" EVEN WITH THE FIRST ROW OF DATA. EX." 10:01:01 200 300 400
    10:01:03 200 300 400
    THANK YOU
    FOR YOUR HELP, JOE BOB CRAIN

    I think the best way is to generate an array of time values to send to your spreadsheet BEFORE channel 0 data.
    To generate the array you can use Ramp Pattern.vi from 0 to 'Actual Scan Rate' (from AI Start) * number of scans performed.
    That way in your spreadsheet you will have the first line same as now, next the columns with time(secs), channel 0, channel 1... channel 7.
    If you need the time column with date and time string, you can use Get Date/Time in Seconds.vi when you begin acquisition, then sum the seconds obtained to the array calculated as above and next use Format Date/Time String.vi to obtain the time stamp you need, next build array of time stamps and sent to your spreadsheet.
    Hope this helps
    Roberto
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

  • How to calculate min/max scan rate of PCI-MIO-16XE-50

    My board: PCI-MIO-16XE-50 , 20MHz timebase
    The min/max scan rate is0.00596-20K scan/s and the channel rate is 1.53-20Kchannels/s
    How do they calculate these data?

    Hi May
    Check out the knowledge base locate at http://digital.ni.com/public.nsf/websearch/EA4E306828AF05B586256B8F00661A8B?OpenDocument . I think it answers your question.
    Serges Lemo
    Applications Engineer
    National Instruments

  • How to set the scan rate in the example "NI435x.vi" for the 435x DAQ device?

    I am using LabVIEW 6i, A 4351 Temperature/Voltage DAQ, and am using the example vi "NI 435x thermocouple.vi"
    How do I set the scan rate on this VI? I added a counter to just to see the time between acquisitions, and it is roughly 4s for only 4 channels. The Notch Filter is set at 60, but is there a control for scan rate on this example. I'd like to acquire around 1 Hz.

    Using this DAQ example, it may be easiest to simply place a wait function in the loop and software time when the data is extracted from the board. You can also take a look a the NI-435x pallette and the examples it ships with, along with the timing specs on page 3-3 of the 435x manual and set the filters to meet your needs:
    http://www.ni.com/pdf/manuals/321566c.pdf
    Regards,
    Chris

  • How to increase my transmission rate

    Hey everyone,
    I have two airport extremes and one new "n"airport express. When I check AU, I find a max transmit rate of 54 (all three are running at this now). Any advice on how to increase this? I can't figure out why its just 54. I have a mac book pro 17"
    Thanks

    yes. Actually it was in WDS mode, which seemed to slow it way down (to 54). When I changed it to extend network mode, it picked right up. Anyone have an explanation?

  • How to increase rate at which data is written to file?

    I have a program that reads data at 200kHz and I would like the program to write the data at the same rate. However, the fastest I seem to be able to get my program to write is about 5kHz. I have been using the DAQmx Read and the Format into File functions to read and write the data.
    I have tried moving the write function into a separate loop, so that the data would write to a file after the data collection was complete. However, this did not change the rate at which data was written to a file. Any suggestions for increasing the rate at which data is being to a file? Thanks for looking over my problem!
    Attachments:
    SampleWrite_Read.vi ‏58 KB

    Well, writing to a file is always slower since it takes some time to access the hard drive. I noticed in your program that you are writing into an ASCII file. That is also slower than if you write to a binary file. There are several examples that ship with LabVIEW that allow you to do High-Speed Datalogging (I actually believe that is the name of the examples). Those examples actually come in pairs, one does the datalogging, and another helps you read the file. I will recommend taking a look at them.
    The previous suggestions by Les Hammer is a great idea. Instead of acquiring 1-sample at a time, try acquiring 100 or 1000 samples and write them to the file.
    I hope that helps!
    GValdes

  • How to increase the battery life of your N series ...

    What I am about to post here is valid for any 3G phone or device regardless of model but it is particularly focused towards the N series devices and their power hogging features.
    Your battery life is dependant on many many things. How often you take calls on the device, the condition of your battery, the features you use on the device and so on and on. Therefore it is impossible to say that by following the information in this post you will get x amount of days battery life, but it will get you more time out of the battery than you otherwise would have got.
    So with that out the way, if your looking to increase your battery life then follow these tips and your battery should start looking a lot healthier.
    First of all lets start with THE big one. The one that is going to save you the most juice. Switching 3G off.
    Yep, you heard me right. Just by switching the 3G capability of your phone off you will add hours and hours to your battery life. How is this so? Allow me to explain...
    Due to the rather poor delivery of 3G in the UK by the network operators, it is rare for any 3G phone to maintain a constant 3G signal. Instead you will find that the phone constantly flips between 3G and GSM mode (Keep an eye on your signal one day). Even those of you on Vodafone who probably have the best 3G network coverage will find this is the case.
    Unfortunately, this constant flipping between the two modes sucks power from the battery like a vampire as it alters its reception state for the different modes and the constant flipping is..well...causing it do this constantly! It can sometimes even make your phone unavailable for calls for very brief periods as it trips from GSM to 3G and vice versa.
    If you need to use 3G for video calls or whatever then I'm afraid your just going to have to live with this but if you don't (And lets face it few of us do) then you can switch 3G off and increase your battery life considerably.
    To do this, go into the "Settings" application (Found in the menu somewhere, by default Nokia normally stick it in "Tools"), and then to the "Phone" tab. In there you will see an option that says "Network mode" and you have a choice of "GSM" or "Dual Mode" (I.e. UMTS and GSM). Set it to GSM and your phone will restart. Once it restarts it will be working in GSM with GPRS speeds only but really for most purposes this is fine.
    You have now just extended your battery capability considerably. You can further extend it by going to the "Connection" tab, going into "Packet data" and changing it to "When needed" so it is not constantly checking for a data connection.
    The second big change you can make is to turn your phones wifi scanning capability off. The last time I looked not all Nokia's phones that have wifi capability can have their wifi cards switched off entirely but if you can, turn it off except for when you need to use it. Wifi is a power hog.
    The next big change you can make is to lower the screen brightness settings on your phone. The less bright your screen is the less power is being used to light it up. Nokia by default leave the screen brightness at something like 50%. Lowering this a bit more will conserve more juice. Before you do this though please consider the fact that lowering the brightness setting will have a big impact on your ability to see the screen clearly in sunny conditions although you will be fine in the dark as you can't lower the brightness that far.
    To lower the brightness, go to the settings tool in your phone and into the display option (Hidden in a subcategory called "Personalisation" on the N95). It won't hurt to set the power saving time out to 1 minute and the backlight time out to 10 seconds while your here (Although these are the Nokia default so they should already be set to this).
    Finally in regards to the screen, although they may look pretty, animated screensavers use more battery power than the standard blank screen with time and date so avoid them if you can.
    It also helps to keep Bluetooth switched off until you need it although the power savings are minimal in comparison to the other changes but every little milliamp counts!
    Using the above methods I generally get about 3 to 4 days with about 3 hours talktime on my N95 without using Bluetooth, GPS or anything like that (I might be able to get more but so far I have not paid attention to the battery state before I put it on charge). If I am on a long train journey I can get about 4 hours worth of full screen video and about 2 hours talktime over the period of about 24 hours before it needs a recharge. As I said at the start of the post your mileage will vary greatly depending on how you use your device.
    Hope this helps.
    Useful links: Phone firmware update | Nokia support site

    02-May-200701:14 PM
    bixby wrote:
    no keffa it is a cop out from nokia
    its not unfai as its a premium device with a premium price
    the n95 battery is atrocious
    dont change the post content as the title is 'How to increase the battery life of your N series device'
    your talking about nokia phones specifically
    the networks are not to blame
    they do not make the handsets : Nokia do !!!!!!!!!!
    I'm going to choose my words carefully here...
    I would never deny the battery on the N95 is not really up to the job of powering the N95 with its power hungry features. To put the same battery into a phone that has WiFi, GPS and a large 320x240 screen, the same one that goes into the E65 which has comparatively nothing compared to it is a bit pants.
    However at no point was I criticising them for the band hopping problem. I labelled the post as how to increase the battery life of your N series device because this is a board for the N series devices. It was a simple choice of wording and not intended to be cutting in any way and I did make a remark that the details would be true of any 3G device at the top of the post.
    What I was trying to point out in my second post is that the constant band hopping the phone is being forced to do that is draining its battery so much more quicker than it would if it had a constant signal of one kind or another isn't quite Nokia's fault.
    They build it to conform to a laid out specification for 3G. However if the network operators cannot be bothered to roll out their 3G infrastructure adequately enough that the phone can find and remain locked onto a 3G signal that is usable then what are Nokia to do other than offer you the capability to turn 3G off until you need it (Although note to Nokia: That **bleep** reboot the phone does when you do this is entirely unneeded and you know it).
    Blaming Nokia for this would be like blaming the manufacturer of your radio for failing to pick up radio because the radio station does not have any transmitters within range of your radio's receiver.
    Finally...this band hopping is exhibited by all 3G phones built by Samsung, Nokia, Sony Ericsson, etc, from their most budget 3G model to their priciest piece and is the reason that all phones with 3G capabilities have batteries that do not last for any respectable length of time because these phones are also having to band hop between 3G and GSM.
    Finally the proof is in the pudding. Turn 3G off for a few days. See your battery improve. Then (Although admittedly this will be harder to do...mcuh harder) find an area where you get a fairly decent 3G signal constantly. Again, see your battery improve. Try it with a different 3G phone...different manufacturer even. The same will be true.
    So I stand by my comment, the network operators and their woeful 3G rollout are the villains costing you a fair chunk of your battery and Nokia cannot be expected to mitigate this....but a better battery would be nice all the same...
    Useful links: Phone firmware update | Nokia support site

  • Keithley 2701 LV driver: Setting the scan rate?

    I downloaded the Keithley 2701 LV driver and the example programs work great. There is just one question that I have. I cannot find anywhere in the examples or in the LV drivers how to change the scan rate or scan interval. Everything else I'm able to change, such as number of points, channels, voltage levels, etc., but I can't figure out how to change my scan rate to once every 0.5 seconds or 1 second.
    I'm not sure, maybe this is a question for Keithley but since it is a LV driver I decided to post here first.
    I'm using LV8.2
    Jeff

    I am using the Keithley 2700 with the 7700 cards.  There is a vi named "ke27xx Configure Aperture Timing.vi" in the "ke27xx.llb" .
    The vi sets the integration time for the A/D converter.  I was measuring 500uv and needed to improve my accuracy so increased my integration time.
    Brian
    LV 8.2
    Brian

  • SCXI-1125 scan rate controlling

    SCXI-1125 scan rate is 333kHz in the multiplexed mode. Does anyone knows how to slow the scan rate?.
    regards.
    naushica

    According to the web page "http://digital.ni.com/public.nsf/allkb/410A70C25A4D12B486256A1E0070BDAE"
    You will lose resoltuion for increasing the scan rate.  Isn't it?
    If the scan rate is 100kS/s and the input range is +-5V  with the
    14bit resolution, does that mean  that  I have a
    noisefloor of 600uV for that range?
    If I want to measure a signal in +-1V range does that means the noise floor is around 120uV?
    The absolute accuracy table on the other hand states about  an accuracy full scale range but
    doesn't state about scanning speed.
    For example, 18-bit  ADC module PXI-6289 data sheet states that +-5V range has a full scale
    accuracy of 510uV.If  converted to bits, it will be less than 15bits.
    So even though I reduced the scanning speed to abt 50kS/s will I get an accuracy of 16bits?
    I have a SCXI-1125 + PXI-6289 installed in a PXI-1052 combo chassis.
    Please help on this matter.

  • Scan rate and samples for FFT affecting dt

    Hi,
    I currently acquire data from an accelerometer and set the scan rate of the clock to 5120 S/s. I then use the Analog 1d Wfm Nchan N Samp to acquire 5120 samples and then use FFT analysis on it to create a frequency chart. This gives me a dt of 1 i.e. a resolution of 1Hz.
    The question i have is how can i sample the data with a dt of 1 at a faster rate.
    For example i change the scan rate to 10240 and still aquire the samples at 5120, this allows me to sample twice a second but gives me a dt of 2 (2 hz resolution) when i do my FFT analysis. All that happens by upping the scan rate for the sample clock is that i can now detect peaks at higher frequencies.
    By this i mean if i just scan at 5120 S/s using my card , when i do the FFT it goes upto 1100Hz approx , if i double the scan rate, it then doubles this to 2200Hz approx, but then reduces the resolution
    How can I improve the update speed and still keep a DT of 1?
    Thanks
    Mike
    Solved!
    Go to Solution.

    Hi,
    Unfortunately i dont think i am quite making clear what i mean
    Apologies so I'll try again..... 
       If I aquire the data at 5600S/s and my number of samples is also 5600. When i do the FFT peak spectrol measurements my df would then be equal to 1. By this as i understand it, it means i have a resolution of 1Hz on my FFT graph/data.
    This is all fine, but if i want to update the graph or indeed save the data to disk at say 5Hz how do i do this and still keep the resolution of the FFT at 1Hz??.
    If i change the S/s for the clock, all this does is increase the frequency range it does its aquiring over in the FFT. i.e. if its 5600 S/s my FFT goes upto 2550Hz on its X axis. If i up it to 10200 S/s my FFT goes upto 5100Hz on its X-Axis.
    Therefore if i keep the same samples in my aquiring waveform i would get it to run at twice a second if my sampling rate was 10200 and my samples were 5100, BUT my resolution would go down to 2 on the FFT chart.
    How can i sample at a faster rate at the frequencies i want yet still keep the reolution to 1 on the FFT?, i almost want to set the sampling rate per 10th of a second i suppose to enable me to update at a faster rate but this isn't possible - is it?
      Many thanks
      Mike

  • How to increase the performance in server 2008 R2 for RDP users

    Hi,
    My application take to much time to load. If anyone double click on mail client the exe file will appear in task manager but it will open after 5 mins. how to increase the performance.
    My sever configuration is as below,
    SC2600 Intel  motherboard with total 24 core processors and 32 GB RAM and 8 TB Hard Disk. RAID 5 is configured which has two lungs one is 167 GB for C drive and other is 4.5 TB for D drive.
    There are 28 Thin-clients connected to server through L300 N computing Thin-clients.
    Thin-clients connect to V-space server installed in server for RDP users to get connected.
    we have installed around 20 applications including printer and scanner driver. And apps are has below,
    Firefox browser, windows mail, Adobe acrobat XI, canon printer and scanner drivers, Epson printer and scanner driver, E scan anti-virus, office 2007, v space, power ISO, win-rar,Tally and e token drivers and some backup software's.
    Below  are the services and features enabled,
    AD, File services, RDP, web server, Hyper-v, .net frame work.
    Is there a way to increase the performance .
    Very slow performance.

    Hi,
    what would you suggest on  hardware configuration must be for  above mentioned applications and services with those many users.
    how many cores and ram size is required.

Maybe you are looking for

  • Why does iTunes use 100% CPU all the time, hogging my Macbook Air?

    Decided to post this here instead of the iTunes forum, since it might be related to hardware too. New owner of a 2011 13" Macbook Air, core i5 based. My problem started about 1 week ago, after it had run out of power completely one day. At the first

  • Can't install Oracle 8 in Linux 6

    I can't install Oracle 8 in Linux 6 because I encounted the following error during installation: Error during action 'Running startorcl.sql'. Command: /u01/app/oracle/product/8.0.5/bin/svrmgrl <<! @/u01/app/oracle/admin/orcl/create/startorcl.sql exit

  • Calling A Secured webservice using Username and password in the Soap header

    I want to call a secured webservice. The Username and password should be sent with the payload in the SOAP Header as <wsse:Security S:mustunderstand="0" xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd"> <

  • How to find out if a frame is mirrored ?

    Hi, I need to find out if a page item (e.g. a picture frame) is mirrored vertical, horizontal or both. Is there a way to find this out ? One problem I have is that my function tells me that a frame that is turned with 90° is mirrored, but that is fal

  • Excise calculated at migo is wrong

    hi friends.         i have maintained tax code for 8%, 2%, 1%, for bed, ecess, and secess respectively and 4% vat the Bed calculated at migo is wrong, it is calculating little more, what is the procedure to rectify this, please give all respective hi