Need std and non-std baud rates on PXI-8420 (16 channel RS232)

I found a similar request on the knowledge base, and the resolution was possibly NI would make a card on a case-by-case basis.
I need the forementioned PXI card to support std and non-std baud rates (i.e. not a simple xtal change, and the software sets up for 57.6 Kbaud, and it is really 62.5 Kbaud) that is easily configured in LabVIEW/VISA.
The rates I need are as follows:
All std rates (i.e. 1200,2400,4800,9600,...)
10.4K (possible with std ports, but unsure about NI hardware and VISA).
62.5K
Thanks.
62.5K

The golden rule in serial (UART) communication, is that if two communicating partners are within +-2% of each other, they'll happily communicate error-free. Since you often only know one side of the equation, the window narrows to +-1%.
Next, take a look at how baud rates are calculated. On most NI-Serial hardware, there is a 7.3728 MHz oscillator that is divided by 16 (to create the what I'll call the 'base' frequency), and then divided again by a divisor latch. A divisor latch value of 1 yields 460800, 2 = 230400, 3 = 153600, etc. This is where the 'standard' baud rates come from.
However, on some OS's, we divide the clock source again by 4 before it hits the divisor latch. This means 1 = 115200, 2 = 57600, etc. While this does creat
e a set of 'standard' baud rates, it also limits the number of possible baud rates. This is something that will be fixed in the future.
Now hopefully the following table will make sense. In systems that have a base of 460800, we support baud rates that are within +-1% of the following (baudrates marked with * are supported in 115200 base systems):
110-9216*, 9404, 9600*, 9804, 10017, 10240, 10472*, 10716, 10971, 11239, 11520*, 11815, 12126, 12454, 12800*, 13165, 13552, 13963, 14400*, 14864, 15360, 15889, 16457*, 17066, 17723, 18432, 19200*, 20034, 20945, 21942, 23040*, 24252, 25600, 27105, 28800*, 30720, 32914, 35446, 38400*, 41890, 46080, 51200, 57600*, 65828, 76800, 92160, 115200*, 153600, 230400, 460800.
10.4k is supported in both systems at the baud rate 10472.
If you're running at 62.5k when set to 57.6k, then you're likely using a 8 Mhz Oscillator. In this case, you have a base of 500000, and with a divisor latch value of 48 you'll be running at 10416. This cor
responds to a 'normal' baud rate setting of 9600. So if you set your modified card to 9600, you should get 10.416k.

Similar Messages

  • Non standard baud rates serial support?

    1. I have a microcontroller board sedind data tp FT232R(usb uart IC) which then forms a COM port on PC to commnicate with labview serial visa.
    2. I am able to communicate with standard baud rates with any problem like 9600 or 38400 bps.
    3. I want to know can I can communicate wit non-standard baud rates like 500Kbps with labview also???

    Dennis_Knutson wrote:
    Where did I say that the VISA baud rate is an enum?
    My Bad - I wasn't paying attention!  No wonder I seldom attempt to correct you!
    So lets talk about serial baud rates.  LabVIEW does not have anything to do with it other than implement calls to the VISA API- VISA not LabVIEW handles serial communications.
    VISA does not limit serial baud rate to anything other than "a positive non zero integer "(Actually a 0 baud rate just garuntees a timeout error and is silly, negative baud rates are sillier still- think about it for a moment)
    Most hardware today detects the clock rate of the incomming TX and adapts baud properly.
    Some legacy devices exist that were designed prior to the advent of clock recovery.  These are mostly obsolete and should be considered for replacement.
    Some modern hardware that could support clock recovery has firmware developed without support for the feature either for "optimization" (it may be run from an underpowered CPU) or because the developer has been copy-pasting that same #include for decades.  Those firmware engineers are also mostly obsolete and IMHO should be considered for replacement.
    All that being said 500K baud is not inconcievable- but, you better watch out for noise in your cabling and inside the hardware too! including the COM port of the PC!
    Jeff

  • What is the minimum baud rate that PXI 8461 CAN support

    what is the minimum baud rate that pxi 8461 CAN support

    Hi,
    The PXI-8461 is the high speed CAN. The minimum baud rate is 5kBits/s. Refer to the following Knowledge base article:
    CAN Physical Layer Standards: High-Speed vs. Low-Speed/Fault-Tolerant CAN
    Hope this helps.
    DiegoF.
    National Instruments.

  • RS232 baud rate

    Hello,
    I am using the javax.com package in order to manage serial port. My application is well running when I use standard baud rate (9600, 19200...).
    I have a problem because I need to transfer data with a non standard baud rate (in this case the value by default is 9600).
    Is there a mean to program a non standard baud rate?
    Thank you

    Baud rate is a mess, it goes back to the OS. Most provide a set of constants such as
    B0
    B50
    B75
    B110
    B134
    B150
    B200
    B300
    B600
    B1200
    B1800
    B2400
    B4800
    B9600
    B19200
    B38400
    B57600
    B115200
    B230400
    which define the standard baud rates. There are sometimes ways around these, often you can just use the next higher than your hardware needs. See "man termios" to see what's behind javax.comm.

  • Does labview 5.0 support serial communication at a baud rate of 115200?

    When I try to initialise my serial port at 115200 baud rate I get error 32, device paramter error. I'm running labview 5.0. Can anyone help me with this?

    I use Serial Port Init.vi to set the baud rate to 115200 with no problem in LabView 5.0 under Windows 95.
    I get error 38 when I try a non-supported baud rate.
    Do you get the error when you run Serial Port Init.vi directly or just when you call it from your VI? On the diagram where you call Serial Port Init.vi, try placing a probe on the wire going to the baud rate input of Serial Port Init to see what value it's trying to set. You will have a problem going to 115200 if the control on your VI is represented as I16 or U8 or if the data range max doesn't go to 115200.

  • Count of Tenure and Non tenure Staff

    hello All,
    Iam facing with the problem like i need tenure and non tenure staff.
    In my internal table i have werks,btrtl , staff name and if it is tenure tenure if it is non tenure it is nontenure.
    but i need count the number of staff by tenure,non tenure for each btrtl.
    can you please tell me how can i find the count based on btrtl...

    Can anybody please advise what are the ways to get the required sum of [Measures].[Amt] and count of [Customer].[Customer].[Customer Key].ALLMEMBERS dimension?
    Also, does it make any difference if I manage it as SSRS level and not at MDX query level?
    Hi Ankit,
    We can use SUM function and COUNT function to sum of [Measures].[Amt] and count of [Customer].[Customer].[Customer Key].ALLMEMBERS dimension. Here is a sample query for you reference.
    WITH MEMBER [measures].[SumValue] AS
    SUM([Customer].[Customer].ALLMEMBERS,[Measures].[Internet Sales Amount])
    MEMBER [measures].[CountValue] AS
    COUNT([Customer].[Customer].ALLMEMBERS)
    MEMBER [Measures].[DimensionName] AS [Customer].NAME
    SELECT {[Measures].[DimensionName],[measures].[SumValue],[measures].[CountValue]} ON 0
    FROM [Adventure Works]
    Besides, you ask that does it make any difference if I manage it as SSRS level and not at MDX query level. I don't thinks it will make much difference. The total time to generate a reporting server report (RDL) can be divided into 3 elements:Time to retrieve
    the data (TimeDataRetrieval);Time to process the report (TimeProcessing);Time to render the report (TimeRendering). If you manage it on MDX query level, then the TimeDataRetrieval might a little longer. If you manage it on report level, then the TimeProcessing
    or TimeRendering time might a little longer. You can test it on you report with following query: 
    SELECT Itempath, TimeStart,TimeDataRetrieval + TimeProcessing + TimeRendering as [total time],
    TimeDataRetrieval, TimeProcessing, TimeRendering, ByteCount, [RowCount],Source
    FROM ExecutionLog3
    WHERE itempath like '%reportname'
    Regards,
    Charlie Liao
    TechNet Community Support

  • CRIO-9014 serial baud rate

    Hello,
    I have a cRIO-9014 system and am trying to communicate with a device through the built in serial port. The device I am using requires a different baud rate than the default 9600 I find in settings under MAX. Is there a way anyone knows of to change the cRIO serial port's baud rate (under MAX the settings are grayed out).
    Thanks,
    Dan

    I was looking for this information, but my cRIO-9014 seem to be perfectly happy with opening and using a baud rate of 230400, out it errors out if I try any of the (standard) settings over that.. (This contradicts the answer given above that cRIO9014 maxes out at 115200 baud.)
    Is this dependent on the hardware revision of the cRIO-9014?  My R&D unit is a newer 9014 but I'm developing a module that would be doing file transfers over the RS232 for some 9014's that are a few years old and I'm curious if we can expect the 230400 rate to work on those like it does on my R&D 9014?
    I did not see any mention of supported baud rates in the 9014 OPERATING INSTRUCTIONS AND SPECIFICATIONS from 2013 and I'm curious if there is another public document that contains such information?
    --And since I already did a thread-resurrection, does anyone know if there is an NI or 3rd party USB to serial (RS485 ideally) dongle that can be accessed via NI VISA on a cRIO-9014??
    QFang
    CLD LabVIEW 7.1 to 2013

  • Peer-Switch with vPC and non-vPC Vlan Port-Channels

    Hi,                 
    in a design guide i have noticed that it is best practice to split vPC and non-vPC vlans on different inter-switch port-channels. Now, if i want to use the Peer-Switch function, but the port-channel interface of the non-vPC-vlan channel moves into blocking state. The option spanning-tree pseudo-information has no influence. Is peer-switch possible in my kind of topology?
    Greeting,
    Stephan

    I believe absolutly possible. specifically coz peer-switch and spt pseudo-info are specific and local to cisco fabric services running as part of  vpc technology. Personally me has lab with vpc-domain compounded of 2 N5Ks. They are peer-switches with spt-pseudoinfo and they have MST running on non VPC links independantly from vpc.

  • "Baud Rate" AttrId not found for CAN frame API's ncGetAttr and ncSetAttr

    Hi,
    Well, everything is in the Subject : "Baud Rate" AttrId not found for CAN frame API's ncGetAttr and ncSetAttr
    Although it is listed in the LabVIEW Documentation,  it is not in the enum's items gotten from `right click` -> `create` -> `contsant` on their `AttrId` connector.
    My actuall need would be for checking a formerly opened CANNet handle compatibility against hardware  requirement, so ... ncGetAttr.vi
    Anyway, if somebody knows the actual u32 to hardwire instead of the enum constant, or did I get something wrong ?
    Thanks

    Hello,
    You are quite right about the ncGetAttr.vi however the ncSetAttr does have a Baud Rate item if you scroll further down.
    To get the Baud Rate wire a x80000007 to the AttrId pin. Or you can double click on the the ncSetAttr and then go into its block diagram, now
    in the case structure select case 27 copy the numeric constant (which should be x80000007) and use it as the input to the ncGetAttr
    Let us know if this helps.
    Christian A
    National Instruments
    Applications Engineer

  • About "using namespace std" and tool.h++

    Hi,
    I am a new starter working on some old code C++ using tools.h++ which was previous compiled under the 4.2 version. After reading early topics in this forum, I know that I can use in the new workshop 6 compiler by compiler options like "-library=rwtools7, iostream" to enable usage of both standard iostreams and classic iostreams, however I have problem with "using namespace std", which is plenty in original codes, even the following simple code won't pass: CC -library=rwtools7,iostream test1.cc
    File: test1.cc
    using namespace std;
    #include <rw/rwstring.h>
    #include <iostream>
    #include <string>
    int main()
    RWCString s1("hello");
    string s2("world");
    cout<<s1<<s2<<endl;
    return 0;
    error code is :
    Error :the name cout is ambiguous, cout and std::cout
    Error: The operation "ostream << std::basic_string<char, std::char_traits<char>, std::allocator<char>>" is illegal
    but if I marked "using namespace std" and replace cout with std::cout, everything works fine.
    Thanks for the time.
    PS: when I use -library=rwtools7_std, a CC warning of illegal option -library=rwtools7_std ignored, am I missing something in the system?

    The C++ Standard Library, which you access by using headers such as <iostream> and <string>, does not mix well with classic iostreams.
    One problem is that classic iostreams has identifiers that are the same as identifiers in standard iostreams, but in different namespaces. If you add
    using namespace std;
    to your code, you create conflicts among the names.
    In general, "using namespace std;" is a Bad Idea in real applications. The standard library has so many names that you are likely to run into conflicts with names declared elsewhere. You can add individual using-declarations for types that you need, or qualify them explicitly.
    #include <string>
    #include <list>
    using std::string;
    std::list<string> ListOfString;
    When mixing classic iostreams with the standard library, you cannot have any using-declaration or using-definition associated with names from either iostream library -- the names will conflict.
    If you want to use the standard library with Tools.h++, we recommend not using classic iostreams at all. Use the option
    -library=rwtools7_std
    to get a version of Tools.h++ that works with the standard library.
    Most simple iostream code works with both classic and standard iostreams, so this technique is worth a try. With luck, either you won't have to modify your source code at all, or the needed changes will be simple.
    You mention you are using the "new WorkShop 6" compiler. Workshop 6 is over 5 years old, and was long ago declared End Of Life. No support is available for it. Maybe you mean WorkShop 6 update 2, which was released in 2001, and which has just been declared End Of Life. WS6u2 is the earliest version that supports the -library=rwtools7_std option.
    In any case, you should look into upating to a current compiler.
    http://www.sun.com/software/products/studio/index.xml
    The current release is Sun Studio 10, listed here. But Sun Studio 11 will be released in two weeks. Watch that space (and this forum) for an announcement.
    The C++ MIgration Guide that comes with the compiler explains every issue you can run into when porting code from C++ 4.2 to C++ 5.x in standard mode. It also shows you how to modify code when necessary to get the proper result.

  • Using Adobe Customization Tool XI to deploy Acrobat STD and PRO

    Hello,
    I'm using Adobe Customization Tool XI to build Adobe Acrobat STD and PRO packages.
    It seems that none of my changes stick during deployment.
    Example: I open AcroStan.msi with the tool. Make all necessary changes and generate transform file. I then exit without saving changes to AcroStan.msi file.
    After that I test the installation in C:\Test folder with the following command:
    msiexec /i acrostan.msi TRANSFORMS=test.mst
    Installation window pops up since I didn't use silent installation switches, but none of the information I have provided are available, it defaults to out of the box experience asking for serial number as well.
    Am I doing something wrong?
    Luke

    I have created .mst file named test and put it in c:\New Folder along with all source files, including AcroStan.msi
    Command line:
    c:\new folder>msiexec /i AcroStan.msi TRANSFORMS=test.mst /qb (I've used QB to watch the progress).
    It failes with this error attached below.
    We can close this subject though.
    I've started testing this on different machines and they all work fine.
    I have many VMs to test on, and the one I have picked to start with is causing the above error so I'm going to link this issue with the OS.
    Other VMs and desktops are working great.

  • HT1920 I have got a Activation Lock on my iPhone and I need to get it activated. I don't know what Apple ID I was using with my iPhone. It shows as U***@h******.uk which is too short to be my email address. I have reset all my Apple ID passwords and none

    I have got a Activation Lock on my iPhone and I need to get it activated. I don't know what Apple ID I was using with my iPhone. It shows as U***@h******.uk which is too short to be my email address. I have reset all my Apple ID passwords and none are activating my iPhone. I have also been into the apple store and they have tried to find out the Apple ID my iPhone was using but this was not successful. The serial number of my iPhone is: C32JN641DTWF

    You are going to need to change the email address you use with your old ID. Once you have got access to your old account you will then log into both accounts at the same time on your Mac and transfer your data to a single account. We can do this later, but need you to get access to your old account first.
    My Apple ID

  • HT3014 my mac has the VGA port and I need a newer monitor for it. what do I use that will connect it. I had a new dell monitor and none of the things that dell said worked. so I am going to get a mac monitor for a second unit.

    my mac has the VGA port and I need a newer monitor for it. what do I use that will connect it. I had a new dell monitor and none of the things that dell said worked. so I am going to get a mac monitor for a second unit.
    my laptop is a 2008 model

    Hmmm... 2008 MBP has...
    Video (Monitor):     1 (DVI)
    Details:     Supports external display in dual display and mirroring modes. VGA output provided by included Apple DVI-to-VGA adapter, S-video output provided by optional adapter (sold separately).
    http://www.everymac.com/systems/apple/macbook_pro/specs/macbook-pro-core-2-duo-2 .5-15-early-2008-penryn-specs.html

  • Hello..i have IPhone 4S i did reset to factory settings by fault and till now after a 2 hours there is nothing just a waiting icon in the middle of the screen and none of home or turn off button are response so i do need your help plz

    hello..i have IPhone 4S i did reset to factory settings by fault and till now after a 2 hours waiting (as known it takes a 1 to 2 minutes maximum) there is nothing just a blank screen and a running waiting icon in the middle of the screen and none of home or turn off button are responsed so i do need your help plz & thanks in advance..

    Try to reset it by holding the power an home buttons at the same time until you see the Apple logo, then release.  If it won't reset, you'll probably have to force it into recovery mode and restore it, as explained here: http://support.apple.com/kb/ht1808.

  • How to auto-detect a baud rate on UART or Serial port. and how to handle different baud rates for Transmission and Reception

    Hi,
    Till now i have used only single baud rate for opening a port, writing some stuff and reading back from other end.  This time requirement is little different, can SerialPort class
    be used to have different baud rates for Tx and Rx, I have not seen that in the msdn, but any other ideas also welcome!!!. I am experimenting!!!

    When I look in the Info.plist file for the AppleUSBCDCDriver kext, which is what the driver should be for a USB modem, there are three driver personalities listed: DeviceClass = 2, DeviceProtocol = 0, DeviceSubClass = 0; DeviceClass = 2, DeviceProtocol = 0, DeviceSubClass = 2; DeviceClass = 2, DeviceProtocol = 1, DeviceSubClass = 2. When I look at the info for the Phone with USB Prober, the class, protocol and subclass are all 0 and it is assigned to the AppleUSBComposite driver which appears to be a default when it doesn't know what else to use.
    Does that information come from the phone or is it assigned by the Mac? I'm wondering if the Mac doesn't recognize the phone and is assigning default values, or if the phone is responding with the wrong values causing it not to be assigned to the right driver. Are there other USB utilities out there which could possibly configure the phone's USB port to identify itself correctly?

Maybe you are looking for