Serial baud rate vs. high speed CAN

The bottleneck in my current app is the serial writes at 115200, to a custom device.  The program dramatically improves in speed as the baud rate is increases.  If i remove the serial write and read, the program loop time improves from 100 to 2000htz 
The device also has a CAN interface.  Im debating trying to use the NI CAN interaface card to see if there are improvments in speed but want to get an opinoin first since this will require considerable reprogramming.  The can card appears to be just two serial ports but yet, accoring to specs, appears faster.
http://zone.ni.com/devzone/cda/tut/p/id/2732
Its difficult to try to compare serial and can speeds. CAN is up to 1mbits a sec while 115200 is measuring baud.  After considerable reserach to compare the two, i still am having difficulty. Im not sure these comparisions are even valid as Im more instrested in in sending one charcter and then recieving it rather than contual streaming of data whch is what these rates measure.

  This is a custom servotube powered by an amplifier.  The servo has a renishaw encoder for 1um resolution. The application requires reading the position of the renishaw encoder every iteration.  This positional information is then logged into the data.  The position is independent of the feedback loop but it is important for the research. The feedback loop itself is determined by a strain gauge and 9237 module which compute loading forces of the servotube, and this is converted to analog output value via pid and then sent via an ni analog output module to the differential input on the amplifier.  The differential inputs on the amplifier will create a current (I) for the servotube value based on the voltage it receives.
 The bottleneck only has to do with the serial port.  Reading the serial port can be accomplished with the serial or can interface.  Using the serial port, the commands to get the position and then read receive it back, can occur at a rate of around 350 iterations a second at the highest baud, 115200. This 350 iterations is if there is nothing else in the loop other than visa read and write of the serial port with no delays. Its the best case scenario and when the additional code is in place, we are getting more like 100 iterations a second. 
So the baud rate speed dosent seem to apply?  It is really a delay in the send and receive each time the communication is initiated?  That is why the can interface might be better?    Programming the NI can card seems a little tricky and i want to be sure it will offer improvement before jumping into this task. 
Maybe there is a chip or something NI has to read the position of the reinshaw on the available NI AD cards we have on the system.?  It would then reduce the bottneck?
Message Edited by Biosolutions on 10-15-2009 07:47 AM

Similar Messages

  • How do I set a low baud rate using a High Speed CAN board?

    I have a PCI-CAN/2 board, and I want to use one channel for a 500kb/s network and one channel for a 33.333kb/s network but I keep getting a baud rate error.  Is it possible to use such a low baud rate or is there a lower limit?
    Thanks

    I would recommend to post CAN related questions to this board.
    According to page E-2 of the user manual, the min baud rate for a PCI-CAN high-speed CAN interface is 40kbps.
    Message Edité par JB le 04-15-2008 08:04 AM

  • What is the minimum baud rate that PXI 8461 CAN support

    what is the minimum baud rate that pxi 8461 CAN support

    Hi,
    The PXI-8461 is the high speed CAN. The minimum baud rate is 5kBits/s. Refer to the following Knowledge base article:
    CAN Physical Layer Standards: High-Speed vs. Low-Speed/Fault-Tolerant CAN
    Hope this helps.
    DiegoF.
    National Instruments.

  • CRIO-9014 serial baud rate

    Hello,
    I have a cRIO-9014 system and am trying to communicate with a device through the built in serial port. The device I am using requires a different baud rate than the default 9600 I find in settings under MAX. Is there a way anyone knows of to change the cRIO serial port's baud rate (under MAX the settings are grayed out).
    Thanks,
    Dan

    I was looking for this information, but my cRIO-9014 seem to be perfectly happy with opening and using a baud rate of 230400, out it errors out if I try any of the (standard) settings over that.. (This contradicts the answer given above that cRIO9014 maxes out at 115200 baud.)
    Is this dependent on the hardware revision of the cRIO-9014?  My R&D unit is a newer 9014 but I'm developing a module that would be doing file transfers over the RS232 for some 9014's that are a few years old and I'm curious if we can expect the 230400 rate to work on those like it does on my R&D 9014?
    I did not see any mention of supported baud rates in the 9014 OPERATING INSTRUCTIONS AND SPECIFICATIONS from 2013 and I'm curious if there is another public document that contains such information?
    --And since I already did a thread-resurrection, does anyone know if there is an NI or 3rd party USB to serial (RS485 ideally) dongle that can be accessed via NI VISA on a cRIO-9014??
    QFang
    CLD LabVIEW 7.1 to 2013

  • PCMCIA -2 port high speed CAN -LabVIEW programing

    Hi to all,
    I am new to CAN based applications.
    I have few basic questions
    1.)Say I am sending three messages.Message1 to be sent in every 10ms, message 2 in every 20 ms and message3 in every 50 ms.

    See post here:
    http://forums.ni.com/ni/board/message?board.id=30&message.id=1531
    Michael Chaney
    Systems Engineer - TestStand
    National Instruments

  • High speed and Low Speed CAN

    Hi
    This is my first time to use CAN.
    I have LabVIEW  2010 and  LabVIEW FPGA
    I have NI 3110 Dual-Core Industrial Controller with Windows OS, NI 9157, NI 9852 CAN Low Speed Module and NI 9853 CAN High Speed Module to be used to monitor the CAN messages from an ECU unit.
    The connection is done as the following:
    ·         The High speed CAN is connected to the NI-CAN 9853 (Port 0)
    ·         The low speed CAN is connected to the NI-CAN 9852 (Port 0)
    ·         The baud rate of the 9853 is 500kbps
    ·         The baud rate of the 9852 is 50kbps
    ·         Both modules are set to “Listen Only”
    ·         “Module Clock” in both modules is set to 20MHz
    ·         CAN networks are 29-bit and 11-bit.
    My consideration is to read all the messages from both modules without losing any message. Below is a picture from the block diagram on the FPGA target, is this the best way to read CAN messages from both modules? Also, could you please suggest what is the best way to read the messages on Host vi?
    Unfortunately, I do not have the module right now to do some test, so that I cannot get the iteration time (“CAN Low Speed Loop” and “CAN High Speed Loop” show in the picture below), could you please tell how to calculate the iteration time needed for each loop depending on the above information.

    The way that you have proposed to use the CAN module to monitor the BUS is a bad idea.  We have a prebuilt driver that will do most of this work for you.  You will need to download and install the NI-CAN 2.7.3 and then you will be able to use the example finder.  In the example finder look for.
    The example finder is accessed through the Help menu.
    Jacob K || Applications Engineer || National Instruments

  • Non standard baud rates serial support?

    1. I have a microcontroller board sedind data tp FT232R(usb uart IC) which then forms a COM port on PC to commnicate with labview serial visa.
    2. I am able to communicate with standard baud rates with any problem like 9600 or 38400 bps.
    3. I want to know can I can communicate wit non-standard baud rates like 500Kbps with labview also???

    Dennis_Knutson wrote:
    Where did I say that the VISA baud rate is an enum?
    My Bad - I wasn't paying attention!  No wonder I seldom attempt to correct you!
    So lets talk about serial baud rates.  LabVIEW does not have anything to do with it other than implement calls to the VISA API- VISA not LabVIEW handles serial communications.
    VISA does not limit serial baud rate to anything other than "a positive non zero integer "(Actually a 0 baud rate just garuntees a timeout error and is silly, negative baud rates are sillier still- think about it for a moment)
    Most hardware today detects the clock rate of the incomming TX and adapts baud properly.
    Some legacy devices exist that were designed prior to the advent of clock recovery.  These are mostly obsolete and should be considered for replacement.
    Some modern hardware that could support clock recovery has firmware developed without support for the feature either for "optimization" (it may be run from an underpowered CPU) or because the developer has been copy-pasting that same #include for decades.  Those firmware engineers are also mostly obsolete and IMHO should be considered for replacement.
    All that being said 500K baud is not inconcievable- but, you better watch out for noise in your cabling and inside the hardware too! including the COM port of the PC!
    Jeff

  • Slow Baud Rate serial port (VISA)

    The last version of LabVIEW, 7.1, has the serial functions incorporated on VISA Resources, and it doesn´t possible to work with baud rate lower than 110 bps. I have a big stuff of applications that works with 5 bps. It´s a serial protocol that send a byte, e.g AA (hex) via serial line, with 5 bps, and after the rate is increased.
    I tried to use dll (CIN Function) but I think is impossible, cause the resource declared at the c code, so Im thiking to use a dedicated COM Dll, and use Call library Function)
    I´d like to know if somebody has a tip or similar problem.
    Thanks in advance
    Fabrizio
    Test Engineer

    Matthias Müller writes:
    > Hello,
    > I'm using LabView to controll a spektrometer through the serial port. I
    > use VISA for the communication with the device. Unfortunately, the
    > device is always in 9600baud mode after power on. So I have to change
    > the baud-rate each time by a command i send to the device. So I open a
    > VISA session in 9600 mode and communicate with the divice to set it to
    > 57600baud. After that, i have to reset my local serial port to the same
    > baud-rate. I do this with an property-node, where i change 'Serial Baud
    > Rate'.
    > Unfortunately, after I did this, the vi's that want to communicate after
    > the reset of the baud-rate stop with an error:
    > -1073807298
    > VI_ERROR_IO
    > Could not perform read/write operation
    because of I/O error.
    >
    > (I try to write to the serial port after change of the baud-rate)
    >
    > I would be very glad, if someone could give me an advice, why it doesn't
    > work, or how to make it work.
    > Thank you a lot.
    > Matthias
    Matthias,
    my first approach would be to close the first VISA session after the
    property node. Data dependency is achieved by the error cluster feed
    into a new session with the new properties.
    IMHO you've discovered another bug in serial VISA.
    HTH,
    Johannes Nieß
    P.S:What brand/modell of spectrometer are you programming for?

  • TI 2407 DSP TO PCI-CAN 2 (high speed card)

    I'm trying to interface the national instruments PCI-CAN2 (high-speed CAN) to the texas instruments TMS320LF2407 CAN module. How can I make the two CAN modules communicate(c Lang)? Also the TMS320LF2407 DSP does not support the RTDX interface so the dsp toolkit will not be useful in my app since I'm not allowed to use memo i/o to monitor. What do I need to do in the code composer environement to communicate back and forth with the can on dsp board (any example code or docs in c will be appreciated)
    could anybody please point me to a starting point
    Thank You very much

    Hello,
    I am not familiar with your TI CAN module but I can help you with the configuration of the National Instruments PCI-CAN Card. A helpful reference is the PCI-CAN user manual located here:
    http://digital.ni.com/manuals.nsf/websearch/6BF779​10C5528D4486256D63004EDE1F?OpenDocument&node=13210​0_US
    You first need to physically wire the two devices together as shown in chapter 4. Next, you should use some of the example programs to get you started programming in your desired language.
    The CAN standard will allow you to communicate between your TI module and your PCI-CAN card, but the format of the data is up to the TI module. You might need to send a "remote CAN frame" to get data back, or the TI module might send data continuously. The data that you
    receive will need to be somehow interpretted, and this is where you will need the help of TI.
    Hope this gives you a place to start. Even if you do not have the card at this point in time, you can download the driver and take a look at the API and the example programs by downloading it here:
    http://digital.ni.com/softlib.nsf/websearch/A84EE3​49DAAEF6A486256E7B00561281?opendocument&node=13207​0_US
    Hope this helps.
    Scott B.
    Applications Engineer
    National Instruments

  • How exact does the baud-rate must be configured?

    Hallo!
    I have troubles to configure the baud-rate of the CAN-module of the TMS320LF2406 DSP.
    I can programm 122500 baud or 127500 baud, but not 125000. What is the tolerance for the baud-rate of the ni-can PCMCIA board?
    When I select 125kBaud for ni-can I get an error: stuff or form failure when receiving.
    Golubkov Andrej

    If you have to run it at 122500 or 127500 baud, you can manually type in the baud rate. I currently have my card running at 800000 baud which is not a standard baud rate, so perhaps the 122500 or 127500 baud rate will work. attached is the code i used to set the baud rate. (altered for 122500 baud rate)
    AttrIdList(0) = NC_ATTR_BAUD_RATE
    AttrValueList(0) = 122500
    AttrIdList(1) = NC_ATTR_START_ON_OPEN ' CAN Network Interface Object starts automatically
    AttrValueList(1) = NC_TRUE
    AttrIdList(2) = NC_ATTR_READ_Q_LEN
    AttrValueList(2) = 0
    AttrIdList(3) = NC_ATTR_WRITE_Q_LEN
    AttrValueList(3) = 0
    AttrIdList(4) = NC_ATTR_CAN_COMP_STD ' CAN arbitration ID for the standard frame comparator.
    AttrValueList(4) =
    NC_CAN_ARBID_NONE
    AttrIdList(5) = NC_ATTR_CAN_MASK_STD
    AttrValueList(5) = NC_CAN_MASK_STD_DONTCARE
    AttrIdList(6) = NC_ATTR_CAN_COMP_XTD
    AttrValueList(6) = NC_CAN_ARBID_NONE
    AttrIdList(7) = NC_ATTR_CAN_MASK_XTD
    AttrValueList(7) = NC_CAN_MASK_XTD_DONTCARE
    Status = ncConfig("CAN0", 8, AttrIdList(0), AttrValueList(0))
    txtStatus = CheckStat(Status, ("ncConfig " & checkCAN)) ' check for errors
    If (txtStatus <> "") Then GoTo error:
    Hopefully this works for you.
    Marty

  • I have one application that has requirement to do low and high speed acquisition. I want to change sample rate while running. BUT... I have E series Device

    I am writing control software for a process that is usually dull and
    requires only 10 Hz acquisition rate.  At particular times during
    the sequence, however, we are interested in looking at a couple of
    channels at 1000 Hz.  My approach so far is to configure my
    Buffered DAQ to run at the higher rate at all times.  When we are
    in the 'high-speed DAQ' mode, the program logs every point to
    disk.  In the 'low-speed' mode, I am picking off every nth (in
    this case, 10th) point to log to disk.  At all times, I update my
    GUI indicators on the front panel at a maximum of 4 times per second (I
    find that anything faster results in an uncomfortable display), so I
    fill up a FIFO with data in my acquisition / logging loop, and read the
    FIFO in the display loop.  The data in my GUI display can be up to
    250 milliseconds off, but I find this acceptable . As a side note, I
    need buffered Daq with hardware timing, as software timing results in
    lost data at 1000 Hz.
    This all works fine and dandy, but I am convinced that it is not the
    most elegant solution in the world.  Has anyone developed a
    buffered DAQ loop where the scan rate can be adjusted during
    operation?  I would like to change the rate of the E-Series card
    rather than relying on down-sampling as I am now doing. 
    The reason I have concern is that at the moment I am simulating my AI
    using MAX and when running the down-sampling routine, I consistently
    miss a particular event on the simulated data becuase the event in
    question on the simulated data always occurs at the same 'time', and I
    always miss it.  Granted, while it is unlikely that my measured
    signal and my acquisition are perfectly synchronized in the real world,
    this particular situation points out the weakness in my approach.
    More than anything, I am looking for ideas from the community to see
    how other people have solved similar problems, and to have you guys
    either tear apart my approach or tell me it is 'ok'.  What do you
    think?
    Wes Ramm, Cyth UK
    CLD, CPLI

    Adding to Alan's answer:
    One of the problems that comes with these tricks for variable-rate acquisition is being able to match up sample data with the time that it was sampled. 
    If you weren't using either of E-series board's counters, there is a nifty solution to this!  You'll be using 1 of the counters to generate the variable-rate sampling clock.  You can then use the 2nd counter to perform a buffered period measurement on the output of the 1st counter.  This gives you a hw-timed measurement of every sampling interval.  You would need to keep track of a cumulative sum of these periods to generate a hw-accurate timestamp value for each sample.
    Note:  the very first buffered period measurement is the time from starting the 2nd counter until the first active edge from the 1st.  For your app, you should ignore it.
    -Kevin P.

  • Does labview 5.0 support serial communication at a baud rate of 115200?

    When I try to initialise my serial port at 115200 baud rate I get error 32, device paramter error. I'm running labview 5.0. Can anyone help me with this?

    I use Serial Port Init.vi to set the baud rate to 115200 with no problem in LabView 5.0 under Windows 95.
    I get error 38 when I try a non-supported baud rate.
    Do you get the error when you run Serial Port Init.vi directly or just when you call it from your VI? On the diagram where you call Serial Port Init.vi, try placing a probe on the wire going to the baud rate input of Serial Port Init to see what value it's trying to set. You will have a problem going to 115200 if the control on your VI is represented as I16 or U8 or if the data range max doesn't go to 115200.

  • Verizon DSL - High "Speed (down/up)" Low connection rate

    Hello -
    I am doing this for myin-laws, who live in a semi-rural area.  They have Verizon DSL, and have been having very slow connection speeds for the past few months (they have the 1.5-3 Mbps, and are getting between 200-700 kpbs.  Sometimes, for a few seconds, it can get as high as 1.1 Mbps, but it never gets to 1.5.
    I have tried lots of things.  We got a new modem, reset the D-Link router to factory settings, plugged the modem directly into the desktop, rebooted everything, then rebooted again, called Verizon and talked with their overseas people, who said they would have someone look at it.
    Everything, no dice. So I'm not sure what to do. 
    Here's one thing I noticed, however.  When I go into the Modem sign in screen, and look at it (It's the Red sign in, not the Blue, on a Westell 6000 DSL modem), they get a very high "Speed (down/up)" level.  Something like 1740 down, and 500-something up.
    But the actual downloads are nothing like that.
    Is that significant to anything? 
    My other situation is that I am not at their computer now (I am back in the city).  When I left their place, I did not know about these boards, and I can't remember any of the other information that is from the modem sign-in.  That one piece, however, seemed strange to me (a high "Speed (down/up)" level, but slow reality.
    As you think about it, could that lead to any clues or solutions I should look for the next time I go up there?
    Thanks for reading this.

    The closest proper sync rate that the modem would show for speed would be 1792/448Kbps, which would be the 1.5Mbps provisioning of the 1.5-3Mbps package. Verizon usually configures the speed lower if the line is not capable of holding a full 3Mbps or even an "optimized" flavor of it (which is ~2600kbps down, 640kbps up). Anyways, the next time you get the chance, I'm going to need the following information from their line:
    1: What do their modem Transceiver Statistics look like? If running a Westell modem, visit http://192.168.1.1/ , choose System Monitoring, Advanced Monitors and then click Transceiver Statistics. Post up what you see there. For ActionTec modems, check the Status pages of the ActionTec for DSL Stats. The address to ActionTecs are the same.
    2: Find out if the slow speeds are taking place all the time, or only during the evening hours
    3: Go to http://visualroute.visualware.com on their PC and choose the closest server to them. Let the Java applet load, and when it does it will show you a "Trace" box with your relative's IP address filled in. Press Trace and let it complete. When it completes, move the mouse over the second-last Circle (second from the right) and take down the name of it. If you see "ERX" in the name, please tell me.
    If you are prompted for a Username/Password while doing Step 1, try the following:
    admin/password
    admin/password1
    admin/admin
    admin/admin1
    ========
    The first to bring me 1Gbps Fiber for $30/m wins!

  • Airobet 1142n erased flash and set baud rate to 115200, can't send any commands

    Hi,
    I erased the entire contents of flash and let the WAP boot to AP. I set the baud rate to 115200 now all I get is gibberish. Once it reads from eeprom this shows up:
    Waiting for PHY auto negotiation to complete TIMEOUT !
    done
    Ethernet speed is 10 Mb - HALF duplex
    The system has encountered an error initializing
    the Ethernet port.
    The system is ignoring the error and continuing to boot.
    If you abort the system boot process, the following
    commands will re-initialize Ethernet, TFTP, and finish
    loading the operating system software:
        ether_init
        tftp_init
        boot
    The system is unable to boot automatically because there
    are no bootable files.
    C1140 Boot Loader (C1140-BOOT-M) Version 12.4(18a)JA3, RELEASE SOFTWARE (fc1)
    Technical Support: http://www.cisco.com/techsupport
    Compiled Wed 14-Oct-09 18:59 by prod_rel_team
    ap: üüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüü
    *** line too large ***
    Unknown cmd: üüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüü
    ap: üüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüü
    *** line too large ***
    Unknown cmd: üüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüüü
    It can read from the WAP but doesnt allow me to send commands,
    Any ideas?
    Thanks

    Hi Tyresse,
    Please use this:
    https://supportforums.cisco.com/message/1255262#1255262
    Or
    Using the MODE button
    You can use the MODE button on 1100 and 1200 series access points to reload the access point image file from an active Trivial File Transfer Protocol (TFTP) server on your network or on a PC connected to the access point Ethernet port.
    This process resets all configuration settings to factory defaults, including passwords, WEP keys, the access point IP address, and SSIDs.
    Follow these steps to reload the access point image file:
    Step 1 The PC you intend to use must be configured with a static IP address in the range of 10.0.0.2 to 10.0.0.30.
    Step 2 Make sure that the PC contains the access point image file (such as c1100-k9w7-tar.122-13.JA.tar for an 1100 series access point or c1200-k9w7-tar.122-13.JA.tar for a 1200 series access point) in the TFTP server folder and that the TFTP server is activated. For additional information, refer to the "Obtaining the Access Point Image File" and "Obtaining TFTP Server Software" sections.
    Step 3 Rename the access point image file in the TFTP server folder to c1100-k9w7-tar.default for an 1100 series access point or c1200-k9w7-tar.default for a 1200 series access point.
    Step 4 Connect the PC to the access point using a Category 5 (CAT5) Ethernet cable.
    Step 5 Disconnect power (the power jack for external power or the Ethernet cable for in-line power) from the access point.
    Step 6 Press and hold the MODE button while you reconnect power to the access point.
    Step 7 Hold the MODE button until the status LED turns red (approximately 20 to 30 seconds), and release the MODE button.
    Step 8 Wait until the access point reboots as indicated by all LEDs turning green followed by the Status LED blinking green.
    Step 9 After the access point reboots, you must reconfigure the access point by using the Web-browser interface or the CLI.
    Regards
    Dont forget to rate helpful posts.

  • How can I modify the High Speed Data Reader VI to show correct time informatio​n in x-axis?

    I am just a beginner learning the LabVIEW programming currently.
    I have a PXI 6115 DAQ card and have to make a hardware timed acquisition VI for maximum performance. Thus I use the High Speed Data Logger VI for data acquisition.
    However, when I read my data by using the High Speed Data Reader VI, it doesn't show its correct time information in the graph.
    How can I modify the High Speed Data Reader VI to show correct time information in x-axis?
    I hope you can explain easily because I am a beginner.

    Hey Chunrok,
    I've modified the High Speed Data Reader VI slightly so that it now uses the scan rate of the data (as determined from the file) to set the scaling for the data points. If you wanted the start time to be a specific time you could use the start time obtained from your file to set the xscale offset as well.
    I hope this helps!
    Sarah Miracle
    National Instruments
    Attachments:
    Example.vi ‏281 KB

Maybe you are looking for

  • Call for beta testers for next release of ODT

    The next release of ODT is approaching (no dates yet) and we are seeking motivated testers who can help us by playing with the product and providing valuable feedback. If you would like to participate, please email me at [email protected] Just a note

  • Idoc mapping into E1IDB02 with different qualifiers

    Hi, I have to map a flat file to an Idoc (PEXR2002). Problem is in the flat file I have debitor and creditor informations wich have to be mapped into the same target segment named E1IDB02 with different qualifiers (BA for debitor and BB for creditor)

  • How do I set Firefox to start on login in full screen mode and permanently disable the tool bars?

    I am trying to set up a touch screen windows 7 lenovo desktop for use by older users with no prior experiance of PC's. I would like Firefox to start on login and open in full screen mode, with all toolbars disabled. Is this possible? and if so how do

  • Verify modified jar files

    Hi all, I would like to address a few issues that I have with signed jar files. 1. I have signed a jar file and am still able to open and change in WinZip. Can I encrypt it so that the contents are not visible in WinZip. If I have an XML or a Text fi

  • Why can't I logout of Asus Router GUI in Safari?

    I recently installed an Asus RT-N66U wireless router and cannot log out of the GUI. I click on the "logout" button and then get a "you have logged out successfully" page. When I go back again, there is NO request for a user name and password. All of