I2c bitbanging

Hello,
I'd like to create a LabVIEW program that communicates to a stand alone component using I2C.
I'm guessing that I will need to use bitbanging to get this accomplished.
Does anyone have any examples of this?
Is this even possible to do using LabVIEW? (You'd somehow have to quickly switch your data line from input to output)
Help on this subject would be greatly appreciated!
Solved!
Go to Solution.

The myDAQ does not support per-cycle tri-stating, but does have hardware timing, unlike the USB 6009.
I would recommend getting a USB-8451.
Joey S.
Software Product Manager
National Instruments

Similar Messages

  • Unable to Load USB Driver in LV for Total Phase's Aardvark I2C adaptor

    I have tried to instal Total Phase's Aardvark I2C USB driver for LabVIEW on two computers and two versions of LV (8.5 and 7.1) without success. The Windows driver seems to be working fine as their command center gui terminal-esque program works without a hitch, but the LV DLL doesn't seem to work.
    Attached is the very simple vi to find connected Aardvark USB devices (aa_find_devices). It always returns the same error code, 8002, which resolves to:
    AA_UNABLE_TO_LOAD_DRIVER
    -2
    unable to load USB driver
    From their documentation: http://www.totalphase.com/docs/aardvark_datasheet/​sect005/#s5.9
    This looks like perhaps the LabVIEW DL, "aardvark.dll" maybe having a versioning issue. So my question is this: Using LabVIEW, how does one debug dll dependencies and resolve issues like this? Do I need the source code for the lib?
     Cheers,
    Joe Gorse
    Solved!
    Go to Solution.
    Attachments:
    find_devices.vi ‏11 KB

    Dear Joe,
    Let me start off by apologizing for the inconvenience you
    have gone through.  I am an engineer at Total Phase.  We are
    working on fixing this issue for our next release of the
    Aardvark LabVIEW Driver.  This forum post was only brought
    to our attention today.  In the future, you can also let
    Total Phase know about these kinds of issues by emailing
    [email protected]  We are very responsive through
    email or phone.
    The problem you are experiencing has to do with versioning
    issues between our DLL and USB driver.  Specifically, DLLs
    before v5.xx must use the older version of our USB driver.
    By replacing the DLL from our latest version of Control
    Center, you have effectively upgraded the DLL that the
    LabVIEW driver is using and made it compatible with v2.xx of
    the USB driver.
    The most likely reason that smercurio_fc did not see a
    difference between the DLL in Aardvark Control Center and
    the LabVIEW driver is because he is using an older version
    of Control Center and correspondingly an older version of
    the USB driver.
    The reason that we have not packaged the LabVIEW driver with
    latest DLL is due to the fact that a few functions have
    changed their function prototypes (for example aa_open_ext
    and aa_spi_write), and the VIs have not yet been updated
    accordingly.  Although updating the DLL solved your USB
    driver issue, it introduced this new issue.  If you try to
    use these certain VIs, LabVIEW will error out.
    We are currently working on updating the LabVIEW drivers to
    work cohesively with the new DLL and USB driver, and this
    should be available soon.  If you need immediate
    availability to the functions with new prototypes you will
    have to change it manually.  The other recommended option is
    to simply revert back to the DLL that is packaged with our
    LabVIEW driver and downgrade the USB driver back to v1.xx.
    I will make sure to post again once the update is released,
    and I once again apologize for the inconvenience.
    Best regards,
    Etai

  • NI 845x USB: Error -301713 occurred at Property Node (arg 2) in General I2C Read.vi

    Hello,
    I am currently working  with digital accelerometer LIS35DE from ST Microelectronics. I want to start with tests of this device. For that purpose I used NI 845x USB to connect with accelerometer via I2C. Unfortunately, when I made electrical connections and set up parameters of communication and run the program (I found it in examples) the following error occured:
    Error -301713 occurred at Property Node (arg 2) in General I2C Read.vi
    Possible reason(s):
    NI-845x:  An input parameter, or combination of parameters, is invalid. An example of this error may be setting an invalid baud rate or enabling I2C ACK polling while using 10-bit addressing.
    The code can be found in attachements. I couldn't find any extended description of this problem. What could be a problem: incorrect device address, register address, configuration parameters?
    Any help is appreciated!
    Best regards,
    Michael
    Attachments:
    General I2C Read.vi ‏24 KB

    Hi MicMac89!
    First of all could you please post which version of LabView are you using?
    Could you please tell me which version of 845X hw are you using? (8451 or 8452)
    I opened the example you attached. As you wrote the error occurs at the second argument of the property node. (I guess this is the first property node where the error occurs.)
    This argument of the property node enables the onboard pull-up resistors. But not all NI-845x hardware support pull-up resistors. (Because of this is important to know which hw version are you using.)
    Did you try the example with disabling the pull-up resistors?
    I suggest you to go through the Manual of this product, (if you did this not yet) This could make it clear where and when to use what kind of pull-up resistors.
    For example: If you are using 8452, you must enable pull-up resistors, for Vref ≤ 1.8 V for the FPGA to properly detect a low-to-high transition
    Manual: http://www.ni.com/pdf/manuals/371746d.pdf
    HW specification: http://www.ni.com/pdf/manuals/290598a.pdf
    Please post if my suggestions helped. Of course if you have any questions, don't hesitate to post them.
    Best regards, 
    Balazs Nagy

  • Best practice for encoding and decoding DUT/UUT registers? Class? Cluster? Strings? Bitbanging?

    I am architectecting a LabVIEW system to charactarize silicon devices.  I am trying to decide the best way to enqueue, encode, and decode device commands executed by my test system.
    For example, a ADC or DAC device might come in both I2C and SPI flavors (same part, different interface) and have a large register map which can be represented as register names or the actual binrary value of it's address. 
    I would like my data structure to
    *) be protocol agnostic
    *) have the flexibility to program using either the memonics or hard coded addresses
    *) be agnostic to the hardware which executes the command. (
    *) I would like to enqueue mulitple commands in a row.
    I am thinking a detailed class is my best bet, but are there are examples or best practices already established?

    I agree on the detailed class inherited from a general DUT-class. Especially if you want to mix interfaces you need to keep those as far away from your top vi as possible.
    As to the 4th point i'd implement command-vi's as enque and have a in-class command-queue (or possibly just an array of commands).
    /Y
    LabVIEW 8.2 - 2014
    "Only dead fish swim downstream" - "My life for Kudos!" - "Dumb people repeat old mistakes - smart ones create new ones."
    G# - Free award winning reference based OOP for LV

  • Cannot use i2c and custom fpga logic at the same time

    I am driving a OV7670 camera sensor with my myRio. Configuring the camera's registers is done via I2C (or they call it the SCCB interface, but it's practically the same thing). The sensors has to be given an external clock input which I do through the FPGA (I run it with an 8MHz signal). I can then do a parallel read (8-bits) from the chip into the FPGA on each clock in order to read off the pixel information, place it in a buffer and let the real-time side read it and manipulate it. The whole idea is to let the FPGA do the image acquisition while the processor does the image manipulation.
    The only problem is, when I have my FPGA configuration loaded, the I2C bus seems to stop working. I know that the I2C is actually handled throught the FPGA as opposed to the processor. Is there anyway to run those two functions simultaneously on the FPGA? If not, can I somehow implement the I2C protocol independently of the FPGA?
    Thanks in advance

    Hey quickbacon,
    I take it you're talking about using the built in I2C API for the myRIO. The myRIO comes preconfigured with a default FPGA personality, and the myRIO API is built off of that default personality - the default personality needs to be in place in order for it to function. By customizing the FPGA on your own, you've overwritten the personality, which is why the myRIO I2C VIs have broken.
    If you want to use both the myRIO I2C VIs and customize the FPGA you can do so, but you need to modify the FPGA VI for the default personality instead of creating a new FPGA VI from scratch. To expose the FPGA VI that defines the default personality in a project, create a new project in LabVIEW and select the "myRIO Custom FPGA Project Template." Once you've created a project from that template, expand the FPGA target - the FPGA VI that's present under the target is the FPGA VI that defines the default personality. To add your own custom FPGA functionality, simply open that VI and add your functionality in parallel to the existing code. As long as you leave the code pertinent to the I2C VIs in place (it should be clearly labelled in the FPGA VI), the I2C VIs should still function. If you run into resource utilization problems, you can delete sections of code from the personality that deal with myRIO functions that you aren't using - just leave the code pertinent to I2C intact.
    Regards,
    Ryan K.

  • Time-out limit of the 8451-I2C interface for hold-master condition

    I am working with Windows XP, Labview 8 and the USB-8451 I2C to USB interface-box.
    We intend to incorporate the USB-8451 into our product evaluation kits for a new sensor product.
    This evaluation kits should be used by our customers when they start working with our sensor products.
    The I2C-device is a digital sensor which generates a hold-master condition with difference length, depending on the sensor configuration.
    The hold-master is produced by the sensor after full transmassion of  start-bit, address, read-bit and ACK.
    After the hold master is suspended three bytes can be read by the master.
    If the hold-master is kept by the sensor for 25 ms or less I have no problems. When keeping it significantly longer the interface seems not to work properly anymore. I don't get a LabView Error-Message but the PC (Windows?) creates a 'ding-dong' sound very similar to disconnecting and connection a USB device. (the 8451???). My LabView-Program does not show sensor data after this.
    I can imagine that the hold-master is somehow too long for one element in the system.
    Is there any time-out period for hold-master defined for the USB-8451?
    How can I change it to higher values?
    Thank you very much for your support!
    Message Edited by UnexpectedLifeForm on 02-19-2007 09:48 AM

    Dear Member,
    That's right. If somebody encounter this problem again.
    You should contact your local NI branch that will install the new firmware
    version (that works with the last driver version)
    Best regards,
    Nick_CH

  • DiB0070 I2C read failed errors on Sony playTV dual tuner.

    Hi Guys,
    As far as I can tell Googl'ing around this error has been aroung since 2008 in the dvb-usb-dib0700 driver.
    I get the following in the log:
    [ 271.882232] DiB0070 I2C write failed
    [ 612.809450] DiB0070 I2C read failed
    [ 792.722569] DiB0070 I2C write failed
    [ 969.609201] perf samples too long (2515 > 2500), lowering kernel.perf_event_max_sample_rate to 50000
    [ 2011.890312] DiB0070 I2C write failed
    When there is enough of them in the log, the tuner stops working. It has crashed at the point as well, see:
    [39237.817501] DiB0070 I2C read failed
    [39648.715000] DiB0070 I2C write failed
    [40324.499602] divide error: 0000 [#1] SMP
    [40324.499637] Modules linked in: udf ctr ccm bnep rfcomm bluetooth parport_pc ppdev arc4 snd_hda_codec_hdmi snd_hda_codec_realtek rt2800usb rt2800lib crc_ccitt rt2x00usb rt2x00lib dvb_usb_dib0700 dib7000p dib0090 dib7000m dib0070 mac80211 snd_hda_intel dvb_usb snd_hda_codec dib8000 rc_imon_mce dvb_core cfg80211 snd_hwdep dib3000mc snd_pcm dibx000_common snd_seq_midi snd_rawmidi imon snd_seq_midi_event rc_core snd_seq snd_timer snd_seq_device hid_appleir joydev snd soundcore snd_page_alloc serio_raw asus_atk0110 lpc_ich mac_hid lp parport hid_generic usbhid hid pata_acpi radeon ttm drm_kms_helper firewire_ohci drm r8169 firewire_core crc_itu_t mii pata_marvell i2c_algo_bit
    [40324.500094] CPU: 1 PID: 2605 Comm: kdvb-ad-0-fe-0 Not tainted 3.13.0-35-generic #62~precise1-Ubuntu
    [40324.500144] Hardware name: System manufacturer P5Q-EM/P5Q-EM, BIOS 2203 07/08/2009
    [40324.500189] task: ffff880079efb000 ti: ffff88007995a000 task.ti: ffff88007995a000
    [40324.500231] RIP: 0010:[<ffffffffa0480b6a>] [<ffffffffa0480b6a>] dib7000p_set_dds+0x3a/0x140 [dib7000p]
    [40324.500292] RSP: 0018:ffff88007995b928 EFLAGS: 00010246
    [40324.500323] RAX: 0000000004000000 RBX: ffff880034fb6000 RCX: 0000000010624dd3
    [40324.500363] RDX: 0000000000000000 RSI: 0000000000000001 RDI: ffff880034fb79a0
    [40324.500403] RBP: ffff88007995b968 R08: ffff88007995a000 R09: ffffea0001e16c00
    [40324.500444] R10: ffffffff8156b26a R11: 0000000000000000 R12: 0000000000000000
    [40324.500484] R13: 0000000000000000 R14: ffff880034fb6000 R15: 0000000004000000
    [40324.500524] FS: 0000000000000000(0000) GS:ffff88007fc80000(0000) knlGS:0000000000000000
    [40324.500570] CS: 0010 DS: 0000 ES: 0000 CR0: 000000008005003b
    [40324.500603] CR2: 00007f1fe9d293d0 CR3: 0000000077c69000 CR4: 00000000000007e0
    [40324.500643] Stack:
    [40324.500656] 0000000000000040 0000000000000384 ffff88007995b968 ffff880034fb6000
    [40324.500704] ffff880034fb6000 ffff880034fb6000 0000000000000002 0000000022739480
    [40324.500750] ffff88007995b9b8 ffffffffa0480e9e ffff880078d3a8f0 0000000000000441
    [40324.500797] Call Trace:
    [40324.500818] [<ffffffffa0480e9e>] dib7000p_agc_startup+0x22e/0x480 [dib7000p]
    [40324.500862] [<ffffffffa0483382>] dib7000p_set_frontend+0x72/0x1f0 [dib7000p]
    [40324.500911] [<ffffffffa040e78e>] dvb_frontend_swzigzag_autotune+0x13e/0x350 [dvb_core]
    [40324.500961] [<ffffffff8156b26a>] ? usb_control_msg+0xea/0x110
    [40324.501002] [<ffffffffa040f8aa>] dvb_frontend_swzigzag+0x29a/0x3c0 [dvb_core]
    [40324.501048] [<ffffffff810135da>] ? __switch_to+0x12a/0x4d0
    [40324.501083] [<ffffffff810a5d2d>] ? set_next_entity+0xad/0xd0
    [40324.501118] [<ffffffff810789ff>] ? try_to_del_timer_sync+0x4f/0x70
    [40324.501155] [<ffffffff81078a72>] ? del_timer_sync+0x52/0x60
    [40324.501191] [<ffffffff8175d345>] ? schedule_timeout+0x135/0x250
    [40324.501228] [<ffffffff81078620>] ? call_timer_fn+0x160/0x160
    [40324.501268] [<ffffffffa0412a64>] dvb_frontend_thread+0x454/0x7c0 [dvb_core]
    [40324.501311] [<ffffffff810affd0>] ? __wake_up_sync+0x20/0x20
    [40324.501351] [<ffffffffa0412610>] ? dvb_frontend_ioctl+0x160/0x160 [dvb_core]
    [40324.501393] [<ffffffff8108fb59>] kthread+0xc9/0xe0
    [40324.501423] [<ffffffff8108fa90>] ? flush_kthread_worker+0xb0/0xb0
    [40324.501460] [<ffffffff8176ab7c>] ret_from_fork+0x7c/0xb0
    [40324.501493] [<ffffffff8108fa90>] ? flush_kthread_worker+0xb0/0xb0
    [40324.501528] Code: 7d f8 41 bf 00 00 00 04 48 89 5d d8 4c 89 65 e0 4c 89 6d e8 41 89 f5 4c 89 75 f0 49 89 fe e8 6e ff ff ff 31 d2 41 89 c4 44 89 f8 <41> f7 f4 8b 0d ad 67 00 00 41 89 c7 44 89 e8 c1 f8 1f 89 c3 44
    [40324.501731] RIP [<ffffffffa0480b6a>] dib7000p_set_dds+0x3a/0x140 [dib7000p]
    [40324.501776] RSP <ffff88007995b928>
    [40324.516014] ---[ end trace 6a9fad8fe9eb61e2 ]---
    Anyone know how I can fix this? I have tried upgrading to the latest kernel 2.13 and have compiled and installed the latest v4l drivers.
    I have also tried vairous module parameters for the relevent modules. i.e
    options dvb_usb disable_rc_polling=1
    options dvb_usb_dib0700 force_lna_activation=1
    options dvb_core dvb_powerdown_on_sleep=0
    Nada!
    Any help much appreciated.
    Last edited by bmentink (2014-09-20 03:38:44)

    The first thing to do is use smartctl to test the drive. Start with the short test ( smartctl -t short /dev/sdX ), and if it passes run the long test ( ... -t long ... ).

  • Need help - I2C write/read with TAOS TCS3414 light sensor using USB-8451

    Hello, I'm new to labview and need help setting up a vi that will allow me to communicate with a digital light sensor (TAOS TCS3414) using a USB-8451. I need to use the sensor to measure light from a light source that I designed and built as part of a project im working on. I've tried looking at several labview I2C exampled but find them to be very confusing. I've used an arduino to interface with the sensor successfully but need to use labview and dont understand how to write the program. The actions are simple; I need initialize the sensor with a simple command and then request data from 8 data registers and then read that data. The data will then be used in further calculations. The portion i need help with is writing and reading from the sensor. I've attached the datasheet for the sensor as a guide. I can also provide the arduino code that i use to read data from the sensor if that would help. 
    Pleae keep in mind that i am completely new to labVIEW. I really do want to learn from this but need quick results so the more help the better. It would greately appreciate any help or explaination. 
    Attachments:
    TCS3414_Datasheet_EN_v1.pdf ‏1806 KB

    Hi Aaron,
    Here you go, this is made with a USB-8452.
    When you run the code tick the power en dac enable box on.
    Maybe you can help me with my problem, I want to use a fiber to sense light from a led.
    Do you use any fiber hardware with the TCS3414?
    gr,
    Attachments:
    TCS3414.vi ‏63 KB

  • I2C Write/Read Error

    Error -301742 occurred at NI-845x I2C Write Read.vi
    "Possible reason(s):NI-845x:  The slave did not acknowledge an address+direction byte transmitted by the I2C master. Reasons include the incorrect address set in the I2C configuration or the incorrect use of the 7-bit address. When entering an address to access a 7-bit device, do not include the direction bit. The NI-845x Basic I2C API internally sets the direction bit to the correct value, depending on the function (write or read). If your datasheet specfies the 7-bit device address as a byte, discard the direction bit (bit 0) and right-shift the byte value by one to create the 7-bit address."
    We have connected the pull up resistors on the SDA and SCL line.
    The error seems to be strange in our case,
    We have two setup
    1. Using a push button to power the device and wires through the pull up resistor.
    2. We are trying to communicate using a NI 2569 for the SDA and SLC and the Power supply to the device.
    We do not observe the error in case 1,however, in case 2 we do observe,attached are the waveform pics for the same.
    Noted that we have the clock signal rise delay in case 2, is this because of the relay 2569?
    Any help will be highly appreciated.
    Regards,
    Naru 
    Attachments:
    Case 1.jpg ‏302 KB
    Case 2.jpg ‏302 KB

    I don't quite understand your setup, and I'm not sure what I'm looking at with the figures you posted. What does the 2569 have to do with the 8451? You also said
    1. Using a push button to power the device and wires through the pull up resistor.
    I have no idea what this means.
    It's hard to tell from the pictures, but for the Case 1 figure, it appears that you are generating a correct start condition. It's also indicating that are sending a read transaction. You didn't indicate what device you're talking to, so we can't say for sure if this is correct. 
    In the second case it also appears that you generating a correct start condition. The slow rise times that you are seeing in the clock is probably due to extra capacitance on the bus, or weak pullups, or both. This transaction appears to be a write transaction, but with no data.
    At this point, without knowing what device you're talking to, a better explanation of your setup, and the code itself, I'm not sure how much help we can be. 

  • I2C interface (Sensor Data Acquisition) LabVIEW

    Hi all!
    Hope you are doing great!
    Well I have a question which is more about asking all you for an idea!
    The Situation:
    I have a circuit board which has an On-Off Valve, Digital pressure sensors (manufacturer AMD) and Humidity/Temperature Sensors (make- IST Hygrosens). On the board all the sensors communicate as I2C slave devices and all the data from the sensors is read into an I2C --> USB adapter chip which further connects to the PC via normal USB cable.
    Additional to this board, There is a Relay circuit with a simple 1-pole relay which controls an on-off valve on the above Circuit board. This valve is controlled totally separate via a coaxial cable from the relay directly to the Valve. But the relay board has a I2C interface and it also acts as a slave device. The relay board has the same I2C --> USB adapter chip.
    Both the Relay board and Sensor board connect via USB to the PC which I suppose is the Master device.
    The software code written for this arrangement and Sensor data acquisition is too old and there are a lot of problems coming. I have almost given up troubleshooting..
    I now want to translate this automation system onto LabVIEW. I searched the NI website where there is a DAC card called - USB8451 which supports I2C interface... I am a beginner in LabVIEW and cant really make sense out of how should I go about implementing this system on LabVIEW..
    If you guys can please help me out to atleast start (like what all hardware I would need etc..), to have a clear picture, it would be great help!!
    Looking forward to your inputs and Thank you so much in advance!
    Cheers!
    Pramit

    NI provides a LabVIEW API for the USB8451.  If you use the USB8451, you would use the provided API to write a program that controls the USB8451 and you would do all of the I2C communication in your program.  This would mean using functions / SubVI's to connect to the USB8451 and then perform I2C operations through it.
    If you use USB already on your device, then you would probably use NI-VISA as the driver and have to get / write your own API to talk to the specific device.  The manufacturer may have a LabVIEW (or other) API available for talking to the device that you could get.  If not, then you would have to understand the details of how to communicate with the device and then write an API using NI-VISA serial functions.  This would mean making NI-VISA be the assigned driver for the device and then using VISA Serial functions / SubVI's to send the messages and receive the responses.

  • [ERROR] [DAAPI] iso=-1:[I2C] cannot open /dev/i2c-1 bus file

    I am running the TemperatureSensor project from lesson 2
    I can detect the BMP180
    pi@raspberrypi ~/javame81ea/bin $ sudo i2cdetect -y 1
         0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f
    00:          -- -- -- -- -- -- -- -- -- -- -- -- --
    10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    20: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    30: -- -- -- -- -- -- -- -- -- -- -- UU -- -- -- --
    40: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    70: -- -- -- -- -- -- -- 77
    I can retrieve the I2CDeviceConfig info
    Controller Name:   null
    Controller Number: 1
    Address:           0x77
    Address Size:      7
    Clock Frequency:   3400000
    My API permissions are
    jdk.dio.DeviceMgmtPermission "*:*" "open"
    jdk.dio.i2cbus.I2CPermission "*:*"
    But when I attempt
    myDevice = DeviceManager.open(config);
    I get
    [ERROR] [DAAPI] iso=-1:[I2C] cannot open /dev/i2c-1 bus file
    Any suggestions?

    Hi Tom,
    I stepped back and removed the BMP180 entirely. The UU remained.
    Googling "raspberry pi UU" yielded
      https://github.com/raspberrypi/linux/issues/581
    which referenced
      http://www.raspberrypi.org/forums/viewtopic.php?f=44&t=31420&p=612513#p612513
    which referenced
      https://www.hifiberry.com/forums/topic/i2cdetect/
    which reccomended blacklisting snd_soc_wm8804
    Googling "what is snd_soc_wm8804" yielded
      http://www.raspberrypi.org/forums/viewtopic.php?f=44&t=81750
    which explained the whole matter
    Anyhow the UU is gone. Only 77 is detected now. The mystery continues which I accept as an interesting challenge.
    Victor

  • Accessing I2C From Java

    Has anyone had any experience accessing I2C devices using Java. I am currently running a J2ME CVM on arm-linux and I have the device file /dev/i2c-0. I have run a few supplied test programs which scan the bus and show me that the devices are operational. But now I need to be able to write and read using Java.
    Thanks

    Since you need to use a bunch of different IOCTLS with this device, it seems that a JNI adaptor would be in order.

  • I2C data send using USB-to-I2C converter

    Hi guys me again
    Just when things started to look good another trouble comes
    We thought we will buy NI-8451 to implement I2C. We contacted NI also but suddenly another student comes and says we can implement I2C using Devantech USB-to-I2C converter..
    Can it be implemented using VISA(all operation of I2C like read, write, ack, clock stretching)???
    Thanks in advance

    I never used that device, but ran across it while searching for alternate I2C and SPI devices. That uses a standard FTDI interface chip, so the device shows up as a serial device. So, yes, you'd be able to use VISA to communicate with it. Please check the documentation for the device in the future.

  • Implementing I2C on Siemens TC65

    Hi,
    I need an assistance with I2c protocol, just some basic questions.
    Do you need connector object, and connection to start and control I2C, or all control is done by AT commands ?
    If You need connector and connection, what type of connection should I use.
    And if it is all done by AT commands ... How do you control input stream only by AT commands ?!
    An example code would be welcome, if not any help is appreciated.
    Message was edited by:
    zutikombi

    AT commands require (example in Siemens TC65 AT command set) terminal program. (I don't have it in Java) ...
    After issuing command AT^SSPI an stream connection is established. Send/reicieve until # sign occures ...
    The true question is how to make stream readable in Java program ?!, probably trough connector and connection ... But how to configure it for I2C comunication ...

  • Best way to chain several I2C write / read cycles

    Hi,
    I'm developing a custom sensor that communicates over I2C with the NXT.
    I need to perform sequencially several read / write actions over I2C.
    I know that the NXT toolkit is limited to only one frame sequences and that is possible to chain them with "pink NXT wires" or "any other wire".
    I've tried several times but I only get the first frame working. The other aren't executed.
    Does anyone here has enough experience or advices to perform several I2C operations sequencially in the same VI.
    Many thanks in advance.

    What version of LabVIEW / NXT Module are you using?
    The most recent LVLM supports multi-frame stacked sequences.
    Take a look at the ultrasonic sensor block diagram, it has a configure step followed by actually reading the data.
    One problem can be trying to read/write the i2c channel too fast. You may need to insert some arbitrary wait times in between sequential i2c calls.
    If you post your code it might be easier to tell whats not working.

Maybe you are looking for

  • Best Practice for disparately sized data

    2 questions in about 20 minutes! We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. Bu

  • Monitor and overclocking

    If I try any sort of overclocking of my ti-4200-8 64mb everytime my pc restarts the programmes will load up then the monitor will click off and then come back on again. Im sure its something other people experience so im hoping theres a fix? Its not

  • Supplement​ary Service error: general error/ Call failed

    Hi, guys please help, am very frustrated as I am not able to make or receive any calls or sms's. Everytime I try, i receive an immediate message saying " supplementary service error. or "call failed". When everybody else calls me it goes straight to

  • Problem in FBL1n tx

    Hi All,        When I am running FBL1n tx and seeing all items for 1st quarter, I am not able to see the correct G/L account no and am seeing empty cost center. When I check the related PO I can view the correct G/L account and cost center.         W

  • Burned DVD images only appear sharp near end of loaded time in screen

    Hi, I am new to the list. I have produced a slide show in iDVD with still photos, music and some video. Everything is fine except when I view the final presentation on my TV or with a DVD projector. The Problem: The images load and transition fine ex