Interfacing UART on Zedboard

Hi,
I want to use the the uart_tx and rx in my design. While making the pin connections in .xdc, I set
set_property LOC D11 [get_ports uart_tx]
set_property LOC C14 [get_ports uart_rx]
(D11 and C14 are required pins, which I found from the Zedboard Hardware Reference Guide).
But this gives me critical warning : cannot set LOC property of ports.
I read somewhere that D11 and C14 belong to the PS so that cannot be connected this way in the .xdc? If so, how do I use the uart and where to make the connections?
Thanks

there are two ways to use the uart on the processor. The easy way is to write some C code and the hard way is to make an axi master and talk to the uart controller directly. Zynq uart controller is described in UG585 chapter 19 in some detail. 

Similar Messages

  • Dll spontaneous unloading

    I am using third party dll in my project. The problem is
    that this dll spontaneously unloads and loads during program execution. This
    makes problems with calling functions. Sometimes functions return errors.
    I use LabView classes with virtual inheritance in this project. This allows me
    to decide control interface (UART, SPI, GPIB ...) at run time. In other words I
    have parent class say called INTERFACE and a few classes (UATR, SPI, GPIB ...)
    inherited from INTERFACE. I use external dll in SPI class functions which are
    called dynamically at run time if I decide to use SPI interface. So there are
    no explicit dll functions calling in my code. This I think makes a problem with
    spontaneously loading and unloading the dll. Am I right? What do you think? If
    I am right I think I have to tell LabView not to unload this dll until program
    termination. Could somebody give me a clue how I can do this? Thank you.

    Alex67 wrote:
    I've tried VI which calls dll function explicitly. Situation is the same. The next is a part of dll log file with my comments. They told me that process attached detached messages correspond to dll load unload. Is this normal that LV loads dll so many times? I've found in one topic of this forum that LV loads dll when a VI containing dll call is loaded. But in my case I can see a lot of "ATTACHED" massages at run time. There are also a few "DETACHED" messages at run time. Do you think this is normal LV behavior to unload dll at run time?
    [25.06.2008] [15:31:23] PROCESS ATTACHED m_hwnd :: (1) 0x00000000
    [25.06.2008] [15:31:24] PROCESS ATTACHED m_hwnd :: (2) 0x00000000
    [25.06.2008] [15:31:24] PROCESS ATTACHED m_hwnd :: (2) 0x00000000
    [25.06.2008] [15:31:25] PROCESS ATTACHED m_hwnd :: (2) 0x00000000
    LV project started
    [25.06.2008] [15:31:46] PROCESS ATTACHED m_hwnd :: (2) 0x00000000
    Test VI loaded.
    Starting Test VI...
    [25.06.2008] [15:32:17] PROCESS ATTACHED m_hwnd :: (2) 0x00000000
    [25.06.2008] [15:32:20] PROCESS ATTACHED m_hwnd :: (2) 0x00220378
    [25.06.2008] [15:32:20] PROCESS ATTACHED m_hwnd :: (2) 0x00220378
    [25.06.2008] [15:32:46] PROCESS DETACHED m_hwnd :: (3) 0x00220378
    [25.06.2008] [15:33:18] PROCESS ATTACHED m_hwnd :: (2) 0x00000000
    [25.06.2008] [15:33:30] PROCESS ATTACHED m_hwnd :: (2) 0x0030036E
    [25.06.2008] [15:33:30] PROCESS ATTACHED m_hwnd :: (2) 0x0030036E
    [25.06.2008] [15:33:30] PROCESS ATTACHED m_hwnd :: (2) 0x0030036E
    [25.06.2008] [15:33:30] PROCESS ATTACHED m_hwnd :: (2) 0x0030036E
    [25.06.2008] [15:34:23] PROCESS ATTACHED m_hwnd :: (2) 0x00000000
    [25.06.2008] [15:36:15] PROCESS DETACHED m_hwnd :: (3) 0x0090039A
    [25.06.2008] [15:37:00] PROCESS ATTACHED m_hwnd :: (2) 0x00000000
    [25.06.2008] [15:37:12] PROCESS ATTACHED m_hwnd :: (2) 0x009B03EE
    [25.06.2008] [15:37:12] PROCESS ATTACHED m_hwnd :: (2) 0x009B03EE
    [25.06.2008] [15:37:12] PROCESS ATTACHED m_hwnd :: (2) 0x009B03EE
    Test stoped.
    [25.06.2008] [15:37:48] PROCESS ATTACHED m_hwnd :: (2) 0x00000000
    [25.06.2008] [15:37:52] PROCESS DETACHED m_hwnd :: (3) 0x00000000
    [25.06.2008] [15:37:52] PROCESS DETACHED m_hwnd :: (3) 0x00000000
    [25.06.2008] [15:37:53] PROCESS DETACHED m_hwnd :: (3) 0x00000000
    Test VI unloaded.
    [25.06.2008] [15:38:39] PROCESS DETACHED m_hwnd :: (0) 0x00000000
    LV project unloaded.
    The problem is probably in the dynamic invocation of your VIs. A VI that loads a DLL has to take care to unload it when it is unloaded itself. Otherwise the DLL stays lingering in memory. So that is what LabVIEW does. As soon as a VI that loaded a DLL goes out of memory it unloads that DLL too.
    Each Call Library Node in a diagram loads the DLL explicitedly when it is loaded itself into memory and unloads it when it goes out of memory. As soon as the DLL is loaded at least once the additional loads will simply increment a reference counter for that DLL and on unlodaing that reference counter gets decremented and when it reaches 0 Windows will unload the DLL from memory.
    If your DLL has trouble with that since it stores resources globally between calls, then you need to make sure that at least one VI that references that DLL stays in memory for as long as that DLL is needed. You could do that by adding a VI that loads a function from that DLL to a place in your main application that is always in memory for the duration of your application. Or extend the INTERFACE class to hold on to an Initiliaze function for each possible interface that gets derived from it, or whatever.
    Just have one VI that does not get unloaded whenever the SPI class decides that it is not currently needed anymore.
    Rolf Kalbermatter
    Message Edited by rolfk on 06-25-2008 12:16 PM
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Ni-imaq pci1422 serial port

    How do I program the PCI1422 serial port through NI-IMAQ? The card's User Manual sez the RS-232 Serial Interface (UART) is accessible through NI-IMAQ driver (pg 3-3), but there is no reference in NI-IMAQ User or Ref. mans.
    (NI-IMAQ rev. 2.5)
    Only info I could find was the file SerialControl.txt in \bin of distro directory.

    Nathan,
    You can have access to the serial interface through imgSessionSerialWrite and imgSessionSerialRead in IMAQ.dll. These low-level functions are not documented because most of the serial communications should be taken care of by the camera file. This function is still available for use when serial commands needed are not implemented.
    For LabVIEW, you can find these low-level VIs in \vi.lib\vision\driver\imaqll.llb
    Hope this helps.
    Ken
    Applications Engineering
    National Instruments

  • How would I interface an instrument with a serial UART output to LabVIEW?

    I am trying to gather some information on a upcoming project. I have a instrument that outputs a serial UART stream. I would like to interface this with LabView. This is the product that I am planning to interface with. http://www.pressureprofile.com/products-digitacts.php I am just learning how to use LabView so this is all new to me, any help would be greatly appreciated.

    First- I would recommend takinga look into the Basic serial read and write example that ships with LabVIEW.  Second, RTFM (Read The Friendly Manual) for the device.  A Google search for serial comms will provide a reasonable backround for how serial comms can be configured. 
    As to how to implement the specific serial protocol in LabVIEW-- well, use a VISA call to the serial port to configure the various properties of the VISA ser:instr class and IF you learn the equipment's expected settings and what, and when, it responds, the "nuts-and-bolts" of wiring your device driver becomes reasonably simple.
    Jeff

  • Zedboard Xilinx Zync 7000 interface using labview

    Hello,
    I am doing my thesis in Zedboard for developing an DDR3 memory test and verification. For that I need to implement an LabVIEW dedicated Graphical User Interface base on NI Measurement Studio.
    Topic is : Szudy of Algorithmic test setup for DDR3 SDRAM with a Xilinx Zynq SoC.
    Here i have done my algorithm in Xilinx SDK. But I need to make a GUI using labview. Which helps to execute these programs. Please let me know how I can do this.
    1. Or is it possible to directly access the Zynq SoC using Labview. If yes how?
    2. Or if I need to do the coding in Xilinx SDK and How I can run this code using Labview?
    Please give me an detailed reply. Since I am new to labview. I m not understanding how to start with. If you have any example design please share with me.
    Thanks & Regards,
    Nithin Ponnarasseri
    Solved!
    Go to Solution.

    No you can't develop directly in LabVIEW and deploy that program to the Zync board. NI has their own Zync based hardware platform (cRIO and myRIO) but the tools to target those boards are specific to the NI hardware implementation and won't work with other hardware. Developing an interface for another hardware platform is a lot of work and needs to be adapted for every single flavor of a new hardware platform. And NI does not support this for other hardware.
    So your option will be to develop an application with the Zync SDK for the Zync ARM controller and supply some form of communication interface (serial port, TCP/IP, or similar) in that application over which you can send commands from LabVIEW to your embedded application.
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Uart in simulink

    Hey,
    I hope I am at the right forum for my question. If not, you can redirect me. I have a problem: i want to send out values of my Zedboard to my computer. The purpose is to log them so I can analyse them. I thought you could do that with the uart to usb port.
    I want to realise this in the simulink environment. I use windows 8.1 and system generator. I already have working design and i would now like to send a value of that design through the serial connection back to matlab. But i'm not sure how to program it. For the other in- and output i just use the gateway in and gateway out blocks provided by Xilinx and i can connect to JA1 en JB1 on the Zedboard. How do i realise this for setting up communication with the uart so it sends data throught the serial bus to my computer (I also can maybe use Tera Term to read in the values?) Are there some timing constraints I have to look out for?
    thanks in advance!

    Hi,
    I'm not familiar with the Zedboard.
    Does the FPGA fabric have access to the UART->USB interface pins?
    In that case the proposed "UART in a Black Box" approach by vsrunga will be the right way.
    But if this interface is only accessible by the ARM cores you have to do it in an other way.
    There shouldn't be any timing issues, since a UART connection is a quite slow thing.
    But, if you have a UART core, it also requres some circuit that generates the correct BAUD rate.
    This is often included in the IP-core sources, but sometimes requires the correct clock frequency value to be set somewhere.
    Does your design create ASCII data?
    Otherwise Tera Term will receive something, but you will be unable to read it.
    You can also write some matlab code that receives the values via some serial object and then does the conversion to some useful print- or plotable format for displaying the data.
    Have a nice synthesis
      Eilert
     

  • Implementar protocolo de comunicação serial UART no LabVIEW

    Boa Tarde,
    Estou querendo implementar o protocolo de comunicação serial UART no LabVIEW.
    A descrição do projeto é mais ou menos essa:
    - Uma Interface no LabVIEW para receber o dados enviados pela serial UART;
    - Uma DAQ NI USB-6212 irá receber o dado por um pino digital ou analógico (a princípio estou usando um pino analógico);
    - Um microcontrolador que envia dados através da serial UART;
    - Sistema operacional Windows 7;
    Estou querendo implementar o protocolo de comunicação pelos motivos:
    - Usei o Visa da comunicação serial e um cabo USB-Serial genérico, mas não consegui taxas de recepção rápidas como eu preciso, quando eu tentava transmitir dados de forma mais rápida, eles chegavam corrompidos;
    - O exemplos que encontrei do protocolo implementado usam o LabVIEW FPGA, e pelo que eu li minha placa de aquisição (NI USB-6212) não suporta FPGA.
    Tem algum exemplo ou sugestão de como devo proceder.
    Desde já agradeço.
    Atenciosamente,
    Fernando Esquírio Torres

    Good morning,  Fernando,
    Here are some links to help in its implementation. Any questions please contact us.
    What Is the Basic Architecture for Serial Communication?
    http://digital.ni.com/public.nsf/allkb/E0D95CB9249FB8CF86256C68007B1F81?OpenDocument
    Can I Do 9-bit Serial Communication Instead of 7 or 8 bits?
    http://digital.ni.com/public.nsf/allkb/3BDC7FF03541F772862564990057F919?OpenDocument
    9-Bit Serial Writing in LabVIEW
    http://digital.ni.com/public.nsf/allkb/E0D95CB9249FB8CF86256C68007B1F81?OpenDocument
    Serial Communication Starting Point
    http://zone.ni.com/devzone/cda/tut/p/id/4049
    Serial Communication - Basic Serial Write and Read
    http://zone.ni.com/devzone/cda/epd/p/id/2669
    Enviar ou Receber Dados Binários / Hexadecimais Utilizando NI-VISA no LabVIEW
     http://digital.ni.com/public.nsf/allkb/33C1056D66078118862577450061E683?OpenDocument
    Sending and receiving serial commands using VISA
    http://zone.ni.com/devzone/cda/epd/p/id/2423
    Sincerely,
    Mauro Vera
    Applications Engineer
    National Instruments

  • PIC16F877A+UART for Display of Measurements LABVIEW

    I need to send four measurement parameters to the PC for display and storage purposes. The software for this aim is LABVIEW. I am new to PIC and Interfacing, but by putting so much effort and asking questions from experts, i could have accomplished the acquisition of all the data till today.
    I could also transmit a character "HELLO" through RS232 with my UART to USB converter and the result in hyperterminal was observed as well.
    I am quite happy that i could have reached up to this stage. but on the other hand, unfortunately my knowledge is absolutely poor about LABVIEW.
    I have heard that if i have control over the format of the data from PIC to LABVIEW, it is not that difficult.
    Please find the attached pictures of the design for better understanding.
    Questions:
    1- can you please explain what is exactly meant by format of data?(I also don't know the format of data i sent through USB)
    2- After specifiying the format, what would be the next stage?
    3- If it is possible, please provide me with some sample codes for Hi-tech C, and LABVIEW as well.
    Solved!
    Go to Solution.
    Attachments:
    PIC16F877A+UART for LABVIEW_page1_image1.png ‏587 KB
    PIC16F877A+UART for LABVIEW_page2_image1.jpg ‏72 KB
    PIC16F877A+UART for LABVIEW_page3_image1.jpg ‏71 KB

    void read_adc(void)                                                                 ////This function generates the average reading value of ADC
        unsigned short i;
        unsigned long result_temp=0;
        for(i=2000;i>0;i-=1)                                                            //looping 2000 times for getting average value
            ADGO = 1;                          //ADGO is the bit 2 of the ADCON0 register
            while(ADGO==1);                    //ADC start, ADGO=0 after finish ADC progress
            result=ADRESH;
            result=result<<8;                       //shift to left for 8 bit
            result=256*result|ADRESL;            //10 bit result from ADC
            result_temp+=result;
        result = result_temp/2000;            //getting average value
    unsigned short read_temp(void)                                                        ////This function stores the generated value by ADC into the variable "temp"
        unsigned short temp;
        temp=result;
        return temp;
    ==========================================================================================================
    //Since only 8 bits at a time can be sent with USART and the ADC value is 10 bits.
    //How to recombine the Lo and Hi 8-bit values to get 10-bit value?
    do {
    for (i=0; i <=7; i++) {
    myarray[i]=read_temp();
    for (i=0; i <=7; i++)  {
    putch (Lo(myarray[i]));               //send the lower 8 bits of ADC reading
    putch(Hi(myarray[i]));                 // Send upper 8 bits of ADC reading
    ===========================================================================
    Please let me know your feedback about this code?Is it sufficient for sending analog measurement values?
    do {
        USART_Write(255);
        an0 = ADC_Read(0) >> 2; 
        an0 = (an0 - 1);                                     / 1 is taken away from the value of ADC read, because if Voltage In is 5 volts ADC value
        USART_Write(an0);                                 / will be 255 and labview will mistake the data as the start byte. Data can read from 0 - 254.   
        an1 = (an1 - 1);
        an1 = ADC_Read(1) >> 2; 
        USART_Write(an1);       
        an2 = ADC_Read(2) >> 2; 
        an2 = (an2 - 1);
        USART_Write(an2);       
        an3 = ADC_Read(3) >> 2;
        an3 = (an3 - 1);
        USART_Write(an3);       
        an4 = ADC_Read(4) >> 2; 
        an4 = (an4 - 1);
        USART_Write(an4);        
    How about this?

  • Contructing an RS-232 interface using USB interfaces Digital IO

    Hi,
    I've got a NI USB interfaces to work with, which has plenty of digital IO available (USB 6259)
    I need to communicate via RS-232 with a few instruments, which I've already done using USB to RS-232 devices.
    However, I would like to do everything on a single NI USB DAQ interface. Is it possible to use a few digital lines from the USB 6259 to create an RS-232? It seems that this would be relatively straightforward, but I haven't heard of this being done.

    Not recommended.  Voltage levels are different, no FIFO of communication, no UART that does bit sampling and error checking, no clock for data, a lot harder than you think it would be.
    Matthew Fitzsimons
    Certified LabVIEW Architect
    LabVIEW 6.1 ... 2013, LVOOP, GOOP, TestStand, DAQ, and Vison

  • How to make in/out Port vis work with COM interface under different windows versions?

    Hello!
    I know that the accessHW is necessary for it, so I downlaoded the
    this file from the homepage of NI, but there are always some problems
    with it.
    My program should get the the voltage impulse between
    the pins DTR and RTS of a COM interface. After
    I installed accessHW under winNT, the voltage between them changed
    automatically from 0 mV to about 24 mV , and program
    worked well . But this way does not work under Win98 and Win2000,
    and this voltage is always 0 so that the device can not make any voltage
    impulse.
    Could someone give me a software solution for it? Thanks!
    P.S:The version of the labview I use is 6.1.
    Le

    You shouldn't depend on a voltage difference between the tow pins unless you need something like 24 volts and not 24 millvolts. All of the signal lines are referenced to ground and the signal lines usually swing between -3 to -12 and +3 to +12 volts. If you truly saw a 24 millivolt difference between two signal leads, then all that means is that they are both at the same logic state. A difference of 24 millivolts is not much and has to do with the UART in your computer and not accessHW I believe though maybe what happened is that the difference is 24mv when both are logic "1" and less when logic "0" or vice versa. If your device requires power from the serial port, then what you'll have to do is control the signal lines in your program and I
    would recomend VISA to do that instead of inport/outport. And check the voltage requirements of your device. If this is a device you designed, it should be designed for the lower voltages on new pc's and laptops. In other words, I would depend on 3 volt signals instead of 12. There are numerous references to either RS-232 or EIA-232 on the web. I think a review of the electrical specifications is needed.

  • How to use vivado hls::mat with AXI-Stream interfaces (not AXI4 video stream) ?

      Hello, everyone. I am trying to design a image processing IP core with vivado hls 2014.4. From xapp1167, I have known that video functions provided by vivado hls should be used with AXI4 video stream and VDMA. However, I want to write/read image data to/from the Ip core through AXI stream interfaces and AXI-DMA for some special reasons.
      To verify the feasibility, a test IP core named detectTest was designed as follows. The function of this IP core is reading a 320x240 8 bit gray image (bit 7-0 of INPUT_STREAM_TDATA) from the axis port "INPUT_STREAM” and then output it with no changes. I fabricated a vivado project of zedboard and then test the IP core with a AXI-DMA. Experimental results show that the IP core works normally. So it seems possible to use hls::mat with axis. 
    #include "hls_video.h"
    #include "hls_math.h"
    typedef ap_axiu<32, 1, 1, 1> AXI_VAL;
    typedef hls::Scalar<HLS_MAT_CN(HLS_8U), HLS_TNAME(HLS_8U)> GRAY_PIXEL;
    typedef hls::Mat<240, 320, HLS_8U> GRAY_IMAGE;
    #define HEIGHT 240
    #define WIDTH 320
    #define COMPRESS_SIZE 2
    template<typename T, int U, int TI, int TD>
    inline T pop_stream(ap_axiu<sizeof(T) * 8, U, TI, TD> const &e) {
    #pragma HLS INLINE off
    assert(sizeof(T) == sizeof(int));
    union {
    int ival;
    T oval;
    } converter;
    converter.ival = e.data;
    T ret = converter.oval;
    volatile ap_uint<sizeof(T)> strb = e.strb;
    volatile ap_uint<sizeof(T)> keep = e.keep;
    volatile ap_uint<U> user = e.user;
    volatile ap_uint<1> last = e.last;
    volatile ap_uint<TI> id = e.id;
    volatile ap_uint<TD> dest = e.dest;
    return ret;
    template<typename T, int U, int TI, int TD>
    inline ap_axiu<sizeof(T) * 8, U, TI, TD> push_stream(T const &v, bool last =
    false) {
    #pragma HLS INLINE off
    ap_axiu<sizeof(T) * 8, U, TI, TD> e;
    assert(sizeof(T) == sizeof(int));
    union {
    int oval;
    T ival;
    } converter;
    converter.ival = v;
    e.data = converter.oval;
    // set it to sizeof(T) ones
    e.strb = -1;
    e.keep = 15; //e.strb;
    e.user = 0;
    e.last = last ? 1 : 0;
    e.id = 0;
    e.dest = 0;
    return e;
    GRAY_IMAGE mframe(HEIGHT, WIDTH);
    void detectTest(AXI_VAL INPUT_STREAM[HEIGHT * WIDTH], AXI_VAL RESULT_STREAM[HEIGHT * WIDTH]) {
    #pragma HLS INTERFACE ap_fifo port=RESULT_STREAM
    #pragma HLS INTERFACE ap_fifo port=INPUT_STREAM
    #pragma HLS RESOURCE variable=RESULT_STREAM core=AXI4Stream metadata="-bus_bundle RESULT_STREAM"
    #pragma HLS RESOURCE variable=INPUT_STREAM core=AXI4Stream metadata="-bus_bundle INPUT_STREAM"
    #pragma HLS RESOURCE variable=return core=AXI4LiteS metadata="-bus_bundle CONTROL_STREAM"
    int i, j;
    for (i = 0; i < HEIGHT * WIDTH; i++) {
    unsigned int instream_value = pop_stream<unsigned int, 1, 1, 1>(INPUT_STREAM[i]);
    hls::Scalar<HLS_MAT_CN(HLS_8U), HLS_TNAME(HLS_8U)> pixel_in;
    *(pixel_in.val) = (unsigned char) instream_value;
    mframe << pixel_in;
    hls::Scalar<HLS_MAT_CN(HLS_8U), HLS_TNAME(HLS_8U)> pixel_out;
    mframe >> pixel_out;
    unsigned int outstream_value = (unsigned int) *(pixel_out.val);
    RESULT_STREAM[i] = push_stream<unsigned int, 1, 1, 1>(
    (unsigned int) outstream_value, i == HEIGHT * WIDTH - 1);
    return;
      Then I tried to modify the function of detectTest as follow. The function of the modified IP core is resizing the input image and then recoverying its original size. However, it did not work fine in the AXI-DMA test. The waveform captured by chipscope show that the ready signal of INPUT_STREAM was cleared after recieving servel pixels. 
    GRAY_IMAGE mframe(HEIGHT, WIDTH);
    GRAY_IMAGE mframe_resize(HEIGHT / COMPRESS_SIZE, WIDTH / COMPRESS_SIZE);
    void detectTest(AXI_VAL INPUT_STREAM[HEIGHT * WIDTH], AXI_VAL RESULT_STREAM[HEIGHT * WIDTH]) {
    #pragma HLS INTERFACE ap_fifo port=RESULT_STREAM
    #pragma HLS INTERFACE ap_fifo port=INPUT_STREAM
    #pragma HLS RESOURCE variable=RESULT_STREAM core=AXI4Stream metadata="-bus_bundle RESULT_STREAM"
    #pragma HLS RESOURCE variable=INPUT_STREAM core=AXI4Stream metadata="-bus_bundle INPUT_STREAM"
    #pragma HLS RESOURCE variable=return core=AXI4LiteS metadata="-bus_bundle CONTROL_STREAM"
    int i, j;
    for (i = 0; i < HEIGHT * WIDTH; i++) {//receiving block
    unsigned int instream_value = pop_stream<unsigned int, 1, 1, 1>(INPUT_STREAM[i]);
    hls::Scalar<HLS_MAT_CN(HLS_8U), HLS_TNAME(HLS_8U)> pixel_in;
    *(pixel_in.val) = (unsigned char) instream_value;
    mframe << pixel_in;
    hls::Resize(mframe, mframe_resize);
    hls::Resize(mframe_resize, mframe);
    for (i = 0; i < HEIGHT * WIDTH; i++) {//transmitting block
    hls::Scalar<HLS_MAT_CN(HLS_8U), HLS_TNAME(HLS_8U)> pixel_out;
    mframe>>pixel_out;
    unsigned char outstream_value=*(pixel_out.val);
    RESULT_STREAM[i] = push_stream<unsigned int, 1, 1, 1>((unsigned int) outstream_value, i == HEIGHT * WIDTH - 1);
    return;
      I also tried to delete or modify the following 2 lines in the modified IP core. But the transmitting problem existed too. It seems that the IP core cannot work normally if the receiving block and the transmitting block in different "for" loops. But if I did not solve this problem, the image processing functions cannot be added into the IP core either. The document of xapp1167 mentioned that " the hls::Mat<> datatype used to model images is internally defined as a stream of pixels". Does that caused the problem? And how can I solve this problem? Thanks a lot !
    hls::Resize(mframe, mframe_resize);
    hls::Resize(mframe_resize, mframe);
     

    Hello
    So the major concept that you need to learn/remember is that hls::Mat<> is basically "only" an hls stream -- hls::stream<> -- It's actually an array of N channels (and you have N=1).
    Next, streams are fifos; in software that's modeled as infinite queues but in HW they have finite size.
    The default value is a depth of 2 (IIRC)
    in your first code you do :
    for all pixels loop {
      .. something to read pixel_in
       mframe takes pixel_in
       pixel_out is read from mframe
       .. wirte out pixel_out
    } // end loop
    If you notice, mframe has never more than one pixel element inside since as soon as you write to it, you unload it. in other terms mframe never contains a full frame of pixel (but a full frame flow through it!).
    In your second coding, mframe has to actually contain all the pixels as you have 2 for loops and you don't start unloading the pixels unless you have the first loop complete.
    Needless to say that your fifo had a depth of 2 so actually you never read more than 3 pixels in.
    That's why you see that the ready signal of the iput stream drops after a few pixels; that's the back pressure being applied by the VHLS block.
    Where to go from there?
    Well first stop doing FPGA tests and chipscope if you did not run cosim first and that it passed.
    you would have done cosim and it had failed - or got stuck - then you would have debugged there, rather than waiting for a bitstream to implement.
    Check UG902 about cosim and self checking testbench. maybe for video you can't have selfchecking so at least you need to have visual checks of generated pictures - you can adapt XAPP1167 for that.
    For your design, you could increased the depth of the stream - the XAPP1167 explains that, but here it's impractical or sometimes impossible to buffer a full size frame.
    If you check carefully the XAPP, the design operates in "dataflow" mode; check UG902 as to what this means.
    In short, dataflow means that the HW functions will operate in parallel, and here the second loop will start executing as soon as data has been generated in the first loop - if you understand, the links between the loops is a stream / fifo, so as soon as a data is generated in the first loop, the second loop could process that; this is possible because the processing happens in sequential order.
    Well I leave you to read more.
    I hope this helps....

  • C-Series Module NI 9871, 4-Port, RS485/422 Serial Interface

    I have a question regarding the C-Series Module NI 9871, 4-Port, RS485/422 Serial Interface.
    I need to find out how it recovers the data from the link. I need to compare it with 8b/10b encoding.
    Does anyone know what UARTs are used or how the data is recovered.
    Thanks
    John Lee

    Hi John,
    I have had a look into this and I cannot find any specific information on the Transciever or UART. However I have come across an example using the 9871 with processing on a cRIO FPGA.
    To find this example open up Labview » go to Help » Find Examples » Search for 9871 and open up NI-987x Serial Loopback.lvproj.
    Hopefully this should give you a good start. Once your serial data is passed onto the FPGA you can then decode the 8b/10b within the FPGA code.
    Best Regards,
    Ben B.
    Applications Engineer
    National Instruments UK & Ireland
    "I've looked into the reset button, the science is impossible!"

  • Why Error opening JTAG UART ?

    I just got a ZC702 board and try to run demo application projects on the board. I follow the instructions to design. I try to run the design with JTAG. However, when I try "Run as->Launch on Hardware" with template project "Hello world", it always show error message in console as "Error opening JTAG UART @ localhost:-1"
    I am sure I have configured in "Run Configurations-->STDIO Connection"  Port as "JTAG UART" with right BAUD rate, anybody can suggest why it still does not work?
    Thanks very much.

    I am also having trouble with the JTAG UART in the MicroBlaze Debug Module (MDM).  I selected the JTAG UART option in the IP Integrator. Then in SDK I connected STDIO to the console using the JTAG UART but still get this error.
    Error opening JTAG UART @ localhost:-1
    I am using Vivado and SDK version 2013.4.  My hardware is the Avnet ZedBoard with a Zynq 7020.  The Zedboard does not have a comm port attached to the FPGA side (PL) of the part so I need to use the JTAG UART for STDIO.
    Any advice is greatly appreciated.
      Pedro

  • Arduino MyRIO communication via UART

    Hi,
    I am trying to realize the data communication between MyRIO1900 and Arduino via UART.
    UART interface on Arduino Uno is under 16MHz clock.
    The UART for myRIO can only set the Baud Rate. However if the frequency is different from Arduino, the connection will not be built.
    How do I adjust the frequency of MyRIO UART? What is the default frequency of MyRIO UART? Where can I find that parameter?

    SergioMa wrote:
    Hi,
    I am trying to realize the data communication between MyRIO1900 and Arduino via UART.
    UART interface on Arduino Uno is under 16MHz clock.
    The 16MHz has nothing to due with the baud rate of your Arduino.  You need to set the baud rate in the Arduino as well and make sure it matches the buad rate you used for the myRIO.  I would recommend 115200.

  • Test Pattern Output on Zynq Zedboard HDMI

    Hello,
    I am working on a project where we will eventually display the output from an image sensor through the HDMI output on a Zedboard.
    As a first step, I would like to build a simple project to test driving the HDMI output from an AXI stream.  The system would look like this
    Test Pattern Generator --> AXI4-Stream to Video Out --> HDMI Output Interface
                                                                         ^
                                                                         |
    Video Timing Controller -------------------|
    My assumption is that if this system can be made to work, it would be a good starting point for building the final system.
    I have attached the block diagram for this system in PDF format as well as the tcl file for generating the block in Vivado 2015.1.
    The major issue is that the Video Output core will not lock (an issue that has been brought up a lot in the forums).  I have tried to apply the various "not locking" solutions to this design without any success.  Each time, I run a behavioral simulation for 3-4 frames to see if the lock output will go high. 
    A few notes:
    1) The AXI tready signal from the Video Output core would not go high to trigger the test pattern generator unless I reset the Video Output core using the "locked" output from the Clocking Wizard block.  A simple "not" gate is used to get the right polarity.
    2) The system is configured for 1080p video (1920x1080).  Everything is running on the same 148.5MHz clock derrived from the 100MHz system clock.
    3) The Video Out core is configured as a Master.  The gen_clken input on the Video Timing Generator is tied high.
    Any suggestions on what is preventing the Video Output core from locking would be greatly appreciated.

    Thanks for confirming that the setup looks good.  In the process of generating some screenshots I discovered that the GND1 constant which was connected to the FID input was outputting 1 in the simulation.  This was telling the video out block that it was always the second field of an interlaced video stream which was preventing it from locking.
    No matter what I did, I coudn't get a constant connected to FID to output a logic '0'.  I ended up uninstalling Vivado and reinstalling 2015.1 and it now seems to work.

Maybe you are looking for

  • Is there any way to create smartlists with MPAA ratings?

    The only "ratings" field available seems to be the silly star rating field (which I don't use). I want to create smart playlists that use rating (like G or PG for my kids). I can't find a way to do this - does anyone know a good work-around? Thanks i

  • ITunes - 2 Computers - Can't seem to get it back on the one I want...

    I apologize if this has been posted, but couldn't find this exact scenario... I just purchased an 8GB 3GS. I initially synced it with my laptop (Win 7) at home just to get some basics on it and get it up and running. However, when I took it to work (

  • Problem with Zen Micro recogniti

    I have the installation CD and installed it accordingly. I went to the device manager, and Creative Zen Micro still has the yellow "?" by it as an unrecongized device. Can anyone help me please? MiguelMessage Edited by Zero269 on 02-09-2006 07:40 PM

  • Extereme Issues..ha ha, no pun intended...with wireless, Uverse, Iphone

    I just installed My Airport Extreme and am having a bit of trouble managing it, with my Uverse service.  I have read so much information on the topic and am unsure where to go.  At present, everyone in my house has internet access and since my wife w

  • Creating PO and PR

    1. Can we create a PO referencing a previous PO we created? How 2. How to create PR with out material and when material is not given ,is account assignment category mandatory? Thanks