Perturbation & Observation algorithm using labview

Hi
i have a problem with the mppt (maximum power point tracking) algorithm in labview, if you know how to do it help me please

السلام عليكم ورحمة الله وبركاته
الرجاء طرح الأسئلة بشكل واضح و كامل وذلك للدفع بهذا المنتدى العربي إلى الأمام وبذلك تعم المنفعة للجميع.
للإجابة على شيئ من السؤال التالي قد يفيدك
Using NI CompactRIO to Design a Maximum Power Point Tracking Controller for Solar Energy Application...

Similar Messages

  • Sending email using labview 8.6

    I am trying to send an email using labview 8.6.But I am getting error 1172  when I try to do so.
    Kindly let me know what modificatons i should do to the port an d server settings.
    I have taken the code from forum only
    I am posting a renamed version of it.
    gmail-1 is its actual name in forum
    kindly suggest me what I should do
    thanks
    Attachments:
    gmail_labview.vi ‏16 KB

    Dear sir,
    I have done as you said and observed that the error is coming at the final node "smtp client-send-message"
    I did not installed any .NET or any other programs.
    A screen shot of the error along with the error node is attached for your reference.
    thanks
    Attachments:
    gmail_labview_error.JPG ‏73 KB

  • How to use labview to control a robotic arm by EMG signal

    Hello,
    I am working on simulation of an active Exoskeleton (wearable robot) for the upper limb using LabVIEW for my senior project. I need to use the EMG signal as an input to move the elbow joint (flexing and extending). I downloaded labview biomedical toolkit to take a ready simulated EMG signal but have a weak experience in LabVIEW.
    The design criterial I am planning to use is to establish a threshold for the EMG signal using a comparator (above 0 for example) and set a counter for everytime the signal passes the threshold as(+1) in every count. Then, specify a degree value for the counter (for example when the counter reaches 10000) and feed this value of the counter to a simple simulated structure for the joint (simple angle of 2 lines) or a meter to represent the movement.. e.g. every 10000 count = 1 degree of movement. Zero crossing can also be used instead of the comparator and the signal will be filtered and that is easy to do. However, my problem is in converting this logic into LabVIEW. I don’t know how to set a counter for the signal and make every number of counts refer to a specific degree of movement and I also don’t know how to form the simulated joint structure in LabVIEW or even how to transfer this into a simple meter device in LabVIEW.
    I have only one month to do this project so any help or ideas you provide are highly appreciated
    Thank you ,

    CarlFHellwig 
    Thank you for providing this example I just implement  it in the software to check the counter results.
    In fact, the design criteria I desided to use latly is to use is to establish a threshold for the EMG signal using a comparator and correlate the EMG signal with the angles of movement of a simple simulated structure "motor" for a single joint (simple angle of 2 lines) eg. 30,60,90,120,150 degrees for flexing and extending. Zero crossing can also be used instead of the comparator and the signal will be filtered and that is easy to do. 
    In other words, the idea is to drive a motor for different angles based on the input EMG signals. I am now stuck with developing the algorithm of how the angles are related to the RMS value of the signal while flexing and extending and how to convert this RMS into angular velocity and angular position to form the simulation. 
    I will be grateful if someone guided me to a person did a similar project to discuss some issues. 
     

  • How can i design a smart antenna using labview?? plzz

    i have my final year project. I am student of Beng in Telecommunications. How can i design a smart antenna using switched beam algorithm on lab view??

    You can hardly use LabVIEW to design antenna, but definitely can use LabVIEW for Characterization, Optimization, and Test your antenna design.
    Hope you got the point here.
    I am not allergic to Kudos, in fact I love Kudos.
     Make your LabVIEW experience more CONVENIENT.

  • Dijkstra's Algorithm in LabVIEW

    So I've attempted to undergo the painful task of trying to implement dijkstra's algorithm in LabVIEW. The problem is that I think I am going about doing this wrong, or at least not in the most efficient manner. I've attached my VI so people can take a look at it. I guess I'll also give a brief description of how I'm attempting to implement this. So the input from the user ( at this point) is three 1D arrays as well as a start and end vertex value. The three arrays are the weight values of the edges, V1, and V2. The way these arrays are filled out is so that in each array the coordinating values are all stored at the same index. So if there is a path from vertex 0 to 1, with a weight of 2 then 2 will be stored in the edge array, 0 in the V1 array, and 1 in the V2 array, and all at the same index. These values get fed into functions and the shortest path is found, yada yada yada. I'd explain it more, but I think it'll be easier if you look at the VIs. The problem is that the way I have it setup, creating a finished product to sort through even 20 vertexes would result in massive code. The VI i have is only about 25% complete I'd say, but i dont want to go any further as it is getting exponentially more complicated with each iteration and I feel there has to be a better way. So if anyone has any ideas I would greatly appreciate it!
    P.S. I realize that i can write the code in C and call it as a function in the LabVIEW program, but I'm kind of stubborn and want to make this work written solely in LabVIEW
    Attachments:
    Dijkstras.vi ‏23 KB
    Short path.vi ‏21 KB
    Short path 2.vi ‏24 KB

    Well I'm glad that there are others interested in this project. First I'd just like to give a little update. The program currently will give the length of the shortest path to the desired vertex, but it doesn't output what the path it took to get that shortest value yet. So that is what I am working on now, as well as verifying its scalability and trying to implement other features that will actually make this a useful program vs. mere curiosity. 
    As far as my references go, I've gotten all my info from the internet. Wikipedia, of course, was my starting place. I found a well explained article at http://eoinbailey.com/blog/dijkstras-algorithm-illustrated-explanation that helped me decide how I was going to try and implement the program. I also read pretty much the rest of the links that come up on google when you type in "dijkstras algorithm." Again, I have no formal CS background, Im studying engineering, so I'm working with a lot of trial and error here. My basic methodology behind all this was simply to read as much as I could, try to understand it, and then work on the program until I started getting somewhere. Maybe not the best strategy, but it works for me. Thank you also for the link, I wish I had read that before I had started this project lol
    I will post my latest version of the VI later today, This way I can get some feedback on how I can improve it and what works or doesn't work. Thanks again everyone for all ur help!

  • Face recognition using labview

    I, am into my final year project. I am  doing face recognition using labview. I want to know how to compare image using labview. it would be  great if someone helped , as it is proving to be a major obstacle in my way

    Hi Sibom,
    There are various face recognizing algorithms available. If your purpose is simple face recognition, then the simplest method would be to segment the image using a line, and to perform the recognition for every tilt of the image.Tthus when the line traces 180 degrees, all possible positions would have been verified. 
    If you like a more holistic approach you can have a look at the trace transform model and the active appearance model, which are used in the industry.
    These models can be studied at : 
     http://www.face-rec.org/algorithms/#Image
    Here are a few other algorithms for face recognition :
    *Automated Face Detection System utilizing USB WebCam 
    *Face Recognition using Vector Quantization Histogram Technique 
    Regards,
    Siva
    A Face Recognition Software using Vector Quantization Histogram Technique 

  • Using labview cosimulation, how to control PWM duty cycle in multisim

    I am new to using Multisim with LabVIEW using cosimulation. I want to ask if there is a PWM component in Multisim that can have its duty cycle be controlled using LabVIEW? I have an algorithm in LabVIEW that outputs duty cycle values from 0 to 1, representing duty cycle percentages.
    How do I control the PWM duty cycle in Multisim using LabVIEW cosimulation?
    Many thanks,
    SPECTRE
    Solved!
    Go to Solution.

    Hi Spectre,
    In Multisim, search for the parts base on functionality, there are some PWM models in the database.  Have a look at this knowledge base  if you don't know how to search for parts:
    http://digital.ni.com/public.nsf/allkb/7309A5CABC677296862577ED006EC99E
    Aslo, have a look at this knowledgebase:
    http://digital.ni.com/public.nsf/allkb/EF391C48CF71AE4F862571B900644F84
    This article shows how you can get Mutlisim and LabVIEW to co-simiualte:
    http://www.ni.com/white-paper/13663/en
    I hope this helps
    Tien P.
    National Instruments

  • Levy flight or firefly algorithm in Labview or matlab

    Hello,
    Has anyone used Levy flight algorithm with Labview. Is there any toolkit  or library to implement this algorithm in Labview or at least a Matlab code that can be coupled with Labview via Mathscript??
    Thanks,
    Zied

    If you search the Web for "Levy Flight", you will find Matlab code.
    Bob Schor

  • Dynamically build VI using LabVIEW

    I'm currently working on an application and I would like to be able to generate VI's using LabVIEW. I remember seeing something like this being used before, so I would like to know how one would go about accomplishing such a feat. The point is that I want to be able to use LabVIEW to generate LabVIEW VI's add some elements to those VI's (controls, functions, etc).
    Thanks in advance,

    I can not comment on when, but it will become available, someday.
    Vision Builder has been doing this for years.
    The LV State Diagram Editor (which is wonderful by the way) can obviously edit diagrams and is apparently implemented as a VI!
    THe SD Editor (which is great by the way, I love this thing, NO MORE PowerPoint to design) is an incredibly small add-on. This leads me to suspect much of this functionality (i.e. VI's to edit VI's) is already implmented in LV, we just do not have mechanisms to get at it (other than using the very restricted tools mentioned above).
    I stongly agree with Wiebe in that this functionality will be rather challenging to master.
    It will also be dificuly to support!
    Imagine the type of code that would result. here
    is one mind bogling possiblity.
    Write three VI's
    "A.VI" alternately calls "B" and "C".
    "B.VI" Executes a "random VI" evaluates the reuslts and clones a modified version of itself as "C.VI" based its evaluation of the random code results and any prior experiments it has performed.
    C.VI then duplicates the actions of B.
    This could lead to a type of artificial intellegent code that is able to adapt it own algorithm.
    What happens if the code after learning and growing decides it does not want to do math anymore and would rather just listen to the same albumn over and over again?
    How does a support engineer address that?
    The reason I present this extreme case was that this functionaly, when it comes, it will be limited at first (as we see in dynamic event registration).
    I seem to remember Rolf Kalbermatter (on Info-LabVIEW) saying something like "Self modifying code is a bad idea and went out style years ago".
    Just my thoughts,
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Zedboard Xilinx Zync 7000 interface using labview

    Hello,
    I am doing my thesis in Zedboard for developing an DDR3 memory test and verification. For that I need to implement an LabVIEW dedicated Graphical User Interface base on NI Measurement Studio.
    Topic is : Szudy of Algorithmic test setup for DDR3 SDRAM with a Xilinx Zynq SoC.
    Here i have done my algorithm in Xilinx SDK. But I need to make a GUI using labview. Which helps to execute these programs. Please let me know how I can do this.
    1. Or is it possible to directly access the Zynq SoC using Labview. If yes how?
    2. Or if I need to do the coding in Xilinx SDK and How I can run this code using Labview?
    Please give me an detailed reply. Since I am new to labview. I m not understanding how to start with. If you have any example design please share with me.
    Thanks & Regards,
    Nithin Ponnarasseri
    Solved!
    Go to Solution.

    No you can't develop directly in LabVIEW and deploy that program to the Zync board. NI has their own Zync based hardware platform (cRIO and myRIO) but the tools to target those boards are specific to the NI hardware implementation and won't work with other hardware. Developing an interface for another hardware platform is a lot of work and needs to be adapted for every single flavor of a new hardware platform. And NI does not support this for other hardware.
    So your option will be to develop an application with the Zync SDK for the Zync ARM controller and supply some form of communication interface (serial port, TCP/IP, or similar) in that application over which you can send commands from LabVIEW to your embedded application.
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Interfacing a C++ Console Application Genetic Algorithm with LabVIEW

    Hi,
    I am have recently modified the GENESIS genetic algorithm using C++ so that it runs on a Windows machine (before it only ran under UNIX). In my experimental work the aim is to control everything using LabVIEW, including this Genetic algorithm.
    My problem comes from the fact that the Genetic Algorithm is set-up as a console application, a library file is created  containing the input parameters of the VI, then this is joined with an evaluation function that is dependant on the library file an an executable is created.
    I need to take some information collected from LabVIEW and give it to the genetic algorithm, but I don't know how to do this.
    Any Ideas?
    Thanks
    Alan Homer

    You can run most applications from a command line using the location/application.exe (where the location is where the executable is stored), all c++ applications run using:   int main (int argc, char **argv); where the argc is the number of arguments and argv is essentially a string table with the parameters passed to the program (will be used as code) the output if a command line application is the std out which can be piped to file or wired out from the standard out terminal.  To pass parameters from your LV code to the command line you should know the parameter list (sometimes empty). 
    Paul
    Paul Falkenstein
    Coleman Technologies Inc.
    CLA, CPI, AIA-Vision
    Labview 4.0- 2013, RT, Vision, FPGA

  • Real Time ADC/DAC for SMPS using Labview and USB

    Hello all,
    I have asked the sales department this same question, so here's a two-pronged approach:
    I am reserching a control algorithm for a switch-mode power supply, and so far, the simulations for its performance look good.  Now, the goal is to implement the circuit for experimental data.
    I've seen several NI USB DAQ boxes that appear to have the performance I'm looking for (for example, the USB-6211 would have the resolution and sampling rate that I need).
    The control algorithm uses the following math functions:  add/sub/mult/div/exponent, and derivative/integral.
    My question is this: is Labview "strong" enough to take in four channels of 250Ksps data, crunch the numbers in an equation, and spit out the answer to an analog out channel, all in REAL time?  I'm looking for an analog output rate of ~100kHz.
    Thank you for any suggestions that you have!
    -Rick
    Solved!
    Go to Solution.

    Hey,
    So if you were just trying to perform an input or output then the USB-6211 would certainly be able to handle it because the hardware clock would be handling the input/output, not software. However, what you're wanting to do, basically a feedback system, will require software timing (at least for a USB device) because you'll have to be able to actively specify what the output is. So, for that reason alone, and the fact that you want 100kHz output, this device and USB devices in general won't be an option no matter what software you use, LabVIEW or otherwise. On another note, what you're looking to do sounds more like live updating, not Real-Time, which is more about jitter. Bottom line, for these kinds of requirements, you're going to need to move to an FPGA board, something like the NI PCIe-7841R would work. It's more expensive, but for your requirements, FPGA is going to be the only option and it comes down to bus latency as well as software response time. With FPGA, as is shown in the first diagram of the following document, you're basically closing your software loop through hardware.
    FPGA Fundamentals
    http://www.ni.com/white-paper/6983/en
    --Ryan S.

  • I Need some info about interfacing the PC or laptop to Spectrum analyzer using Labview

    we need to control  the spectrum analyzer using an interface   that will    be   developed  using  Labview  .
    Spectrum analyzer will be connected to tha PC using RS 232C and the waveform observed will be seen in the PC interface of spectrum analyzer.
    Pls send some info regarding dis.

    Using a spectrum analyzer with LabVIEW is a pretty common application, my first program 15 years ago did this. What we need to know to help you though is what model spectrum analyzer are you planning on using? Most of the ones I'm familiar with use the GPIB interface rather than RS232, but that isn't a major issue, the spectrum analyzers command set is the important one. There are a lot of LabVIEW drivers available for a larger number of analyzers, here on the National Instruments' site. Although most use the GPIB interface (or ethernet), they can be used as a starting point to develop an RS232 driver if one isn't available, assuming that there is a driver for the model that you have, or a closely related one (manufacturers frequently use similar command sets within a model type).
    So, what type analyzer are you using, and what types of things are you planning on doing with it?
    Putnam
    Certified LabVIEW Developer
    Senior Test Engineer
    Currently using LV 6.1-LabVIEW 2012, RT8.5
    LabVIEW Champion

  • How to let the user define the colors for each plots in the graph (I use LabVIEW 7)?

    How to let the user define the colors for each plots in the graph (I
    use LabVIEW 7)?

    Hi,
    Take a look at this example, it uses property nodes to select tha
    active plot and then changes the color of that plot.
    If you want to make the number of plots dynamic you could use a for
    loop and an array of color boxes.
    I hope this helps.
    Regards,
    Juan Carlos
    N.I.
    Attachments:
    Changing_plot_color.vi ‏38 KB

  • I am trying to use Labview and RP1210 compliant hardware to connect to a truck J1939 bus and receive messages.

    I am trying to use Labview and RP1210 compliant hardware to connect to a truck J1939 bus and receive messages. 
    Specifically I am attempting  to read data frames using the RP1210_READMESSAGE .   am able to configure the hardware and send a message to the J1939 bus. .    I think I have not configured something correctly.  I can use the RP1210_SENDMESSAGE and see the message I have sent on the bus using CANalyzer   When I use the RP1210_READMESSAGE   I get the timestamp from a message and the return from the function sends back the correct number of bytes (the number matches the number of bytes I sent out plus four bytes from the timestamp).  What I am having trouble with is actually receiving the data. I have had the same type of behavior from two different hardware (Vector CANcase XL and Nexiq USB Link), so I don't think the issue is vendor specific.
    Has anyone been able to make the RP1210_RECIEVEMESSAGE function work correctly?
    Thanks for any help

    Thanks
    I have already tried that.  The links are the NI RP1210 wraper. The problem I am having is using labview to interface with the RP1210 layer.  The RecieveMessage char*fpchAPIMessage this is the output which is a pointer to a cahracter array.  In this variable I can receive the timestamp of the message but not the message.  The retun showns the correct amount of bytes are aviaable (18 for a 8 byte message) but I can only get the 4 byte timestamp  I think I have to dereference this pointer to view the data.  I am not sure how to fix this. 

Maybe you are looking for