Transform a cluster in a graph

I want to transform a cluster into a graph like in the example "peak detection".
But i don't know what to do.
Can you give me a methode to do that?

There is a useful example in LV on this matter. Look at XY graph.vi in Search Examples / Fundamentals Examples / Graph Examples.
Roberto
Proud to use LW/CVI from 3.1 on.
My contributions to the Developer Zone Community
If I have helped you, why not giving me a kudos?

Similar Messages

  • Multi graph cluster

    Hi all,
    I have to plot data coming from 64 different sites. 
    What I'd like to do is to create an array (8x8) of graphs and plot each incoming trace in a different graph.
    To do so, I've created a 2D array of clusters, where each cluster contains a graph.
    By means of a for loop I converted the 1D data array related to each channel in a cluster (bundle function).
    The indexed cluster produced by the for loop is finally sent to the array of clusters containing the data. 
    I've attached a simple VI I did to test what i was doing. The idea is to plot 4 different ramps with programmable gains.
    However the resultant plots are some flikkering.
    Anybody could help me to solve this problem please?
    Kind regards
    Gian Nicola
    Attachments:
    GraphOrder.vi ‏16 KB

    you should have been answered by now-
    So I will answer.
    <you may wish to wrap your head in Duct tape>
    I do not believe your "fickering" is a bug.
    The fact that you can drop a cluster containing a graph into an array MAY be a bug.- let me attempt to explain why.....
    so for a Graph we pass an array of (DBL)  to the UI theead tranfer buffer.... Whoops!  property "Value" has very little do do with "Property" Plot
    What is that again?  And how does that affect your example code?
    Elements of an array must have identical properties except for "Value" (and by extension "text.text")  "Plot" is a property of a "Graph"  o..o ....It is related to the value of the 1D array.  but, before it is displayed on the plot area the plot is subjected to either compression or expansion..(WaddUmean Jeff?)
    The "PLOT" sends the data array through a processing plant.  the data points that cannot be displayed (like a 200,000,000 points) are decimated with some turning point algorithm and others (like a 5 point array) are interporlateded to fill in the pixels that are missing!
    Yes ,  a line (PLOT with color and "glyphs") is different from an array of DBL.  they are not directly related.
    So what you see a flicker is LabVIEW refreshing the properties of the graphs in the array of clusters of graph indicators. 
    Is that helping at all?
    Jeff

  • Passing multiple graphs to sub vi

    Hello,
    I'm rather new to labview and could use some advice.  My current project utilizes a for loop to create a graph for the I-V measurements of each sample solar cell (each plot is on a seperate graph).  Multiplexing is done externally via labview control.  I have a subvi to show a sample report to the user before they are prompted to save the report.
    What would be the best way to pass the data for 6 (or more) graphs to a subvi, and then create the graphs again in the subvi?  I think an option would be to utilize a shift register to concate the arrays from each run.  Unfortunately, the array size is determined by the user at run-time (number of test points on the graph) and I'm not sure if I can seperate the data properly in the subvi.
    Another question would be can I create an array of the seperate graphs (graph 1 in array element 1, etc.) utilizing the data cluster of each graph as an array element?  I ask this because it seems that data seperation within the subvi would be easier with this method.
    Thanks in advance for the help.
    Solved!
    Go to Solution.

    Without seeing the rest of your code you can use either an array of references or just an array of waveforms. If you data for the graphs is not in the form of waveforms then you can simply create the waveform data type using the Build Waveform function. See attached example for the references approach.
    pjr1121 wrote:
    Do you know of an example which shows placing clusters within an array?
    I don't understand what you're asking here. Are you referring to making an array of clusters? On the front panel or on the block diagram as a constant, or on the block diagram in terms of building an array of clusters programmatically. For the front panel just place an array control, and then drag a cluster control inside the array. Then place the elements of the cluster inside the cluster element. For a block diagram constant it's the same approach. For building it programmatically use Bundle or Bundle by Name and then Build Array or Replace Array Subset depending on whether you're building a new array or replacing an array element.
    Have you done the LabVIEW tutorials? To learn more about LabVIEW it is recommended that you go through the tutorial(s) and look over the material in the NI Developer Zone's Learning Center which provides links to other materials and other tutorials. You can also take the online courses for free.
    Attachments:
    multi graphs.vi ‏29 KB
    multi graphs - subVI.vi ‏30 KB

  • Animaiton of 2D graph

    Dear
    I would like to present my results according to various paramters with powerpoint.
    Would you please let me know how to capture the 2D graph (and parameter controls) or make the file with LV conveniently?
    Should I save the image frames and then make the avi file in independent avi-application?
    메시지가 07-22-2008 09:29 AM에 labmaster에 의해 편집되었음

    Riatin wrote:
    So, I've dug around and can't seem to find the "Get Image" property...could you point me in the right direction?  I have 2 DDT's going into a Build XY Graph Express VI, then Cluster to XY Graph.
    Thanks,
    Ria 
    It is actually a method and not a property.
    Bonus Q:
    Has anyone come up with a nice clean rule that describes when NI makes something a method vs a property?
    I surely have not recognized the pattern yet.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • How to make a graph inside a while loop maintain previous values

    In the beginning I was trying to use an the XY Graph Express VI to create a plot of points.  However, the graph is making a linear retrace between the first point of the new line, and the last point of the previous line.  It then creates the new line as desired.
    I have tried using a for loop with to bundle a cluster to the graph, but the graph resets the plot on each iteration (as expected) and I cannot find a way to make it maintain the previous data.  I tried using shift registers but was unable to find out how to do this, and I have also tried bundling the cluster to an array, but cannot figure out how to make the cluster go to a 1D array of a cluster of 2 elements.
    One option is having is finding a way to make it maintain previous data, but the preferred option is to make it create a new plot on each iteration so as to see the color change for each new plot.
    Solved!
    Go to Solution.
    Attachments:
    shift register attempt.JPG ‏64 KB
    original attempt.JPG ‏42 KB
    Output only current iteration data.JPG ‏86 KB

    I'm taking a stab at this because I'm not exactly sure what you want. But I think it is what I have shown here. You need to use a shift register on your outter while loop as I have shown. Your image where you tried using a shift register shows a misunderstanding of shift registers and how they work though, so I would take a look at these tutorials.
    CLA, LabVIEW Versions 2010-2013
    Attachments:
    plot with SR.png ‏29 KB

  • Is it possible to scale values in the graph editor?

    Ive animated a layers position over about 150 frames with lots of keyframes but I wanted to 'amplify' the existing motion a little.
    It will take ages to tweak the individual keyframes so I wondered if there was a numeric way of selecting the keys and scaling their value by a percentage or similar.
    Im able to do this in my 3D app and wondered if AFX had a similar function. The 'Edit Value Graph' doesnt appear to allow for a numeric adjustment.
    Thanks

    In the case of (non-separated) position, the bounding box doesn't offer the same transform controls in the value graph as for other properties.
    You do get the transform box for position keyframes. when you separate dimensions.
    Note that even if you don't need to separate XYZ for position, you can do that briefly to use the transform box, and then join the dimensions back together.

  • How to enter constants into Labview that will affect the graph?

    How do you enter constants into Labview in which x and y would be dependent on them? Ex. Say I am creating a graph of voltage vs time and I want to have constants such as trigger level (such as wanting my graph to start recording data when it reaches 20 volts and stop recording after 20 volts). I would also want to have a time interval as a constant. I have no clue how to do this in labview so I am looking for some kind of help. Thanks
    Solved!
    Go to Solution.

    You would just make the values write to an array of x vs y.
    The values would only write if some boolean was true (such as voltage > 20)
    Then cluster the x and y and send the cluster to a graph.
    Message Edited by Cory K on 12-11-2008 11:04 AM
    Cory K
    Attachments:
    condition.PNG ‏8 KB

  • Graph multiple inputs

    hi
    I am taking 4 inputs from a daq and i need to graph the 4 inputs on one graph. I tried to enter the different channels in the task but it will not let me do it. I also need to average the 4 inputs and graph those on a separate graph also. I am using version 7.1. Can anyone help please.

    You can graph four channels on one graph by putting your data points into a cluster and then graphing them. Put the average on it's own graph. An example is enclosed, it's version 6.1, but you should be able to open it if you have any higher version.
    eewonder
    LV 6.1, 8.2, 8.5, 8.6 beta
    Fieldpoint
    DAQmx
    IMAQ Vision
    Attachments:
    graph4elements.vi ‏38 KB

  • Cluster into array of bit

    Hi
    Well, my title is my question. Is it possible
    to transform a cluster to an array of bit ? I am
    on LabView 5.1.
    Thanks in advance

    Sure. In LabView 6.1 there's a Cluster to Array function in the Array or Cluster palette. If that's not available in 5.1 you can use the Unbundle function (in the Cluster palette) then the Build Array function (in the array palette).

  • Modifying version 6 VIs in version 8

    I am trying to modify a VI that was created using labview v6.0.  I am modifying it in v8.0.  The old program uses the Analogue input VI called 'AI Acquire Waveforms'.  It uses a device number to decide which instrument to use along with the appropriate count, rate and other settings.  Instead I want to use a VI in the Universal Library for LabView with version 8.0 program.  The VI I specifically want to use is called 'AInScFg' (Analogue In Foreground).  The VI is similar, it uses a 'Board number' instead of device number. 
    The main difference is that on the 'AI Acquire Waveforms' VI, the output wire type is a 2D array, an Actual period time, and an error out.  On the 'AInScFg' it has a single array of ADData type, an Actual rate, and an error out.  The 2D array must be transformed in order to be graphed.  So to ge to the point here, the old version uses the transform VI and then makes indices of the transform two times to create Array Index VIs for index 0 and 1(constants set for Array Index).  These indices are then put into a Build Array and sent to the Y part of a node for a waveform graph (the node has Xo, dX, Y parameters). 
     When i try to do this with 'AInScFg' the ADData indices are not arrays like they are with the previously mentioned indecies.  How can I create the same output using the AInScFg?  Thanks,
    -Tekky

    What is the "Universal Library for LabVIEW"? It sounds like the old program indexed the 2D array and created a waveform data type. I don't understand the reason why since AI Acquire Waveforms can optionally return a waveform data type anyway and if only a single waveform was required, AI Acquire Waveform (no 's') could have been used. What exactly is ADData type? If it's a cluster, what are the elements?

  • Version 6 AI in version 8

    I am trying to create modify a VI that was created using labview v6.0.  I am modifying it in 8.0.  The old program uses the Analogue input VI called 'AI Acquire Waveforms'  I just uses a device number with the appropriate count, rate and other settings.  Instead I want to use a VI in the Universal Library for LabView with version 8.0 program.  The VI I specifically want to use is called AI Sc Fgd i think.  Same sort of idea, uses a 'Board number' instead of device number.  The main difference is that on the 'AI Acquire Waveforms' VI, the output wire type is a 2D array, an Actual period time, and an error out.  On the other AI Sc Fgd it has a single array of ADData type, an Actual rate, and an error out.  The 2D array must be transformed in order to be graphed.  So to ge to the point here, the old version does the transform and uses two Array Index VIs for index 0,1.  How can I make the same output for the AI Sc Fgd?  Thanks,
    -Tekky

    Hello Tekky,
    I can't find the 'AI Sc Fgd' VI you referenced.  Can you tell us where its located on the function palette?  The AI Acquire Waveform VI is a Traditional NI-DAQ VI that is available in LabVIEW 8.0.  It is found in the functions palette under Measurement I/O > Data Acquisition > Analog Input.  If you don't have this palette, you may need to reinstall Traditional NI-DAQ 7.4.1 and make sure support is enabled for LabVIEW 8.0.  The AI Acquire Waveform VI can either output a 1D array of DBLs or Waveform data, and you can select which format to output data by right-clicking on the function and choosing Select Type.  The AI Acquire Waveform VI is a top level VI that contains the AI Waveform Scan subVI, which outputs a 2D array of DBLs.  The Index Array function is used to extract the first column of data, since its assumed that your only acquiring from a single input channel.  If the data from your function is already a 1D array, you should just be able to pass that directly to a waveform graph or numerical array indicator.
    Also, all these functions are Traditional NI-DAQ VIs.  Have you considered using the NI-DAQmx driver and function calls?  These functions are much easier to use and actually operate more efficiently.  More information about the NI-DAQmx functions can be found here:
    Learn 10 Functions in NI-DAQmx and Solve 80% of Data Acquisition Applications
    Also you can check to see if your DAQ device is supported under the DAQmx driver here:
    NI-DAQmx Driver Support Versions
    I hope this helps,
    Travis G.
    Applications Engineering
    National Instruments
    www.ni.com/support

  • 0.5 MB program has 832 MB memory usage even before it is run

    I just uninstalled and re-installed LabVIEW 2011. I was hoping this would do something to lower the memory footprint. Alas, it did not make any difference. This is what I see:
    - When I launch LabVIEW, it uses 94 MB even before I have opened a single VI, i.e. only the front end uses this much memory.
    - When I open the VI (the size of the .vi file is 549 KB), memory usage goes up to 836.4 MB. This is before I have even run the program.
    - Finally, when I run the program, memory usage jumps to 1.17 GB.
    I find this memory usage grotesquely high. Can anyone please shed some light on what can be done? I have uploaded the VI. It's a rather simple VI.
    Thanks,
    Neil
    Attachments:
    IP QL to charge - from img.vi ‏446 KB

    nbf wrote:
    1) If I understand correctly, what you are saying is that when I open the VI, the data structures associated with the intensity charts and waveform graphs are initialized to default values (e.g. 0), so there is memory usage even before the program is run and anything displayed, right?
    Yes, a huge array still takes a lot of memory, even though it compresses well for storage inside the VI. Once the VI is ready to run, these arrays need to be allocated in memory.
    nbf wrote:
    2) Is there an alternative to using the value property node that doesn't duplicate the memory content? (e.g. something like pointer or reference in C++?)
    One possiblity are data value references. You can open a DVR and fill it inside the subVI, then extract slices later as needed. If done properly, you can also keep the 2D data in a shift register (yes, please learn about them!).
    nbf wrote:
    4) Regarding the clearing of the indicators, could you please tell me how to do that, without removing the color ramp settings?
    The color ramps can be set programmatically using property nodes based on array min&max and you can assign colors for the ramp values at will.
    nbf wrote:
    5) You are correct that I am displaying 4000x2000 array data in an intensity graph with 400x200 pixels. But then I change the min/max of the charts/graphs to zoom into the region of interest; I can identify the region of interest only after displaying the full 4000x2000 array of data. I don't know if there is a way around this.
    Do you really need to show the 2D data in three different transforms? Maybe one intensity graph is sufficient. You can apply the other transforms on the extracted slices with <<1% of the computing effort. What is the range and resolution of your data? Do you really need DBL or is a smaller representation sufficient (SGL, I16, etc). From looking at the subVI, the raw data is U16. The subVI also involves a coercion to DBL, requireing another data copy in memory. When running, make absolutely sure that the subVI front panel is closed, else additional memory is used for the huge array indicator there. For better performance, you might want to flatten the subVI to the main diagram or at least inline it.
    nbf wrote:
    6) Yes, I use the continuous run mode instead of a toplevel while loop. Is there a disadvantage to doing that?
    As Samuel Goldwyn ("A hospital is no place to be sick") might have said: "Run continuously is no way to run a VI"
    Run continuously is a debugging tool and has no place in typical use. What it basically does is restart the VI automatically whenever it completes, and as fast as the CPU allows. It is mostly useful to quickly test subVIs that don't have a toplevel loop and play with inputs to verify results. Any toplevel VI needs a while loop surrounding the core code. Period!
    So, my suggested plan of action would be:
    Use a while loop around your toplevel code
    Decide on the minimal representation needed to faithfully show the data.
    Keep a single copy of the 2D array in a DVR
    You optionally might want to undersample the data for display according to the pixel resolution and use other means of zooming and panning. For zooming, you could extract a differently sampled small array subset and adjust x0 and dx of the axes accordingly.
    Use a single 2D graph (if you need to see the various transforms, add a ring selector and transform the 2D array in place. Never show more than one intensity graph.
    See how far you get.
    LabVIEW Champion . Do more with less code and in less time .

  • VF0070 gamma correction - for experts...

    I am using VF0070 for image analyzing - my aim is to take RGB spectrum image and transform it into reflective spectrometer graph.
    I'm encountering problems regarding on how colors as seen at the computer screen are different from their REAL RGB.
    I'm taking the R,G,B pixels and make a conversion to YUV for using the Y values.
    I'm measuring the Dark current and subtracting it.
    My reflective spectrometer algorithm is correct but still my measured graphs are not like the should appear according to the reference graphs....
    my questions are:
    1. What is the gamma value which is equal to 1 ? meaning NO correction... the values in VF0070 are 0 to 100.
    2. What is the Non-Linearity equation of this camera?
    3. How can i correct the measured color spectrum to be like the reference of the same color ?
    If i knew what the camera and the driver is doing to the real color i will be able to correct the spectrum reading to be real.
    If someone can help with his knowledge, I would appreciate it very much.
    Thank you very much
    Ofer

    Ken, there's a whole part of this that isn't stated but is obvious now. You are playing back this video as a movie on the computer screen (and therefore projector), rather than as video...
    So from the start it is seriously degraded by being projected that way - in resolution, gamma, and "temporality" (lack of interlace support). And yes, it would then also be affected by differences in how Macs and PCs treat their display gamma.
    My comments were aimed at a situation where one can really address maximimum quality for the situation. From the above that won't be the situation here. But you can still make improvements.
    You are still taking this seriously, so here's my updated advice:
    Don't change anything about the production or output - keep editing to a calibrated CRT.
    Do a thorough calibration of the projector as a computer monitor. The PC should have a system for this. Calibrate both color and gamma under the normal viewing conditions. The actual calibration could be on the projector and/or in the PC.
    Now take bars from FCP through all your channels to where it gets played back on the PC. Use the playback software's controls to adjust the bars playback to work as close as possible to what a video display would. You won't get very close but whatever you can do will make a huge difference.
    If your software doesn't have such controls, then either:
    - get playback software that does, or
    - (what I would do): decide the projector's role as a video display is more important than it's computer display, and adjust the projector to the bars.
    If you really want to stun them, ignore all of this, get an actual CRT video projector and mount it in the room. Get a tape, disc, or true NTSC computer output device, and adjust to bars. The difference will blow you away too.

  • Appel asynchrone, variant et récupération de donnée.

    Bonjour à tous et toutes, 
    Voila j'ai un petit soucis avec mon programme,
    Il est composé de la facon suivante,
    Un vi (mesure de jauge avec interpolation) qui permet de choisir deux interfaces de configuration différente dans une liste déroulante.
    Ici j'essaye tous d'abord de débugger la configuration par déformation.
    Lorsque je choisis mesure par déformation dans mon vi principale j'ouvre une vi dialogue général (qui permet de confirmer ou non la configuration d'un ensemble de parametre en variant).
    Celui-ci va ouvrir dial_def.
    Ensuite, celui-ci va ouvrir suivant l'appuie sur le bouton demandé un vi dialogue général (je l'ai configuré en copie) qui lui même va ouvrir le vi de configuraiton ouvert.
    je récupére le variant le transforme en cluster, le met dans l'interface visible par l'utilisateur.
    L'utilisateur mets sa configuration
    je me charge ensuite de le retransferer dans le cluster que j'avais en entré, en ne modifiant que la partie que l'utilisateur à appeller.
    je retransforme en variant et renvoie les infos, lorsque l'utilisateur a appuyé sur le bouton ok.
    Mais voila les informations ne se transmettent pas quand je les modifie .
    Et je ne vois pas pourquoi du tout.
    Je vous mets en copie mon projet en 2013 et en 2011.
    Cordialement
    J'ai essayé d'etre concis (mais pas sûr d'avoir été compris).
    J'ai commenté les vis avec plus de détail pour le cas, parametre d'acquisition vu qu'ils sensait fonctionner tous identiques.
    Pièces jointes :
    Dossier mesure_def_decim2011.zip ‏2729 KB
    def.zip ‏2858 KB

    Salut,
    Je ne sais pas si ça vient de ça mais tu n'utilise pas ta définition de type (celle de dialog_general) dans la boucle du dessus, au moment de convertir ton varianbt en cluster, mais un cluster normal...
    Sinon le cas où tu n'aurais rien serait celui où tu changes la valeur de "Mesure" sans rien avoir fait avant, car ton variant dans la boucle du bas n'auras pas été initialisé.
    Eventuellement, pour être sur de ce que tu envoies dans ta file d'attente, tu peux copier coller le variant en donnée de ta boucle sup dans la boucle inf, ds l'événement "Mesure" ou tu remplis ta file d'attente. Tu peux déjà ici vérifier si la donnée que tu est censé envoyer ressort bien après conversion.
    Francis M | Voir mon profil
    Certified LabVIEW Developer

  • Velleman K8000

    Bonjour,
    J'ai récupérer un driver pour la Velleman K8000. J'aimerais mettre ce VI en sous-VI pour enclencher des relais et donner un valeur à une sortie analogique (0-5V).
    Comment puis donner l'ordre d'allumer les boutons "digital outpout 7 & 8"? Ces valeurs sont dans un cluster. je suppose que ce sont des valeurs bolléenne mais je n'arrive pas à les programmer.
    Je souhaite donc avec ces boutons commander une carte de relai.
    Même problème pour le controle de la tension de sortie "DAC outpout 1".
    En fait comment envoyer un information bolléenne à un cluster?
    Je vous joint le VI.
    Merci d'avance.
    Résolu !
    Accéder à la solution.
    Pièces jointes :
    K8000.vi ‏49 KB

    ton problème est donc de changer un seul élèment de ton cluster. Il faut donc à partir du cluster d'entrée, changer une donnée.
    Il y a 3 méthodes :
    > modifier la donnée du cluster "par exemple 01" : il faudra faire le code pour les 8 données
    > transformer le cluster en tableau, et remplacer un élément du tableau et le transformer en cluster
    > idem mais avec une structure inplace
    Tu passes le cluster au driver.
    PS : cette image est du code, à "glisser" dans un vi (c'est un VI snippet, si tu ne connais pas)
    A+
    Luc Desruelle | Voir mon profil | LabVIEW Code & blog
    Co-auteur livre LabVIEW : Programmation et applications
    CLA : Certified LabVIEW Architect / Certifié Architecte LabVIEW
    CLD : Certified LabVIEW Developer / Certifié Développeur LabVIEW

Maybe you are looking for