LabVIEW 8.0 Graph Scaling

Hi,
I generate an array of random numbers displayed on a Waveform graph. If I do the same thing during 100s, i have of course 0 to 100s on the Xscale.
I'd like to display the valve of the array by range of 10s. First, I display the first ten secs, then the ten next ones and so long.
I made a sample code. It works but comes a strange effect. Before displaying the right scales, it starts to display first the full array then the splitted one. It makes a visual [flash].
Does anybody already have the same issue? and know a way to sort it out?
Attachments:
Test graph.vi ‏41 KB

Hi
Basically because you have "Autoscale X" activated. In addition, you do have not a defined order of execution. You do not know whether the graph is updated first or the scale is modified.
I edited you vi a little, so that this is corrected.
Thomas
Using LV8.0
Don't be afraid to rate a good answer...
Attachments:
Test graph.vi ‏42 KB

Similar Messages

  • Labview array to graph

    I am reading data from modbus and have an array of data that I need to graph, deepending on the iteration of a for loop I want to direct the data to the right struture and plot it. 
    What plot should I use, x-axis is time in all cases. Should the array be unbundled first or masked instead?
    Solved!
    Go to Solution.
    Attachments:
    FPI_2.vi ‏27 KB

    Well, some of your charts seems to have multiple traces, thus you need to rearrange the data a little bit. Here's one quick attempt, but please modify as needed.
    LabVIEW Champion . Do more with less code and in less time .
    Attachments:
    ArrayMOD.vi ‏16 KB

  • Continous ramp generation in labview using XY graph

    hi...i want to generate a continous ramp signal with XY graph but not by using charts.....because oscilloscope do not support a chart by using DAQ.....so please tell me a solution.

    ankit26290 wrote:
    because oscilloscope do not support a chart by using DAQ
    Here are you referring to a 'software based oscilloscope'...??
    Well I think it should be compatible...!! Can you show, what you've done so far??
    I am not allergic to Kudos, in fact I love Kudos.
     Make your LabVIEW experience more CONVENIENT.

  • Labview.ini and graph rescaling

    Hi everyone,
    I'm using Labview 71. and 8.0 on Mac OSX.
    It used to be (under Laview 6) that to disable graph rescaling a one line labview.ini file would do the job.  The one line would be
    EnableAutoScales=False
    See for example
    http://zone.ni.com/devzone/cda/tut/p/id/3815
    This would work if you put the labview.ini file in the same directory as the Labview application.
    Now, with Labviews 7 and 8, its not working anymore, well at least not for me.
    Has the name of the file changed?  (Does its capitalization matter?)  Has it been superceeded by some other mechanism?
    Exactly where should the file be?  (Mine is in bootdrive:Applications:National Instruments:Labview 7.1
    Has the syntax of labview.ini changed?  (Is that single line sufficient?)
    I noticed that the valuable web page discussing INI files, labview.brianrenken.com/ini, is not working right now.
    Thanks in advance,
    Peter H.

    PGH wrote:
    It used to be (under Laview 6) that to disable graph rescaling a one line labview.ini file would do the job.  The one line would be
    EnableAutoScales=False
    Now, with Labviews 7 and 8, its not working anymore, well at least not for me.
    I think that the the global .ini entry EnableAutoScales=False disappeared from use as of LV6.1 when it was replaced with the ability to right-click (or do the Mac equivalent) on individual graphs and toggle Advanced>Auto Adjust Scales.
    =====================================================
    Fading out. " ... J. Arthur Rank on gong."

  • Labview Intensity Charting/Graphing

    How do you get an array with three columns of data to be 3 individual arrays and then graph them in an intensity graph/chart? I am pulling data from tcp and it is output as a comma separated string and a carraige return to denote an end of variables so the output in the data stream looks like:
    (X,Y,Intensity)
    6,80,9,
    74,14,0,
    51,69,1,
    then i conver it to s spreadsheet style 2D array. I talked to the labview reps and they said I need to use something like Replace Array Subset. To do this. I have included the VI and an image of what my current setup results in. Basically I just want the intensity to be plot at the X and Y coordinate specified in the set. Thanks!
    Attachments:
    TCPGraphing.vi ‏40 KB
    labview.png ‏108 KB

    If I understand correctly, you will want to initialize a 2D array of the correct size for your data filled with zeros.  As the data is read in, use the Replace Array Subset to replace a zero with the new data.  The X and Y coordinates will determine which element of  the 2D array to replace with the associated Z value.  Something like the attached maybe?
    Randall Pursley
    Attachments:
    Replace.png ‏20 KB

  • Graph scaling question

    Using the graph palette tools on a XY graph having 2 Y scale, I would like to scale the first Y axis without changing the second one. I use LV 6.0.2. Any tips? Thanks

    If you simply right click on the scale in question, you can format that. Just be sure to uncheck the Autoscale axis on the pulldown menu.
    eric
    Eric P. Nichols
    P.O. Box 56235
    North Pole, AK 99705

  • Labview to excel:graph

    is it possible to draw a graph in excel for just some data of each row,if yes how

    i already know this toolkit, but to get a graph for some data of each row is a little complicated.
    for example :in excel i want to get the graph of param 2 and 4 and 5 of the 1st row in front of this row(and so on graph of param 2,4 and 5 of 2end row in front of it...)
    Attachments:
    Sample Report (Excel).vi ‏30 KB

  • Graph Scaling

    Hello there.
    I was wondering if someone can point me to the right direction with this problem i am facing.
    currently, my program read a device data and graph it on the chart.
    it takes the date of the first data, the last data and 2 or 4 weeks ago.
    i want to calculate the slope of the graph according to what days i want it. 
    right now, my x axis from days 143 to 177
    on the graph, i change it from 150-170 for instant.
    what can i do to get the value of the x axis from the chart?
    i hope i am not confusing you with my discriptions.
    Best regards,
    Krispiekream
    Solved!
    Go to Solution.
    Attachments:
    untitled.PNG ‏92 KB

    i think this is the right direction?
    i change x scale to 154 and 169.
    how do i get all the data of the array from 154 to 169?
    Message Edited by krispiekream on 10-07-2008 05:38 PM
    Best regards,
    Krispiekream
    Attachments:
    untitled.PNG ‏4 KB

  • Plot discontinuous data in labview graph/chart

    Hi,
    I would like to plot discontinuous data in a graph in labview. 
    The data is aquired and plotted over time - butduring certain periods there is no data aquired.
    I don't want during these times any "line" between the adjescent points - I just want simply "no line".
    Maybe the best way to show it is this example of a javascript chart
    http://www.highcharts.com/stock/demo/data-grouping
    - zoom into the time around november 2005
    During certain times there is no data - this is implemented in the datastream as "null" instead of a vaild floating point number - similar to NaN for floating point numbers - the javascript code knows that in this case there should be no interpolation.
    Is there any way to have a similar behaviour with labview charts or graphs ? 

    Yes.  Place an NaN value in the array where you data break is before sending it to the graph.

  • How does LabVIEW Render the data in 2D Intensity Graphs?

    Hello,
    I was hoping someone could explain to me how LabVIEW renders it's 2D intensity data. For instance, in the attachment I have included an image that I get from LabVIEW's intensity graph. The array that went into the intensity graph is 90000x896 and the width/height of the image in pixels is 1056x636.
    Something I know from zooming in on these images as well as viewing the data in other programs (e.g. Matlab) is that LabVIEW is not simply decimating the data, it is doing something more intelligent. Some of our 90000 lines have great signal to noise, but a lot of them do not. If LabVIEW was simply decimating then our image would be primarily black but instead there are very obvious features we can track.
    The reason I am asking is we are trying to do a "Live Acquistion" type program. I know that updating the intensity graph and forcing LabVIEW to choose how to render our data gives us a huge performance hit. We are already doing some processing of the data and if we can be intelligent and help LabVIEW out so that it doesn't have to figure out how to render everything and we still can get the gorgeous images that LabVIEW generates then that would be great!
    Any help would be appreciated! Thanks in advance!
    Attachments:
    Weld.jpg ‏139 KB

    Hi Cole,
    Thank you for your understanding.  I do have a few tips and tricks you may find helpful, though as I mentioned in my previous post - optimization for images or image-like data types (e.g. 2D array of numbers) may best be discussed in the Machine Vision Forum.  That forum is monitored by our vision team (this one is not) who may have more input.
    Here are some things to try:
    Try adjusting the VI's priority (File»VI Properties»Category: Execution»Change Priority to "time critical priority")
    Make sure Synchronous Display is diasbled on your indicators (right-click indicator»Advanced»Synchronous Display)
    Try some benchmarking to see where the most time is being taken in your code so you can focus on optimizing that (download evaluation of our Desktop Execution Trace Toolkit)
    Try putting an array constant into your graph and looping updates on your Front Panel as if you were viewing real-time data.  What is the performance of this?  Any better?
    The first few tips there come from some larger sets of general performance improvement tips that we have online, which are located at the following links:
    Tips and Tricks to Speed NI LabVIEW Performance
    How Can I Improve Performance in a Front Panel with Many Controls and Indicators?
    Beyond that, I'd need to take a look at your code to see if there's anything we can make changes to.  Are you able to post your VI here?
    Regards,
    Chris E.
    Applications Engineer
    National Instruments
    http://www.ni.com/support

  • Graph attributes not updating on print VI

    Hi All,
    At my current client's lab they are having problems with printing test result graphs to HTML. They typically have two almost identical VIs with graphs. The first one is the operator's GUI, colored, lots of buttons, etc. This displays properly.  When it comes time to print the test reports they send both data and graph attributes, such as scales, etc, to a simplified version of the graph VI. This second VI has a white background, all the operator's buttons have been removed, a few new summary tables usually get added, then they tell this VI to print it's front panel to HTML.  Been there, done that.
    The finished HTML reports often show that the second, white, VIs sometimes do not have the correct graph scaling. The data looks correct, but the scales are way off.  I have not yet been asked to look at the code, but that is probably coming soon. I have been asked to ask my LabVIEW contacts for general ideas related to this technique.
    Unfortunately the code is on a secure system in a secure lab, so I cannot post an actual VI from the project. I seem to recall seeing similar problems on a project long ago and the solution(s) involved keeping the printing VIs open and the order that data and attributes were applied, etc.
    The code is very very legacy and cannot be replaced at this time & budget, I may be asked for help in finding a quick modification solution. Any thoughts, sequence of steps, things to remember to check, etc are appreciated.

    It's likely a problem with the report VI grabbing the front panel image
    before it has been updated properly by the property nodes.  After
    speaking with some of my colleagues here, we have seen situations where
    if you configure the subvi to open when called,
    programmatically minimize it, and close it when it's finished; the
    subvi will work.  You could also consider enforcing dataflow
    dependencies between the scale updates and the report VI functions,
    possibly introducing a wait function in between to ensure that the
    panel can be updated completely before the image is taken.  Of
    course, without access to code, I can only offer suggestions and
    general information about what has worked in the past.  If you get
    more information, let me know!
    Cheers,
    Matt Pollock
    National Instruments

  • Displaying the content of one JPanel in an other as a scaled image

    Hi,
    I'd like to display the content of one JPanel scaled in a second JPanel.
    The first JPanel holds a graph that could be edited in this JPanel. But unfortunately the graph is usually to big to have a full overview over the whole graph and is embeded in a JScrollPanel. So the second JPanel should be some kind of an overview window that shows the whole graph scaled to fit to this window and some kind of outline around the current section displayed by the JScrollPanel. How could I do this?
    Many thanks in advance.
    Best,
    Marc.

    Hi,
    I'd like to display the content of one JPanel scaled
    in a second JPanel.
    The first JPanel holds a graph that could be edited
    in this JPanel. But unfortunately the graph is
    usually to big to have a full overview over the whole
    graph and is embeded in a JScrollPanel. So the second
    JPanel should be some kind of an overview window that
    shows the whole graph scaled to fit to this window
    and some kind of outline around the current section
    displayed by the JScrollPanel. How could I do this?
    Many thanks in advance.
    Best,
    Marc.if panel1 is the graph and panel2 is the overview, override the paintComponent method in panel2 with this
    public void paintComponent(Graphics g) {
    Graphics2D g2 = (Graphics2D) g;
    g2.scale(...,...);
    panel1.paint(g2);
    }

  • NI-DAQmx task works in MAX or DAQ Assistant test panel but not in LabVIEW

    I am attempting to read a single AI channel from a PCI-6024E card via an SCB-68. I have created a NI-DAQmx Analog Input Voltage Task in MAX for this channel, sampling in contiuous aquisition mode at 100 kHz, 10000 samples at a time, with RSE terminal config. If I use the Test feature from MAX, the channel acquires data as expected.
    In LabVIEW, I call this task using an DAQmx Task Name Constant. If I right-click this constant and select "Edit Task", the Daq Assistant opens and I can use the Test feature from the DAQ Assistant to see that the data is still being acquired as expected.
    However, when I try to programmatically read this channel in LabVIEW using the VI "DAQmx Read (Analog Wfm 1Chan NSamp).vi", the VI returns a constant DC value of 500 mV, which I know is incorrect (I can monitor the signal across the two terminals in the SCB-68 with a DMM to know that the signal coming in varies as expected, and as I read using the test panels). This erroneous reading occurs even if I make a new VI, drop the task name constant on the diagram, right-click the task name constant and select "Generate Code - Example" and let LabVIEW create its own acquisition loop (which is very similar to what I already set up, using the "DAQmx Read" VI).
    Any ideas why the Test Panels work correctly but the LabVIEW code does not?

    Hello bentnail,
    I'm not sure why the test panels are readin the value correcly but the LabVIEW code does not, but there are a couple of things we can try.
    1) What happens if you just use the DAQ Assistant and place it on the block diagram? Does it read out the correct values?
    2) Try running a shipping example that comes with LabVIEW. "Acq&Graph Voltage-Int Clk.vi" should work well.
    3) What kind of signal are you expecting to read (peak to peak voltage, freqeuncy, etc.)?
    Thanks,
    E.Lee
    Eric
    DE For Life!

  • XY Graph cursor Error

    Hello
    This error was found, when I started using LabView 8, with my old LabView 6.1 VI's.
    I have encountered a minor error on my Graph display. I noticed my Y-Scale numbers go negative, if I Press and Hold my Graph cursor, while I am moving it.
    I run this same test using New LV 8 XY Graph and numbers do not go negative, but they surely flicker a lot + numbers still change some.
    Have you noticed this before? Is there some new property I need to set up, to make my LabView 6.1 Graph stable? Is this known bug or am I just doing something wrong?
    Here is VI for your testing. My LabView 8 version is 8.0.0.4005.
    Just run it and while moving yellow cursor (press+hold+move), after 1-2 seconds of holding something happens.
    Yours
                                         Kari
    Attachments:
    Cursor Test.vi ‏84 KB

    Hello Karik.
    I experience the exact same problem in the application I'm programming.
    The only solution I have come up with is having the cursors Y-axis
    centered in the visible area of the graph.
    However this is really bad because there is will create a smal "hole"
    in the cursor bar, it looks awfull. I think this is a bug in lv8 since
    in lv7.1 I could tie down the y-axis to like -10000 or something and
    the problem would not appear.
    Hope NI fixes this problem really really soon.
    //Andreas

  • 3D bar graph with time stamp on X-axis

    Hi all....I try to plot date data (first column in array)  to x axis in 3D bar graph follow by example in this link
    http://forums.ni.com/t5/LabVIEW/3D-bar-graph-issues-using-2D-Y-Z-plane-and-timestamp/td-p/1923027
    But I still do not succeed.
     I'll be glad if you had any suggestions that could help.
    Thank you
    HiN
    Attachments:
    xy_bar_graph Version 0.vi ‏21 KB

    Hi Jubchay,
    Here is an example of displaying the given timestamp. Only the 3D vision has the customized marker.
    The 3D Bar is not suitable to display a lot of bars, which make the marker overlapped. So I only input the subset of 15.
    Attachments:
    xy_bar_graph Version 1.vi ‏23 KB

Maybe you are looking for