Capturing Data Point

I have written a program that constantly monitors the pressure from a transducer.  Once the pressure has stabilized the user can begin the record process which essentially records the data to an excel spreadsheet at different rates depending on whether or not the even has been triggered.  The event is determined by the pressure at the point when the record process began + a user defined slope, so when the initial pressure increases by this much begin recording at a faster rate.  To capture the initial pressure I'm using a for loop with an iteration count of 1 reading from the continuous pressure reading.  That value is then used in a formula node.  I'm getting an error saying that the left brace for the formula node is needed.  Is this because of the way I'm capturing that initial pressure value?  Thanks.....
LabVIEW 2012 - Windows 7
CLAD
Attachments:
PressureScanv1.vi ‏89 KB

Hi MeCoOp,
I don't know if I fully understand what you are trying to accomplish,
but it sounds like you want the first (initial pressure) value to be
used in all calculations, once it's acquired the first time. With your
current code that will never happen since you are continuously writing
new values to "Initial Pressure" when the "Begin Pressure Monitor" is
TRUE. If you use "Highlight Execution" (the small light bulb) while
your VI is running, you will get a good idea of your dataflow - you
should see that the "Initial Pressure" keeps getting new values from
outside the Case Structure.
To store the initial value and use it for the subsequent iterations,
you could use a couple of Shift Registers on the edge of the While Loop
to store the value. The reason that you need a couple of Shift
Registers and not just one, is to keep track of if it is the first time
the "Begin Pressure Monitor" has been pressed and if it's the first
iteration of the loop. Here's a simplified example:
If this example doesn't relate to what you are trying to accomplish, please let me know why, thanks.
Have fun!
Message Edited by Philip C. on 08-03-2005 12:40 AM
- Philip Courtois, Thinkbot Solutions
Attachments:
InitialValue.PNG ‏6 KB
InitialValue.vi ‏22 KB

Similar Messages

  • Help with Capturing Business Graphics data point

    Hi,
    I created a BusinessGraphics UI element with SimpleSeries and assigned eventId for the categories and data points. I am able to get the series that is clicked through the event but I would like to know which point (value) is clicked as well.
    The steps I followed are
    1. Created BG UI element, category and SimpleSeries
    2. Assigned eventIDs
    3. Created an action class and mapped it to the UI element
    4. Code in wdDoModifyView is
         if (firstTime)
           IWDBusinessGraphics chart = (IWDBusinessGraphics) view.getElement("bgCSB");
           chart.mappingOfOnAction().addSourceMapping("id", "pointID");
    5. Implemented action class with one parameter (pointID) and able to get the value.
    Can someone help me to get the data point values from the user click.
    Appreciate your help.
    Thanks,
    Kalyan

    You have done everything right, except I don't think you can do this with simple series.
    Create something like this:
    in the context:
    series-> (this node can be with 1..1 cardinality and 1..1 selection)
       points->
           label (string)
           value (int)
           pointId (string)
    in the business graphics:
    create one series (not simple one) and add to it one point of numeric type.
    in the properties of business graphics bind seriesSource to series context.
    Series: bined poitSource to series.points
    Series_points: bind eventId to series.points.pointId
                            bind label to series.points.label
                            bind valueSource to series.points
    Values (these are the numeric values): bind value to series.points.value
    in wdDoModify method do the same thing as you have done already.
    Now, when you click on a point you will receive in your event in pointId variable the pointId context attribute value.
    Best regards,
    Anton

  • Capturing a set amount of data point.

    Hello,
    I am using LabVIEW v7.5 for a project I am currently working on and am recording signals from 6 physical channels.  I have a couple of questions about the write to spreadsheet VI.  First, when I write to a spreadsheet, transpose it, then open it in excel or another spreadsheet application, do the columns correspond to the channel that the data came from?  Second, I was wondering if there is a way that I can specify the VI to record a set amount of data.  Since EXCEL can only plot a maximum of 32,000 data points I would like to sset the VI to only record that many data points so I don't have to manually delete them when I would like to plot them.  Lastly, this is something I have been curious about for a while, is there any way that I can append a header to the columns without manually adding it after opening the data in EXCEL.  I will appreciate any comments or feedback
    thanks so much,
    bsteinma

    The answers to your questions are: Yes and Yes.
    To see how to acquire a set number of samples refer to the examples that ship with LV. (Hint: this is always a good place to start when trying something you haven't done before.)
    Mike...
    Message Edited by mikeporter on 02-17-2009 03:22 AM
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • Can no longer sort RAW with JPG files together by capture date.

    Suddenly I can no longer sort  JPG and RAW files together by capture date. I have two Canon SLR's shooting in RAW and one Canon point an shoot shooting in JPG and want to collate the photos in LR4, which I was able to do up to now. I cannot find out what I did to make this happen or correct it in LR preferences. Any solutions?
    JimB.

    View->Sort->Capture Time

  • Capture Date/Time - clobbered by LR 1.1

    Oh No!
    It appears that LR 1.1 has clobbered the Capture Date/Time after writing out the metadata.
    To try to be clearer, I've been using Elements 5.0 which organizes my photos (all jpg) by date. The date the photo was taken (which is also the date the file was created) are important to me. Those fields are respected by other Adobe products (Album, Elements 5.0 and Bridge CS2). Somehow LR 1.1 treats this date differently and with less importance than other Adobe and non-Adobe applications.
    I am not too concerned about the date that the metadata was changed, which LR shows. I want to see the date the picture was taken.
    The only way I managed to see that in LR was to customize the display using
    Jeffrey's Lightroom Metadata Viewer Preset Builder (for Lightroom 1.1). (Why I had to use a 3rd party add template creator to see the Capture Date/Time I don't know - it should be right there in LR - but leave that aside for the moment) .
    Anyhow, I manipulate the metadata by adding keywords and then tell LR to write that to the files. In this process the creation date/time vanished on a number of photos. The only way I could get it back was to use a backup of the original files (losing keywords and other data in the process)or to use Exifier to set the date/time, remove the collection from LR and reimport.
    Something as important as the date and time the photo was taken (file creation) is a key bit of data that should be readily available and visible in the metadata display and should not be clobbered or lost when writing out metadata to the files. I see there are other complaints about this and some expressing horror that Adobe didn't fix it in LR 1.1.
    My point is simple - for my purposes I cannot fully utilize LR 1.1 if the Creation Date/Time cannot be readily seen, or is in danger of being lost, I would have to use other software.
    Geeze, and I was beginning to get comfortable with LR...
    Maybe the good folks at Adobe have some comment and might be able to rectify this?
    In the alternative, is there something I'm missing about the way Capture Date/Time is supposed to be treated by LR users?
    Advice and pointers would be appreciated.

    Christopher wrote:
    "I hear ya, but if someone could just say, this is how dates are supposed to work ... that would be helpful. In particular, what fields or fields does the Metadata Browser use to sort an image into a date bin? If I knew that, I might be able to determine why so many of my images are put into the Unknown Date heading."
    Here we go:
    1. None DateTime tag defined in exif (or DateTime tags corrupted)
    -in Metadata Browser (left panel) Date will be seen as Unknown Date.
    -Thumbnail show no DateTime value,
    -in Matadata (right) panel, there will be no DateTime caption visible
    2. Only ModifyDate defined in exif
    -in Matadata Browser value of ModifyDate will be shown,
    -Thumbnail show no DateTime value,
    -in Matadata panel, ModifyDate will be shown as Time
    3. Only CreateDate defined in exif
    -in Metadata Browser value of CreateDate will be shown,
    -Thumbnail show no DateTime value,
    -in Metadata panel, CreateDate will be shown as DateTimeDigitized
    4. Only DateTimeOriginal defined in exif
    -in Metadata Browser value of DateTimeOriginal will be shown,
    -Thumbnail shows value of DateTimeOriginal,
    -in Metadata panel, DateTimeOriginal will be shown as Capture Time
    +in Medatada panel, DateTimeOriginal will be shown as Time
    5. All three DateTime values defined in exif
    -in Metadata Browser value of DateTimeOriginal will be shown,
    -Thumbnail shows value of DateTimeOriginal,
    -Metadata panel shows CreateDate as DateTimeDigitized,
    +Metadata panel shows DateTimeOriginal as CaptureTime,
    +Metadata panel shows ModifyDate as Time
    That's it. Now... why are we confused in LR when speaking of DateTime?
    In Metadata Browser any of exif's DateTime value can be shown. That is, looking at this panel only, we never know what Date is being used! Anyway, DateTimeOriginal has priority: if DateTimeOriginal exist, it will be used for Metadata Browser -otherwise any other existing exif DateTime tag will be used.
    But if only DateTimeOriginal is missing (so both, CreateDate and ModifyDate exist), then CreateDate has priority over ModifyDate: CreateDate will be shown.
    Priority in short: DateTimeOriginal, CreateDate, ModifyDate.
    Thumbnails: only if DateTimeOriginal exists, thumbnail will contain DateTime value -otherwise thumbnail's DateTime will be blank.
    Sorting of thumbnails per CaptureTime (hence by exif DateTimeOriginal) is another story and can be quite confusing at first sight... because DateTime priorities seems to be the same as for Metadata Browser:
    If exif DateTimeOriginal exist (=CaptureTime in LR), then this value will be used for sorting. But if this value is missing, LR will try to use exif CreateDate and if this is missing too, ModifyDate will be used. OK, there's some logic... except, why I'm repeating "if this tag is missing..."?
    Because when you modify/rewrite some image, then some tool may damage DateTime in exif -remember: damaged DateTime tag is the same as tag doesn't exist!
    In short: if there's exif DateTime corruption involved, you can't be sure which of three DateTime tags is being used for sorting. For some images DateTimeOriginal will be used, for those that are missing that tag, CreateDate will be used instead.
    Images missing exif DateTime tags (or these are corrupted), will be shown as first (sort A-Z).
    Bogdan

  • How to save captured data to a log file to prevent lag?

    Here is my VI.
    I read in an string of this format (the channels number are now 32 only but will like to increase up to more than 10000 later)
    Time\sStamp\tChannel:00\tChannel:01\tChannel:02\tC​hannel:03\tChannel:04\tChannel:05\tChannel:06\tCha​nnel:07\tChannel:10\tChannel:11\tChannel:12\tChann​el:13\tChannel:14\tChannel:15\tChannel:16\tChannel​:17\tChannel:20\tChannel:21\tChannel:22\tChannel:2​3\tChannel:24\tChannel:25\tChannel:26\tChannel:27\​tChannel:30\tChannel:31\tChannel:32\tChannel:33\tC​hannel:34\tChannel:35\tChannel:36\tChannel:37\tIP\​sAddress\t\n
    The problem I am now having is that, the data is through UDP sending in, the program start off normal capturing all data, but when data are captured and saved to the .log file after an hour or more, some of the data will start to be missed.
    I guess this is beause the .log file is becoming larger, and in the rate of 10Hz (now), the file open and close can't chase up? so some data are missed?
    Anyone could advice or amend the VI so that I could have a better way to save the captured data? making a buffer? or what so ever? Hope someone could help =]
    Thanks.
    Attachments:
    myVI.zip ‏32 KB

    milkbottlec wrote:
    Just found some error when the data are saved.
    At some point, there data just copy this whole thing and save in the middle of the excel
    ""Time\sStamp\tChannel:00\tChannel:01\tChannel:02\​tChannel:03\tChannel:04\tChannel:05\tChannel:06\tC​hannel:07\tChannel:10\tChannel:11\tChannel:12\tCha​nnel:13\tChannel:14\tChannel:15\tChannel:16\tChann​el:17\tChannel:20\tChannel:21\tChannel:22\tChannel​:23\tChannel:24\tChannel:25\tChannel:26\tChannel:2​7\tChannel:30\tChannel:31\tChannel:32\tChannel:33\​tChannel:34\tChannel:35\tChannel:36\tChannel:37\tI​P\sAddress\t\n""
    I don't know which part handling the error is wrong.
    If possible, maybe could you help me delete the whole auto naming function (move it aside) and just add in a promt user for saving directory thing ok?
    Thanks!
    Here is the file and i added the excel in.
    Hi milkbottlec,
          Regarding the headings in the middle of your file, I imagine the filename changed temporarily.  When the name changed BACK to an exisiting name, the headings were added (anticipating a new file.)  I'm more curious about the "lost" data problem.  Do you still see holes in the data?  If so, were any bad filenames recorded?
    Sure! Why not present a file-dialog instead of automatically generating filenames,  Looks like you've mastered the "True/False" case, why not build on what you know?  Experiment with the File Dialog express VI on the FILE\Advanced File Functions palette.  Personally, I'm a fan of auto-generated filenames, though, not necessarily built from the data. 
    Cheers!
    "Inside every large program is a small program struggling to get out." (attributed to Tony Hoare)

  • Capture data and store it in a spreadsheet file

    Hello,
    I want to capture the waveform data from the scope TDS 744A for a specific period of time after a certain interval of time the trigger occurs, store it in a spreadsheet file and then plot the data points in the excel spreadsheet automatically. The trigger should occur as soon as the scope sees some input and after a specific period of time the data is captured and ploted on excel spreadsheet. Can you help me to do this.

    I'm sorry that I don't hav experience with your specific scope, but in general the GPIB commands mirror the operation of the manual controls. If you know how to set up the triggering on the scope manually look for the GPIB commands that do the same things. If you are still having problems in a couple more day and no one is responding to this thread you might want to repost the scope-specific portion of the question being sure to mention "Tek 744" in the subject line. That will increase the likelihood of someone who knows that instrument spotting the problem.
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • Capturing data instantane​ously after condition is met using 2 DAQ cards

    Hi all,
    I am wanting to sample a pulse and a sine wave at a low frequency of 1 kHz point by point sampling and upon a condition being met - the leading edge of the pulse being greater than a certain value - I the want to take a certain number (here 270) of samples of the sine wave at 50 kHz, multiple samples. I am doing this within a case structure and using a separate DAQ card. 
    The problem is that when the conditon is met is does not capture the data at the higher frequency straight away. It misses part of the sine wave before commencing capturing the 'window' of the 270 samples of the sine wave at 50 kHz. I want to know if there is any way of speeding this up? I am aware that using an FPGA could help but I don't have access to one. 
    The other solution would to advance the condition, i.e advance the pulse train by a certain amount so that I compensate for the delay in activating the high freq capture. I would need to set up this advance of the pulse train so that, after the delay in activation, it captures the data exactly where it is required to. What would be the best way to do this? I could delay the pulse train by a certain number of samples/ or delay the sine wave, or I have read that you can use the sample timer/counter within the DAQ card to change the 'trigger of the pulse train'. How would I implement this?
    Attached is my vi. I would be very appreciative if you could help. 
    Attachments:
    2 DAQ cards Low and High Freq Sampling.vi ‏29 KB

    I do not have DAQmx or any suitable DAQ devices so I have not tested this.
    This is a start on cleaning and speeding things up. Note that writing to the front panel indicators at 1 kHz will not work because the screen update rate is on the order of 50-100 Hz. Also charts take a considerable amount of computation beacuse they need to (1) store data in the internal buffer, (2) erase old data if the buffer is full, and (3) (slowest) recalculate all the pixels in the display for the updated data. The cart should be moved to the parallel loop. The condition True boolean will only be true for about 5 ms out of each 100.  Look at the 5 ms boolean.vi  attached. It runs close to 1 ms per iteration. The boolean is true about 5% of the time but I never see it change. Remove that boolean from your VI.
    With continuous sampling on the pulse channel and reading 1 sample every millisecond (assuming you get that fast) when the high speed sampling occurs it takes 5.4 ms to acquire the 270 high speed samples. So, ten times per second the loop takes >= 6.4 ms for an iteration. The next sample it reads from the pulse channel is the one which was measured (acquired) at 1 ms after the previous sample. Thus, this data point is read ~5.4 ms after it actually occurred. The next time you detect a pulse, it will be 5.4 ms late. The second one will be 10.8 ms late. Eventually you will get a buffer overflow, but your data will be useless long before the error occurs.
    The Dual Sampler Simulator.vi shows a possible approach to the issue. It simulates sampling both channels at high speed.  I generate both a sine wave and a square wave and sample both at 50 kHz. It simulates reading 4000 samples at a time (equivalent to reading 12.5 times per second or every 80 ms). The square samples are then processed for the transition using the boolean Implies function. I do not recall whether the Conditional terminal was available in LV2012. I did not get an error when saving for previous version so I think it will be OK. The same thing can be done with a while loop with some extra logic.  This does not handle the case where the transition occurs at the boundaries of the 4000 sample segment. To handle those cases use a shift register on the outer loop to pass the needed samples to the next iteration. The 800 ms Wait makes it run slower than "real" time but allows you to see what is happening.
    Lynn
    Attachments:
    2 DAQ cards Low and High Freq Sampling.2.vi ‏26 KB
    5 ms boolean.vi ‏10 KB
    Dual sampler simulator.vi ‏15 KB

  • Capturing data from a website

    Hello,
    I would like to capture data from websites, or get a snapshot
    of a given URL. I was wondering how this would be posible using
    Java.
    Any help or comments will be greatly appreciated.
    Chanie

    Thanks, that worked beautifully.
    I was wondering if you could help me with one more point.
    There are some webpages that before getting to them you must
    give a username and password. I was wondering how I could get
    data from such a webpage if I know the username and password
    using Java.
    Thanks in advance.

  • Capture Date issues in Lightroom !

    Hello,
    I have a 30k picture database (mixed scanned and digital camera images) in Photoshop Elements v3.
    I really would like to shift to LR, I even tried several betas about it (it was awful how slow it was on XP :).
    Now I downloaded the Lightroom v1 in order to see how can I migrate my PSE3 database into LR.
    My issue is mainly about dates: If you tell to PSE that a given image was captured (taken) in a given point of the time, PSE stores the information in its internal database and allocates it to the image. The result is a good chronological order of the pictures coming from scanner or DSLR.
    BUT: When imported to LR, the dates are getting confused: Meanwhile the left section (I think this is the Metadata browser) displays mostly correctly the PSE dates, in the LIBRARY section, the 'Date Time' on each picture is the file creation date ! On the right side the Metadata information also shows the file date instead of the capture date set by PSE.
    Do you have any idea why this is like that ?
    In this way, it's impossible to see on a given picture selection when it was really done.
    (Btw, same issue with DSLR pictures for which the date had to be modified later on due to an error of time setting in the DSLR!)
    thanks : Kadosa

    I'd have to hunt, just like you. I read them ALL, don't always remember, exactly where something is, sorry.
    Maybe a previous poster on thei will point the way?
    Don
    Don Ricklin, MacBook 1.83Ghz Duo Core, Pentax *ist D
    http://donricklin.blogspot.com/

  • Flushing the captured data uppon system failure

    I'm creating a JMF application and saving the captured data to a datasink. During the recording the file output keeps increasing with the data being stuffed in it. However when some faillure occurs, like an energy power problem the file still plays but it's speed is slow and without audio, like if not all the data has been flushed yet. I need to know how to flush the data so that uppon the power faillure the file is complete until that point. Where do I start ?
    Thank you !

    Lupan wrote:
    Hi EJP, thanks for replying
    Indeed may be impossible to recover everything but maybe some mecanism to make save points at regular times, to in case of a power failure, recover at least until that point. Isn't there possible to do seamlessly to the user? Thank you !Isn't it possible to refuel my car without my noticing?
    No, because you have to shut down the engine and park the car while you put gas in.
    It takes some time to finalize a file when you're doing writing it. It takes some time to get the stream ready to record again after you've stopped it. You can automate the process so the user doesn't have to do anything, but I think you'd have to play around with a bunch of stuff before you could even approach doing it without losing data.

  • Using LabView version 6.1, Best way to capture data every 5 milliseconds

    Best way to capture data every 5 ms (milli-seconds) in the .vi diagram when using "Time between Points, and Small Loop Delay tools"

    I have no idea what "Time between Points, and Small Loop Delay tools" is. If this is some code you downloaded, you should provide a linke to it. And, if you want to acquire analog data every 5 milliseconds from a DAQ board, that is possible with just about every DAQ board and is not related to the version of LabVIEW. You simply have to set the sample rate of the DAQ board to 200 samples/sec. If it's digital data, then there will be a problem getting consistent 5 msec data.

  • Noise removal in Motion capture data

    Hello to you.
    I am currently trying to remove noise in image sequences
    containing motion capture data. As a part of this project I am
    clustering pixels into groups and want to remove pixel groups with
    less than 4 members as these groups are regarded as noise.
    The problem is that when I have a nested list with e.g 3 sub
    lists/clusters (as indicated below) and I use a repeat loop to
    remove the sub lists with less than 4 pixel coordinates, but my
    code does not seem me to do this correctly (There are three sub
    lists where 2 of these should be removed, but only 1 of these 2 are
    deleted while I am unable to remove the last one).
    Does anyone have any suggestions on how I can modify the
    current code so that it does the job correctly?
    The code is attached below.
    on exitFrame me
    dataPointMembersInIndividualClusters =
    [#ClusterDatapointMembers: [point(477, 453), point(477, 459),
    point(477, 460), point(478, 454), point(478, 456)],
    #ClusterDatapointMembers: [], #ClusterDatapointMembers: [point(462,
    405),point(462, 405),point(462, 405)]]
    MinimumNumberOfDataPointsInClusters = 4
    repeat with analyzeCluster = 1 to
    dataPointMembersInIndividualClusters.count
    if dataPointMembersInIndividualClusters
    [analyzeCluster].count < MinimumNumberOfDataPointsInClusters
    then
    dataPointMembersInIndividualClusters.deleteAt(analyzeCluster)
    end if
    end repeat
    put dataPointMembersInIndividualClusters
    halt
    end
    Suggestions will be happily appreciated
    Have a glittering day

    Jan Carlo wrote:
    > Hello to you.
    >
    > I am currently trying to remove noise in image sequences
    containing
    > motion capture data. As a part of this project I am
    clustering pixels
    > into groups and want to remove pixel groups with less
    than 4 members
    > as these groups are regarded as noise.
    >
    > The problem is that when I have a nested list with e.g 3
    sub
    > lists/clusters (as indicated below) and I use a repeat
    loop to remove
    > the sub lists with less than 4 pixel coordinates, but my
    code does
    > not seem me to do this correctly (There are three sub
    lists where 2
    > of these should be removed, but only 1 of these 2 are
    deleted while I
    > am unable to remove the last one).
    >
    > Does anyone have any suggestions on how I can modify the
    current code
    > so that it does the job correctly?
    >
    The problem is that you're deleting things from the list
    you're looking at
    and things get moved towards the start, thus skipping an
    iteration you
    wanted to examine. If you start at the end, you won't get
    that problem.
    on test
    clusters = [ [point(477, 453), point(477, 459), point(477,
    460),
    point(478, 454), point(478, 456)], [], [point(462,
    405),point(462,
    405),point(462, 405)]]
    nMin = 4
    nClusters=count(clusters)
    repeat with i = nClusters down to 1
    if clusters [ i ].count < nMin then
    clusters.deleteAt(i)
    end if
    end repeat
    put clusters
    end test
    Andrew

  • Removing noise in optical motion capture data

    Hello to you.
    I am building my own optical motion capture system. In
    relation to this I have coded a modified K-means clustering
    algorithm in LINGO, which can be used for removing image noise in
    the captured image data. Noise is in this case regarded as being
    sparse groups of pixels (e.g they do not belong to a marker as
    these are represented as compact pixel groups).
    In order to evaluate the performance of my algorithm I need
    to compare it with some other noise filtering solution. Does anyone
    know if there is any suitable such LINGO code available? Possibly
    Mean or Median filters…
    I hope for positive replies:)

    Jan Carlo wrote:
    > Hello to you.
    >
    > I am currently trying to remove noise in image sequences
    containing
    > motion capture data. As a part of this project I am
    clustering pixels
    > into groups and want to remove pixel groups with less
    than 4 members
    > as these groups are regarded as noise.
    >
    > The problem is that when I have a nested list with e.g 3
    sub
    > lists/clusters (as indicated below) and I use a repeat
    loop to remove
    > the sub lists with less than 4 pixel coordinates, but my
    code does
    > not seem me to do this correctly (There are three sub
    lists where 2
    > of these should be removed, but only 1 of these 2 are
    deleted while I
    > am unable to remove the last one).
    >
    > Does anyone have any suggestions on how I can modify the
    current code
    > so that it does the job correctly?
    >
    The problem is that you're deleting things from the list
    you're looking at
    and things get moved towards the start, thus skipping an
    iteration you
    wanted to examine. If you start at the end, you won't get
    that problem.
    on test
    clusters = [ [point(477, 453), point(477, 459), point(477,
    460),
    point(478, 454), point(478, 456)], [], [point(462,
    405),point(462,
    405),point(462, 405)]]
    nMin = 4
    nClusters=count(clusters)
    repeat with i = nClusters down to 1
    if clusters [ i ].count < nMin then
    clusters.deleteAt(i)
    end if
    end repeat
    put clusters
    end test
    Andrew

  • SSRS 2008 Column Chart with Calculated Series (moving average) "formula error - there are not enough data points for the period" error

    I have a simple column chart grouping on 1 value on the category axis.  For simplicity's sake, we are plotting $ amounts grouping by Month on the category axis.  I right click on the data series and choose "Add calculated series...".  I choose moving average.  I want to move the average over at least 2 periods.
    When I run the report, I get the error "Formula error - there are not enough data points for the period".  The way the report is, I never have a guaranteed number of categories (there could be one or there could be 5).  When there is 2 or more, the chart renders fine, however, when there is only 1 value, instead of suppressing the moving average line, I get that error and the chart shows nothing.
    I don't think this is entirely acceptable for our end users.  At a minimum, I would think the moving average line would be suppressed instead of hiding the entire chart.  Does anyone know of any workarounds or do I have to enter another ms. connect bug/design consideration.
    Thank you,
    Dan

    I was having the same error while trying to plot a moving average across 7 days. The work around I found was rather simple.
    If you right click your report in the solution explorer and select "View Code" it will give you the underlying XML of the report. Find the entry for the value of your calculated series and enter a formula to dynamically create your periods.
    <ChartFormulaParameter Name="Period">
                      <Value>=IIf(Count(Fields!Calls.Value) >= 7 ,7, (Count(Fields!Calls.Value)))</Value>
    </ChartFormulaParameter>
    What I'm doing here is getting the row count of records returned in the chart. If the returned rows are greater than or equal to 7 (The amount of days I want the average) it will set the points to 7. If not, it will set the number to the amount of returned rows. So far this has worked great. I'm probably going to add more code to handle no records returned although in my case that shouldn't happen but, you never know.
    A side note:
    If you open the calculated series properties in the designer, you will notice the number of periods is set to "0". If you change this it will overwrite your custom formula in the XML.

Maybe you are looking for