Differential of noisy response curve

I am measuring the temperature from an RTD to produce a heating (or cooling) curve. The response is roughly sigmoidal, however it is noisy, and thus differentiating the data in real time does not give me a 'heating rate' of any sensible appearance.
Does anyone know how i could do this in real time rather than resorting to post processing?
One option I tried was to watch the data for the correct type of change, then only use that. I.e. when recording a heating process, the code watches for a rise in temperature. When this happens the new temperature and the current time is stored in an array. This is then compared to the previously recorded value, and the gradient of the increase is calculated and displayed. Of course when r
ecording a cooling process, a decrease is stored instead. This still gives me a very fuzzy differential, however it appears to show a 'real' response (has a maxima/minima and finally settles to 0).
Is there a 'better' method for doing this in real time?
Many thanks
***Of all the things I've lost, I miss my mind the most!***

Juan,
Thank you for your response.
Unfortunately our server has been down and I've only just read this, but your suggestion is pretty much what I did in the end.
I altered the VI to allow me to control the sampling frequency from the front panel, and tried different values (chosen pretty much at random). I've found several that work very well (the mean value is taken each second, giving me a much smoother responce).
Thank you again.
Ellen
***Of all the things I've lost, I miss my mind the most!***

Similar Messages

  • HDR problems, inconsistant results and camera response curves

    So, I'm doing some HDR panoramas, and every once in a while I'll get one  set of merged files looking completely different than the rest of them, usually brighter and with higher contrast.  When I look at the 32 bit HDR processing option, I notice that the camera response curve is totally different on the malfunctioning HDR merge shot (10 paritioning lines in the histogram instead of 8).  I am shooting RAW with a Canon 60D and Tamron 17-50mm f/2.8.  I am in manual exposure, manual white balance, highlight tone priority off, no flash, everything that that be set to manual is set to manual.  Sometimes it seems that the malfinctioning set of shots is the one that is not pointing at a bright light source.  If everything in the camera is set to manual, how can Photoshop Merge to HDR tell if a shot is less bright than another?  Is my camera making a different Camera Response Curve for these less-bright set of photos?  I tried saving a Camera Response curve, and applying it to the other sets of photos, but that didn't work.  I am using CS5.1 on a brand new computer, but this was happening with CS5 on my slightly older computer a few weeks ago as well.
    I'm out of ideas.
    Has anyone else experienced this?

    Thanks Marty,
    I have seen other threads that describe and show the results you are getting, but my inconsistancy is just a stop or so brighter, and a completely different contrast curve.  And it's usually only in shots that dont have a bright reflection or the sun shining in the camera. 
    It's very important in panoramas that each shot has the exact same exposure and processing, so if one set of HDR exposures in a panorama is being interpreted differently by the software, it ruins the whole panorama.
    What I really need to know is, is it a camera setting that is making Merge to HDR interpret these sets of exopsures differently than the others, or is the Merge to HDR software interpreting the brightness and contrast on the images on it's own and making a change to the settings based on some sort of automatic level adjustment that I don't know about.

  • Camera Response Curve file format

    Hi,
    is there any documentation on the rcv file format used to store camera response curves and what its data represents?
    In my case the content is:
    HDRMergeResponseCurve_2.0
    Canon-Canon EOS 5D Mark II
    4 Y coefficients
    -0.029127
    0.428257
    -1.513274
    2.114144
    I need to use the response curve outside of Photohsop. How can I use these 4 coefficients to map pixel intensity to linear data?
    Thanks!

    Harold,
    The short answer is good. Actually, it is very good.
    I found the differences in color accuracy and computer work time after creating a color profile to be quite noticeable. I have my camera's profile set to default in ACR. I was wondering about this being something of equal benefit.
    Thank you.
    Bill

  • The Zone System (Linear Tone Curve?)

    Hi, I've been learning the zone system by Ansel Adams.  I would like to take a photo and import it into photoshop so that each value in my scene matches the according grayscale measurements in photoshop (measured using Lab values).
    I used my camera and took 11 exposures of a grey card: -5 ('Black'), -4, -3, -2, -1, 0 (Mid Grey), +1, +2, +3, +4, +5 ('White'). 
    I then opened the files in Adobe Camera Raw (No adjustments)
    I then opened them with Photoshop
    I measured the Lightness values using Lab and plotted the points on a graph.
    I see that somewhere along the line an S Curve has been applied to the input values!  I would like an even distribution of the input tonal values. 
    How do I turn this S-curve off?  Or, how do I achieve a linear curve when importing photos to photoshop?   
    Essentially:
    I would like a linear response curve so that each tone (each 1 stop exposure from above) maps to the appropriate grayscale value in even increments (i.e., 10% steps).
    Zone 0:     0% (Black)
    Zone 1:     10%
    Zone 2:     20%
    Zone 3:     30%
    Zone 4:     40%
    Zone 5:     50%
    Zone 6:     60%
    Zone 7:     70%
    Zone 8:     80%
    Zone 9:     90%
    Zone 10:   100% (White)
    Can anyone shed any light on this issue?  I'm really stumped with this one!

    This looks like a typical response curve that you would also see with film.  I'm not sure you would want a linear line, as your images would look flat.  If you really want a linear curve, play around with the curves in ACR and adjust them so that they compensate for the curve that your camera sensor is creating.

  • IR - reponse curves, visual representation

    Hi guys,
    I'd like to try out the IR utility in our video studio, before and after some acoustic dampening of the room.
    What I'd like to see, is some visual representation of the room response.
    Is this possible/any good with IR utility? Will the deconvolved return signal suffice as a frequency response curve for the room? What about phase response?
    I have never used the utility, but just printed the user guide (42p).
    Browsing it, I find no info in this regard.
    Cheers + thanks for any (constructive) feedback
    Eivind

    Hi JOhn,
    thanks for the reply.
    I realized IRU was useless for this, but maybe Match EQ on the before and after recorded reponses will tell us anything.
    If I could dump the raw sine sweep to file (for example with wiretap pro), MatLab could also do the math. But Matlab is not free, either
    I guess I'll just go for Match EQ. I'm just curious, and want some indication, don't need a thorough analysis for this.
    Cheers!

  • Loudspeaker Frequency Response

    Hi all! I'm making a program to measure the frequency response of a loudspeaker. I'll be using a sweep to test the loudspeaker. 
    Particularly i would like the program to be like this, which is real time. How can i implement this in LabVIEW. This is the screenshot
    Details about the plot.
    First graph:
    Shows a plot in real-time of a frequency sweep with a constant sine sweep amplitude of 1 V. When sweep is started, the graph shows a plot of FFT moving from left to right, with peak of FFT at maximum amplitude of 1 at corresponding frequency of the sweep.
    Second graph:
    Shows the plot of the Sound Pressure Level in dB versus freqeuncy.
    Please refer to the picture and video link below.
    https://www.youtube.com/watch?v=sKC3ioWXG38, skip to 4:10

    Assuming your idea is to sweep a frequency into an amplifier connected to the loudspeaker, and measure the frequency response with a microphone:
    You need to monitor the signal at the loudspeaker terminals to account for any non-linearity's in the signal generator and amplifier.
    You need to know the frequency response of the microphone, this is difficult, and that is why calibrated microphones are expensive.
    You need an anechoic chamber so that the results are not affected by any room resonances.
    Your sound level plot is in dB (A). My understanding of the A weighting is so that the human perceived loudness is constant across the audio frequency range. If you are concerned about loudspeaker performance, is it worth discarding the complexity of this additional frequency response curve?
    This will be a difficult project. Please let us know how you get on.

  • Curious - frequency response of MacBook sound?

    Hi - not a complaint - I'm just curious - I kinda like the way my MacBook deals with audio using the internal speakers - it seems to do a good job of getting the point across with limited resources.
    Does anyone know what the frequency response for the MacBook is or where I could find a freq response curve for it?

    Look here.

  • Microphone frequency response

    Hi
    Does anyone know where I might find data on mobile phone microphone frequency response (like frequency response curves)?
    Thanks. 

    Hi
    Does anyone know where I might find data on mobile phone microphone frequency response (like frequency response curves)?
    Thanks. 

  • Accelerometers with SCB-68

    Hello,
    I am running a test with two accelerometers to measure vibration in a cantilever.
    I am using the SCB-68 connector block. The accels tie in to the connector block....1 line has signal, the other is a ground. I have one accel tied in to Channel 0 and the other tied in to Channel 1.
    When I run the test, I get the impulse spike from the accel on the hammer. The hammer has an accel in it and is used to impart a force onto the beam. The other accel is mounted on the beam and it provides the typical transient response curve. So far, so good...
    The problem, however, is that the curve (when there are no inputs) is not flat, but rises slightly, as if the units are not grounded properly.
    Has anyone seen this or offer a suggestion? I believe all the connections are set up properly.
    Thanks!

    I have used the SCB-68 with accelerometers in the past and did not notice this type of behavior. However, I also used the DAQ card in differential mode instead of single ended. You may want to try that to see if it solves your problem (assuming you are not already doing so). You can still tie one of your inputs to ground, if your sensor needs this.
    You have probably already done this, but check to make sure your connectors are clean and the input wires are clamped well. I have been burned by accidentally clamping the wire insulation, pinching through it, and getting a flaky contact.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • How to change document gamma (or create a "false profile")

    Although there has been much recent criticism of the notion of "false profiles," generally associated with the person who espouses it, I would like to be able to convert a document which is in, e.g., Adobe RGB with a gamma of 2.2 to one of the same colorspace but a gamma of, say, 1.5.  Although I can do this via the color settings dialogue, creating a new setting with the name, e.g., "gamma 1.5," and although at the time of creation of the new setting the document does indeed become lighter, nonetheless the setting does not "stick."  That is, if I leave that document and later try to apply that same setting via edit>convert to profile, nothing happens.  I'd greatly appreciate someone giving me some pointers on how to establish a setting such that it appears and is functional in the edit>convert to profile list.  I'm working in CS2 in OS10.5.6.  Many thanks

    retiredpatman wrote:
    I would like to be able to convert a document which is in, e.g., Adobe RGB with a gamma of 2.2 to one of the same colorspace but a gamma of, say, 1.5. 
    The Red, Green and Blue tone response curves are kept separately in the .icm file.  You can see that if you open an .icm file in Apple's ColorSync Utility.app.  In the case of AdobeRGB1998.icm you'll find that each of those curves has an identical Gamma of 2.199, but they're stored and display separately.
    You would need access to a profile generating application to change that, and you'd need to know what you're doing.

  • Video keeps sliding around, is there a way to LOCK VIDEO???

    I'm working on a cue, and I'm making timing adjustments as I go...sometimes it's the start of the session, sometimes the tempo.
    The bloody video KEEPS MOVING AROUND. I've made absolutely sure that it's not somehow 'selected' when I make changes. The number is always random and different every time.
    It's frustrating to constantly go back to Video preferences to start the video at 01.00.00.00.
    What the heck is going on???

    Eriksimon wrote:
    iS! Why the F are you still on 9.0.0?
    a) trying to set a bad example?
    b) or just lazy?
    c) or something else altogether?
    Let's see...
    Regarding "b", I've never been accused of being lazy, that's for sure.
    Then there's "c" -- something else. Yeah, it could be something else. But before I commit to that answer, let's look at "a" -- trying to set a bad example:
    I'm very tempted to embrace that one just to be a grump cuz it's early in the day and I haven't had enough coffee yet. But lemme tell ya, if there's anyone trying to set a bad example, it wouldn't be me. It would be... Dare I say it? Dare I mention names? Names of fruit? A particular kind of fruit? One that has varieties that include Crofton, Elstar and Gilpin?
    Well, in truth, the answer is indeed "c", something else. Let's put it this way... if someone could come up with a way by which L9 channel strip faders reacted to external CC7 (and on-screen mouse movements) with the same response curve as in L8, I'd jump onboard L9.0.more in a second. Short of that, there's no point in updating past where I am now.

  • Sync is not working correctly from .jpg to .nef

    I am working on editing a large collection of photos, maybe 150 or 160. These are photographs of flat art taken using a flat-art photography stand (camera points directly at floor, art piece is on larger white background). I have two versions of the exact same image, one .jpeg and one .nef. I am editing them one at a time, fixing the white balance, any accidental tilt, and cropping them. Again, the two images are exactly the same: size, angle, etc. What I have been doing is editing the .jpg and then highlighting the .nef version and syncing the two. This has been working perfectly, but something has changed and I don't know what I did to cause it. When I sync the changes I have made to the .jpg, the crop does not line up correctly on the .nef image. Instead of the image being perfectly centered and cropped, it is off to the side, revealing the white background and cutting off the image. Why is this happening? How can I fix it?

    What dj wrote regarding crops is equally true for all and any edits you do to the NEF, including any you paste from the jpg. Those edits exist only in the catalog  and additionally in an XMP file if you opt for that. Thus the modifications to the NEF will be viewable only in your LR catalog or by somebody else (with the accompanying XMP) in Adobe software.
    Also, you should consider that color-critical edits cannot simply be pasted from a gamma corrected image rendered in a narrow  RGB space to linear and wide gamut image data and be the same.  If it's a jpg from the camera it is in either sRGB or Adobe RGB, narrower gamuts than LR's linear ProPhoto RGB and colors have already been changed in the camera by both the Rendering Intent and the Tonal Response Curve. Moreover, the camera's jpg has been built on a Nikon camera profile while both viewing within LR and exports from it are on the basis of an Adobe profile that may seek to emulate the maker's profile but will never be the same.
    You say you do WB adjustments to the jpg. How are +/- tweaks in a limited range applied on top of the camera's WB supposed to translate to WB for the NEF when even the UI (Temperature and Tint) is different?
    I have had many years experience photographing paintings for show catalogs and your method seems fundamentally flawed to me.

  • Exposure to the right results in different TRC than normal exposure

    Exposure to the right is advocated by most experts to improve tonality and dynamic range. On the Luminous Landscape a photographer noted that ETTR all the way to the right followed by negative exposure correction in ACR produces a different image than is produced by normal exposure, and that he preferred the latter image.
    Luminous Landscape Thread
    Most responders to this post postulated that, since ACR is operating on linear data, underexposure by 1 EV followed by a 1 EV boost in ACR would produce the same results.
    I had some exposures of a Stouffer step wedge. The first was exposed so that step 1 has a pixel value of 250 when converted with ACR at default settings into aRGB. This is exposed to the right as far as possible. A second exposure placed the same step at 221, and this step was brought back to 250 in ACR, which required an exposure compensation of +1.05 EV.
    If you compare the resultant images in Photoshop using the difference blending mode, the differences too dark to make out on the screen, but can be detected with the eye dropper. In this image, normal exposure to the right is on top, and the difference between normal exposure and underexposure followed by a boost of 1 EV in ACR is shown on the bottom.
    The different resulting tone response curves are better shown by Imatest plots of the two images. As is evident the TRCs are different, contrary to my expectation. Comments are invited.

    The ETTR Myth
    ETTR is short for expose to the right. Some folks have promoted it as a replacement for traditional exposure metering. The premise is that you can validate camera metering by simply reading the histogram in the cameras preview window.
    Unfortunately, it is based on some basic misunderstandings about digital photographic technology. The first misunderstanding is the premise that each bit level in a digitally encoded image represents an exposure stop. The second misunderstanding is the premise that all digital cameras capture light in a perfectly linear fashion. The third misunderstanding is the premise that the histogram represents the raw image data captured by the camera. I will briefly address each of these.
    Any correlation between exposure stops and digital bit levels can only be accidental at best. The total exposure range in a scene or an image is correctly known as the dynamic range. The dynamic range of digital cameras is wider than most folks assumes and usually equal to or better than film or paper. It can be defined in terms of tone density, decibels, or exposure stops. It is a function of the optics and sensor electronics in the camera. The few cases where an accurate range is provided by the vendors, it varies from 8 to 12 f/stops.
    The image data is converted from analog measurements by the analog/digital (A/D) circuits early in the capture. This can wind up as an 8-bit, 12-bit, 14-bit, or even 16-bit digital value depending on the camera and its user settings. It is simply a number that has been digitized. Any correlation between bits and exposure levels is pure speculation, end of subject.
    Second, the digital capture of light is not strictly linear. It is true that the silicon sensor itself will capture light in a very linear fashion. But this ignores reciprocity at the toe and heel of the extremes, the quantum efficiency of the substrate, and most importantly it ignores the optical filters in front of the sensor. If the color filter array were linear it would be impossible to reconstruct colors. And these are not the only optical filters in your camera. Then, the A/D circuits have gain controls based on the current ISO setting. And some A/D circuits perform some pre-processing based on the illuminant color temperature (white balance) and limited noise reduction based on the ISO setting. The point is that there are many steps in the pipeline that can introduce non-linearity.
    Finally, the image in the preview window has been color rendered and re-sampled down to a small size. This is the data shown in the histogram. The camera can capture all colors in the spectrum, but the rendered image is limited to the gamut of an RGB color space. So, in addition to exposure clipping the histogram will include gamut clipping. This is also true for the blinking highlight and shadow tools. This might imply an exposure problem when none exists. There is no practical way to map all the data in a raw image into a histogram that you could use effectively in the preview window.
    If you capture an image of a gray scale chart that fits within the dynamic range of the camera, at the right exposure, you can create a linear graph of the raw data. But if you underexpose or overexpose this same image, the graph will not be linear and it is unlikely that software will be able to restore true linearity. End of subject.
    If you typically shoot JPG format, the histogram will accurately represent the image data. But clipping can still be from either gamut or exposure limits. If you typically shoot RAW format, the cameras histogram is only an approximation of what the final rendered image might look like. There is a significant amount of latitude provided by the RAW image editor. This is probably why you are shooting RAW in the first place.
    So, in closing, I am not saying that histograms are bad. They are part of a wonderful toolkit of digital image processing tools. I am saying ETTR is not a replacement for exposure metering. If you understand what the tone and color range of the scene is, you can evaluate the histogram much better. And if you master traditional photographic metering, you will capture it more accurately more often.
    I hope this clears up my previous statements on this subject. And I hope it explains why I think ETTR and linear capture are based more on technical theology than on technical fact.
    Cheers, Rags :-)

  • Msi GTX 760, Multi-Screen, HDMI Color accuracy way better then DVI

    Hi,
    I'm using two HP Pavillion 23xi screen.
    At first, I decide to connect one screen using the HDMI output of the video card, and the other one using DVI. There was an astonishing difference between the two.
    Ok, maybe the screen is the problem. So, I switch the connection between the two screen. Same result. HDMI look so better. For exemple, if i open paint between the two window, the hdmi connection show the empty white canvas like it suppose too, but the dvi screen is showing the canvas almost like a light grey.
    Maybe it's the default behavior, but i'm sure something is wrong.
    Is there a settings somewhere that i'm missing to make both output port behave the same way.
    Best Regards,
    Patrick.

    if there is a setting it will be in the Nvidia control panel (probably under color for the output and you may need to adjust the color response curve)

  • HT3625 I don't have a "Use audio for:" menu in my system preferences

    When I go to system preferences and try to follow all of the directions in the article, there isn't any menu or anything that says "Use audio for:" when I'm on the input tab of the sound preferences.  How can I get my MBP use the audio input instead of detecting it as an audio output? 

    malcolm007,
    Here is a pin description of the 4 pin connector, there impedence and shunting that needs to be taken into consideration, not to mention the response curve that this is optomized for,ie:headset mic response. There could be potential damaged cause by incorrect matching and shunting (shorting certain pins to ground) you might consider calling Apple and see what their response is. You can even try experimenting at your own risk, I myself can make no recomendation and would advise against it as you could risk damaging your device. I found no data that would allow an informed decision on my part. The audio quality would most likely be sub par for a music type of application anyways, as it is most likey optomized for the headset mic as prviously mentioned.
    USB interfaces are cheap and most sound good and will definatly provide a safe and flexible way to get in.
    IMHO
    http://www.speechrecsolutions.com/assets/iPad_and_microphones.pdf

Maybe you are looking for