A Calibration.Gamma Question

I've been calibrating a fairly new 21 Inch Samsung LCD, with the Eye One Display 2. Generally speaking, I am very happy with the results. The colors seems true and the monitor is not too bright. The problem is the Gamma. The target chosen is 2.2; yet, each time I calibrate the result is 1.9. The color target of 6500 and brightness of 120 are achieved.
So, what effect does this slight variance have, and any ideas why it can't be achieved?
I adjust the brightness/contrast and RGB values using the software and the LCD OSD controls. Each step I get the values almost dead on in the center, so I can't figure out what's going on.

Now i'm confused, so let me get this straight,
I said:
>As long as what you're seeing between LR, PS and web syncs up, that's as accurate as you're going to get.
you:
>There's no reason why LR/PS and web colors should sync up, unless you're using a color-managed browser such as Safari.
and then,
>Which is why I always see a difference, sometimes ever so slight, between CM and nonCM apps.
So,
> As accurate as you're going to get.
> Ever so slight.
To me these mean the same thing.
There are subtle differences between applications, but getting within 3% of expected is essentially, accurate color.
Why sweat those last few %? It's going to look different on everyone else's monitor, so why kill yourself?
I apologize if my language caused a minor fire, but i just found your statement to be slightly offensive. Telling a user that they're not supposed to have consistent color seems like an attempt to scare them. I don't know about you, but i make my living producing color. I work on multiple systems spread throughout several agencies and photographers. In each of them, i've walked in, set up an unfamiliar system and had consistent and reasonably accurate color up and running within the first hour (mac only). And the process is always the same. Unless the monitor you're working on is beyond repair, once you've calibrated, getting consistent color between apps is simply a workflow issue. Of course, the moment you bring an output device (other than the monitor) into the mix, everything goes to hell. Fortunately, the majority of my work these days is destined for the web. But if i had to tell my clients, "oh sorry, these files that you just spend a couple hundred thousand producing are going to look like inconsistent dog meat because some browsers aren't color managed," they would fire me. Seriously. This issue was resolved last century. I can bring up my work in any image editor, and ALL browsers and not see any obvious differences. This is how it should be and is.
In case you haven't tried this, calibrate to 2.2/65K. Now, open LR and PS. Take an image from LR and export as sRGB. Open in PS. Notice any major differences between? You shouldn't. Next, Save for Web from PS. This will take your sRGB image and strip its profile (if unchecked). Save it. Drop the jpg into all your browsers. What do you see? That's right, consistent color! And yes, it is possible to produce out of gamut color which shifts after being squeezed into sRGB. Fortunately, this is rarely the case.

Similar Messages

  • Final Cut Pro monitor gamma question... again!

    Very confused here: can anyone tell me for sure what gamma my Cinema Display needs to be set to for work with FCP7 / Snow Leopard? I can only find contradictory information and nothing clear and conclusive from Apple.
    My problem: following the latest advice, I've been using a Cinema Display calibrated to 2.2 (dual monitors on a Macbook Pro). I can export an edited file which matches FCP's canvas when viewed in QuickTime.
    However, if I send to Compressor from FCP, the preview in Compressor is darker and more contrasty than FCP's canvas.
    If I upload a video that looks OK in QuickTime to MobileMe it becomes dark and contrasty too.
    If I convert my exported file that looked OK in QuickTime to H.264 in Mpeg Streamclip I, once again, get a version that is darker and more contrasty than in FCP or QuickTime.
    I'm pulling my hair out over this, not least because it's becoming a fantastic waste of time. I would like to be concentrating on editing rather than trying to sort this out!
    I had assumed that using a suite of Apple applications on an Apple computer would mean seamless and easy integration...

    But the problem here is not simply viewing. I am also creating and then viewing and I think this is where the problem lies. A video uploaded to MobileMe is transformed in the process and it's during the transformation (whatever that might be) that the change in appearance occurs. I view a video in FCP's canvas but then it is exported before being viewed in QuickTime. Gamma levels may be set, colour profiles embedded and / or read, applications may make assumptions about viewing conditions or which of the two monitors I'm using is the principal one, who knows what... I think it's here that the problem lies...

  • Calibration interval question

    I have at work a spare PXI 4065 which is calibrated until April 2015.
    On the production line I have another PXI 4065 which is calibrated until February 2015.
    I never use the spare PXI 4065 until February 2015. Question: If I put the spare PXI 4065 into the production line in February 2015, that means according to calibration certificate that my spare DMM is calibrated only for 2 months?Then I have to resend it to calibration?
    Please help!I looked at the calibration documents and the 2 labels with due date for calibration and I'm confused...  

    Hi,
    NI recommends calibrating your NI PXI-4065 every year (NI 4065 Calibration Procedure), but the calibration interval is really determined by the specifications you require. Regardless of how the device is being used, the specifications of the NI PXI-4065 are dependent upon the time interval since last calibration. This means that the 1-year specifications for your spare PXI-4065 are only guaranteed until April 2015, even if it is not used until February 2015. We typically list specifications for 24 hours since last calibration, 90 days since last calibration, and 1 year since last calibration. Outside of one year, we do not guarantee any specifications. For more information regarding guaranteed specifications, please see the NI 4065 specifications.
    Hopefully, you find this explanation helpful.
    Regards,
    Mike Watts
    Product Marketing - Modular Instruments

  • Calibration hardware question

    I have suggested to my boss that we get some sort of calibration device for our monitors at work. my questions are, Does one brand stand out?, and can one puck be used on multiple monitors in the shop? any suggestions are appreciated.

    My Mitsubishi has it's own gun control. I didn't know about pre cal.
    OTOH, I also have a LaCie 19" with no gun adjustments and the results are quite close. Not perfect, however.
    I had an interesting experience with both. When I was running the 9600, the monitor which came with the system was a Dell. There is a definite difference between my Mitsu and the Dell, which is why I borrowed the Eye one which came with the 9600 package. I tan both calibrators on both monitors, and in each case, no difference was detectable between ColorVision and Eye one, yet they didn't match, particularly on separation in shadows, which would not necessarily be the cal HW/SW. Yet, when I took the prints home and checked them against my monitor, they matched.
    I learned not to tweak the settings I saw on the Dell. But again, if I did tweak a customer's image, which we did continually, the prints matche the Dell.
    Unfortunately, before I could design some further testing routines, the long time owners (50 years) decided to retire and sold off the equipment. I wound up with the Dell computer but not the monitor.
    If I took the time to set each gun so that the differential values cancelled, I actually did not have to proceed with the rest of the tuning.

  • Camera Raw 5.5 will not install?  And Camera calibration button question.

    I just built a new computer with 7 PRO and installed all my software. So when it comes time to instal 5.5 raw it goes about half way through and tells me to shut off Bridge.  As far as I know the bridge window and program are off and not running.  I even unpined it off the taskbar.
    In CS4 when opening a Raw file and I go to my camera calibration botton there are only two options that come up, Arc 3.3 and 4.4. Where are all the other options at?  Do they come with the 5.5 update?  Thanks Keig

    Keiger wrote:
    As far as I know the bridge window and program are off and not running.  I even unpined it off the taskbar.
    Odds are you've got Bridge set to auto-launch at startup...make sure it's not running when you try to install the update.,.and yes, the update should install the addition DNG Profile options.

  • ACR Calibration Results Question

    I recently purchased a D300 and have been shooting in RAW format. I am working my way through Bruce and Jeff's book and decided to try to calibrate my camera.
    I shot a MacBeth (now X-Rite) color checker and followed the directions from the RWCR CS3 book.
    All went well with the calibration and the results I ended up with are:
    No Shadows correction
    R Hue -15 Sat +38
    G Hue -16 Sat +33
    B Hue +11 Sat +5
    The results of applying these corrections is kind of mixed. Overall the color looks pretty good but the greens esp in grass are pretty saturated and kind of faked looking. Skin tones seem a little on the red side as well.
    Overall the image seems a bit too saturated. I guess I could adjust my corrections by eye and save those as a new preset, but I was wondering if my results were normal or perhaps I went astray somewhere.
    I intend to run some of the auto calibrators like the Fors script or Rags script to compare.
    Thanks,
    Les

    Les,
    I cant really comment on your current results. But I am currently working on a new update that you might want to help me Beta test. If so, please contact me offline.
    I am also always looking for test images from new camera models. I jumped on the D3 bandwagon and I am delighted. I could use a D300 image for my test suite.
    Cheers, Rags :-)

  • Calibration Lightroom Question

    Hi folks,
    I've calibrated my MacBook Pro monitor using a Spyder2 Pro which has produced nice colours on the OSX desktop.
    When I open Lightroom I find that all my images are supersaturated and that the colours just seem wrong. Blues look quite purple and greens are really intense with the saturation and vibrance sliders at zero. When I output to jpegs the colours are all extremely weak and the vibrant colours of the image don't seem to be there.
    I am a colour management novice and am wondering what I am doing wrong.... Any ideas?
    Kind Regards
    Mark

    I am not an expert in this, but it seems like you have not actually applied the Spyder Calibration to the Monitor. LR is Color managed and would use it, where I am not at all sure the OSX desktop is.
    Seems more like what ever Monitor Profile you are currently using is corrupt.
    As to the Jpeg, what Profile are you exporting them in? sRGB is what they should be. The flatness could be from aRGB or ProPhotoRGB if you are viewing them in a Browser.
    Don
    Don Ricklin, MacBook 2Ghz Duo 2 Core running 10.5.1 & Win XP, Pentax *ist D See LR Links list at http://donricklin.blogspot.com for related sites.

  • Where is the Printer Preferences Dialog?

    I'm running PSE 6 for Mac on an iMac using Leopard.  Printer is Canon iP6600D.  After calibrating the monitor (using Apple's program under System Preferences), I opened a jpg on my desktop and made a print, and it matches the monitor fairly closely.  Then I decided to open the same jpg inside PSE and print from there.  Then it asked me under color management/color handling whether PSE or the printer manages the colors (or no color management).  Depending on the answer, it says I need to disable or enable color management in the Printer Preferences Dialog.  I have two questions:
    1)  Where can I find the Printer Preferences Dialog?
    2)  Is it better to let PSE or the Printer handle the color management?
    I actually have a third question (sorry).  When I make prints from our local Walgreens or WalMart, the prints seem way too dark and saturated (WalMart) or has a bluish/purple-ish cast (Walgreens).  Maybe their printers are not calibrated like they're supposed to be??

    Thanks.  I used PSE to produce two jpgs of the same image, one with the 'standard' monitor calibration (gamma = 1.8), and the other using a calibration with gamma = 2.2.  Images are sRGB, and I used the 'no color management' setting in PSE.  I made prints on my Canon ip6600D, and they matched the monitor pretty closely for the respective profiles.  The print made with the gamma 2.2 calibration came out lighter than the gamma 1.8 calibration, which was expected.  So at this point it looks like my monitor and printer are working okay.
    However, I just got back from Walmart, and these same two prints are way too dark, and they are the same exposure (one should be lighter).  Does their machine (Fuji Frontier I believe) automatically optimize the images?  If so, that is not good.  What do I do now?  Do I have to use a printer profile that matches their machine?  My prints looks good (according to my monitor and printer), but I would like to make prints from Walmart or Walgreens if I need to print out a large quantity (saves time, money, ink and paper cost).  Also, I want to send a cd to friends so they can make prints, but at this rate, it is not going to work.

  • How differs soft proofing in View - Proof Colors and Save for Web - Preview?

    Hi, I'm currently confused with one inconsistency. My working space is Adobe RGB and I use calibrated monitor. After I finish my work on image I go to View -> Proof Colors -> Internet Standard RGB. Image looks terribly with the overall violet/purple hue. Then I open Save for Web dialogue, I check Convert to RGB and from Preview options I select again Internet Standard RGB. Now the previewed image looks as expected. The same results I get if I manually convert image to sRGB before soft proofing and saving for web. So... what's the difference between preview in Proof Colours and in Save for Web? Thank you for your opinions.

    Hi 21, thank you for your input. All what you say makes perfect sense, it is exactly how it should work and how I expected it works. My problem was, that while testing this theory in practice, I have come to different results. I expected, that if I stick to the theory (meaning keeping in mind all rules you perfectly described) I should get the same result in both soft proof and save for web preview. But... it was not the case. Save for web preview offered expected results while soft proof was completely out of any assumptions and colours were totally over-saturated with violet/purple hue. Also, Edit -> Assign Profile -> sRGB gave another result then Soft Proof -> Custom -> assign sRGB (preserve numbers), but the same as save for web preview.  What troubled me was why this is so.
    Today I've made tests on hardware calibrated monitor and... everything works exactly as you describe and as I expected.
    Then I went back to another monitor which is software calibrated (both monitors are calibrated with X-Rite i1 Display Pro). And again... I received strange results described above. So I did the last thing I thought and disabled colour calibration on that monitor. And suddenly... both soft proof and save for web preview gave the same result.
    Probable conclusion: soft proof and save for web preview (together with Edit -> Assign Profile) are programmed to use different algorithm which is evident on standard gamut monitors with software calibration. Question can be closed.
    Gene and 21, thank you for your effort.

  • Colors differ in PhotoShop and Windows Viewer

    Dear members
    I got a problem.. when I adjust/ photoshop a picture in PhotoShop cs3 I save it.
    Then when I open it in the Windows Viewer.. the colors are different from PhotoShop. Some are lighter other darker.
    When I send this to print. The colors of the Windows Viewer match the result. And the colors of PhotoShop doesn't.
    I've got my screen also calibrated.
    Does anyone know where the problem is?

    Dave,
    Really there was no offense taken. It's only that these problems are a bit more complicated than simply calibrating the monitor. Learning to understand these things takes time. For me it took many long mountain bike rides and hikes in the Santa Monica Mountains where I could clear my head enough for the light bulb of understanding to turn on.
    I might have reacted a but harshly too. Really, we're all here to learn and share information and hopefully not get too confused.
    Assie,
    "'Trust what you see in Ps and don't worry about Viewer as it's just an approximation'
    "I think this is with my system different. When i see a picture in viewer and i let it printed professionally. The colors on the printed picture are the same of the viewer.. and not the same with those in PS. "
    Good, your back in the thread. In order to help you we need to know a couple of things. What is your RGB working space? What monitor do you have and how has it been calibrated - gamma, white point, luminence, calibration device, etc?
    With the images you are having printed professionally, what is the data flow for the printing? What type of printer is being used and what is the lab's color management setup? Are you sending them tagged files and if so, what profile is being embedded?
    Really the best way to wade through all this is one baby step at a time. First make sure that your screen is calibrated properly (this means hardware colorimeter) and your ambient lighting is correct for the luminence of your screen. Prints need to be viewed under a standardized light source. Inkjets love the lighting from Solux (solux.net). If you are going to display prints under other types of light, then it's best to make custom profiles that factor in the light source as part of the profile.
    Once you have your monitor properly calibrated, you can be reasonbly sure that what you see in Ps is what is in your file. For any kind of output, you need to customize the data for the output device. Normally this is done with output profiles, and usually better results come from custom profiles. You have to know what's happening to your data at every step of the way to determine where things are going awry if you get prints that are not meeting your expectations, so tackle one thing at a time and don't move on to the next until your'e sure the last step has been done correctly or it will be difficult to pinpoint what to change.

  • OT: Congratulations to Beth Marshall

    Yesterday, Beth Marshall (aka Zabeth69) was appointed an Adobe Community Professional (ACP) for Dreamweaver.
    ACP is the new name for what used to be known as Adobe Community Experts and, before that, Team Macromedia (and if your memory goes back a very long way, Team Allaire). Members of the scheme are volunteers, whom Adobe recognizes as experts in a particular product or technology, and who provide help to the community in a variety of ways, including help in the forums. It's a one-year (renewable) appointment, and competition to be selected is pretty tough. So, well done, Beth.

    Hi Everyone
    David wrote -
    There's no shame in being a beginner, as long as there is a willingness to learn.
    You are correct in that there is no shame in being a beginner, and I was not suggesting to anyone that these people should not be helped, just that I wish a small but significant number of them would at least recognize that the more advanced techniques are often not a case of cut and past answers, and quite often that what they wish is something where they need to learn the basics first. But that said, there are also many who are willing to learn, and there is a certain satisfaction in helping these people, that does make up for the frustration caused by the others
    Yes David, I also remember those days when browsers did not even have javascript. This has probably made the learning process for those who started with the web early much simpler, as we did not need to learn everything in one go. Many who did start then have moved on to incorporating web standard, but some have not and still insist that tables, embedded style, etc. are better. It is also unfortunate that there are still a large number of sites on the web that show techniques that are pre. 2004 both in the tutorials and examples they offer, and completely ignore such things as accessibility, legal requirements, security and unobtrusive coding.
    I also think that this often leaves many beginners, some of who, post to the forum with, (to quote David) - "a widespread belief that web development is "easy".
    Osgood wrote -
    I do agree that it quite possibly is the calibre of questions that are
    asked which has lead to a rather 'blinkered' board in my view.
    Yes, this is part of what I was suggesting, but also that the forum does not really lend itself to helping with some of the 'more interesting' questions, that really require a 'tutorial' and not a simple 'do this, do that' approach, after all who wants to write a series of short answer when a complete explanation complete with example and code is necessary, (and how many have the time?) time for me often comes into question when the answer really combines dreamweaver/flash with actionscript/javascript, (jQuery)?
    Martin wrote-
    Now, back to my quest for WWW domination.  I will rule with fairness.
    You disappoint me, I thought you had already achieved this.
    Other thoughts -
    As for dreamweaver itself, I think it supports the beginner in the best way possible, by enforcing standards compliant code where possible, even if it does not have the more advanced features that have made me start looking for a better php ide, and having already forced me to move to visual studio for C#. Which also goes to prove, it is not completely aimed at the professional web designer/developer.
    Just as a matter of interest, what do you all think are the skills necessary for someone to call themselves a professional web designer/developer?
    Paula

  • Question about monitor calibration

    I'm hoping someone can tell me what role the monitor factory settings play on monitor calibration.  My Spyder 3 Elite calibration device tells me (before I start calibration) that I should reset the factory settings on my monitor.  On my old monitor I didn't know what the factory settings were and couldn't find a way to reset them.  My new monitor just arrived today with a brightness setting of over 90 (on a scale from 0-100, and the contrast setting was around 80.  The first thing I did when I turned the monitor on was to change that because the display was so bright I could hardly read the forum on it.  I can't imagine that those are the "factory settings" that I am supposed to use!  Thanks!

    This is actually a VERY good question, because the initial settings will affect, to some extent, how you'll see all things that are not color-managed.
    Things get even more complicated if you're going to maintain two monitors and would like them to more or less match.
    What I'd do is spend some time, before firing up the profiling device, to try to set the on-monitor settings so that you have a comfortable brightness level and get the response as close as possible to gamma 2.2.  Then the video card curve cablibration process won't have as much to change.
    There's a chart I like to use to see if the gamma is close to 2.2:
    First, make sure any remnants of a monitor profile from your old monitor are removed, and that you're back to defaults (e.g., sRGB IEC61966-2.1).
    Make sure and view the above chart at 100% full size, and using your on-monitor controls try to get the gray bars in the left column to seem as one smooth gradient, the same brightness from side to side.  Also, you should barely be able to see dark gray on black squares in the top-right black bar on white background.
    Depending on your monitor gamut, you may not be able to get all the color out of the center column, but get it as close as possible.  Then you'll leave the calibration/profiling process a good starting point, and you shouldn't be hugely disappointed in what you see from your non-color-managed applications.
    -Noel

  • Mid 2010 15" i5 Battery Calibration Questions

    Hi, I have a mid 2010 15" MacBook Pro 2.4GHz i5.
    Question 1: I didn't calibrate my battery when I first got my MacBook Pro (it didn't say in the manual that I had to). I've had it for about a month and am doing a calibration today, is that okay? I hope I haven't damaged my battery? The calibration is only to help the battery meter provide an accurate reading of how much life it has remaining, right?
    Question 2: After reading Apple's calibration guide, I decided to set the MacBook Pro to never go to sleep (in Energy Saver System Preference) and leave it on overnight so it would run out of power and go to sleep, then I'd leave it in that state for at least 5 hours before charging it. When I woke up, the light on the front wasn't illuminated. It usually pulsates when in Sleep. Expectedly, it wouldn't wake when pressing buttons on the keyboard. So, what's happened? Is this Safe Sleep? I didn't see any "Your Mac is on reserve battery and will shut down" dialogues or anything similar, as I was asleep! I've left it in this state while I'm at work and will charge it this afternoon. Was my described method okay for calibration or should I have done something different?
    Question 3: Does it matter how quickly you drain your battery when doing a calibration? i.e is it okay to drain it quickly (by running HD video, Photo Booth with effects etc) or slowly (by leaving it idle or running light apps)?
    Thanks.
    Message was edited by: Fresh J

    Fresh J:
    A1. You're fine calibrating the battery now. You might have gotten more accurate readings during the first month if you'd done it sooner, but no harm has been done.
    A2. Your machine has NOT shut down; it has done exactly what it was supposed to do. When the power became critically low, it first wrote the contents of RAM to the hard drive, then went to sleep. When the battery was completely drained some time later, the MBP went into hibernation and the slepp light stopped pulsing and turned off. In that state the machine was using no power at all, but the contents of your RAM were still saved. Once the AC adapter was connected, a press of the power button would cause those contents to be reloaded, and the machine would pick up again exactly where you left off. It is not necessary to wait for the battery to be fully charged before using the machine on AC power, but do leave the AC adapter connected for at least two hours after the battery is fully charged. Nothing that you say you've done was wrong, and nothing that you say has happened was wrong.
    A3. No, it does not matter.

  • PXI 2527 & PXI 4071 -Questions about EMF considerations for high accuracy measurements and EMF calibration schemes?

    Hi!
    I need to perform an in-depth analysis of the overall system accuracy for a proposed system. I'm well underway using the extensive documentation in the start-menu National Instruments\NI-DMM\ and ..\NI-Switch\ Documenation folders...
    While typing the question, I think I partially answered myself while cross-referencing NI documents... However a couple of questions remain:
    If I connect a DMM to a 2 by X arranged switch/mux, each DMM probe will see twice the listed internal "Differential thermal EMF" at a typical value of 2.5uV and a max value of less than 12uV (per relay). So the total effect on the DMM uncertainty caused by the switch EMF would be 2*2.5uV = 5uV? Or should these be added as RSS: = sqrt(2.5^2+2.5^2) since you can not know if the two relays have the same emf?
    Is there anything that can be done to characterize or account for this EMF (software cal, etc?)?
    For example, assuming the following:
    * Instruments and standards are powered on for several hours to allow thermal stability inside of the rack and enclosures
    * temperature in room outside of rack is constant
    Is there a reliable way of measureing/zeroing the effect of system emf? Could this be done by applying a high quality, low emf short at the point where the DUT would normally be located, followed by a series of long-aperture voltage average measurements at the lowest DMM range, where the end result (say (+)8.9....uV) could be taken as a system calibration constant accurate to the spec's of the DMM?
    What would the accuracy of the 4071 DMM be, can I calculate it as follows, using 8.9uV +-700.16nV using 90 days and 8.9uV +- 700.16nV + 150nV due to "Additional noise error" assuming integration time of 1 (aperture) for ease of reading the chart, and a multiplier of 15 for the 100mV range. (Is this equivalent to averaging a reading of 1 aperture 100 times?)
    So, given the above assumptions, would it be correct to say that I could characterize the system EMF to within  8.5uV+- [700.16nV (DMM cal data) + 0.025ppm*15 (RMS noise, assuming aperture time of 100*100ms = 10s)] = +-[700.16nV+37.5nV] = +- 737.66nV? Or should the ppm accuracy uncertainties be RSS as such: 8.5uV +- sqrt[700.16nV^2 + 37.5nV^2] = 8.5uV +-701.16nV??
     As evident by my above line of thought, I am not at all sure how to properly sum the uncertainties (I think you always do RSS for uncertainties from different sources?) and more importantly, how to read and use the graph/table in the NI 4071 Specifications.pdf on page 3. What exactly does it entail to have an integration time larger than 1? Should I adjust the aperture time or would it be more accurate to just leave aperture at default (100ms for current range) and just average multiple readings, say average 10 to get a 10x aperture equivalent?
    The below text includes what was going to be the post until I think I answered myself. I left it in as it is relevant to the problem above and includes what I hope to be correct statements. If you are tired of reading now, just stop, if you are bored, feel free to comment on the below section as well.
    The problem I have is one of fully understanding part of this documenation. In particular, since a relay consists of (at least) 2 dissimilar metal junctions (as mentioned in the NI Switch help\Fundamentals\General Switching Considerations\Thermal EMF and Offset Voltage section) and because of the thermo-couple effect (Seebeck voltage), it seems that there would be an offset voltage generated inside each of the relays at the point of the junction. It refeers the "Thermocouple Measurements" section (in the same help document) for further details, but this is where my confusion starts to creep up.
    In equation (1) it gives the expression for determining E_EMF which for my application is what I care about, I think (see below for details on my application).
    What confuses me is this: If my goal is to, as accurately as possible, determine the overall uncertainty in a system consisting of a DMM and a Switch module, do I use the "Differential thermal EMF" as found in the switch data-sheet, or do I need to try and estimate temperatures in the switch and use the equation?
    *MY answer to my own question:
    By carefully re-reading the example in the thermocouple section of the switch, I realized that they calculate 2 EMF's, one for the internal switch, calculated as 2.5uV (given in the spec sheet of the switch as the typical value) and one for the actual thermocouple. I say actual, because I think my initial confusion stems from the fact that the documenation talks about the relay/switch junctions as thermocouples in one section, and then talks about an external "probe" thermocouple in the next and I got them confused.
    As such, if I can ensure low temperatures inside the switch at the location of the junctions (by adequate ventilation and powering down latching relays), I should be able to use 2.5uV as my EMF from the switch module, or to be conservative, <12uV max (from data sheet of 2527 again).
    I guess now I have a hard time believeing the 2.5uV typical value listed.. They say the junctions in the relays are typically an iron-nickel alloy against a copper-alloy. Well, those combinations are not explicitly listed in the documenation table for Seebeck coefficients, but even a very small value, like 0.3uV/C adds up to 7.5uV at 25degC. I'm thinking maybe the table values in the NI documentation reffers to the Seebeck values at 25C?
    Project Engineer
    LabVIEW 2009
    Run LabVIEW on WinXP and Vista system.
    Used LabVIEW since May 2005
    Certifications: CLD and CPI certified
    Currently employed.

    Seebeck EMV needs temperature gradients , in your relays you hopefully have low temperature gradients ... however in a switching contact you can have all kind diffusions and 'funny' effects, keeping them on same temperature is the best you can do. 
    Since you work with a multiplexer and with TCs, you need a good Cold junction ( for serious calibrations at 0°C ) and there is the good place for your short cut to measure the zero EMV. Another good test is loop the 'hot junction' back to the cold junction and observe the residual EMV.  Touching (or heating/cooling) the TC loop gives another number for the uncertainty calculation: the inhomogeneous material of the TC itself..
    A good source for TC knowledge:
    Manual on the use of thermocouples in temperature measurement,
    ASTM PCN: 28-012093-40,
    ISBN 0-8031-1466-4 
    (Page1): 'Regardless
    of how many facts are presented herein and regardless of the percentage
    retained,
                    all will be for naught unless one simple important fact is
    kept firmly in mind.
                    The thermocouple reports only what it "feels."
    This may or may not the temperature of interest'
    Message Edited by Henrik Volkers on 04-27-2009 09:36 AM
    Greetings from Germany
    Henrik
    LV since v3.1
    “ground” is a convenient fantasy
    '˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'

  • Macbook Battery Calibration Questions

    Greetings!
    I just bought a MacBook today (the mid-range white model) but I am a complete tyro in the Apple world. I have a few questions with respect to the calibration process. The manual is lucid with respect to everything except for three issues:
    1. The instructions make no mention of whether or not I can turn on my MacBook and use it during the first charge cycle (that is, the charging cycle before the two hour normal use stage).
    2. The instructions do not indicate whether or not I have to wait until the battery is completely dead during the "die-down" stage (the supposed five hour sleep stage) before I plug in the charger and fully charge the battery again. Is it all right to plug in the power adapter after the computer has slept for five hours while the computer is still in sleep mode?
    3. If for some reason I screwed up my first calibration (perhaps due to mishandling the process in (1) or (2) above), is it detrimental to immediately run another calibration process (i.e. treat the second "power up" cycle as step one of the calibration process)?
    The reason why I raise these issues is because I am currently calibrating my MacBook as I speak. I plugged my adapter into the computer, turned it on, and ran programs (iTunes, CD ripping, Wifi) while the initial charge cycle was running. I have let the computer fully charge and have left it fully charged for a little over two hours. I am about to disconnect it and move to the "die-down" stage (that is, I will let it die down into sleep mode). I am concerned about whether or not I have faulted at any point in the calibration process. Any help regarding my questions would be much appreciated. Thank you for your time and consideration.
    -- BibleJordan

    And yes, it is ok the plug the adaptor while the computer is in sleep mode. The battery will charge as normal.
    No, I think this is wrong. You need to let it sleep without the power adapter. This causes the battery to deplete most of the remaining reserve charge. If you look in the manual on pages 23-24, steps 4-6 are critical. Basically it says to drain your battery till it goes to sleep (step 4). Then let it sleep for 5 hours (step 5). Then plug in the ac adapter and leave it in till it is fully charged (step 6).
    So do not plug in the AC adapter until it has either been off for 5 hours or in sleep mode for 5 hours.
    Personally I don't see how you can shut it off instead of sleep mode for the 5 hours, as once it goes to sleep mode, from low battery, you cannot wake it to turn it off unless you plug in the AC and that defeats the whole purpose...
    Regards,
    RacerX
    MacBook 2.0Ghz, 2GB RAM   Mac OS X (10.4.7)  

Maybe you are looking for