In order to achieve accurate calibration, it’s necessary to understand a bit of color science and history of quantifying how human’s see the world, so brushing up on those topics (CIE, XYZ color spaces, icc profiles, LUTs vs. matrices) will help immensely. Calibrating your display and environment for color-critical photo and video work is simply not an easy task. Fortunately it can now be done with free, open-source software (DisplayCAL) and affordable hardware (X-rite i1DisplayPro), but do expect an investment in time. Some of the following resources are helpful:
Disclaimer: Calibration is important, but it’s also very easy to get it wrong and make things worse.
The inevitable experience of every colorist involves sending colored material to a client for review and getting feedback based more on the inaccuracy of their display than of your color correction. Fortunately, as of iPhone 11, iPhones are pretty standardized and high enough quality that, for basic SDR content, you can give a client a great idea of finished color. Use the Frame.io app (never the browser!) and have them set brightness to about 60% in a normal viewing environment. Apple computers also have a decent track record of looking decent out-of-the box (part of the reason I recommend them in the ‘computer’ section), but you’ll need to understand the concept of ‘color management’ to get predictable and trustworthy results. That said, one good approach is to view your export on different devices, i.e. look at your Frame export on an iPhone screen to gauge if your work looks the same as it did on your computer display.
Many people are familiar with the classic “SMPTE color bars“, a hallmark of the NTSC signal verification days.
Being acquainted with the basics of calibration with SMPTE bars doesn’t hurt and it’s pretty easy. This is a good place to start and what many old-school industry insiders will think of when they hear the word “calibration”. It’ll also help you understand the purpose of this colorful set of rectangles you’ve likely seen before.
This process of using knobs and dials to ‘calibrate’ or ‘tune’ a display is similar to how we’ll use the term ‘calibration’ here. We’ll later explore how the umbrella term ‘calibration’ has come to generally mean “get my image to look right” but this act of adjusting a display’s controls to the best possible starting point is only the beginning of a full calibration and profiling solution.
First, let’s talk about what we’re actually ‘calibrating’. If you’re a photographer, you’re used to working off your computer monitor to do your edits. If you’re a videographer, especially one dabbling in color, you’ve likely been told you need an external “reference monitor” to do any serious grading. For a long time I wondered how professional photographers could work with ‘cheap’ computer monitors, but basic rec709 colorists couldn’t.
Photo applications reference an “.icc profile”, often held by the OS, which basically outlines what your given display can show–it’s like an evaluation or report card of your monitor’s capabilities. Then your media software (e.g. Photoshop) converts colors in your document to look correct on your display as you work on the image.
Proper video calibration expects that the video software is less intelligent and can’t do this realtime conversion for you, typically requiring an external, video-specific display and a color conversion between that display and your compute, usually in the form of a “LUT”. That said, many video NLE’s (like Resolve) are now (properly) supporting ICC profiles and can therefore provide a “calibrated” image on your computer screen. Any professional colorist will still have a good external display, but for the budding amateur, the computer GUI display is much better than it used to be and don’t let the professionals bully you into dropping $40k on a display (until you need to master HDR content that is…). That said, there are still many ways a computer display can show inaccurate color: Google “Quicktime gamma shift” or “Premiere render washed out” for multiple evidences of that fact. The refresh rate of your display is variable, the software may be scaling your image making sharpening difficult to evaluate, and the interaction of operating system, video card, and ICC metadata is a precarious one that’s subject to change without you realizing it.
Let’s explore this in more depth. DisplayCal and Argyll are built around the idea of ICC profiling, common to color management with computers for many years. It gets a bit confusing but here’s the gist: Profiling is just measuring what a display is capable of and that’s the first step to accurate color. A hardware device called a probe is placed on the display to be measured. Software (or a hardware device) spits out some patches of color, the software measures results, and the “profile” is basically just a description of what that display can do and what RGB values fed to the display are going to result in. This profile just describes the characteristics of the display but it’s not actually doing anything–it’s color managed software that’s required to make use of this. Again, the profile is just sort of a “characterization” of your monitor.
So what then is “transforming” the image of your photo to look color accurate when you open it in Photoshop? What’s morphing the “bad colors” to the “correct colors”? That’s the responsibility of the Photoshop itself in this case. Color managed software like Photoshop will read metadata in the photo you open and see that this photo of your grandmother is sRGB. It will read the ICC profile associated with the display and properly selected under “Displays” in Mac’s “System Preferences”.
This tells Photoshop that your particular monitor’s primaries are something close to, say a P3 color space. Now Photoshop, knowing the source and destination color spaces has the information necessary to display the photo correctly on your monitor and allow you to “convert” to a specific color space you intent to output to. And, importantly, all of this happens beneath a 1D system LUT, so if that LUT isn’t ‘on’ your calibration is potentially inaccurate. This gamma-only LUT helps get your white point into check as well as fix grayscale issues across the luminance range. Both of these pieces, the 1D LUT and the color managed software using the ICC profile need to work together.
Bear with an odd analogy: Bill has a sensitive tongue and everything tastes salty to him. Jean has a semi-deficient tongue and adds salt to everything. You, the self-advertised taste-managed baker, are responsible for adjusting salt levels to please both clients. A hundred years ago you did a test to measure a sampling of the earth’s populous and how they responded to this phenomenon of “saltiness”. This gave you a concrete way to relate how people perceive something as “salty” and a quantifiable amount of “salt” from a molecular perspective. You now know that 150 milligrams of salt is the objective “perfect amount” to create a satisfyingly-salty sodium-sensitive dinner roll for the “standard diner”. But, because you know your clients, you’ve characterized them and know that Bill will require less salt and Jean more salt to receive the same sensation. This is why you are the taste-managed baker. You know the measurements related to the experience you want for the end user and you make the adjustment dynamically for a perfect fit. You can make this adjustment immediately, independent of who walks into your store. But you hire an apprentice who happens to be there begrudgingly as part of an internship and against his free will. You can’t trust him to know your customers and craft with the care of your customized culinary creations. He’ll need a recipe to follow and that recipe will be specific to each customer. There will be a Bill recipe and a Jean recipe. This is the 3D LUT approach.
A 3D LUT is much more “dumb” than software intelligently converting between color spaces and reading an ICC profile on-the-fly. The 3D LUT, as you read above, is just a simple (albeit large) table of input and output values. So historically the video world is used to a simpler form of color management. Because of that, I think it’s actually easier to understand as well as implement correctly. Like all LUTS, this one is specific: it can intelligently “read” the color space of the incoming signal and it has no idea what display device it’s connected to. It’s a “pre-baked” transform between one gamut and another and only valid in that specific situation.
So LUTs and ICC profiles are both color management tools but the implementation is different in most cases. That said, Device Link profiles are more similar to the functionality of a 3D LUT, but we’ll not discuss that here.
If you made it through the list of resources and explanations at the top you should have an understanding of some of the foundational principles of color calibration. Let’s move on to the tools.
Light Illusion’s ColourSpace is as much of an industry standard for calibration as any tool. Steve Shaw, the creator, even has a free version that’s great for checking a display and correcting gamut and white point with display’s controls, but it has no 3D LUT creation ability. The full version of this software is cost prohibitive for many students so we’ll spend more time discussing a slightly-less intuitive program called DisplayCal. That said, if you have the money, buy Steve’s software; his support is second-to-none.
Graeme Gille is a brilliant man whose ArgyllCMS color engine work underlies the popular DisplayCal open source calibration software. He has even written his own driver which allows X-rite hardware to be used on Linux and Android. It also means he doesn’t have to comply to X-rite’s licensing conditions. It also enables a ‘hi-res’ spectral mode which allows spectros to sample at higher resolutions (we’ll get into that later). This same color engine powers the Android app that turns your phone (plus a probe) into a very accurate color meter. On the calibration side, you get a lot of functionality and incredible results for a price tag of $0.
The DisplayCal software is rooted in this functionality of ICC profiles and will actually profile your display much as you would for calibrating a computer screen and then generate a 3D LUT based on that profile and your intended destination space.
The most practical device or “probe” for doing these sorts of calibrations is typically an i1 Display Pro made by X-rite. These are not perfect units, but the device is very good for the price. This is a device that uses colored filters and a sensor to measure color and therefore requires an emissive display (e.g. computer screen). The shorthand term for these sorts of probes is a “colorimeter”. The quote below references pairing the i1D3 with another type of probe to create an offset and increase its accuracy. This is a good idea if you have the money.
“The unit-to-unit variation accuracy of the i1d3 is non-trivial. I have had some of them in our lab that are scarily accurate, most are not that good, and a few are not very good at all. The typical i1d3 would benefit from a i1Pro2 profile, though the details would depend on the display type.”
Spectroradiometers are this second type of probe–the i1Pro2 reference above. They are more costly but actually measure the wavelength of light unfiltered. They work on passive sources like a printed calibration page, rather than only on emissive sources. They are slower in operation, struggle in low light, and you pay more for narrower bandwidth sampling ‘resolution’. If you have the money, I suggest buying two i1Display Pro’s or only one and an i1 Pro2 to verify it.
Take a look at this comparison between the two: X-Rite Comparison
The best way to learn DisplayCal is to simply follow their Quick Start Guide.
Download the software here.
Many of the default settings can be left untouched. DisplayCal’s developer Florian has put a lot of thought into this.
Profile Type is possibly your most important selection:
Matrices are simple transforms that work great if your device is linear. They’re also less prone to visual distortion LUTs can sometimes introduce. Using more than the default 34 patches is probably not necessary as it just doesn’t make much difference since you’re not calibrating to the granular level a LUT provides. This sort of calibration can happen in under 10 minutes. But if you have non-linearities or irregularities within the volume of the gamut, the matrix can’t help you. The matrix shifts or transforms the entirety of the three-dimensional volume of the image, stretching, scaling, rotating as necessary, but if everything in the middle doesn’t already line up relatively well to itself you need the granularity of a 3D LUT. See Steve Shaw’s excellent explanation below:
“We are also focused on the best possible display calibration, as is attained via a 3D LUT using LightSpace CMS. We do not consider a 1D grey scale/white point LUT combined with a 3×3 matrix to ever produce an acceptable level of calibration, due to a total lack of volumetric data.
…The problem is that when an ICC based calibration system is used to attempt to directly ‘hardware calibrate’ a display (no software ICC profile involvement) it does so just with a 1D LUT for grey scale/white point management, and a simple 3×3 Matrix for gamut (colour). A 3×3 matrix can only manage gamut as a single entity, using the 6 flat plane sides of the matrix, without any volumetric information at all.
“Further, the 1D LUT component may not even reside within the display, or may be ‘supplemented’ by 1D LUT data held within the video card’s VCGT (Video Card Gamma Table), or even within the ICC its self, where the ‘corrections’ are applied as image manipulations. “
“Any ‘volumetric’ gamut information (measurements that are within the gamut of the display…if there is any, will be held within the associated ‘display’ ICC profile for linked ‘software calibration’ via image manipulation. The ICC profile acts within any ICC aware graphics program used within the PC, effectively correcting the gamut of the image before it is displayed on the partially calibrated display. This means that any program that is not ICC aware will not use the ICC profile data for image correction, and so will display any image only partially correctly.”
“The term ‘image correction’ is key here, as that is how ICC profiles work. They hold ‘profile data’ that is used by the graphics program’s CMM (Colour Management Module) to ‘correct’ the image, not calibrate the screen. As described already, this means the display is not actually fully calibrated itself, and due to processing restrictions (available processing power) the CMM can never ‘correct the image’ on the fly to the level required for true accuracy, as detailed below.”
Further, just because a given ICC may contain a large amount of patch colour data, there is no guarantee the CMM in use will actually use all the data, or will interpret it correctly. Often, too much data will cause image artefacts, due to the limited colour processing the CMM is able to perform.”
Follow the wiki dedicated to calibrating in Resolve, here.
Don’t use the matrix option as you have no need for it. Just use XYZ LUT. The matrix is there in case you have software that can’t utilize the LUT (e.g. Finder) but will utilize the matrix.
Some of the following are other things to keep in mind:
Video vs. Data Levels
Color modification in linear (Broken Computer Color Video)
Why two monitors won’t look the same and how to (rather-manually) fix it. Create a custom white point.
Determine your display’s specs here.
The type of display you own can make your success in calibration vary widely, for example, see this note from Graeme.
Narrow spectral primaries on technology like laser projection and OLED means different people observe the same color differently: A new standard (such as the TC 1-36 2012 proposed observer) may improve average match slightly, but can’t possibly address individual variation. Effective methods to address individual variation are to broaden the primaries on the displays, perhaps moving in the direction of spectral reproduction, or to have a practical means of measuring individuals CMF’s. The latter means that only one person at a time can see the display perfectly calibrated though.
As mentioned, the world of calibration is a complex one, and the results are often not as good as anticipated due to human error, display inefficiency, and the fallibility or inconsistency of our visual system. That said, hopefully this sets you up with information to be able to start down the path of calibration.