Is gamma still important to take into account?



Do different monitors (including mobile screens) still use significantly different gamma functions when displaying colour images? Is there a standardised way to represent colour that can then be translated according to the gamma of the monitor, or are they all sufficiently similar to preclude the need for this nowadays?


Posted 2015-08-05T16:33:33.160

Reputation: 3 748

This doesn't sound like Computer Graphics as our scope intended it to be, this site is specifically oriented towards 3d, rendering, simulation, and the process of viewing them. This sounds like it belongs on graphic design. – Robobenklein – 2015-08-05T17:15:14.783

@robobenklein this would most likely get rejected from graphics design, and its borderline. Besides gamma is about viewing your art – joojaa – 2015-08-05T17:41:02.177


I'd say this post is on topic. It seems to relate heavily to GPU Gems 3: Chapter 24. The Importance of Being Linear

– Alan Wolfe – 2015-08-05T18:30:29.643

Ah actually i lied... This question would be welcome to GD.SE because a similar is there. IF i can get to it first, if im away its most likely gets modded to oblivion. – joojaa – 2015-08-05T18:32:39.813

@AlanWolfe i used to own that book, somebody at work stole it from my hand reference. – joojaa – 2015-08-05T18:37:02.960

5Whether something is on topic elsewhere is not relevant to this decision. We just need to know whether it's on topic here. – trichoplax – 2015-08-05T19:20:25.787

1I'm deliberately posting borderline questions to measure where the border is. Please raise as many uncertainties as possible on Meta. – trichoplax – 2015-08-05T19:21:00.567

3Being welcome elsewhere is never a proper reason alone to declare something off-topic here. – Christian Rau – 2015-08-05T19:43:17.330

2@robobenklein nowhere in this stack's scope states this stack is for 3D graphics specifically. – Qix – 2015-08-05T20:03:42.837


Such discussions are important to have - that's what this private beta is all about. So I've raised it on meta

– trichoplax – 2015-08-05T20:06:19.210



Yes, while many screens and OS operations are using a gamma of 2.2 your hardware and computation result still need to be corrected. There are also special mediums such as broadcast TV's that have a different gamma. Sensor equipment like cameras are mostly linear so they need to be adjusted accordingly. Besides this the user can set their system to whatever gamma they like.

In computer graphics primary reason to account for gamma is that your computation is most likely done in linear space* or summing light contributions becomes unnecessarily complicated. In any case gamma is really a simplification of things, you'd be much much better of doing profile to profile conversions if its possible to invest the computational time to do so.

Mathematical explanation

This may be a simpler to understand. Your monitor has a display error of $g(x)$ which is no problem if your data is conforming to $g(x)$, however if you want to calculate:

$$ a+b+c+d $$

and $a$, $b$, $c$, $d$, have the error of $g(x)$ you would actually need to calculate.

$$ g(g^{-1}(a)+g^{-1}(b)+g^{-1}(c)+g^{-1}(d)) $$

You would need to do this for each sub element over and over again. Therefore instead of running the transform each time you transform all once to linear and then once back.

And finally a artistic reason

It may just look better with a different gamma. Many games have a gamma adjustment for this reason.

* Worst case is that you think your computations as linear but do not compensate for the monitors nonlinear output charachteristics.


Posted 2015-08-05T16:33:33.160

Reputation: 6 680

Do we have latex yet? – joojaa – 2015-08-05T17:53:28.320


No, but hopefully we will in the future. Feel free to add you answer as an example for its necessity, though.

– Christian Rau – 2015-08-05T18:18:01.940

If you want to show how your answer would look if we had MathJax, to help make the case, you can use

– trichoplax – 2015-08-05T19:36:50.790


The de facto standard color space for digital images these days is sRGB. sRGB is a good default assumption if working with a display whose exact color space is not known (i.e. most random displays someone might run your app on), or images whose color space encoding is not known (i.e. most random image files you might encounter).

The sRGB standard defines the CIE chromaticity of the pure red, green, and blue primaries and the white point—in other words, it defines what those primaries and white should perceptually look like relative to pure wavelengths.

sRGB also defines a gamma curve that's used for encoding the RGB values. The gamma curve is the part that graphics programmers are usually concerned with, as we have to convert colors back and forth between sRGB and linear to do lighting math physically-correctly. All modern GPUs have sRGB support built in: they can automatically apply the gamma transformations in hardware when sampling a texture, or writing a pixel value to a render target.

As far as monitors are concerned, with a high-quality one it should be possible to calibrate its settings (or it may come pre-calibrated) so that its output matches sRGB as faithfully as possible. In case the monitor itself can't do it, a limited amount of color correction can also be done on the GPU during scan-out; there are some small hardware lookup tables that the RGB values are mapped through before being sent out over the wire.

You might also come across Rec. 709, which is the standard color space for HDTVs; it's very similar to sRGB, using the same primaries and white point, but a slightly different gamma curve. Some high-end monitors use the Adobe RGB color space, which is somewhat wider-gamut than sRGB; photographers tend to like those because they more faithfully represent what photos will look like when printed. The next generation of HDR TVs coming out (hopefully) in the next few years will use Rec. 2020, which has a huge gamut and requires 10 or 12 bits per component rather than 8.

So to come back to your question whether you need to worry about different monitors having different gamma: not much. For gaming and general PC graphics you can pretty much assume sRGB, and figure that if the user really cares about color accuracy, they'll have a good, calibrated monitor. If you're producing software for photographers or print media, or for next-gen HDR video standards, then you might have to start worrying about wide-gamut color spaces.

Nathan Reed

Posted 2015-08-05T16:33:33.160

Reputation: 15 036

Wider gamut monitors will also start to be commonplace in near future. For example I do not calibrate to sRGB on my art worstation, but the profile to profile converter makes images still look the same as on my dev machine. – joojaa – 2015-08-06T06:41:08.183