Gamma Correction
Gamma correction is a non-linear process that adjusts the brightness and contrast of an image to account for the way human eyes and display devices perceive light. It ensures that the tonal values in a digital image are displayed accurately on a screen by compensating for the non-linear relationship between a pixel's numerical value and its actual brightness.
How gamma correction works:
The core of gamma correction is a power-law relationship that adjusts the mid-tones of an image without affecting the pure black and pure white points.
Non-linear response: Displays do not output light linearly. A typical monitor with a gamma of 2.2 will output less than half the expected light for a pixel with a value of 50% brightness. This makes images appear too dark on an uncorrected display.
Perceptual encoding: The human eye is more sensitive to changes in dark tones than in bright ones. Gamma correction applies a curve to the image data to match this non-linear sensitivity, which allows for more efficient use of the limited tonal range available in standard 8-bit image formats.
Encoding and decoding: Gamma encoding: When a digital image is created by a camera or software, it is "gamma-encoded" (or compressed) by applying a power-law function with an exponent (gamma) of less than 1. For the sRGB color space, the standard is approximately 2.2. This brightens the image's mid-tones.
Display gamma: A monitor automatically applies its own gamma curve, raising the input signal to a power greater than 1. For a system with a combined gamma of 1.0 (a straight line), the display's "gamma expansion" will cancel out the image's "gamma compression."