Color temperature has been described most simply as a method of describing the color characteristics of light, usually either warm (yellowish) or cool (bluish), and measuring it in degrees
of Kelvin (°K).
That’s a little too simple to be of more than introductory value.
A more technical definition assigns a numerical value to the color emitted by a light source, measured in degrees of Kelvin. The Kelvin Color Temperature scale imagines a black body object— (such as a lamp filament) being heated. At some point the object will get hot enough to begin to glow. As it gets hotter its glowing color will shift, moving from deep reds, such as a low burning fire would give, to oranges & yellows, all the way up to white hot. Light sources that glow this way are called “incandescent radiators”, and the advantage to them is that they have a continuous spectrum. This means that they radiate light energy at all wavelengths of their spectrum, therefore rendering all the colors of a scene being lit by them, equally. Only light from sources functioning this way can meet the truest definition of color temperature.
Note – the light spectrum is wider than our ability to see it. Light values falling beneath the visible part of the spectrum are referred to as infrared, and above the spectrum as ultraviolet. Each can adversely affect an image, and you may need to add some filtration to remove them.
Light sources that are not incandescent radiators have what is referred to as a “Correlated Color Temperature” (CCT). It’s connotations to any part of the color temperature chart are strictly visually based. Lights with a correlated color temperature do not have an equal radiation at all wavelengths in their spectrum. As a result, they can have disproportionate levels (both high & low) when rendering certain colors. These light sources are measured in their ability to accurately render all colors of their spectrum, in a scale is called the Color Rendering Index (CRI)
Each LED diode manufactured will have a fluctuation in colour temperature, the quality of the LED determines the amount of fluctuation. A typical diode in 3000k will have a tolerance of 2800k-3200k. This cannot be detected with the human eye
Color Rendering Index (CRI) Defined
A simple definition of Color Rendering Index (CRI) would measure the ability of a light source to accurately render all frequencies of its color spectrum when compared to a perfect reference light of a similar type (color temperature). It is rated on a scale from 1-100. The lower the CRI rating, the less accurately colors will be reproduced. Light sources that are incandescent radiators have a CRI of 100 since all colors in their spectrum are rendered equally. As stated earlier, light sources that are not incandescent radiators will have Correlated Color Temperatures.
Examples of light sources with Correlated Color Temperatures, having CRI levels that are less than 100 would include: HMIs, and also most photo quality fluorescent lamps, as well as LEDs. With lower CRI ratings these sources may also have too much green or magenta in their spectrums. An acceptable Color Rendering Index level for professional imaging is considered to be 90 or above.