Hello there,
I have noticed a problem that seems to plague computer monitors and maybe TV sets.
First, most computer monitors work by showing three different colors: Red, Green, and Blue, which i will abbreviate as R,G,B as needed, for each pixel. The monitor achieves various colors and shades of those colors by varying the level of white light that is allowed through the aperture for that color. If the pixel in a certain location needs to show red, it will open the red wider and the other colors will remain closed. The viewer then sees red for that pixel. For violet, it would open both the red and blue and keep the green closed. For white, it would open all three and not keep any closed.
That is the basic operation and it achieves the aperture change in various ways.
Each pixel color R, G or B emits a certain amount of light when the aperture is open for that color. It's like a tiny flashlight that gets turned on, and there is one for each of the three colors. The more each one emits, the brighter the screen appears.
The problem is since there is the equivalent of three flashlights behind the screen, if one is turned on it only emits about 1/2 of the light that gets emitted when two are turned on, and only about 1/3 of the total light output that gets emitted when all three are turned on which is what has to happen to create white.
To put it another way, when white is emitted the screen outputs THREE times the light intensity that it does for any individual color like blue. This of course means that anything that is white in the picture is going to be brighter than anything of any one or two colors. This is especially true when one color is displayed across a background of white. The color gets drowned out by the white surrounding the color picture. This is true of many types of graphics too like a program that shows a file listing, which is made up of mostly dark text on a pure white background. The white background is almost blinding.
Ok white is almost blinding, so just turn down the brightness right?
I wish that would work. What happens is all the colors decrease in brightness as well.
What this means is that if you have the monitor set for viewing that file listing on the white background so the background doesnt blind you then when you go to look at a picture of a natural scene or human subject they appear far, far too dark.
This problem is not subtle either. There is a marked difference in brightness of a white screen compared to a single color screen like blue.
This whole idea comes into play when converting color photos to black and white, because each intensity level has to be made to be the average of the three colors:
C=(R+G+B)/3
(Note there is a small correction factor for the three colors, but it is very small compared to the issue being discussed here so it is left out for simplicity).
What that formula does for three levels of the same high intensity is it makes the brightness of the black and white picture the highest pixel color which is 255 on most monitors. This happens because R, G, and B are all 255, so the average is 255 also:
C=(255+255+255)/3=255
So the pixel for that location would be 255. There are still three colors however, so each one would be made 255.
So what happens when a single color is displayed vs when white is displayed?
For a single color the maximum intensity setting is 255, so say blue would be 255.
But for white, we would have the total sum 255+255+255=765.
So we would be staring at a pixel with relative intensity of 756 vs a pixel with intensity of only 255. That's the problem. The monitor or graphics card does not average out the three as we do when creating a black and white photo from a color photo. If it did, and if the colors were really of equal intensity, the right levels would not be 255, 255,255 but would be 255/3, 255/3, 255/3, for a total intensity level of 255 again. So both white and blue would show up (ideally) at the same intensity and so the colors would appear normal compared to white.
So the question is, does anyone have any idea what to do about this?
What it leads to is you have to adjust the brightness for reading text on a white background and readjust the brightness for viewing photographs.
It's hard to do this back and forth when say surfing the web where you have to read and view photos sometimes.
BTW this has been a known issue for projectors in the past. Epson has apparently improved their projectors to eliminate this problem. It seems to appear in monitors too though.
I have noticed a problem that seems to plague computer monitors and maybe TV sets.
First, most computer monitors work by showing three different colors: Red, Green, and Blue, which i will abbreviate as R,G,B as needed, for each pixel. The monitor achieves various colors and shades of those colors by varying the level of white light that is allowed through the aperture for that color. If the pixel in a certain location needs to show red, it will open the red wider and the other colors will remain closed. The viewer then sees red for that pixel. For violet, it would open both the red and blue and keep the green closed. For white, it would open all three and not keep any closed.
That is the basic operation and it achieves the aperture change in various ways.
Each pixel color R, G or B emits a certain amount of light when the aperture is open for that color. It's like a tiny flashlight that gets turned on, and there is one for each of the three colors. The more each one emits, the brighter the screen appears.
The problem is since there is the equivalent of three flashlights behind the screen, if one is turned on it only emits about 1/2 of the light that gets emitted when two are turned on, and only about 1/3 of the total light output that gets emitted when all three are turned on which is what has to happen to create white.
To put it another way, when white is emitted the screen outputs THREE times the light intensity that it does for any individual color like blue. This of course means that anything that is white in the picture is going to be brighter than anything of any one or two colors. This is especially true when one color is displayed across a background of white. The color gets drowned out by the white surrounding the color picture. This is true of many types of graphics too like a program that shows a file listing, which is made up of mostly dark text on a pure white background. The white background is almost blinding.
Ok white is almost blinding, so just turn down the brightness right?
I wish that would work. What happens is all the colors decrease in brightness as well.
What this means is that if you have the monitor set for viewing that file listing on the white background so the background doesnt blind you then when you go to look at a picture of a natural scene or human subject they appear far, far too dark.
This problem is not subtle either. There is a marked difference in brightness of a white screen compared to a single color screen like blue.
This whole idea comes into play when converting color photos to black and white, because each intensity level has to be made to be the average of the three colors:
C=(R+G+B)/3
(Note there is a small correction factor for the three colors, but it is very small compared to the issue being discussed here so it is left out for simplicity).
What that formula does for three levels of the same high intensity is it makes the brightness of the black and white picture the highest pixel color which is 255 on most monitors. This happens because R, G, and B are all 255, so the average is 255 also:
C=(255+255+255)/3=255
So the pixel for that location would be 255. There are still three colors however, so each one would be made 255.
So what happens when a single color is displayed vs when white is displayed?
For a single color the maximum intensity setting is 255, so say blue would be 255.
But for white, we would have the total sum 255+255+255=765.
So we would be staring at a pixel with relative intensity of 756 vs a pixel with intensity of only 255. That's the problem. The monitor or graphics card does not average out the three as we do when creating a black and white photo from a color photo. If it did, and if the colors were really of equal intensity, the right levels would not be 255, 255,255 but would be 255/3, 255/3, 255/3, for a total intensity level of 255 again. So both white and blue would show up (ideally) at the same intensity and so the colors would appear normal compared to white.
So the question is, does anyone have any idea what to do about this?
What it leads to is you have to adjust the brightness for reading text on a white background and readjust the brightness for viewing photographs.
It's hard to do this back and forth when say surfing the web where you have to read and view photos sometimes.
BTW this has been a known issue for projectors in the past. Epson has apparently improved their projectors to eliminate this problem. It seems to appear in monitors too though.