Quote:
Originally Posted by Tomcomm
To be truly objective, you must include the gray screen 21FBP22A rare-earth. I have been playing with the color balance feature of my Photo Impact graphics program that reads the RBG values 0 to 255 of any spot on the JPG image. This is the hacker's approach to chromatic spectrum analysis but should give quantified data on relative phosphor differences of all your CRTs.
|
I have now performed the above Photo Impact “spectrum analysis” on some of my 21CT55 screenshots verses ’84 Sony pro monitor and ’97 Sony 27in tv. All pics were shot with the same Canon S40 and were not color enhanced. I used the DVE color bar for all screenshots tests and “normalized” each pic by placing the virtual sampling probe onto each white bar and setting the brightness to produce approximately the same digital 255 max red, green and blue components. I then placed the sampling probe onto the red bar and took digital readings of its red, green and blue components. Ideally, a perfectly pure red bar should produce a red component of 255 while its green and blue components should read zero. Only the color accuracy of the Canon camera sensor is evolved here since the LCD computer monitor’s color accuracy is never an issue. Since relative CRT phosphor chroma response is all we are interested in, any variance of these readings indicates contamination of the ideal red phosphor’s light output, right?
Preliminary estimates of red phosphor purity of a ’88 13in Sony pro monitor, a ’97 27in Sony "modern" tv and of course the 21CT55 with 21FBP22A “rare-earth” crt follows:
21in FBA.……………Red contaminated with 5.6% green and 8.2% blue
13in Sony……………Red contaminated with 12.6% green and 6.3% blue
27in Sony……………Red contaminated with 29.2% green and 21.3% blue
No wonder the “modern” TV crts appear orangish red!