Videokarma.org

Go Back   Videokarma.org TV - Video - Vintage Television & Radio Forums > Early Color Television

Notices

We appreciate your help

in keeping this site going.
 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #12  
Old 04-27-2014, 10:03 PM
old_tv_nut's Avatar
old_tv_nut old_tv_nut is offline
See yourself on Color TV!
 
Join Date: Jul 2004
Location: Rancho Sahuarita
Posts: 7,712
Lots of reasons why usual performance of early color was not the best.
1) tube circuits drift
2) interaction of circuits - horizontal phase and therefore burst gate was affected by H Hold setting, thereby changing overall hue if ghosts were present
3) color greatly affected by fine tuning
4) tube circuit designs were more susceptible to transmission variations - designers were making do with as few tube sections as possible; compare that to late solid state designs with analog ICs, where enough transistors could be thrown at the problems to get perfect sync without any adjustment
5) people afraid to adjust color, which was really needed on a program by program basis
6) 10% of men have seriously color deficient vision, yet they would be the one in the family to determine how the controls should be set (I am not speculating - have actually experienced this).
7) not enough red drive, especially in early sets - resulting in a cyan white point (9300k +27 MPCD), which emphasized flesh tone variations in the broadcast
8) early sets (after abandonment of P1 green phosphor) that still used straight NTSC color decoding instead of modifying for the new phosphor - tended to make purplish blues and greenish yellows
9) green hair (mentioned to me many times by non-expert viewers) because of (8) and also because of polarization sensitivity of the TK-41 optics
10) instability of color cameras
11) instability of the long chain of analog amplifiers between the camera and the receiver - proc amps were used to restore the color burst to FCC specs before emission, but the degraded color in the image was not corrected.

This eventually became less of a problem partly because of transistorized gear, but also because there were cooperative campaigns involving receiver manufacturers and broadcasters to get every one in a given market to match. Work had to be done with early cable providers also, since their proc amp settings tended to mismatch between sources.

The VIR signal was invented to be inserted at the source and suffer the same degradations as the video, so automatic adjustment could be made before transmission. Or if no adjustment at the transmitter, a GE set with VIR could make the adjustment automatically at home. One big problem - broadcasters started using the VIR locally to adjust just the degradation from their transmitter. Now the VIR signal was correct, but the video coming from the network was not. Then TV manufacturers had to abandon it because it often made things worse.
Reply With Quote
 



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 11:43 AM.



Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2026, Jelsoft Enterprises Ltd.
©Copyright 2012 VideoKarma.org, All rights reserved.