View Single Post
  #12  
Old 07-03-2009, 01:06 AM
wa2ise's Avatar
wa2ise wa2ise is offline
VideoKarma Member
 
Join Date: Dec 2002
Location: USA
Posts: 3,147
Many sets drove the CRT cathode, and one way to work around the capacitance from the cathode to the heater would be to introduce some inductance in the heater supply.

I suppose, as mentioned in a post above, you could use the chassis (circuit board) from a computer VGA monitor along with its deflection yoke. The HV may be too high for a B&W tube from the 50's. 20KV vs 10 to 15 KV? You might be able to kludge up a 6BK4 HV shunt regulator and a dropping resistor (one that can take a few KV across it!) to get the HV to something the B&W tube would run on. Just be very careful!

You'd use one of the RGB video channel amplifiers to drive the B&W tube.

The electron spot size in a 1950's B&W CRT may be too big to make the 1080i picture look any better than just the old fashioned 525i from a converter box. You could still make a sharper picture by injecting, at the video detector diode point in the IF circuit, the luma signal from a converter box that has luma and chroma separate, like the Channel Master CM7000. Most B&W sets that were made after NTSC color came out (1954) had their video IF strip's response have a depressed amplitude around the chroma subcarrier frequency. You still need to pass the sound subcarrier though. If you have TV alignment equipment, you might be able to tweak that dip out.

As for the sound amplifier, many TVs of the era used a quadrature FM detector, which yielded audio of enough amplitude to drive the audio output tube (via a volume control) directly. You'd need to build a audio driver stage to take line level audio from the converter box to make it strong enough to drive the audio output tube.
__________________

Last edited by wa2ise; 07-03-2009 at 01:13 AM.
Reply With Quote