View Single Post
  #14  
Old 12-30-2011, 02:02 AM
Rinehart Rinehart is offline
VideoKarma Member
 
Join Date: Nov 2011
Posts: 129
I think James may have misinterpreted something that Battison wrote, although it is sloppily written: when he says that there are only about 350 scan lines that are usable, he might be referring to the utilization ratio, which determines the effective vertical resolution. The NTSC standard specifies 525 scan lines, but this number includes the lines during the vertical blanking interval: a few at the bottom just before the flyback begins, during the flyback itself, and at the top of the next field while waiting for the ringing effect caused by voltage overshoot to settle down. In this you lose roughly 40 scan lines per frame, leaving 486 lines in the visible raster.
However, the effective vertical resolution is considerably fewer, since it depends where the contours of a scanned object are in relation to the scan line.
Consider the following: let's suppose that you have a column of alternately black and white boxes, the height of each being exactly the same as the width of the scan line. If each box is positioned directly under a scan line, then you will get a column of 486 alternately black and white boxes, one per scan line. But let's suppose that each box straddles a scan line. The voltage of the camera signal will be the average of 100% (black) and 0% (white) or 50% grey, and since every scan line has the same property it would be impossible to tell where one box ends and the other begins, or in other words, you wouldn't get a column of boxes on the TV screen but a vertical grey strip, and in this case the vertical resolution would be zero.
Both these cases are completely artificial, but the principal is valid nonetheless. According to Bernard Grob in Basic Television in a real-world situation, the average vertical resolution would be about 70% of the total number of scan lines available, which works out to
486 X 0.7 = 350. So this is probably what Battison means.
One smaller point he makes in an earlier chapter is that the single greatest problem with the use of film on television is the poor quality of the film projectors most TV stations used at the time. Probably because of the high start up costs of a television station--a good quality I/O studio camera might cost around $5,000, which was twice my father's annual pay as a recently-graduated engineer--most stations wanted to cut other costs, and one of the ways that they did that was to buy cheap film projectors--not crappy or defective, but not really professional quality, either. So given this, a station was unlikely to be enthusiastic about purchasing a movie projector that ran at 30 fps, because you couldn't use it for anything except kinescopes, and he cites this as one of the reasons why kinescope films were made at the standard 24fps (35mm) or 16fps (16mm).
Battison's bona fides on the front papers would suggest a competent enough author, so perhaps it is just a lot of sloppy writing (and James caught several examples I didn't notice--on Battison's part and slovenly or hurriedly done copy editing. As always, thank you all for your contributions.
Attached Files
File Type: pdf Battison Title Page.pdf (199.1 KB, 2 views)
__________________
One Ruthie At A Time
Reply With Quote