index sites
In the olden days after WWII, very brilliant minds wanted desperately to show and implement their fascinating work.
Some one decided to temporarily fix a problem with
legislation rather than wait for a technical fix. That problem was black setup. The problem involved
some ringing of the horizontal sync pulse getting into the video. This produced little vertical bars on
the sides of the picture. By raising the video level up away from sync, just a little,
the lines went away.
Even idiots would know that this problem could be solved, but everyone wanted the new television technology Now. O.K., the
easiest solution is to simply temporarily legislate a 7.5IRE setup value and start rolling out those TV sets.
And those TV sets could not roll without a standard to establish manufacturing criteria.
By 1951 the problem was totally solved by better DC level circuits and better horizontal blanking and timing
circuits. In fact, short time distortions had been eliminated and
frequency response had been increased to 4.2Mhz in preparation for color technology where phase distortion
was critical. The Black setup problem was totally gone and was even exemplified by the placement of a color burst signal
in the horizontal blanking area. Color problems were a thousand times more complex than a setup problem, and
both problems were solved early on.
The temporary legislative fix turned out to be permanent artifact: a stubborn thorn in the side of the broadcaster.
As a broadcasting engineer for over 35 years, I have sought to give the public the best picture that I could.
By going back to the way the forefathers invisioned television (without the setup), approximately a 7.5% increase
in contrast ratio would easily and immediatly be seen. Whites would be the same, but blacks and greys would be
blacker.
Many years ago, I ran some tests during Broadcast Experimental Time, a time period given to every broadcaster. I ran zero setup.
And you could easily see the difference: blacks were blacker and the picture looked vivid with more contrast.
But there were problems. Besides the problem of zero setup being illegal in the U.S., there was the problem of some
tv sets using the VIR signal for reference luminance. Some viewers would have washed out whites. I easily
adjusted Luminance reference to any value I wanted: for example from 45IRE (50IRE-7.5/2IRE) to 55IRE (50IRE+7.5/2)
Newer home sets with automatics could not be predicted with any kind of consistency. The experiments were a
waist of time and were a mute point sence all of our programing (cameras, recorders, and Networks) used 7.5 setup anyway.
The setup curse is a mute point for another reason... NTSC has only a 90/1 contrast ratio under the best of
conditions. SD and HD are going to be in the thousands. No comparison! NTSC - no matter how brilliantly
conceived -
is out, in it's entirely including the stupid setup.
A simple academic excercise yields the NTSC Signal contrast ratio:
A contrast ratio is the difference between the maximum white and the blackest black.
As a signal, the whitest white is at 100IRE. The most black at approx setup level which is exactly 7.5IRE.
Black is limited by the, low light, black camera noise, and by the overall system signal noise.
Signal noise, is the "big" noise and is
typically about one IRE unit peak to peak. Luminance level difference is 100 minus 8,
which is 92. This domain is devided by a resolution noise of one. This yields a contrast range of 92 to one.
Sense contrast ratio here is of an electrical signal, a more common electrical measure of the same thing is
Signal to Noise Ratio (S/N unweighted). For typical transmission video S/N(dB)= 20 log(92/1) = 39dB.