Uncle John knows pretty much everything—and if he doesn’t, he heads his massive research library, or puts one of his many associates on the case. So go ahead: In the comments below, ask Uncle John anything. (And if we answer your question sometime, we’ll send you a free book!) This week’s question comes from reader Lynda P., who asks…
When a TV screen is filmed, why does it look like it has lines bouncing up and down and all around it?
It’s a common scene in home movies, or movies made in the ‘70s or ‘80s: a turned-on TV (or computer monitor) sits in the background, flickering weirdly. Instead of appearing like it does in real life, as a stable picture, it flickers, a black bar rolls across the screen, or weird lines dance all over the screen. Modern-day HD or LCD TVs don’t do this (as evidenced by thousands of YouTube videos posted of cameras recording things directly off of TV), so why do these old TVs do this?
There’s a discrepancy in scanning frequency between the TV in the video and the camera used to shoot the scene. And there’s a difference between the way the monitor’s phosphor dots are perceived by the camera, and your human eyeballs.
“Old-fashioned” meaning non-HD TVs operated with CRT technology, short for “cathode ray tube.” An electron beam would scan horizontal lines made up of pixels across the TV’s screen, and as the beam hits each pixel, they light up. Those pixels are in turn made from phosphor dots, which glow when hit with the electron beam of light. Those dots glow for 1/30th of a second; the rapidly refreshing image and re-scanning of electron beams gives the impression of a moving image, and TV as we know it. However, the camera used to shoot the scene with the “dancing TV” in question had a different refresh rate – more than 30 frames per second, possibly as high as 60. Because the two aren’t synced, that light and electricity manifests as dancing lines or rolling black bars.
The truth hertz.