The static on old CRT TVs with rabbit ears was the cosmic microwave background. No one in the last 25 years has ever seen it.
Well, not really. The cosmic microwave background radiation was a tiny fraction of that noise. What everyone saw was mostly thermal noise generated by the amplifier circuit inside the TV.
Do you think CRTs just magically disappeared after the turn of the millennium?
Don’t you still see this when using an OTA ATSC tuner on a newer LCD display? I thought this was a function of the signal generation and not the display technologies.
deleted by creator
It actually was a pretty rapid switch where all the CRTs disappeared
Well to be fair at some point most/all CRTs showed a blue screen instead of static. So it’s possible someone born in 2000 never saw the snowy display.
As someone born in 2000, I’ve personally seen it and I think most people around me did. Maybe someone didn’t, though.
They lied to us. The real Y2K was the CRT rapture.
I think they’re more likely to have been scrapped than other old tech.
They’re bulky, and mine was too heavy to get out in the attic. I still have my ZX Spectrum and Amiga, but the CRT needed for lightgun games is long gone.
No, I just couldn’t remember exactly when. And as another commenter pointed out, what I should have said was analog TV’s.
CRTs was in use well into the 2000s
Even before the 2000s they started showing a blue screen instead of static.
That wasn’t just a digital or flat panel thing.
But of course old sets were around for a long time.
My memory of the exacts here are fuzzy, but I think this depended on whether or not your TV picked up digital signal, analog, or both. I remember around that time we had a TV that would pick up static on some channels and have a blue input screen on others.
I remember back in the Wii days when I was young we had a flat screen that would go to the digital pattern with no input. However sometimes once in a while it would get that static loud no signal so I think mine had both
I don’t really have a point here just wanted to share
It’s definitelly an analog over the air TV thing.
The way digital works you would either get a “No signal” indicator (because the circuitry detects the signal to noise ratio is too low) or squarish artifacts (because of the way the compression algorithms for digital video are designed).
Yeah, for instance the semi-ubiquitous “small TV with a vhs player built in” that was in a ton of mini-vans and kids’ rooms well into the early 2000s only supported analog cable/antenna signals, so it would give the black and white static when there was no signal.
I’m talking long before digital channels existed. (In the US anyway)
What are they hiding from us?!
Yeah I was still using a CRT as recently as 2012. I think OP means analogue TVs.
Yeah you’re right.
Technically, it’s not about the display technology, but instead about the signal/tuner. More specifically if it’s analog or digital. Some modern TVs still have analog or hybrid tuners for backwards compatibility and regions that still use analog, so they can display static. For instance, in Ukraine we finished the switch to digital TV only a couple of years ago. If your TV had no digital tuner (as was the case for many) you had to buy a DAC box. Retirees/pensioners got them for free, sponsored by the government.
Yeah, my youngest sibling has definitely seen CRTs. My niblings probably haven’t, though.
I thought they were teaching it in all the schools? /s
It is entirely possible for people born after 2000 to have grown up with CRTs.
It is, but those late model CRTs often had a lot of digital circuitry that displayed a solid color on channels with nothing on them. Unless there was a much older CRT around, they never would have seen it.
Most of the CRTs are going to be older
Tube TV’s remained in common service well into the 2010’s. The changeover from analog to fully digital TV transmission did not happen until 2009, with many delays in between, and the government ultimately had to give away digital-to-analog tuner boxes because so many people still refused to let go of their old CRT’s.
Millions of analog TV’s are still languishing in basements and attics in perfect working order to this very day, still able to show you the cosmic background, if only anyone would dust them off or plug them in. Or in many retro gaming nerds’ setups. I have one, and it’ll show me static any time I ask. (I used it to make this gif, for instance.)
In fact, with no one transmitting analog television anymore (probably with some very low scale hobbyist exceptions), the cosmic background radiation is all they can show you now if you’re not inputting video from some other device. Or unless you have one of those dopey models that detects a no-signal situation and shows a blue screen instead. Those are lame.
Amateur radio operators are indeed allowed to transmit analog NTSC television in the UHF band. It’s most commonly done on the 70cm (440MHz) band, and a normal everyday 90’s television is all you need to receive the signals. You’d tune to what would have been cable channels 57 through 61. The use cases for this have decreased in recent years; for example you used to see hams using amateur television to send video signals from RC aircraft or model rockets, now that’s done with compressed digital video over something like Wi-Fi and doesn’t require a license. But, it’s still legal for hams to do.
I think my mom still uses the last CRT TV that I had. Gave it to her when I bought my first 720p HD TV, as the old CRT was better than her old TV. Later on I also gave her that HD TV but she still has the CRT too.
Last time I thought about static I wondered why colour TV didn’t show colour static.
Turns out the colour signal was on very specific frequencies, and if it wasn’t present, it would assume it was a black and white signal and turn off the colour circuit.
It really isn’t though. It is thermal noise.
Could it not be both?
I bought a plasma in 2009 that would show static if I turned it to cable channels without cable plugged in. Plasmas were susceptible to burn in and since I would game a lot I could see health bars etc start to burn in after a while. Whenever that would happen I would turn it to the static screen - making each pixel flip from one end of the spectrum to the other rapidly like that would actually help remove the burn in.
People born before 2000 think older technology just evaporated the minute the millenium ticked over.
2001 here literally grew up with CRT static, you have your years a bit off there.
2002 here, we still had such a TV. For quite a while actually, since we never upgraded and just started using phones and computers instead. It became my console monitor.
Yeah OP full of shit. My three sons all born after 2000 have seen this. Hell my flat screen will show snow if I turn it to antenna and there nothing for single to pick up. Also I have console tv for our old gaming systems so they seen that as well
They also know how a vcr works and what a payphone is. We are not that far removed from that technology. Hell my middle son 17 has a record collection and cds. Also we have the cassette audiobook version of Stephen King Dolores Claiborne.
Modern Tv project fake static when there is no siginal because of fimilarity. OTA broadcasts are all digital, either you get a siginal or you dont.
Some TVs may project fake static.
Just because OTA broadcasts are digital doesn’t mean you are stuck with all or nothing. You can definitely have poor signal and see or hear something other than what was intended. Doesn’t manifest as analog static, but depending on your decoding and error correction schemes, you can have cut audio, frozen frames, iframe inconsistencies, and stuttering.
No digital is all or nothing. What you are describing is some digital packets making it through and the algothrim is designed to accept some packet loss and has error correction. Its more complicated then i make it out, but thats the jist of it.
It is nothing like analog thats being drowned out by background radiation.
they have to watch HBO shows to compensate
Surely you mean the much worse “Max”.
Logo still shows HBO for that intro though
Dude I was born after 2000 and this is firmly planted in my memories. Maybe people born after 2010 haven’t but 2000?
By the way, the picture illustrating the post isn’t actually displaying the real thing - the noise in it is too squarish and has no grey tones.
TV static in recent movies and shows that are set in the past almost always instantly pull me out of the narrative because no one seems to be able to get it right and some are just stunningly bad. It’s usually very subtle, so much so that I’m not sure I could even describe what’s wrong. Makes me feel old to notice it.
I think the problem is because CRT displays didn’t have pixels so the uniform noise which is static was not only uniformely spread in distribution and intensity (i.e. greyscale level) but also had “dots” of all sizes.
Also another possible thing that’s off is the speed at which the noise changes: was it the 25fps refresh rate of a CRT monitor, related to that rate but not necessarily at that rate or did the noise itself had more persistent and less persistent parts?
The noise is basically the product of radio waves at all frequencies with various intensities (though all low) with only the ones that could pass the bandpass filter of the TV tuner coming through (and being boosted up in intensitity by automatic gain control) and being painted along a phosphorous screen (hence no pixels) as the beam draw line by line the screen 25 times per second so to get that effect right you probably have to simulate it mathematically from a starting point of random radio noise and it can’t be going through things with pixels (such as 3D textures) to be shown and probably requires some kind of procedural shader.
#CHSHSHSHSHSHSHSHSHSHSH
Hair stands up