sosomoxie
12 days ago
CRTs are peak steam punk technology. Analog, electric, kinda dangerous. Just totally mindblowing that we had these things in our living rooms shooting electric beams everywhere. I doubt it's environmentally friendly at all, but I'd love to see some new CRTs being made.
retrac
12 days ago
There's a synchronous and instantaneous nature you don't find in modern designs.
The image is not stored at any point. The receiver and the transmitter are part of the same electric circuit in a certain sense. It's a virtual circuit but the entire thing - transmitter and receiving unit alike - are oscillating in unison driven by a single clock.
The image is never entirely realized as a complete thing, either. While slow phosphor tubes do display a static image, most CRT systems used extremely fast phosphors; they release the majority of the light within a millisecond of the beam hitting them. If you take a really fast exposure of a CRT display (say 1/100,000th of a second) you don't see the whole image on the photograph - only the most recently few drawn lines glow. The image as a whole never exists at the same time. It exists only in the persistence of vision.
accounting2026
12 days ago
> The image is not stored at any point.
Just wanted to add one thing, not as a correction but just because I learned it recently and find it fascinating. PAL televisions (the color TV standard in Europe) actually do store one full horizontal scanline at a time, before any of it is drawn on the screen. This is due to a clever encoding used in this format where the TV actually needs to average two successive scan lines (phase-shifted compared to each other) to draw them. Supposedly this cancels out some forms of distortion. It is quite fascinating this was even possible with analogue technology. The line is stored in a delay line for 64 microseconds. See e.g.: https://www.youtube.com/watch?v=bsk4WWtRx6M
leguminous
12 days ago
At some point, most NTSC TVs had delay lines, too. A comb filter was commonly used for separating the chroma from the luma, taking advantage of the chroma phase being flipped each line. Sophisticated comb filters would have multiple delay lines and logic to adaptively decide which to use. Some even delayed a whole field or frame, so you could say that in this case one or more frames were stored in the TV.
aidenn0
10 days ago
If a motion adaptive 3d comb filter (which requires comparing successive frames) was present on a TV, you can bet that it would be plastered all over the marketing material for the TV.
brewmarche
12 days ago
I only knew about SECAM, where it’s even part of the name (Système Électronique Couleur Avec Mémoire)
grishka
12 days ago
You can decode a PAL signal without any memory, the memory is only needed to correct for phase errors. In SECAM though, it's a hard requirement because the two color components, Db and Dr, are transmitted on alternating lines, and you need both on each line.
accounting2026
11 days ago
Yes that is called "PAL-S". But the system was designed to use the delay-line method and it was employed since the inception (first broadcast 1967).
jacquesm
12 days ago
The physical components of those delay lines were massive crystals with silver electrodes grafted on to them. Very interesting component.
MBCook
12 days ago
All PAL TVs had a delay line in them? Crazy.
user
12 days ago
ninkendo
12 days ago
It doesn’t begin at the transmitter either, in the earliest days even the camera was essentially part of the same circuit. Yes, the concept of filming a show and showing the film over the air existed eventually, but before that (and even after that, for live programming) the camera would scan the subject image (actors, etc) line-by-line and down a wire to the transmitter which would send it straight to your TV and into the electron beam.
In fact in order to show a feed of only text/logos/etc in the earlier days, they would literally just point the camera at a physical object (like letters on a paper, etc) and broadcast from the camera directly. There wasn’t really any other way to do it.
lebuffon
12 days ago
Our station had an art department that used a hot press to create text boards that were set on an easel that had a camera pointed at it. By using a black background with white text you could merge the text camera with a camera in the studio and "super-imposed the text into the video feed.
"And if you tell the kids that today, they won't believe it!"
Doxin
11 days ago
It's kind of amazing the sort of hoops people needed to jump through to make e.g. the BBC-1 ident: https://www.youtube.com/watch?v=xfpEZDeVo00
lebuffon
11 days ago
It seems like imagination was more common in those days. There was no "digital" anything to lean on.
alamortsubite
11 days ago
The live-action PBS idents from the early 90's were some of the best.
https://www.youtube.com/watch?v=5Ap_JRofNMs https://www.youtube.com/watch?v=PJpiIyBkUZ4
This mini doc shows the process:
lifeisstillgood
12 days ago
>>> The image is not stored at any point.
The very first computers (Manchester baby) used CRTs as memory - the ones and zeros were bright spots on a “mesh” and the electric charge on the mesh was read and resent back to the crt to keep the ram fresh (a sorta self refreshing ram)
adrian_b
11 days ago
Yes, but those were not the standard kind of CRTs that are used in TV sets and monitors.
The CRTs with memory for early computers were actually derived from the special CRTs used in video cameras. There the image formed by the projected light was converted in a distribution of charge stored on an electrode, which was then sensed by scanning with an electron beam.
Using CRTs as memory has been proposed by von Neumann and in his proposal he used the appropriate name for that kind of CRT: "iconoscope".
hahahahhaah
12 days ago
Why didn't that catch on pre-transistor? Feels like you'd get higher density than valves and relays.
adrian_b
11 days ago
DRAM memories made with special CRTs with memory have been used for a few years, until 1954. For instance the first generation of commercial electronic computers made by IBM (scientific IBM 701 and business-oriented IBM 702) have used such CRTs.
Then the CRT memories have become obsolete almost instantaneously, due to the development of magnetic core memories, which did not require periodic refreshing and which were significantly faster. The fact that they were also non-volatile was convenient at that early time, though not essential.
Today, due to security concerns, you would actually not want for your main memory to be non-volatile, unless you also always encrypt it completely, which creates problems of secret key management.
So CRT memories have become obsolete several years before the replacement of vacuum tubes in computers with transistors, which happened around 1959/1960.
Besides CRT memories and delay line memories, another kind of early computer memory that has quickly become obsolete was the memory with magnetic drums.
In the cheapest early computers (like IBM 650), the main memory was not a RAM (i.e. neither a CRT nor with magnetic cores), but a magnetic drum memory (i.e. with sequential periodic access to data).
torginus
12 days ago
Yeah it super weird that while we struggle with latency in the digital world, storing anything for any amount of time is an almost impossible challenge in the analog world.
iberator
12 days ago
You should check out:
- Core memory - Drum memory - Bubble memory - Mercury delay line memory - Magnetic type memory :P
And probably many more. Remember that computers don't even need to be digital!
abyesilyurt
12 days ago
> computers don't even need to be digital!
or electric.
accounting2026
11 days ago
This stores a whole scanline https://www.youtube.com/watch?v=bsk4WWtRx6M. This or something similar was in almost any decent color TV except for the oldest.
mrandish
11 days ago
It's worth deep diving into how analog composite broadcast television works, because you quickly realize just how insanely ambitious it was for 1930s engineers to have not only conceived, but perfected and shipped at consumer scale using only 1930s technologies.
Being old enough to have learned video engineering at the end of the analog days, it's kind of fun helping young engineers today wrap their brains around completely alien concepts, like "the image is never pixels" then "it's never digital" and "never quantized." Those who've been raised in a digital world learn to understand things from a fundamentally digital frame of reference. Even analog signals are often reasoned about as if their quantized form was their "true nature".
Interestingly, I suspect the converse would be equally true trying to explain digital television to a 1930s video engineer. They'd probably struggle similarly, always mentally remapping digital images to their "true" analog nature. The fundamental nature of their world was analog. Nothing was quantized. Even the idea "quanta" might be at the root of physics was newfangled, suspect and, even if true, of no practical use in engineering systems.
accounting2026
11 days ago
Yes agreed! And while it is not quantized as such there is an element of semi-digital protocol to it. The concept of "scanline" is quantized and there's "protocols" for indicating when a line ends, and a picture ends etc. that the receiver/send needs to agree on... and "colorbursts packets" for line, delay lines and all kinds of clever technique etc. so it is extremely complicated. Many things were necessary to overcome distortion and also to ensure backwards compatibility - first, how do you fit in the color so a monochrome TV can still show it? Later, how do you make it 16:9 and it can still show on a 4:3 TV (which it could!).
mrandish
11 days ago
> And while it is not quantized as such there is an element of semi-digital protocol to it.
Yes, before posting I did debate that exact point in my head, with scanlines as the clearest example :-). However, I decided the point is still directionally valid because ultimately most timing-centric analog signal encoding has some aspect of being quantized, if only to thresholds. Technically it would be more correct to narrow my statement about "never quantized" to the analog waveform driving the electron gun as it sweeps horizontally across a line. It always amazes digital-centric engineers weaned on pixels when they realize the timing of the electron gun sweep in every viewer's analog TV was literally created by the crystal driving the sweep of the 'master' camera in the TV studio (and would drift in phase with that crystal as it warmed up!). It's the inevitable consequence of there being no practical way to store or buffer such a high frequency signal for re-timing. Every component in the chain from the cameras to switchers to transmitters to TVs had to lock to the master clock. Live TV in those days was truly "live" to within 63.5 microseconds of photons hitting vacuum tubes in the camera (plus the time time it took for the electrons to move from here to there). Today, "live" HDTV signals are so digitally buffered, re-timed and re-encoded at every step on their way to us, we're lucky if they're within 20 seconds of photons striking imagers.
My larger point though was that in the 1930s even that strict signal timing had to be encoded and decoded purely with discrete analog components. I have a 1950s Predicta television and looking at the components on the boards one can't help wondering "how the hell did they come up with this crazy scheme." Driving home just how bonkers the whole idea of analog composite television was for the time.
> first, how do you fit in the color so a monochrome TV can still show it?
To clarify for anyone who may not know, analog television was created in the 1930s as a black-and-white composite standard defined by the EIA in the RS-170 specification, then in 1953 color was added by a very clever hack which kept all broadcasts backward compatible with existing B&W TVs (defined in the RS-170A specification). Politicians mandated this because they feared nerfing all the B&W TVs owned by voters. But that hack came with some significant technical compromises which complicated and degraded color analog video for over 50 years.
accounting2026
11 days ago
Yes knew what you meant, and fully agree. It is fascinating TV is even possible just out of all these rather simple and bulky analog components. Even the first color TV's were with vacuum tubes and no transitors.
As I recall there's all kinds of hacks in the design to keep them cheap. For instance, letting the fly-back transformer for producing the high voltages needed operate at the same frequency as the horizontal scan rate (~15 kHz) so that mechanism essentially serves double duty. The same was even seen in microcomputers where the same crystal needed for TV was also used for the microprocessor - meaning that e.g. a "European" Commodore 64 with PAL was actually a few percent slower than an American C64 with NTSC. And other crazy things like that.
mrandish
11 days ago
> "European" Commodore 64 with PAL was actually a few percent slower than an American C64 with NTSC. And other crazy things like that.
Indeed! Even in the Playstation 2 era, many games still ran at different speeds in Europe than the U.S. and Japan. There were so many legacy artifacts which haunted computers, games, DVDs and more for decades after analog broadcast was supplanted by digital. And it all arose from the fact the installed base and supporting broadcast infrastructure of analog television was simply too massive to replace. In a way it was one of the biggest accrued "technical debts" ever!
The only regrettable thing is during the long, painful transition from analog to digital, a generation of engineers got the idea that the original analog TV standard was somehow bad - which, IMHO, is really unfair. The reality is the original RS-170 standard was a brilliant solution which perfectly fulfilled, and even exceeded, all its intended use cases for decades. The problems only arose when that solution was kept alive far beyond its intended lifetime and then hacked to support new use cases like color encoding while maintaining backward compatibility.
Analog television was created solely for natural images captured on vacuum tube cameras. Even the concept of synthetic imagery like character generator text and computer graphic charts was still decades in the future. Then people who weren't yet born when TV was created, began to shove poorly converted, hard-edged, low-res, digital imagery into a standard created to gracefully degrade smooth analog waveforms and it indeed sucked. I learned to program on an 8-bit computer with 4K of RAM connected to a Sears television through an RF modulator. Even 32 columns of 256x192 text was a blurry mess with color fringes! On many early 8-bit computers, some colors would invert randomly based on which clock phase the computer started on! Red would be blue and vice versa so we'd have to repeatedly hit reset until the colors looked correct. But none of that craziness was the fault of the original television engineers, we were abusing what they created in ways they couldn't have imagined.
account42
11 days ago
It's interesting how early digital video systems were influenced by the analog aspects. DVDs were very much still defined by NTSC/PAL even though the data is fully digital.
mrandish
11 days ago
Indeed and even today's HDTV specification has elements based on echoes reverberating all the way from decisions made in the 1930s when specifying B&W TV.
The composite and component sampling rates (14.32 MHz and 13.5 MHz) are both based on being 4x a specific existing color carrier sampling rate from analog television. And those two frequencies directly dictated all the odd-seeming horizontal pixel resolutions we find in pre-HD digital video (352, 704, 360, 720 and 768) and even the original PC display resolutions (CGA, VGA, XGA, etc).
For example, the 720 horizontal pixels of DVD and digital satellite broadcasts was tied to the digital component video standard sampling the active picture area of an analog video scanline at 13.5 Mhz to capture the 1440 clock transitions in that waveform. Similarly, 768 (another common horizontal resolution in pre-HD video) is tied to the composite video standard sampling at 14.32 MHz to capture 1536 clock transitions. The history of how these standards were derived is fascinating (https://tech.ebu.ch/docs/techreview/trev_304-rec601_wood.pdf)
VGA's horizontal resolution of 640 is simply from adjusting analog video's rectangular aspect ratio to be square (720 * 0.909 = 640). It's kind of fascinating all these modern digital resolutions can be traced back to decisions made in the 1930s based on which affordable analog components were available, which competing commercial interests prevailed (RCA vs Philco) and the political sensitivities present at the time.
lebuffon
12 days ago
I was on a course at Sony in San Mateo in the 1980s and they had a 36" prototype television in the corner. We all asked for it to be turned on. We were told by the instructor that he was not allowed to turn it on because the 40,000V anode voltage generated too many X-rays at the front of the picture tube.
:-))))
itisit
12 days ago
And perhaps peak atompunk too when used as RAM. [0]
BizarroLand
12 days ago
Damn, what I wouldn't give to be able to look at my computer and see the bits bobbing in its onboard ram
itisit
12 days ago
Like the MegaProcessor? [0]
BizarroLand
12 days ago
Yes but for my 9950x, lol
ortusdux
12 days ago
One summer odd-job included an afternoon of throwing a few dozen CRTs off a 3rd floor balcony into a rolloff dumpster. I'da done it for free.
ihaveajob
12 days ago
People pay for that these days in smash rooms.
hahahahhaah
12 days ago
Rock and roll!
fecal_henge
12 days ago
Extra dangerous aspect: On really early CRTs they hadn't quite nailed the glass thicknesses. One failure mode was that the neck that held the electron gun would fail. This would propell the gun through the front of the screen, possibly toward the viewer.
ASalazarMX
12 days ago
I don't know, "Killed by electron gun breakdown" sounds like a rad way to go. You can replace "electron gun" with "particle accelerator" if you want.
cf100clunk
12 days ago
Likewise, a dropped CRT tube was a constant terror for TV manufacturing and repair folks, as it likely would implode and send zillions of razor-sharp fragments airborne.
thomassmith65
12 days ago
My high school science teacher used to share anecdotes from his days in electrical repair.
He said his coworkers would sometimes toss a television capacitor at each other as a prank.
Those capacitors retained enough charge to give the person unlucky enough to catch one a considerable jolt.
freedomben
12 days ago
Touching one of those caps was a hell of an experience. It was similar in many ways to a squirrel tap with a wrench in the auto shop (for those who didn't do auto shop, a squirrel tap with a wrench is when somebody flicks your nut sack from behind with a wrench. Properly executed it would leave you doubled over out of breath).
NL807
12 days ago
lol I did this with my mates. Get one of those 1 kV ceramics, give it some charge and bob's your uncle, you have one angry capacitor.
torginus
12 days ago
I remember smashing a broken monitor as a kid for fun, hearing about the implosion stuff, and sadly found the back of the glass was stuck to some kind of plastic film that didnt allow the pieces to fly about :(
ASalazarMX
12 days ago
I can't still get over how we used to put them straight in our faces, yet I never knew of someone having an accidental face reshaping ever.
account42
11 days ago
That doesn't match my experience of deliberately dropping an old CRT monitor off the roof. Implosions are unfortunately not as exciting as explosions.
jlokier
11 days ago
Some recent HN comments about CRT implosions people have experienced.
https://news.ycombinator.com/item?id=46355765
"I still have a piece of glass in back of the palm of my right hand. Threw a rock at an old CRT and it exploded, after a couple of hours I noticed a little blood coming out of that part of hand. Many, many years later was doing xray for a broken finger and doctor asked what is that object doing there? I shrugged, doc said, well it looks like it's doing just fine, so might as well stay there. How lucky I am to have both eyes."
https://news.ycombinator.com/item?id=46354919
"2. Throwing a big big stone to an abandoned next to the trashcan CRT TV while I had it placed normally because it didn’t break when I threw it facing up and the next thing I remember after opening my eyes which I closed from the bang was my friends who were further down the road looking at me as it I were a ghost since big big chunks for the CRT glass flew just right next to me.
CRTs were dangerous in many aspects!"
https://news.ycombinator.com/item?id=46356432
"I'll never forget the feeling of the whoosh when I was working as a furniture mover in the early 2000s and felt the implosion when a cardboard box collapsed and dumped a large CRT TV face-down on the driveway, blowing our hair back. When the boss asked what happened to the TV, I said it fell, and our lead man (who had set it on the box) later thanked me for putting it so diplomatically."
cf100clunk
11 days ago
The ''tube'' was indeed extrememly fragile and thus extremely dangerous. I'm talking about only the unguarded ''tube'' itself. Repair and manufacturer technicians had to deal with that on a regular basis. Later, consumer protection laws and other standards came into effect that made TV and monitor superstructures more capable of guarding such a dangerous internal component. Your experience was clearly with those much safer, later types.
kleiba
12 days ago
What do you mean "had"? I just turned mine off a minute ago. I am yet to make the transition to flat screen TVs but in the mean time, at least no-one's tracking my consumer habits.
account42
11 days ago
While not entirely thematically unrelated, being electric puts it distinctly outside of steampunk and even dieselpunk. I don't think anyone would call The Matrix steampunk but CRTs are at the center of its aesthetic. Cassette Futurism is the correct term I believe though it also overlaps with some sub-genres of cyberpunk.
kazinator
12 days ago
With CRTs, the environmental problem is the heavy metals: tons of lead in the glass screen, plus cadmium and whatnot. Supposedly there can be many pounds of lead in a large CRT.
accounting2026
11 days ago
Yes - and x-rays too! Both from the main TV tube itself (though often shielded) but historically the main problem was actually the vacuum rectifiers used to generate the high voltages required. Those vacuum tubes essentially became x-ray bulbs and had to be shielded. This problem appeared as the first color TV's appeared in the late 60s. Color required higher voltages for the same brightness, due to the introduction of a mask that absorbed a lot of the energy. As a famous example, certain GE TV's would emit a strong beam of x-rays, but it was downwards so it would mostly expose someone beneath the TV. Reportedly a few models could emit 50,000 mR/hr at 9 inches distance https://www.nytimes.com/1967/07/22/archives/owners-of-9000-c... which is actually quite a lot (enough for radiation sickness after a few hours). All were recalled of course!
cf100clunk
12 days ago
The shadow mask system for colour CRTs was a huge improvement that thwarted worries about ''beams everywhere'':
accounting2026
11 days ago
Actually, the voltages had to be raised due to the shadow mask, and this rise in voltage meant you were now in x-ray territory, which wasn't the case before. The infamous problems with TV's emitting x-rays and associate recalls were the early color TV's. And it wasn't so much from the tube, but from the shunt regulators etc. in the power supply that were themselves vacuum tubes. If you removed the protection cans around those you would be exposed to strong radiation. Most of that went away when the TV's were transistorized so the high-voltage circuits didn't involve vacuum tubes.
cf100clunk
11 days ago
Most of those old TVs were not Faraday Caged either, nor were they grounded to earth, so all that radiation and energy was one hardware failure away from seriously unfunny events. Their chassis grounding always gave a tingle to the touch.
hahahahhaah
12 days ago
Try antialias with that bad boy
brcmthrowaway
12 days ago
The 1940-1990 era of technology can't be beat. Add hard drives and tape to the mix. What happened to electromechanical design? I doubt it would be taught anymore. Everything is solid state
Xirdus
12 days ago
Solid state is the superior technology for almost everything. No moving parts means more reliable, quieter, and very likely more energy efficient since no mass has to move.
jasonfarnon
12 days ago
Do modern hdd's last as long as the old platter ones? For me, when the SSDs fail it's frustrating because I can't open it up and do anything about it--it's a complete loss. So I tend to have a low opinion of their reliability (same issue I have with old versus new electronic-everything cars). I don't know the actual lifetimes. Surely USB sticks are universally recognized as pretty crappy. I can leave those in the same location plugged in and they'll randomly die after a couple of years.
Xirdus
10 days ago
I feel like I'm the only person in the world who never had an issue with USB flash drives. Or HDDs for that matter. Or SSDs. I don't think I've ever had any storage die on me except optical disks.
Internet says both HDDs and SSDs have similar average lifespans, but with HDDs it's usually a mechanical failure so yes, you can often DIY it back to life if you have the right parts. With SSDs it's almost always the memory cells themselves wearing out. On the flip side, data recovery is usually much easier since SSD will usually keep working in read-only mode for a while, whereas a faulty HDD won't work at all.
djkoolaide
11 days ago
I've had two SSDs "die" over the years, both of them went read-only, but I was able to recover all data. SSD failure modes are weird.
grishka
12 days ago
That and modern digital TV is just incredibly boring from the technical standpoint. Because everything is a computer these days, it's just some MPEG-2 video. The only thing impressive about it is that they managed to squeeze multiple channels worth of video streams into the bandwidth of one analog channel.
joe_the_user
12 days ago
Also, I believe precursors to CRT existed in the 19th century. What was unique with television was the creation of a full CRT system that allowed moving picture consumption to be a mass phenomena.
pinnochio
12 days ago
We're getting awfully close to recreating CRT qualities with modern display panels. A curved 4:3 1000Hz OLED panel behind glass, and an integrated RetroTink 4K with G-Sync Pulsar support would do it. Then add in a simulated degauss effect and electrical whine and buzzing sounds for fun.
soperj
12 days ago
still can't play duck hunt on it though.
gzalo
12 days ago
Yes you can, see https://neslcdmod.com/
It basically mods the rom to allow for a bit more latency when checking the hit targets
charcircuit
12 days ago
>1000 Hz
This sounds like a brute force solution over just having the display controller read the image as it is being sent and emulating the phosphors.
pezezin
10 days ago
Why curved? We didn't like the CRT curvature back then and manufacturers struggled to make them as flat as possible, finally reaching "virtually flat" screens towards the end of the CRT era. I have one right here on my desk, a Sony Multiscan E200.
femto
12 days ago
This thread makes me realise that the old Telequipment D61 Cathode Ray Oscilloscope I have is worth hanging on to. It's basically a CRT with signal conditioning on its inputs, including a "Z mod" input, making it easy to do cool stuff with it.
user
12 days ago
gman83
11 days ago
This is a cool little project you might be interested in - https://github.com/mausimus/ShaderGlass
accidentallfact
11 days ago
'Steampunk' means no electricity. You need to come up with another term. Analogpunk, maybe?
WorldMaker
11 days ago
"Dieselpunk" is sometimes considered the next door neighbor term for WW1 through early 1950's retrofuturism with electricity and radios/very early televisions.
Sometimes people use "Steampunk" for shorthand for both because there are some overlaps in either direction, especially if you are trying for "just" pre-WWI retrofuture. Though I think the above poster was maybe especially trying to highlight the sort of pre-WWI overlap with Steampunk with more electricity but not yet as many cars and "diesel".
accidentallfact
11 days ago
I don't know. Steam and electricity seem more like a coincidence that they were developed at the same time, so worlds without one seem natural. Another possibility might be no semiconductors. No nuclear also feels plausible, but it's just not interesting. Anything else requires a massive stretch to explain why technology got stuck in such a state.
WorldMaker
11 days ago
Perhaps, if you are worried about realism from the perspective of modern technology. But a lot of the concept of retrofuturism is considering the possible futures from the perspectives of the past. You don't necessarily need realism for why you would consider an exercise like that.
Steampunk is "rootable" in the writings of Jules Verne and H. G. Wells and others. We have scifi visions from Victorian and Edwardian lenses. It wasn't needed at the time to explain how you steam power a submarine or a rocket ship, it was just extrapolating "if this goes on" of quick advances in steam power and expecting them to eventually get there.
Similar with a lot of Diselpunk. The 1930s through the 1950s are often referred to as the Golden Age of scifi. There's so much science fiction written in the real world with a zeal for possible futures that never happened. We don't necessarily need a "massive stretch" to explain why technology took a different path or "got stuck" at a particular point. We've plenty of ideas of the exuberance of that era just in the books that they wrote and published themselves.
(Not that we are lacking in literary means to create excuses for the "realism" of retrofuture, either, when we care to. For one obvious instance, the Fallout franchise's nuclear warfare is central to its dieselpunk setting and an obvious reason for technology to get "stuck". For one less obvious reason, I like "For All Mankind" and its "Apollopunk" setting using the excuse of Russia beating the United States to first boots on the Moon and the butterfly impacts that could have had.)
accidentallfact
10 days ago
I mean that steampunk looks plausible, because it indeed seems to be purely a historical coincidence that electricity was developed at the same time. They are unrelated, one doesn't follow from the other in any way, so there is no obvious need to have both.
You pretty much need to have both chemistry and electricity, or neither.
Even Jules Verne understood the impossibility (or at least absurd impracticality) of a steam powered submarine, and made Nautilus electric.
It's unclear if internal combustion engines would be developed without electricity, and to what degree they would become practical.
I'm not sure about semiconductors, but the discovery does seem fairly random, and it seems plausible that electronics could just go on with vacuum tubes.
It seems perfectly plausible that nuclear wasn't noticed or practically developed, but, as I said, it just isn't an interesting setting.
zapataband2
12 days ago
[dead]