The Peach meme: On CRTs, pixels and signal quality (again)

88 pointsposted 8 days ago
by zdw

45 Comments

fleabitdev

5 hours ago

Emulators also struggle to faithfully reproduce artwork for the Game Boy Color and the original Game Boy Advance. Those consoles used non-backlit LCD displays with a low contrast ratio, a small colour gamut, and some ghosting between frames. Many emulators just scale the console's RGB colour values to the full range of the user's monitor, which produces far too much saturation and contrast.

It's a shame, because I really like the original, muted colour palette. Some artists produced great results within the limitations of the LCD screen. Similar to the CRT spritework in this article, it feels like a lost art.

However, there's an interesting complication: quite a lot of Game Boy Color and Game Boy Advance developers seem to have created their game art on sRGB monitors, and then shipped those graphics without properly considering how they would appear on the LCD. Those games appear muddy and colourless on a real console - they actually look much better when emulated "incorrectly", because the two errors exactly cancel out!

BearOso

5 hours ago

One of the best examples of this are the first two GBA Castlevania games, Circle of the Moon and Harmony of Dissonance. The former was developed without consideration of the actual screen and was extremely dark and hard to see. The latter did consider the GBA screen and is far too colorful and bright on anything else.

seanhunter

7 hours ago

It genuinely baffles me that people are nostalgic about CRTs. CRTs were universally god-awful and I paid top dollar to have the best that money could buy for myself since I worked from home and it was still terrible. Modern monitors are better in every possible way.

Aloha

6 hours ago

CRT's were better for a long time, LCD's did eventually catch up.

I held on to my 21" Trinitron for a long time into the LCD era, because it had better contrast, resolution and sharpness. Eventually affordable LCD's did catch up.

brandonmenc

6 hours ago

People are nostalgic for the pixel art made specifically to look good on CRTs.

It’s like sometimes preferring 24 fps cinema or oil paints over photography.

It depends on what’s being displayed.

rafabulsing

6 hours ago

Yeah, I don't think anyone is nostalgic for doing spreadsheets/word processing/etc on CRTs.

The utilitarian POV will always look for the best (less noisy/most accurate/most reproducible) medium possible. But when it comes to art, many other aspects factor in, and a little bit of noise can very well add something to a piece. Not always, but often enough.

user

5 hours ago

[deleted]

Tarball10

5 hours ago

There was a long period (15-20 years) where LCDs were quite a downgrade from CRTs. Poor contrast ratio (gray blacks), backlight bleed, low resolution, poor color gamut (they're still putting 45% NTSC screens in budget laptops!), stuck at 60Hz refresh rates.

Those gaps have finally started closing in the last few years, now that 4K 240Hz 99% DCI-P3 OLED monitors are readily available and relatively affordable.

mrandish

2 hours ago

> It genuinely baffles me that people are nostalgic about CRTs.

I don't get nostalgic about any technologies - and certainly wouldn't get nostalgic about cathode ray tubes which were big, heavy and had innate limitations. However, I am serious about vintage game preservation and I care about seeing classic game art which was created on and for CRTs accurately presented as the original developers and artists saw it. These days that's as easy as playing games from the CRT era with a good CRT-emulation pixel shader.

What frustrates me is when I see classic 80s games on popular retro YouTube channels objectively looking far worse than they actually did in the 80s. That happens because some of that artwork was painstakingly hand-crafted to exploit the unique traits of both the analog video format and CRTs. When presented without a pixel shader, some of those titles simply look wrong - and in some cases, egregiously so. I know because I'm old enough to have been there, worked with and learned from some now-legendary 80s game developers and artists.

The hard-edged, square pixel blocks many young people (who've never seen a CRT) think a retro game like Pac-Man or Joust should have, is a strange historical anomaly. When I show them what the games actually looked like either via a good pixel shader or on my arcade emulation cabinet's 25-inch analog RGB, quad-sync CRT (which was made for high-end arcade cabinets), they're often shocked. I hear things like "Wow, I thought retro was cool because it looked so janky but it was actually softly beautiful." To me, the importance of CRTs (and CRT shaders) isn't about injecting analog degradation to recreate some childhood nostalgia for the crappy RCA TV your parents had in the living room (with rolling hum bars from the unshielded RF modulator), it's about making games like Pac-Man and Joust look as good as they really did on the industrial CRT in their arcade cabinet (which could be better than the best TV many consumers ever owned). Or alternatively, making console games look as good as they did to the original developers and artists, who usually used the highest-quality CRTs they could because they were after the best-possible image for the same reasons recording studios have always used reference-grade audio speakers.

So yeah, it's not honoring those historic games when retro YouTubers show them in a degraded form that looks far worse than they ever did in the day - especially when it's now so easy to present them accurately by turning on a CRT shader that's already built into many retro emulators. As others in this thread point out, even the best pixel shaders aren't completely perfect, but as a retro-purist (and video engineer whose career spanned the analog and digital video eras) I concede today's best pixel shaders are 'accurate enough' and certainly far better than hard-edged block pixels. It's weirdly tragic because what some people think 'retro' games looked like isn't worse than a bad consumer TV was - or better than a good analog RGB CRT was - it's just wrong on some bizarre third-axis of digital jank which never existed in the CRT era.

hbn

6 hours ago

Your answer is right in the article, it shouldn't be that baffling.

No one is campaigning to get rid of the beautiful modern 4k OLED displays and return to CRTs for everything. But for low resolution content (retro games and computers) it looks better on a CRT.

There's pretty good modern emulation of the look, but at the end of the day it's a different technology and they do look different.

Not to mention the feel of playing through a near-zero latency analogue signal is noticeable if you've played enough of certain games. There's a reason speedrunners of retro games all play on console with a CRT.

cubefox

6 hours ago

No, CRTs are still much better than sample-and-hold screens (OLED or LCD) with regards to motion clarity.

Short version: Our eyes are constantly tracking objects on screen ("smooth pursuit"), which leads to visible smearing on OLED and LCD screens, because they hold their frame for the entire frame time rather than just flashing them for a short fraction of that time. Especially fast paced and side scrolling games like Sonic look much better on CRT screens. (But CRTs have much lower brightness and hence contrast than modern screens.)

Full explanation here: https://news.ycombinator.com/item?id=42604613

luciferin

10 hours ago

This is fun to see right now. I've been playing around with CRT shaders in retroarch for the last few days. My main goal is to use the [CRT-Beam-simulator](https://github.com/blurbusters/crt-beam-simulator) at 120hz and get some sort of CRT slot or shadow mask at the same time. I've landed on some settings I enjoy for N64 games, and it really has improved the experience for me.

On the post's notes on the Sonic waterfall effect, the [Blargg NTSC Video Filter](https://github.com/CyberLabSystems/CyberLab-Custom-Blargg-NT...) is intended to recreate that signal artifact, but similar processing is included in a lot of the CRT shaders that are available. I found that RGB had a visual artifact when moving that made the waterfall flicker, but composite didn't, so I played on that setting. Running it with the beam simulator is probably causing some of that.

dmonitor

10 hours ago

Using an OLED display? I've found that's the only type of display that can even come close to reproducing the CRT look

willis936

8 hours ago

The blacks are there, but the brightness is not. I just played some smash 64 on a CRT last weekend and using an OLED for my desktop.

mapontosevenths

8 days ago

I have a Retrotink 4k. I mostly use it for VHS transfer these days, but it's original purpose is upscaling retro game images and applying various masks and filters to make the game look like it's using a CRT on a modern display.

It works beautifully, and you no longer need a clunky, heavy, dying CRT. I'm sure the purists will say it's not the same, but I've done sides by side comparisons IRL and it's good enough for me even when pixel peeping. I prefer the emulated CRT look on a modern OLED to the real thing these days.

hbn

6 hours ago

I started playing Policenauts recently, and when I first booted the game I was straining my eyes trying to read the blocky pixelated text. I only recently started using RetroArch, but I did some digging and figured out how to enable a CRT filter and immediately it was 1000% easier on my eyes.

The anime art and FMV sequences looked way better too.

CharlesW

10 hours ago

Apologies for the off-topic question, but I'm so curious: How is this useful for VHS transfer?

kowbell

10 hours ago

Not OP but I assume "VHS Transfer" meant "transfer to a digital format" i.e. digitize. The Retrotink is a fancy "composite/component/vga-to-hdmi" box, so you can do: VCR playing a VHS -> Retrotink -> HDMI capture card -> computer saving that to a file.

mapontosevenths

3 hours ago

Exactly this.

To properly capture VHS you need something called a TBC. Most have died over the years and the ones that are left are either very expensive or dying as caps fail. A TBC that was once considered low end commonly sells for $1k+ today.

The retrotink can do most of what a TBC does and it's a modern device you can actually buy for a reasonable price. It can also upscale and deinterlace for you in the process saving a ton of work later. The serious archivist would scoff, but it's good enough for home movies, and I would argue that it introduces less noise than 30 year old professional equipment with dying caps.

amlib

9 hours ago

I do something like this for my old game consoles, except that I pipe them trough an old analog video capture card that supports 240p60 and use the video processing module in Retroarch to do the capture with minimal lag. After adding some fancy CRT shaders and other image adjustment carefully tailored for this, the image comes out looking great! I sometimes toggle the shaders off and wince at the "raw" digital capture. I actually bought this capture card back in 2008 for this purpose but detested using it until around 2018 when I started using it in conjunction with retroarch and CRT shaders.

For that period it even shaped my perception that analog video and specially n64 graphics were always bad, but all that was vindicated by those shaders, it really does make a big difference, and made me find a new appreciation for n64 graphics in particular.

There is some internet misconception that the inherently "blurry" output of an n64 is bad (And sure, some games are just ugly/bad from an artistic standpoint), but it's actually the smoothest image any analog console will ever produce when hooked up to a proper CRT or CRT shader, and it's consistent across all games because of "forced" high quality AA in all games. Even the next generation of consoles seldomly used AA.

trenchpilgrim

11 hours ago

An OLED with a great filter is good enough for most gamers other than archivists and hardcore collectors, yeah.

rendaw

8 days ago

The peach on the author's CRT looks pretty awful, as does the photo. I'm curious what sort of CRT produced the meme image. Maybe it can't be done by a real CRT, but the author's CRT doesn't look anything like the example from the CRT database they have below.

They also said the impression is different since it's so close up - what does it look like at the size you'd really see it in game?

CrossVR

8 hours ago

> I'm curious what sort of CRT produced the meme image.

The article mentions later that it's a PVM-20L2MD [1]. This is a professional CRT monitor for medical devices. It uses the same signals as a consumer TV, but comes with a higher quality tube that has a sharper picture.

[1] https://crtdatabase.com/crts/sony/sony-pvm-20l2md

dmonitor

10 hours ago

He seemed to test it on a bunch of computer monitors, and not a standard 480i consumer television set? The different shadow masks phosphor patterns change how things look

drougge

9 hours ago

The C= 1084S he uses is a more a (very good) PAL TV than a computer monitor, even if it was sold as a monitor. So "576i" in your terminology. (It was also sometimes sold with a TV tuner, or at least the earlier 1084 (same picture tube AFAIK) was.)

zzo38computer

6 hours ago

In my opinion: CRT has problems including distortion; however, LCD can be bad if the picture is the wrong resolution that does not match the display, while CRT can handle this problem better. LCD is good if the source of the picture is made for the LCD at its resolution.

gwbas1c

10 hours ago

When I played NES and SNES as a kid, the resolution was so low that I only saw pixels. (Edit: I saw whole pixels when using the RF switch.) To this day, when I go back and play those games on modern consoles I just can't use CRT emulation.

Maybe I just didn't play games that used tricks to get around the pixels?

---

That being said, I remember that "New Super Mario Brothers" on Wii appeared to use a CRT trick to try and make the background flash in a boss room. I always played my Wii in 480p, so it just looked like there were vertical lines in the boss room.

RiverCrochet

10 hours ago

I grew up playing Atari, NES, SNES, and PS1 games on old TVs most of the time, sometimes not the best quality. I also remember that often in 80's arcades, it was guaranteed for at least one or two machines to have CRT issues; colors not aligned, skew at the top/bottom, burn-in (common), screen too bright, etc. All part of the experience and quite nostalgic for me.

The NES had a particular quirk with its NTSC output that I always thought was very characteristic of NES. I found this article a few years ago, and was fascinated that work was done to really figure it out - https://www.nesdev.org/wiki/NTSC_video - and it's awesome at at least some emulators (FCEUX) seem to use this info to generate an experience quite similar to what I remember the NES being when I grew up. But I don't think any NES game graphics really depended on this for any visual output. All NES games had jagged vertical lines, for example.

gwbas1c

7 hours ago

> The video timing in the NES is non-standard - it both generates 341 pixels, making 227 1/3 subcarrier cycles per scanline, and always generates 262 scanlines. This causes the TV to draw the fields on top of each other, resulting in a non-standard low-definition "progressive" or "double struck" video mode sometimes called 240p

Ahh: I always wondered why I never saw interlacing artifacts on the NES! (I'm going to assume the same thing for the SNES too.)

Lerc

8 hours ago

I wonder if anyone has employed the services of a hyperrealistic artist to depict the image they see on a CRT.

Given their ability to generate a painting that appears identical to a photo, could they depict how the image appears to them, eliminating any loss from mechanical capture.

LennyHenrysNuts

7 days ago

I still have three working CRTs. A monochrome monitor for the Atari ST, a Sony Multiscan VGA and some random Phillips thing I saved from the skip.

I still play Diablo I on the Sony to this day. Wonderful monitor. I will cry when it finally dies.

karmakaze

7 days ago

This reminds me of my favorite way of watching movies at home was on a 1365x768 plasma TV at 24fps. I really didn't like 1080p, 120Hz, and 4k that came after it. Great for sports and news, not so much for fiction.

user

6 hours ago

[deleted]

tuna74

7 hours ago

Maybe you should turn off the motion smoothing and show the movies in their proper 24 fps?

Playing something like The Dark Knight Rises from an UHD Blu-ray on a good OLED looks incredible!

astrange

5 hours ago

You want either motion smoothing (of the 60->120fps kind) or black frame insertion on an OLED for good motion, otherwise the lack of decay between frames will make it look unnaturally juddery.

briandw

5 hours ago

Perfect example of the expert fallacy. Also how safetyism can cause harm.

gwbas1c

7 hours ago

> Sometimes, all we want to do is shout "CRTs were magic, bro, just trust me!"

I certainly feel that way when watching interlaced video. There's far too much bad deinterlacing out there. One of the biggest problems that I encounter is that deinterlacers tend to reduce the frame rate. (IE, 60i -> 30p instead of 60p, or 50i -> 25p instead of 50p)

(That being said, when I would watch interlaced live TV on a 30" or more TV, I'd see all kinds of interlacing artifacts.)

ssl-3

7 hours ago

With enough lines, enough brightness, and a high-enough refresh rate, it may become possible have a display that can artificially emulate the features of a CRT -- including phosphor persistence and blooming and focus issues and power supply sag and everything else, with interlacing. AFAICT, we aren't there yet.

If/when this happens, we may be able to again view things as close as they were in broadcast, but with a modern display instead of an an old CRT.

If we can find any content to play that way, anyhow. A lot of it is cheerfully being ruined.

Aside from the dozens of us who are interested in that, most of the rest of the folks seems convinced that television looked just as terrible as a blurry SLP VHS tape does after being played through a $12 composite-to-USB frame grabber, using a 3.5mm "aux cable" jammed in between the RCA jacks of the VCR and the converter, and ultimately delivered by an awful 360p30 codec on YouTube, before being scaled in the blurriest way possible...and draw from this the conclusion that there's no details that have any value worth preserving.

Even though television was never actually like that. It had limits, but things could be quite a lot better than that awful mess I just described.

(For those here who don't know: Towards the end of the run, the quality of a good broadcast with a good receiver would often be in the same ballpark as the composite output of a DVD player is today (but with zero data compression artifacts instead of >0), including the presentation of 50 or 60 independent display updates per second.)

gwbas1c

6 hours ago

> With enough lines, enough brightness, and a high-enough refresh rate, it may become possible have a display that can artificially emulate the features of a CRT -- including phosphor persistence and blooming and focus issues and power supply sag and everything else, with interlacing. AFAICT, we aren't there yet.

To truly do that, you need to display over 20 million frames a second.

Why?

True analog video didn't capture frames, but instead each pixel was transmitted / recorded as it was captured. This becomes clear when watching shows like Mr. Rogers on an LCD. When the camera pans, the walls look all slanted. (This never happened when viewing on a CRT) This is because the top part of the image was captured before the bottom part. I wouldn't even expect a 60i -> 60p deinterlacer to correct it.

That being said, I don't want to emulate a CRT:

- I want a deinterlacer that can figure out how to make the (cough) best image possible so deinterlacing artifacts aren't noticeable. (Unless I slow down the video / look at stills.)

- I want some kind of machine-learning algorithm that can handle the fact that the top of the picture was captured slightly before the bottom of the picture; then generate a 120p or a 240p video.

CRTs had a look that wasn't completely natural; it was pleasant, like old tube amplifiers and tube-based mixers, but it isn't something that I care to reproduce.

ssl-3

4 hours ago

I definitely understand you very well, and I agree.

Please allow me to restate my intent: With enough angular resolution (our eyes have limits), and enough brightness and refresh rate, we can maybe get close to what the perception of watching television once was.

And to clarify: I don't propose completely chasing the beam with OLED, but instead emulation of the CRT that includes the appearance of interlaced video (which itself can be completely full of fields of uncorrelated as-it-happens scans of the continuously-changing reality in front of the analog camera that captured it), and the scan lines that resulted, and the persistence and softness that allowed it to be perceived as well as it once was.

In this way, panning in an unmodified Mr Rogers video works with a [future] modern display, sports games and rocket launches are perceived largely as they were instead of a series of frames, and so on. This process doesn't have to be perfect; it just needs to be close enough that it is looks the ~same (largely no better, nor any worse) as it once did.

My completely hypothetical method may differ rather drastically in approach from what you wish to accomplish, and that difference is something that I think is perfectly OK.

These approaches aren't exclusive of eachother. There can be more than one.

And it seems that both of our approaches rely on the rote preservation of existing (interlaced, analog, real-time!) video, for once that information is discarded in favor of something that seems good today, future improvements (whether in display technology or in deinterlacing/scaler technology, or both) for any particular video become largely impossible.

In order to reach either desired result, we really need the interlaced analog source (as close as possible), and not the dodgy transfers that are so common today.