Nobody can argue against the convenience that comes with playing retro video games with emulators, but there are some retro gamers who swear by playing video games on an old-school CRT TV.
So why are CRT TVs better for gaming? There’s a technical argument to be made. For example, CRT TVs offer less input lag. But there’s also an argument to be made about how great classic games actually look on old school TVs, despite their low resolution. You can judge for yourself after checking out these 10 comparisons.
Mario Kart 64 – Sharp Pixels vs. Sony KV-20FS100
A common thread throughout these comparisons is how CRT TVs smooth out the sharp pixel edges and blend everything together nicely. This couldn’t be more obvious in this Mario Kart 64 example. The sharp pixels on modern LCD and LED displays make the 2D kart sprites stick out even more than they did in 1996.
Siren In Final Fantasy 6 – Sharp Pixels vs. Sony KV-14AF1
On modern displays, the brilliant pixel art in Final Fantasy 6 looks blocky and jumbled, but a CRT displays the art as it was meant to be seen. Look at how the colors on Siren’s skin blends together nicely.
Dracula In Castlevania: Symphony of the Night – Sharp Pixels vs. Sony KV-13M51
Dracula’s features are lost in the sharp pixels, especially along the brow and eyes. Also notice how the red in Dracula’s eyes have an ominous glow on a CRT, whereas the eyes are represented by two awkward red dots on modern technology.
Mega Man 2 – Sharp Pixels vs. Sony KV-14AF1
This Mega Man 2 example highlights how water effects don’t translate well on modern technology. But on a CRT, the water actually looks like water.
Fatal Fury 3: Road to the Final Victory – Sharp Pixels vs. Sony KV-13M51
The most striking differences in this example can be found in the hair and skin color. While the sprites don’t look bad on a modern display, it’s understandable why many people would prefer the CRT.
Super Mario RPG – Sharp Pixels vs. Sony KV-27S42
While the sprites in Mario games aren’t very detailed to begin with, you can see how Peach’s dress looks much more like a dress on a CRT instead of a blob of pixels.
Chadarnook In Final Fantasy 6 – Sharp Pixels vs. Sony KV-14AF1
The fog/mist in this example looks too sharp and blocky on LCD and LED displays. On a CRT, it looks more hazy and cloudy.
Streets of Rage 2 – Sharp Pixels vs. Sanyo DS-13320
If you were playing Streets of Rage 2 on an LCD display, you might mistakenly believe that Blaze has some sort of skin condition. On a CRT, the shading on the skin just looks much better.
Richter in Castlevania: Symphony of the Night – Sharp Pixels vs. Sony KV-13M51
This example from Castlevania: Symphony of the Night is much like the earlier example with Dracula. The details on Richter’s face look better on a CRT.
Final Fantasy 7 – Sharp Pixels vs. Sony KV-14AF1
This portrait from Final Fantasy 7 looks like a blocky mess on modern displays. Fun fact: during the development of Final Fantasy 7, the developers used high-end Sony PVMs and BVMs.
8 thoughts on “10 Pictures That Show Why CRT TVs Are Better for Gaming”
Yes most of these pictures represent why for sure but some images seem like they’re captured from composite (which is bad) instead of s-video or component/rgb, which makes the graphics look significantly better on a crt, esp for ps2 for ex.
However you failed to mention arguably the greatest advantage a crt has over modern lcds though…and that is motion quality. Lcds suffer from “sample and hold”, or ghosting, during quick motion. Each frame gets blurred across your retinas creating motion blur at 60fps. This blur can be seen even on your phone as you scroll through text at a moderate speed. Crts do not have this problem, at 60 fps, motion and animation remain crystal clear, an lcd needs 200+ frames ps, to rival the motion clarity a crt does at just 60 frames ps. Playing Sonic on an lcd is much harder on your eyes because of the fast scrolling, a crt smooths the gameplay out dramatically.
Go to motion busters.com for more details. Crt>lcd
144/120hz for retro gaming is useless cause old games run only 50/60fps. but BFI at 120hz have a better motion for retrogaming but is not perfect because of double image. few monitors do BFI at 60hz, but is not perfect because of visible ghosting on slow response LCD/LED monitors
Those comparisons are worthless. Because on one side you have photos of a CRT screen, but on the other side you do NOT have photos of LCD/LED screens. Instead you have nearest neighbour upscaled images.
For a fair comparison, you should feed both screens from the same signal and then take photos of both.
Even better, this idiot doesn’t understand that the clear pixels far better captures the original portraits illustrated by Ayami Kojima, DRACULA’S EYES AREN’T SUPPOSED TO BE COMPLETELY RED, THE ORIGINAL ILLUSTRATION SHOWS HIM WITH RED PUPILS ONLY.
The LCD looks better in every single example except maybe one. The real reason CRT is better is input lag. There is no such thing as new flat panel screen that doesn’t delay your button presses in comparison to a CRT.
“The LCD looks better in every single example except maybe one. ”
You can’t be serious. Aside from the Mega Man 2 screenshot, CRT looks way better. LCD apologists are sad.
I think Mega Man 2 looks better on the CRT, so it’s a clean sweep.
Erm, this seems like a bad comparison to me, since the images on the right seem to be just pictures. A real LCD/LED has pixels (and often sub-pixels) that have black borders. They don’t look too different from the “pixels” (technically not pixels but a mask) of a CRT if you look as close to the LCD/LED.
Don’t get me wrong, actually I kinda like CRT’s, but this “comparison” is either biased or just poorly done.
My observation was that during the switch between those technologies the flat panels were just a tiny bit clearer and lagged. However the huuuuge problem in my opinion was/is, that for any other than it’s native resolution, a flat panel looks atrocious – way worse than a CRT. I don’t get why there is no nearest neighbor or linear filter available for non-native flat panel resolutions (at least integrals), because it would look much better to make exactly 4 pixels from one, eg. 1920×1080 -> 960×540. Instead it makes 4 completely smeared pixels because whatever soft filter is actually used instead.
Configuring the graphics to display 1 pixel as one pixel, leaving the rest of the screen black in small resolutions isn’t very easy either, on some setups it’s actually impossible. And it makes the used space on the screen tiny. So nearest neighbor/linear filter option for integrals of native resolutions please.
But yea, I actually just wanted to rant about the unfair comparison here. xD