[Editor's note: This story was originally published on July 11, 2013. We're resurfacing it this week as part of our tribute to the great feature work that writer Wes Fenlon has done with Tested, as he embarks on his new career in games journalism.]
Is there such a thing as the best television set ever made? If so, would that honor go to today's incredibly thin 80-inch OLED flatscreens? Or the 4K-resolution television sets just arriving on the market, which pack in 6,220,800 more pixels than 1080p screens? Defining "best" is difficult. A higher density of pixels allows 4K TVs to display a more detailed picture, but what happens when you plug in an old Super Nintendo, which outputs a mere 57,344 pixels? Poor Super Mario World has to be upscaled to more than eight million pixels, and the resulting image can look terrible--blocky, blurry, and all but indistinguishable from how it looked on a CRT TV back in 1994.
Adopting new television technology means saying goodbye to the advantages of older hardware. And yes, there are advantages. There's no such thing as a best TV for all eras of content. But here's a question that's actually possible to answer: What's the best display specifically for retro video games?
Now we've got some parameters to work with. The TV needs to handle low resolution inputs at the proper framerate and aspect ratio, without lag, and with accurate colors. And, of course, it needs to be able to have visible horizontal scanlines, a defining visual element of the way retro games were seen and played.
Out of thousands and thousands of models, the single best TV for retro games is quite possibly the Sony BVM-20F1U, a 20-inch broadcast production monitor that cost about $10,000 when it was introduced in the late 1990s. It is, of course, a CRT, and it's a 15khz display, meaning the highest resolution signal it accepts is 576 interlaced lines, or 576i. That limitation, however, makes it absolutely perfect for everything from the Nintendo Entertainment System to the first PlayStation.
In fact, it may be a little too perfect.
"The problem is that most people [in the 80s and 90s] had television sets at home which in no way resemble what a high-end CRT looks like if used today," writes Tobias Reich, who has been experimenting with video hardware for more than a decade. "For example, take this comparison shot (both taken from real CRTs): you get a high-end Sony BVM on the left and an arcade Nanao 15khz chassis on the right. Quite a difference, right?"
On his website Hazard-City.de, Tobias Reich--who goes by the handle Fudoh online--has been compiling a 40,000 word (and counting) guide to deinterlacing, scaling, and processing game console video since 2008. He's also a regular poster at the shmups.system11.org forums, which he calls "the best hardware discussion board on the net." The shmups forums offer a window into the world of diehard collectors, who hunt for old CRTs and expensive scaling hardware, and then tweak, tweak, tweak in search of that perfect picture.
"For me it's about achieving the best possible look using real hardware," Fudoh writes over email. "Not everybody's willing to spend hundreds of dollars on equipment or hardware mods to help make old 8- and 16-bit to look like what most emulators [achieved 15 years ago]."
"For me it's about achieving the best possible look using real hardware...I'm not really a gamer anymore."
"I'm not really a gamer anymore," he continues. "I've been at that for 30 years now and my backlog is growing and growing. I haven't even finished (most of) the NES and PC Engine games yet which I bought back then."
Just as watch collectors buy mechanical timepieces for the masterful engineering in each and every gear, hardware diehards like Fudoh care about mastering and understanding the thousand little things that affect an image--in this case, an analog one. Telling the time--or actually playing a video game--is almost beside the point.
Finding the perfect TV means understanding scanlines, input lag, the differences between compositve video and RGB, and even the geometry of cathode ray tubes. It means knowing how to mod cables and produce raw, clean video signals for finicky displays. Mostly it means having a crazy eye for detail that most people would never notice. But if you love retro games and geek out over hardware, get ready for a new addiction.
After a crash course in the intricacies of CRT tech and the image processors used to upscale old consoles for modern displays (which Fudoh believes are an even better than the Sony BVM-20F1U) you too may be itching to drop a few hundred bucks on some of the best video gear ever made.
Scanline Crash Course
Scanlines are analog video catnip. The black lines cutting through an 8- or 16-bit game's image help soften the distinct pixels of low resolution graphics, but they've also defined a visual style that millions of people associate with 2D video games. They are equal parts functional and nostalgic and one of the most recognizable elements of older games.
But calling them scanlines is a bit of a misnomer.
Why? Let's start with how a CRT TV or monitor works. In a CRT, three electron guns, or emitters, fire a beam of electrons at the back of the glass screen. The guns are controlled by magnetic deflection, and the beam they fire moves horizontally from the left side of the screen to the right to light up a single line of phosphors, which create colors. Those are the real scanlines--each line that makes up the picture is literally created as the electron gun scans horizontally back and forth.
The black lines colloquially referred to as scanlines are actually lines where no image was drawn.
The black lines colloquially referred to as scanlines are actually lines where no image was drawn--the electron gun has skipped that line and moved down to the next to continue drawing the raster scan.
So why are those empty voids what we think of when we think of scanlines? Probably because they stand out so prominently. After all, the movies and television programs we watched on CRTs had no such black lines. Movies and broadcast television in the US were displayed at 480i, the NTSC standard, meaning there were 480 interlaced scanlines (even lines drawn first, followed by odd lines) refreshed 30 times per second.
That's not how game consoles work. Instead, they output a mere 240 lines, leaving those iconic blank spaces in between.
"Older consoles manipulate the NTSC timing to force the lines drawn on screen to overlap, rather than alternate," writes Daniel Corban, who, like Fudoh, is self-taught in the intricacies of video hardware. "This is where the term 'double strike' would originate; the lines are literally being repeatedly drawn on the same physical area of the tube. This is also what creates scanlines. On a digital display, the signal is simply handled as a 240-line progressive signal, hence '240p.' "
Corban and Fudoh both own hardware expressly for the purpose of taking those 240p video signals and displaying them at the highest possible quality (another bit of terminology to note: 240p is also referred to as 15kHz, because the electron gun in a CRT scans across the screen horizontally a total of 15750 times per second).
There's a ton more to know about scanlines, of course, and you can find a wealth of information on Fudoh's Scanlines Demystified page. But let's move on to another important area before we get to the holy grail of CRTs: understanding RGB signals and some of the key differences between CRTs and modern LCD displays.
The Purity of RGB
Step one towards hardware nirvana: Choose the right cables. Corban recommends using S-Video whenever possible on older consoles and avoiding RF and composite, which are the most commonly available connectors for SD consoles.
"Composite video quality varies between consoles," he writes. "Some are notorious for poor output, such as the later model Sega Genesis, and some are rather well done, such as the Super NES and PSOne. Even the best composite will be too soft and lose a lot of detail...However, the 'upgrade' from S-video to component/RGB is not significant enough for me to suggest spending any time or money if using a consumer SDTV. Unless you have a high-end monitor, such as a Sony PVM, you may not even notice the difference."
With a high-end monitor, however--or a modern display with an image processor, which we'll get to later--those last two signal types become essential.
"RGB and component are very similar," writes Fudoh. "It's hard to find somebody who can actually tell the difference. Component has the benefit of supporting ED and HD resolutions as well...S-Video is certainly closer to RGB than it is to composite and you probably don't have to bother on a 14-inch screen, but the larger the screen gets, the more important a perfect source signal gets. Composite is very complicated signal, which requires tremendous efforts (comb filter, which separates the color from the luminance signal) to look anything but utter shit. I couldn't think of any reason to stick to composite. Even 70s Atari systems can be modded for S-video and getting RGB from a NES isn't this complicated either (just more expensive)."
RGB signals come in a variety of connector formats, and here European hardware enthusiasts have a major advantage. The SCART connector typically used on old consoles in Europe outputs RGB, making those consoles far easier to hook up to quality displays than the US counterparts. The original US Super Nintendo model can still output RGB and S-Video signals, however, using the appropriate Nintendo MultiAV connector cable.
The original US Super Nintendo model can still output RGB and S-Video signals using the right MultiAV connector cable.
The revised SNES-101 model unfortunately dropped RGB and S-Video support, but RGB is relatively easy to restore with a system mod and the SNES-101 puts out even better image quality than the older Super Nintendo.
Modding the NES for RGB output is far more expensive and complicated. Website Retro RGB has a great page devoted to how exactly to get RGB video out of every console. Thankfully, the Sega Genesis, Saturn, Dreamcast, and PlayStation 1 put out RGB with no modding required.
The "purity" of a video signal can also be a big issue when hooking up consoles, and some displays will be more finicky than others about the signal they accept. This is why some consoles may need to be passed through another device, like the Sync Strike, to "clean" or filter the signal. Composite outputs a notoriously messy signal, which can result in colors bleeding into each other. Since RGB separates red, green, and blue, that's not a problem, but properly syncing the colors is essential.
With a retro game console outputting pure RGB video at 240p, there's one major technical hurdle left to clear: choosing a CRT for retro gaming, or playing on a modern fixed-pixel (LCD, OLED or plasma) display.
And when it comes to CRTs, the Sony BVM-20F1U may be the best monitor ever made.
The Sony BVM-20F1U
The distinction that the BVM-20F1U (or the BVM-20F1E, with U/E standing for US/Europe) is a monitor, rather than a TV, is actually significant. As Fudoh covered in an overview of the monitor he posted on the shmups forums, the BVM can be paired with various input boards for different connectors and an external controller board for adjusting monitor geometry and convergence.
Because the monitor was designed for use in broadcast production, it's far more tuneable than the average television. "Geometry, adjustment possibilities, color reproduction, that's where the Sony BVMs are top of the line and the best you can buy," writes Fudoh.
Geometry concerns come from the way the electron beams are deflected inside the TV or monitor. Distortion tends to pop up around the edges, but improperly configured geometry can affect other parts of the screen as well. Convergence refers to how closely aligned the three color electron guns are in the CRT; the better the convergence, the less color bleed you see.
Remember how your old CRT PC monitor let you adjust the pincushion and barrel distortion of the picture? Those kinds of adjustments require digging through some arcane service menus on regular TVs, if they're doable at all.
"Every CRT will have some geometry issues, especially on larger tubes," writes Daniel Corban. "Poor geometry can be mostly corrected, but convergence cannot without the use of physical magnets glued to the inside of the TV. Always test a TV before buying to see if there is red, blue, or green along any white lines and remember that it cannot be corrected easily."
Aside from its extremely high quality convergence and geometry and its advanced configuration options, there's another factor that sets the BVM-20F1U apart: resolution.
When LCD TVs hit the market, the TV industry successfully branded them as HD displays, capable of far higher resolution images than our lame old SD CRTs. We moved on up to 720p, and then 1080p. But CRTs were already capable of displaying HD images--they just didn't have the branding muscle (or the sexy thinness of LCDs) behind them.
CRTs don't have "pixels" in the same way that modern displays do. In a 1080p panel, there are 1920 horizontal pixels and 1080 vertical pixels; that number never changes. As a result, fixed-pixel displays need to do some heavy processing to handle inputs at a non-native resolution. CRTs don't have that problem--their maximum resolution is only constrained by the number of lines they're designed to support. This is also why CRTs had no input lag but modern displays can be delayed by a frame or two.
The Sony BVM-20F1U looks so good because the monitor's horizontal resolution is about 900 lines, double the average NTSC TV.
The Sony BVM-20F1U looks so, so good because the monitor's horizontal resolution is about 900 lines, according to Sony, which is basically double the average NTSC TV set. Basically, this means that if you displayed 900 vertical lines side-by-side on the BVM's 20-inch screen, you'd be able to discern each and every individual one. On the average TV, the lines would begin to blur together in the 300s or 400s. And those TVs would commonly be larger than 20 inches diagonally.
"On the Sony BVMs, the TVL (horizontal resolution of the RGB dots/aperture) is so high that you cannot see it from 1 ft away," writes Fudoh. "This makes the picture look very much emulated--basically like a LCD with scanline emulation."
This is why the BVM is almost too perfect, and where nostalgia can raise its (ugly?) head. CRTs typically use shadow masks or aperture grilles to separate their color phosphors, guiding each electron gun to ignite the correct phosphor and create the corresponding color. But shadow masks could heat up and expand during operation, producing blooming.
"Scanlines on a CRT come in many different flavors," writes Fudoh. "The biggest factor is the luminance blooming of the cathode ray beam. The higher the luminance, the more bloom you get and the more nondescript the scanlines get. A BVM hardly blooms. That's why the scanlines are so extremely prominent on those sets. A CRT scanline is always black and the actual visibility of the scanline depends on the luminance and blooming present on the rows above and below."
Factor together that blooming, potential color bleeding from convergence, and the lower horizontal resolution of the average TV, and you have a much softer picture. By comparison, the BVM is sharp as a razor.
The BVM-20F1U's original $10,000 asking price has come down into the much more affordable triple digit range over the past decade, but finding one isn't necessarily easy. With a little luck, it's possible to snag one on Craigslist or from a local seller for a couple hundred bucks. On eBay, they tend to sell for about $500.
The Almost-But-Not-Quite-as-Good CRT Alternatives
Chasing after perfect picture quality means obsessing over tiny details. In the case of a monitor like the BVM, it means spending hundreds of dollars. But if you just want to get most of the way there, if you're happy with really good instead of incredible, things get a whole lot easier.
For example, instead of hunting down a professional broadcast monitor, you could buy a pretty kickass conusmer CRT television.
"Based on my personal experience and research, I would suggest that the Sony KV-XXFV310 (where XX is the size in inches) is the overall best CRT for retro systems," writes Corban. "It has a few features not found on many other models. It has an internal subwoofer, which is surprisingly effective and balanced. The comb filter, used for all composite video systems, is '3D digital,' the most sophisticated type. Finally, and most importantly, this TV has a high voltage power regulator. What this means is that bright scenes, lines, or text will not cause distortion of the image. I have yet to find any other consumer set with this feature."
For something in between the BVM monitor and the KV television model, there's Sony's PVM line of monitors, which were also used in broadcast production. The PVM-20M4U, for example, boasts 800 lines of horizontal resolution.
And as soon as you broaden your focus from TVs ideally suited to 240p consoles, the choices become overwhelming. Corban talked up the advantages of the Sony KV-XXFV310 model in an old NeoGAF thread devoted to the best CRTs, but a number of other posters proclaimed that the Sony FD Trinitron WEGA KD-34XBR960 is the best HD CRT ever made. But moving into HD territory almost guarantees issues or imperfections with low resolution signals--it's better than an LCD for 240p content, but not the same as a monitor dedicated to that resolution.
Also, it weighs 198 pounds.
Ultimately, the size and weight of CRTs, more than their picture quality, paved the way for an LCD takeover. Now we all have giant flatscreens, but our upscaled SD consoles look awful. But that's an inevitable side effect of that increase in resolution, right?
Well, not exactly. Input lag and poor scalers inside most HDTVs make a mess of low resolution inputs, but that's not how it has to be. Even with his eye for detail, Fudoh actually prefers LCDs over CRTs when they're combined with the upscaling magic of an image processor, a device dedicated to converting a video signal from one type and resolution to another.
For years, they were expensive, extremely complex and finicky pieces of technology. But in 2011, a Japanese company called Micomsoft changed that with an image processor called the XRGB-Mini.
Scanlines, meet high-definition television.
The Framemeister XRGB-Mini
"If you like to play classic videogame systems from time to time and have switched to a Flat Panel TV lately you will have noticed that most games look plain ugly," writes Fudoh on Hazard-City.de. "Most of the time it's the horrible deinterlacing combined with the TVs video electronic which is optimized for video material (movies, TV shows, but not graphics). Deinterlacing is neccessary because your old videogames have a 15kHz video output. Since LCD and Plasma displays are progressive by nature, the incoming signal has to be deinterlaced (linedoubled) to 31kHz before it can be displayed. The TV can take over this job, but it won't look nice.
"To get the best picture out of your old systems you can buy an external deinterlacing device which takes the system's video signal, performs some kind of linedoubling and outputs a 31kHz signal, so the TV has just to do a little scaling before it can show the actual picture."
Among the 40-odd image processors reviewed on Fudoh's site is the XRGB-Mini Framemeister. He calls it the king of 240p processing. Fudoh and other hardware enthusiasts even built a wiki for the XRGB-Mini to help new users.
Prior to the Mini, Micomsoft released a slightly more versatile but considerably more finicky processor called the XRGB-3. Its menus were naturally in Japanese until Fudoh and the members of the shmups forums helped Micomsoft release an English firmware patch. The Mini offers English support out of the box.
The Mini lacks the variety of inputs of some other image processors, but makes up for that shortcoming in quality and straightforward usability. Fudoh practically gushes in his review:
"The Framemeister truly shines with all 240p signals I've thrown at it so far. 240p signals are recognized on all analogue inputs. No deinterlacing is applied....Without any doubt, the Framemeister takes the top position for any 240p processing devices. By using different output resolutions (480p, 720p or 1080p) you can choose different sharpness levels. On 1080p the processing looks absolutely razorsharp. With the right scaling options it looks just as nice as on the XRGB-3 (in B0 mode). As long as the scanlines are rendered the way they are in 1080p, 720p is my favorite output resolution. It's not as razorsharp as 1080p (because of the TV's additional scaling), but it looks at least as good as the XRGB-3 in B1 mode. With 480p output the picture's still very nice, though a bit softer than the XRGB-3's 480p output - more like what classic Faroudja linedoublers would deliver."
And here's a huge advantage of the XRGB-Mini: speed. As Fudoh explains, HDTVs can introduce serious lag when deinterlacing lower resolution sources:
"Of course an external processor will always ADD delay to your TV's native delay. When you read a review of a modern TV set and read something about lag (usually 16 to 60ms)...what the review sites measure with this is the minimal lag a display can offer. Without additional processing, deinterlacing or scaling. Once you connect an analogue 15kHz source to the same display, the lag will dramatically increase. The display has do some [analog/digital] conversion, deinterlacing and upscaling...
"Unfortunately many TVs simply don't offer different signal paths depending on the kind of signal. Progressive signals are routed through the same processing stages that are required for interlaced signals--that's what causes unnecessary delay.
"And that's where an external processor can save time. Treating 240p as a truly progressive signal allows the processor to skip the deinterlacing and jump to linedoubling or scaling right away."
The XRGB-Mini is especially fast for an external image processor. It adds between 1 and 10 milliseconds of processing time, which Fudoh calls minimal enough to support the fast response times needed for bullet hell SHMUPs and similar twitch-based games (by comparison, he writes that HDTVs can often take 30-50 ms to deinterlace and process an input).
The Mini's scanlines are optional and adjustable in thickness, which is an advantage over the bold scanlines of the Sony BVM-20F1U. WIth a little tinkering, a 20-year-old console can look like it was born to run on an LCD.
Remarkably, out of the dozens of image processors reviewed on Hazard-city, none of them were really designed specifically for 240p content. "They've all been built to work with 480i signals and some just happen to be good at working with 240p signals," Fudoh writes.
But he's optimistic that that's slowly changing. The Mini is far more accessible than its predecessor, and video hardware enthusiasts are making their own affordable VGA boxes for systems like the Dreamcast (with built-in scanlines to boot).
"The biggest problem today--universally true for cheap scalers from eBay, high-end home theater processors or the XRGB series from Japan--is that people need a lot of custom cabling and none of the devices are tested with all the major vintage consoles," Fudoh writes. "The whole concept of external video processing is still hard to approach, if you don't know how to solder your own cables."
In the future, the best image processor may even come from the community that has, for so many years, obsessively tested out every CRT monitor and deinterlacer it could get its hands on.
"Tech is getting more accessible," writes Fudoh. "Crafty-Mech (of arcadecontrols.com) is currently working on a simple RGB to VGA linedoubler and I'm confident that it will blow most other cheap processors out of the water. On a more advanced level FPGA processing is becoming more affordable as well and projects like the Universal PPU jump to mind (to replace the NES' composite video graphics chip with a RGB and VGA enabled one). Just combine a few approaches like these into one project and it would be easy to create a truly perfect processor at a price tag half of what you have to put on the table today to get near-perfect quality."