Quantcast
Latest StoriesTVs
    The Best Blu-ray Player Today

    After spending almost 20 hours with the best new Blu-ray players for 2014, the $90 LG BP540 came out on top after our previous pick was discontinued. The LG fits our criteria for a good player thanks to integrated Wi-Fi and the most popular streaming apps. More importantly, it has a better interface and video quality than the competition and offers the best combination of price and performance of those we looked at.

    Who am I to make that claim? I’ve been handling almost all the Blu-ray reviews for Secrets of Home Theater and High Fidelity since 2010 and have had nearly three dozen players come through my hands. I’ve subjected them to countless objective and subjective tests. I’ve even thrown them on a $15,000 HDMI Analyzer to verify their performance. Often, as is the case with the LG, the picture from a cheap player is 100 percent identical to an $8,000 player’s.

    If the LG BP540 sells out, the $90 Sony BDP-S3200 is our runner-up choice that is almost as good. The menu system is more confusing than our top pick’s and the overall interface leaves a lot to be desired, but it offers a wide selection of streaming content, and Blu-ray content does very well. Be warned, though: The Sony shows some jaggies while watching DVD content with diagonal lines.

    With more expensive players, you’re usually paying for better CD playback quality or niche features. Along those lines, and if you also want the absolute best in audio and video quality, the $600 Oppo BDP-103D is the best high-end player you can buy. It has better DVD scaling than any other tested player, performs flawlessly even with foreign content and weird frame rates, and supports all audio formats as well. The integrated Darbee video processing is a favorite of most reviewers, including video purists, and Oppo has better service and support than other companies. For most people, though, the price difference isn’t justified.

    Our pick from 2013, the Sony BDP-S5100, would still be our recommended pick if it were still being manufactured.

    If you only want Blu-ray playback and don’t care about streaming whatsoever, the Samsung BD-H5100 is our step-down choice at $63. It does fine with Blu-ray content and the lack of Wi-Fi saves you some money, though it also means you’ll have to perform firmware updates manually or have hardwired Ethernet to do so. You’ll want to have updated firmware since it may affect your ability to play newer Blu-ray discs in the future.

    Our pick from 2013, the Sony BDP-S5100, would still be our recommended pick if it were still being manufactured, but alas, it is not. It was less expensive than the LG, had the same streaming options, and loaded discs faster. If you bought our pick from last year, or you happen to find it somewhere on closeout, there is no real need to upgrade.

    The Best Television You Can Buy Today

    If I was in the market for an awesome television, I’d get the Samsung F8500 series, either in 51-, 60-, or 64-inch sizes (about $1,800, $2,400, or $3,100, respectively). This is a fantastic looking television, with a punchy brights, deep darks, lifelike and accurate color, excellent detail, and great performance in rooms with lots of light. While pricey, it has one of the best pictures of any TV in recent years according to all the major TV reviewers.

    The F8500 is likely the last great plasma TV (more on this later). We think that those looking for the “best” TV will love the F8500. Its combination of a bright image, dark black levels (and correspondingly high contrast ratio), lack of motion blur, and highly realistic color make for an addictively gorgeous image.

    If it doesn’t fit the bill, we have some other options that may suit you. However, this is still early in the year for TV reviews, so we strongly recommend you wait if you can. We can recommend some “good” TVs, but we won’t know what’s the (truly) best runner-up until more models are reviewed.

    The Samsung F300 is a good step-down pick if you want to save at least $1,000 (or more, depending on which size you buy). It’s not as bright and doesn’t have as good contrast ratio as our pick, but it still has very good picture quality.

    If stepping down, we recommend the F5300 from Samsung, which costs much less, though it doesn’t have quite the same level of picture quality. It comes in 51-inch ($1,000 cheaper), 60-inch($1,500 cheaper), and 64-inch ($1,800 cheaper) screen sizes. The F5300 isn’t as bright as the F8500, doesn’t have as good a contrast ratio, and doesn’t look as good in bright rooms, but still has very good picture quality.

    If saving a lot of money is your goal, we recommend getting our pick for Best $500 TV, which is only 720p but has excellent picture quality for the price. And it is, you know, $500. Similar to the F5300, the F4500 (our $500 pick) isn’t as bright as the F8500, nor is its contrast ratio as high. And it’s got that lower resolution of 720p (the F8500 and F5300 are both 1080p sets). So the F8500 looks a lot better, for a lot more money.

    The Best 4K Monitor Doesn't Exist Yet

    Like 1080p before it, 4K is the new, ultra-high-resolution format that promises better detail and greater image clarity due to the huge number of pixels packed into your screen. “Buttery-smooth text rendering and wonderfully detailed photos,” promises MakeUseOf. Just consider the quality differences between Apple’s Retina Display MacBooks and its standard MacBooks: it's the same pixel-increasing principle.

    That said, we don’t think it’s the right time to buy one.

    While most 4K monitors are still very expensive, we’re starting to see a growing number priced under $1,000: Samsung’s $700 U28D590D, Dell’s $700 P2815Q, and Asus’ $650 PB287Q are already available. Intel and Samsung even recently announced a partnership where they’ve pledged to try and push high-quality, 23-inch 4K monitors to a super-low price of $399. We think it’s worth waiting for some of that to pan out rather than pushing for an expensive early-adopter monitor right now (though you’d be foolish to buy a 24-inch 4K display, we can only hope that Intel and Samsung’s ambitions can push down prices on larger displays).

    Even expensive 4K monitors struggle with the same major weaknesses right now: outdated display connections, beefy hardware requirements, and lack of OS/application support. Cheap 4K monitors can have all those problems and more, sacrificing image quality in order to cut costs.

    TV Quest 2014: Shopping for a Bedroom Television

    A month ago, I decided that I needed a TV for my bedroom. And then two weeks later, I bought that TV, a 40-inch Vizio E400i-B2. As far as "TV Quests" go, it was a relatively short process from shopping research to purchase. Not because I was in a rush to buy a TV, or because I didn't care about the quality of the set. What I found is that finding the right TV for your needs is actually easier than ever. With guides like those on TheWirecutter and a plateauing of unnecessary optional features for 1080p sets, it's not difficult to match up your shopping criteria with what's available on the market. The hurdle to TV shopping is coming to terms with what you need, and having an understanding of how those requirements are addressed by the various TV manufacturers. More often than not, that also means adjusting your expectations and preconceptions of those needs when you see the TV in person.

    This is a walkthrough of how I came to choose the Vizio set, how using it has changed my perception of built-in apps, and why I bought my TV in store at a brick and mortar store instead of ordering one online.

    The Best $500 Projector Today

    If I were in the market for a $500 projector, I’d buy the Acer H5380BD. I base this on 40 hours of research and objective testing with over $20,000 of test gear and side-by-side comparisons with the main competition. The H5380BD offers the best overall picture quality in its class; minus a slight resolution bump (720p down from 1080p), our pick here is surprisingly comparable to our $1,000 projector pick. The H5380BD is bright, has a decent contrast ratio, and its color accuracy is similar to other projectors in its price range

    Fed decent video content (like Blu-ray), the H5380BD puts out an extremely watchable image. And its input lag is low—faster than most TVs and high-end projectors—making it a good choice for gaming.

    However, the difference between several runners-up was very close. The H5380BD is very similar to the $389 Vivitek D557W and $604 D803W. All perform reasonably well, given their prices, and each has its own picture quality issues. So while we feel the H5380BD is the best for reasons we’ll explain in this guide, the difference between it and its two main competitors is very close.

    The Acer H5380BD compares well to our favorite $1,000 projector for half the price. It’s bright, has good contrast ratio, and has great overall picture quality.

    If for whatever reason the Acer becomes unavailable temporarily, we recommend getting the aforementioned $389 Vivitek D557W. It doesn’t look quite as good, with slightly more washed out colors and a lower contrast ratio, but its picture quality is fairly similar, just as bright, and a bit cheaper.

    If you’re susceptible/hateful of the DLP artifact known as “rainbows,” check out the Epson 730HD. It’s brighter than our pick but has a much worse contrast ratio, so it doesn’t look as good overall. It’s LCD, though, so no rainbows.

    If you have a small room or want a “short throw” projector that only needs to be a few feet from the screen, consider the $589 Optoma GT760. It puts out a similar image to the Acer and Vivitek, but isn’t as suitable for a more traditional projector/screen placement.

    The Best Indoor HDTV Antenna (For Cities) Today

    According to our tests, the HD Frequency Cable Cutter is the best-performing indoor antenna you can buy if you live in the thick of a city. It outperformed 12 other models in midtown Manhattan as part of a test pool that included both amped and unamped antennas.

    In Manhattan, the unamped Cable Cutter pulled in the most stations with very little interference and offered a perfect-looking picture for many channels. The antenna also fared well in our follow-up tests in Chicago and the San Francisco Bay Area.

    However, it’s worth noting that the antenna didn’t perform well in our Brooklyn tests, where it finished near the bottom of the pack (although it performed decently in subsequent Chicago and Bay Area suburban tests). This antenna is also pretty big, costs $90 to $100, and doesn’t come with a stand.

    If you live more than 10 miles from a broadcast tower, have a ground-level unit, or care how your antenna looks in your living room, you should go with one of the better-amped models that also did well in our tests: the Mohu Curve 50 or the budget but high-performing Monoprice 7976 MDA Indoor/Outdoor Antenna With Low Noise Amplifier.

    The Trouble with Antenna Recommendations

    TV antennas are notoriously hard to recommend; a recent Consumer Reports roundup concluded that they couldn’t really rank antennas based on performance.

    You might be best off trying the cheapest antenna and then upgrading to our higher-priced recommendation if the cheapie isn’t up to snuff.

    That’s because there are a lot of variables to consider: how close you are to a broadcast tower, which direction your window is facing, how many tall buildings are between you and the transmitter, what the terrain is like in your immediate environment, which stations are most important to you, how much you’re willing to spend, and whether you care what the antenna looks like. When you throw in the unpredictable performance variations between locations, it’s nearly impossible to come up with a “one size fits all” pick.

    We knew all this going into our tests, but that’s exactly why we wanted to take on the challenge. So while we recommend the Cable Cutter and the two alternate picks for city-dwelling folks, be prepared to experience very different results in your own location. You might be best off trying the cheapest antenna you can find, seeing how it performs, and then upgrading to our higher-priced recommended models if the cheapie isn’t up to snuff.

    The Lasting Legacy of the DIVX Disc

    Do you remember the DIVX disc? DIVX, not to be confused with the video codec DivX, was a movie rental scheme that Circuit City and some law firm cooked up to try and disrupt the video rental market five years before Netflix existed.

    For $4 or $5, you could buy a movie on a DIVX disc at Circuit City, Good Guys, Futureshop, or another retailer, then take it home to watch it in a special DIVX player. The player would connect to the Internet using a dial-up phone line and authorize your player to watch that movie for a short period of time. You could watch the movie as many times as you wanted during that window, but once your time was up, you'd have to "rent" the movie again for a few more bucks.

    Photo credit: Flickr user weirdo513 via Creative Commons, from PAX East 2011.

    Sound familiar?

    DIVX ultimately failed, likely because of the upfront cost and quality issues with the actual films. To play the discs, you had to buy a player that cost $100-150 more than a DVD-only player, and you had to run a dedicated phone line to the box. Most DIVX versions of movies were lower quality than their DVD counterparts. The DIVX discs usually contained cropped pan-and-scan version of the film, rather than the anamorphic widescreen that was becoming common on DVDs. The discs also lacked extra features--they didn't contain making-of documentaries, deleted scenes, or audio commentaries.

    People also had privacy and ecological concerns with the format. We feared that the DIVX player would spy on their behavior, uploading their disc viewing activities during its regular calls into the DIVX mothership. There were also concerns about the wastefulness of a disc-based format designed for single viewing. After the rental period expired, the discs were essentially worthless, and people were concerned that if DIVX succeeded, our landfills would end up filled with one-use plastic discs with copies of Speed 2: Cruise Control and Enemy of the State.

    Photo courtesy eBay user imodify.

    So if DIVX was such a bad idea, why am I talking about it today? It started with this Twitter post, from Dave Pell. Dave, who is responsible for the excellent Next Draft newsletter, is one of many people who have complained about the hidden catch of the 24-hour time limit. His complaint is that when he starts watching a film on a weeknight evening, if he doesn't finish it during that sitting, he won't have a chance to come back to it until the 24-hour rental period has expired. Unless he pays another $6, he'll never see the nail-biting conclusion to The Adjustment Bureau. I wanted to find the origin of the 24-hour rental window, as it exists on iTunes, Amazon's Instant Video, the Google Play store, Microsoft's Xbox Video, and pretty much every other on-demand video rental service I've seen*.

    In Search of Scanlines: The Best CRT Monitor for Retro Gaming

    [Editor's note: This story was originally published on July 11, 2013. We're resurfacing it this week as part of our tribute to the great feature work that writer Wes Fenlon has done with Tested, as he embarks on his new career in games journalism.]

    Is there such a thing as the best television set ever made? If so, would that honor go to today's incredibly thin 80-inch OLED flatscreens? Or the 4K-resolution television sets just arriving on the market, which pack in 6,220,800 more pixels than 1080p screens? Defining "best" is difficult. A higher density of pixels allows 4K TVs to display a more detailed picture, but what happens when you plug in an old Super Nintendo, which outputs a mere 57,344 pixels? Poor Super Mario World has to be upscaled to more than eight million pixels, and the resulting image can look terrible--blocky, blurry, and all but indistinguishable from how it looked on a CRT TV back in 1994.

    Adopting new television technology means saying goodbye to the advantages of older hardware. And yes, there are advantages. There's no such thing as a best TV for all eras of content. But here's a question that's actually possible to answer: What's the best display specifically for retro video games?

    Photo credit: Flickr user artemiourbina via Creative Commons

    Now we've got some parameters to work with. The TV needs to handle low resolution inputs at the proper framerate and aspect ratio, without lag, and with accurate colors. And, of course, it needs to be able to have visible horizontal scanlines, a defining visual element of the way retro games were seen and played.

    Out of thousands and thousands of models, the single best TV for retro games is quite possibly the Sony BVM-20F1U, a 20-inch broadcast production monitor that cost about $10,000 when it was introduced in the late 1990s. It is, of course, a CRT, and it's a 15khz display, meaning the highest resolution signal it accepts is 576 interlaced lines, or 576i. That limitation, however, makes it absolutely perfect for everything from the Nintendo Entertainment System to the first PlayStation.

    In fact, it may be a little too perfect.

    "The problem is that most people [in the 80s and 90s] had television sets at home which in no way resemble what a high-end CRT looks like if used today," writes Tobias Reich, who has been experimenting with video hardware for more than a decade. "For example, take this comparison shot (both taken from real CRTs): you get a high-end Sony BVM on the left and an arcade Nanao 15khz chassis on the right. Quite a difference, right?"

    Image courtesy Fudoh/Hazard-City.de

    On his website Hazard-City.de, Tobias Reich--who goes by the handle Fudoh online--has been compiling a 40,000 word (and counting) guide to deinterlacing, scaling, and processing game console video since 2008. He's also a regular poster at the shmups.system11.org forums, which he calls "the best hardware discussion board on the net." The shmups forums offer a window into the world of diehard collectors, who hunt for old CRTs and expensive scaling hardware, and then tweak, tweak, tweak in search of that perfect picture.

    48 FPS and Beyond: How High Frame Rate Films Affect Perception

    [Editor's note: This story was originally published on Dec 20th, 2012. We're resurfacing it this week as part of our tribute to the great feature work that writer Wes Fenlon has done with Tested, as he embarks on his new career in games journalism.]

    Bilbo Baggins waddles down the dimly lit hallway of his cozy hobbit hole, its cramped quarters at once instantly familiar, though they've lain dormant since 2003's Return of the King. Familiar, yet something feels off. The old hobbit moves too quickly, and as he opens a chest, peering fondly at the relics of his adventures collected within, I expect him to pull out a well-worn trinket and suddenly appear on the set of Antiques Roadshow. Or to reminisce himself into a flashback and be carried away into a PBS revolutionary war reenactment, where Bilbo's outfit wouldn't look entirely out of place.

    In its opening moments, The Hobbit's 48 frames per second cinematography overwhelmingly reminds me of a public broadcast television program, filmed at a slightly-too-fast 30 frames per second.

    Since silent films gave way to talkies in the 1920s, the frame rate of 24 frames per second has become standard in the film industry. 24 fps is not the minimum required for persistence of vision--our brains can spin 16 still images into a continuous motion picture with ease--but the speed struck an easy balance between affordability and quality. For the past century, cinema has trained us to recognize 24 frames per second as a reflection of reality. Or, at least, a readily acceptable unreality.

    Doubling that speed to 48 frames per second removes the motion blur and strobing of fast-moving images, argued Peter Jackson in 2011. "3D shows you a window into reality; the higher frame rate takes the glass out of the window," said James Cameron. But to the average viewer, 48 fps looks like an exaggerated version of a television program shot at the common video tape speed of 30 fps.

    The stigma around higher frame rates leads to an important, and extremely complicated, question about how we perceive film: Why does 48 frames per second look so weird?

    "[Film] is the medium we're exposed most to in our everyday life and it has evolved very rapidly in the last 100 years to permeate all aspects of visual culture. And yet so little is actually known of the psychology of viewers and how we make sense of what's presented on the screen in front of us," says Tim J. Smith, a lecturer in the psychological sciences department at Birkbeck University in London. Smith specializes in film cognition, studying how our brains process images and how perception interacts with the world of film.

    In studying film cognition, Smith worked to link the language of film--moviemaking conventions and guidelines like the 180 degree rule--to their cognitive foundations. He came up with the attentional theory.

    "The basic idea is that in the first few decades that film was around, at the start of the 20th century, filmmakers went through a rapid phase of self-experimentation," he says. "There were so many things they could do with the camera and with editing that they would try out things and see how they worked on themselves and see how audiences liked it. What they were doing was seeing which techniques were acceptable to their own visual system, which things made it easier for them to see what was happening on the screen and to make sense of the narrative. The things they experimented with that didn't work didn't get picked up by other filmmakers, so they died out very rapidly. You had this very rapid standardization towards how to shoot a scene and actually edit it together."

    After about 90 years, that standard of the film language may be rewritten.

    Part of that standardization was the frame rate of 24 frames per second. Now, after about 90 years, that part of the film language may be rewritten. The audience will have to learn to read again--and judging by The Hobbit's 48 fps presentation, filmmakers will likewise have to relearn how to write. Smith helped shed some light on the psychology of high frame rate film and why our brains so vehemently reject it. Read on.

    Why I've Changed My Mind about Connecting My DVR to My Xbox One

    I almost changed my mind about the watching TV using the Xbox One. After my initial testing last year, I didn’t think using the console to control my TV viewing was a good idea. Using the Xbox as an extremely limited universal remote wasn’t appealing, I already own a very good universal remote. The Xbox One TV experience is wrought with problems--having to use a game controller instead of my trusty remote to change channels was a drag, the Xbox didn’t have access to the commands needed to navigate the lists of shows recorded on my DVR, and using the voice controls to find or change channels was usually more frustrating than convenient. The limited functionality that Microsoft adds to TV viewing was disappointing, so I wasn’t planning on reconnecting the TiVo to the Xbox.

    After talking to my wife, I decided to give it another try. She hated using the TiVo filtered through the Xbox, but she was into the voice controls. So when I hooked the Xbox back up, I made one important change. Instead of relying exclusively on the Xbox’s inadequate controls to work the TiVo, I left two separate control profiles setup to work with the Xbox on my Harmony remote. One profile keeps TiVo’s dedicated controls, and the other mimics the Xbox controller, for navigating Blu-ray menus, Netflix, and other apps. Instead of using the Xbox as the only remote for the DVR, I found it worked much better as a supplemental remote.

    CES 2014 Impressions: Sony's Head-Tracking HMD Prototype

    Had to share my thoughts on this one. Sony's President surprised us by announcing a prototype head-tracking system at his CES keynote, and by actually having demo units of that system at its booth. If you've heard about this prototype, you'd no doubt heard it in the context of an Oculus VR competitor. That's technically an apt description, but a better one would be to call it a reaction to the Oculus Rift. Sony's head-tracking system is no more than a motion tracker attached to the back of its existing third-generation HMZ-T3 head-mounted display, connected to a smartphone over Bluetooth. And after trying their demo, it feels like something rushed in development for CES, a me-too response to the success and buzz of the Oculus Rift.

    Like with many other hardware prototypes we've seen at CES, technical details were scarce. The Sony rep running the demo couldn't say what kind of motion sensor was being used, nor had any details about latency goals. The HMZ-T3, which retails by itself for $1000, was made to watch movies and television shows, not to envelope the wearer's field-of-view with video. Two 720p OLED displays give the illusion that you're sitting in front of a very large television, filling about 45 degrees of your field-of-vision as opposed to the Oculus's 110-degrees. That difference alone detracts significantly from the immersion, and the Sony rep couldn't say whether there was any plan to put the head-tracker on a different display in the future.

    The tracking itself was serviceable, but definitely underwhelming. Sony's demo was a pre-recorded video sequence of a race on city streets, running off of a proprietary app on an Xperia phone. The video was shot with 180-degree FOV, so I could look around to my left and right during the sequence, accompanied by stereo audio. If I moved my head slowly, the tracking was able to keep up and maintain an illusion of VR, but any sudden shakes or nods would reveal the prototype's latency limitations. I can't pinpoint exactly how long the delay was, but it was noticeable without much effort--my guess is between 50 and 100ms. The size and placement of the motion tracker at the back of the HMZ-T2 also indicates that it can only track six axes of rotational movement as well, while Oculus has already made inroads into positional tracking for its latest prototype.

    As a proof-of-concept, Sony's head-tracking showing is more a foot in the door in an exciting VR space than an indication of any real (or good) product. It faces stiff competition not only from Oculus, but from other HMD makers who have head-tracking features as well (Avegant's Kickstarter HMD will be $500). But competition is good, and the best thing to come out of Sony's announcement and prototype is the validation of this new generation of VR. Momentum is growing.

    CES 2014 Impressions: Curved and Bendable Televisions

    There were two dominant trends in televisions at this year's CES. First was 4K--every TV manufacturer had an Ultra HD TV at the show, alongside some 5K televisions (21:9 aspect ratio) and 8K prototypes. We get the push for 4K, but whether or not it'll make sense for consumers to upgrade is still up in the air. The other big trend was curved televisions. Both Samsung and LG had fleets of curved TVs at their booths, utilizing various panel technologies and at a range of resolutions. What they didn't have were definite answers to the question of why a curved screen is inherently better than a flat one; Samsung's many product managers would only stick to the line that a curved screen provides "the ultimate immersive experience." Of five product reps manning the display section of Samsung's booth, all but one stuck to that singular non-explanation.

    One rep was able to be a little more specific, but not by much. He talked about how a curved TV mimics the curvature of a cinema's screen, and how a curved display gives better off-angle viewing clarity than a flat screen. There's an illusion that the screen is bigger than it actually is, I was told. But when I asked how the actual curvature of a screen is determined in the design and manufacturing process, I just got shrugs.

    Samsung's reps couldn't explain exactly how curved their screens were, or whether larger models followed the same curvature as smaller ones. The one detail curvature specification the booth provided was a radius number--13.7 ft for a 65-inch model. A rep clumsily explained that you have to envision a full circle with a radius of 13 feet, and that the television represented a slice of that circle. I asked if that meant that the ideal sitting distance from that screen was 13 feet so you would be in the "center" of that circle, but was told that it was actually half that distance around 6 feet. For Samsung's bendable television prototype (it goes from flat to curved and vice versa, but no one could say why you would want to watch a flat image if curved was inherently better), the rep listed a radius spec as around 4000 millimeters. Huh.

    At LG's booth, I found a rep that was slightly more helpful. His explanation was that the curvature of their screens matches the curvature of the eyeball, so the image encompasses more of your field-of-view when sitting at the right distance from it--which for an 77-inch model was pegged at 6 feet. That made a little more sense, and this rep was also able to confirm that these panels are cut from essentially the same stock as flat displays, just put behind curved glass in the later stages of the manufacturing process. The curving of the screen also doesn't bring the pixels closer together, so it's not about increasing fill factor.

    After staring at curved televisions for about 15 minutes--Samsung had a setup placing a curved screen next to a similar flat one--I wasn't able to discern the tangible benefits of the shape. One setup used software to dynamically enhance contrast in different parts of a scene to give an illusion of depth, but it just looked like a really saturated image. The advantages of curved screens lie in illusions of perception, but the real trick may be getting people to care (and pay) for a feature that may not matter much at all.

    Additional photos of the curved televisions I saw this week below.

    What You Should Know About PlayStation Now

    Sony PlayStation Now sounds like a schmaltzy documentary, but it's actually the implementation of Gaikai we've been anticipating since Sony bought the game streaming company. Beginning this summer (or in late January if you're a beta tester), PlayStation Now users will be able to stream PlayStation 3 games from vast server arrays to their PlayStation 3s or PlayStations 4s or PlayStation Vitas or 2014 Bravia TVs.

    Support for all of those platforms won't happen at once; Sony's blog explains streaming will begin with PS3/PS4 consoles, come to the Vita next, and then Bravia TVs. After that, PlayStation Now will expand beyond the land of Sony hardware, which means tablets and smartphones. Android's almost certainly a given, but iOS and PC/Mac web browsers could be targets, too. Gaikai's original demo made Mass Effect 2 playable in a browser.

    Photo credit: Sony Electronics Flickr.

    Sony purchased Gaikai in July 2012. In 2013, Sony announced that the PS4's x86 architecture, which was a major departure from the PS3's PowerPC Cell processor, ruled out backwards compatibility. The solution was streaming old games from the cloud via Gaikai, but the technology wouldn't be ready for launch, and that was about all we heard about the streaming for the rest of the year.

    Theoretically, PlayStation Now could allow gamers to stream thousands of PlayStation games, from the PS1, PS2, and PS3 to modern hardware. But there are a lot of variables we don't know about. For a good gaming experience, PlayStation Now will need to be low latency, and that will be affected by how big the data centers are, where they're located, and the speed of the end user's broadband connection. With low bandwidth, games are going to be laggy and artifacted.

    Even with a blazing 50 or 100 megabit connection, the pressure will be on Sony to deliver a high bitrate stream at as low a latency as possible. Sony recommends at least a 5 megabit connection, but the specifics will likely evolve a bit after PlayStation Now goes into beta in late January.

    PlayStation Now leads off with PS3 games, and we don't know if that library will just include first-party titles or a wider selection of games. Likewise, PS1 and PS2 games aren't announced for streaming, but seem likely for the future. Sony's blog states games will be rentable individually, but a subscription option will also be available. PlayStation Now will likely tie into Sony's PlayStation Plus service in some way, but that all-you-can-eat subscription won't be a free giveaway.

    Early reports about Now from CES 2014 are positive. Polygon writes "Performance in games like The Last of Us and God of War: Ascension was impressive. Lag input was noticeable, seemingly more so on Vita when moving The Last of Us' Joel and waiting a beat for him to respond, but more than playable. Even the higher frame rate, faster paced action of Ascension was playable, though compression artifacts and more muted colors were present."

    A couple more tidbits: Now users will be able to play multiplayer games as normal, against or with players playing games with a disc or download version of a game. The Vita's back touch panel will compensate for its missing trigger and clickable stick buttons. The DualShock 3 will sync with 2014 Bravia TVs over USB or Bluetooth.

    If you want to be among the first to get "exclusive information" about PlayStation Now--like how to sign up for the beta, perhaps--Sony's got an email form for you to fill out.

    In Brief: Vizio and Dell Bring Down the Price of 4K

    There are many barriers to 4K display adoption. For televisions, 4K only makes sense for consumers if four major criteria are met: content needs to be shot in 4K, edited in that resolution, broadcast or delivered in a mainstream format, and finally, the televisions themselves have to be affordable. Content providers like Netflix are working to solve the first of those three criteria--eg. with its House of Cards Season 2--and it'll be a year or two until 4K TVs come down dramatically in price. Or maybe not. This week, both Vizio and Polaroid announced 50-inch 4K televisions for $1000, well below what LG and Samsung have been pricing their Ultra HD sets. From reports, Vizio's 4K TV looks more promising in terms of image quality, though there are many unknowns such as refresh rate and input options.

    On the PC side, desktop operating systems and web content can scale to whatever resolution a monitor supports, so 4K adoption there is a more a factor of price. Dell's 28-inch Ultrasharp P2815Q monitor was just confirmed for $700, well below the sub-$1000 promise that Dell made late last year. That's an incredibly attractive price for a 3840x2160 resolution display, and may get me to trade-up from my current 30-inch 1600p monitor.

    Norman
    The Best Small TV Today (32-Inches)

    The $298 Samsung UN32EH5300 is the best small TV. For not much more than a decent 720p display, you get full 1080p resolution and smart TV functionality built-in. Based on the non-smart version, it has better image quality than competitors as well. The catch is that it’s a 2012 model; it only recently dropped in price and therefore could go out of stock soon. In that case, we have other picks as well.

    Our old pick, the Vizio E320i-A0, is a current model that costs $288. That’s a great deal for a 720p panel with Netflix and other streaming content, but the Samsung is better at the moment. The reason we are changing the pick is because the UN32EH5300 cost well over $350 until just recently (which is too much for most people to pay for a TV of this size). It’s likely that the price drop is intended to clear out stock in order to make room for the 2014 models. Until then, the Samsung is the better buy.

    We also recommend the Samsung UN32F5000 if you want a thinner TV to hang on the wall. The image quality should be very similar to the EH5300 and it’s edge lit so it is almost 2” slimmer. It does, however, lack the smart TV features. There is just a general lack of thin 720p sets, so we are recommending a 1080p set instead.

    If you already have a streaming device to use, you should save even more and get the Samsung UN32EH4003. It is only 720p, but it has a nice image and is the cheapest 720p set out there from a major manufacturer today. The main caveat is that it won’t work with a sound bar.

    The Woman Who Recorded 35 Years of News on 140,000 Tapes

    It's weird to think of television being permanently lost. Today we can access modern television broadcasts and movies in so many formats, on so many devices, that video feels eternal. But for decades, television broadcasts, particularly news broadcasts, weren't recorded or preserved. Many of them are gone forever, unless they were preserved by private citizens like Marion Stokes. Fastco has the story of Marion Stokes, who began recording news broadcasts onto VHS tapes in 1977. Once she started, she never stopped.

    Stokes died in late 2012, but she left behind a staggering archive of 140,000 VHS tapes packed into four shipping containers. Her legacy is a vast archive of television news, potentially totaling somewhere in the vicinity of 800,000 hours. Before she began religiously archiving the news, Stokes was a librarian and co-produced a television show. Recording the news eventually became the cornerstone of her life--she would run as many as 8 television and VCRs in her home at once, feeding in new tapes every six hours.

    Photo credit: Flickr user comedynose via Creative Commons

    It's the kind of obsession that could have easily gone to waste after Stokes passed away; those tapes could've been trashed or left to rot away in storage. Thankfully, that didn't happen. Roger Macdonald, who oversees the television branch of the Internet Archive, found out about the collection and reached out to Stokes' son. Now the Stokes estate is shipping the 140,000 VHS tapes to the Archive in Richmond, California, where it will take years to digitize them all. Hopefully, that will eventually lead to the entire collection being available online.

    Photo credit: FastCo

    John Lynch, the director of the Vanderbilt Television News Archive, explained why Stokes' collection is such an important slice of history. Fastco writes: "Early broadcast news isn’t easy to find, Lynch says, because while networks often did a good job of archiving the footage they used to make the show, they were less meticulous about saving the show itself--a pattern he attributes to 'a sense of modesty on their part.' More recent news reports are more likely to be available from stations themselves, but stations typically charge an access fee."

    When cable news became popular, Stokes recorded CNN, Fox, CSPAN, MSNBC and CNBC, catching as much of the 24-hour news cycle as she could. As the Internet Archive digitizes her collection, hopefully we'll be able to see more than the news about any given historical event--we'll be able to see how many different news organizations covered that event, and potentially trace the impact those broadcasts had on public perception and popular culture.

    Hands-On with Avegant's Virtual Retinal Display Prototype

    I was admittedly a bit skeptical when Grant Martin of startup Avegant reached out to me about a new pair of head-mounted displays the startup has been developing for living room entertainment. We've tested other wearable displays that want to replace your big-screen TV, like Sony's HMZ-T1P and Zeiss's Cinemizer, but have never been impressed by the quality of these OLED glasses, at least not enough to recommend them as an alternative to a flatscreen panel for movie viewing. The Oculus Rift made strides in getting us comfortable with the idea of using HMDs with its low-latency head-tracking and use of lenses for immersive gaming, but anyone who watched our Octoberkast 24-hour live stream knows that it still needs to overcome some technical hurdles to be fit for extended use. Display resolution and image fidelity isn't the problem of HMDs, it's eye strain.

    So when Grant and Avegant software lead Yobie Benjamin set up their company's prototype "Virtual Retinal Display"--don't worry, that's not the final product's name--in our offices today to demo some video clips and a PlayStation game, they had to preface it with an in-depth explanation as to why their Avegant's glasses technology is completely different than what we've seen before. The first thing I had to understand, said Yobie, was how the human eye works. The pupil, at the front of the eye, is like the aperture on a camera--the hole that light travels through. That hole grows and shrinks using the iris muscle, restricting or letting more light in depending on the intensity of the light source. But what really matters is the retina--the back of the eye that actually receives light and converts it into signals for your brain. Getting the right amount of light to the retina is what Avegant cares about.

    The second lesson was another in human biology, continued Yobie, and understanding that the vast majority of light that our retinas receive is reflected light. It's light that has bounced from the sun or bulbs off many surfaces before it gets to our eyes--so human evolution has conditioned our our eyes to be more comfortable with reflected light. Eye strain occurs when we're staring at directly emitted light, like that coming from a phone display or computer monitor. Or, as you can surmise, every head-mounted display on the market so far. That's the problem Avegant thinks it has solved with its technology.

    Instead of using LCD or OLED display, Avegant's Virtual Retinal Display actually has no displays at all. What it has are two RGB LEDs (one for each eye) that each emit a controlled amount of light at a chip with millions of microscopically small mirrors, bouncing the right colors directly into your retina. In effect, it's a low-power projector that's shining reflected light into your eyes. If micromirror-based image projection sounds familiar, that's because it's the same technology used by old DLP televisions and current DMD projectors. And indeed, the mirror matrix in these glasses are Texas Instrument DLP chips, adapted using proprietary technology for HMD use.

    The result is a stereoscopic image with almost no pixelation and an incredibly high fill factor--meaning no screen door effect. In fact, you can't even see the pixels because there is no actual screen. You just see one fused image projected through an optical lens in front of your eye.

    The Best Wireless HDMI Video Transmitter Today

    If I were buying a wireless HDTV transmitter, I’d buy the $200 IOGear Wireless HD Digital Kit (GW3DHDKIT). This thing transmits 1080p video to a small receiver attached to your TV from up to a claimed 100 feet away.

    I base this on multiple professional reviews, plus my experience reviewing multiple HD transmitters myself. (On top of that, I’ve been reviewing AV gear for publications like CNET and Sound+Vision for over a dozen years.)

    The IOGear transmitter base unit has two HDMI inputs and an HDMI output. This means you can have a TV, transmitter and sources (Blu-ray, cable/satellite box, etc) wired up in one room while wirelessly sending the same signal to another TV elsewhere in the house. In my testing, and in that of others, there is no appreciable decrease in picture quality, except at very long distances—and even then, it’s only noticeable on really large screens.

    Perhaps my favorite feature of the IOGear is that the receiver unit can be powered using only a USB port (which is super convenient since most modern TVs have USB inputs). This means the IOGear can draw power from the TV without extra wires connecting to a power outlet. This makes it easy to hide and an effective alternative to cutting holes in your walls to hide cables. Amongst all the models worth considering, the IOGear is the only one that has this ability. In the off chance that your TV doesn’t have USB, the IOGear comes with a power adapter too.

    The Difference Between MHL and SlimPort

    It's easy to take for granted how conveniently plug-and-play so many of our devices are, these days. Apple's Thunderbolt connector, for example, incorporated the DisplayPort video standard, making it easy to plug DisplayPort cables into Thunderbolt-equipped MacBooks through the same port. And millions of Android phones, which almost universally charge off of micro USB ports, can output video over MHL, or Mobile High Definition Link.

    But MHL isn't the only video standard that can output via a microUSB port; in 2012, Google introduced the Nexus 4 and the Nexus 7 with a competing standard called Slimport. The two are essentially competing to be the most convenient. MHL has an install base of millions, and outputs to the ubiquitous HDMI standard. But SlimPort offers something just a bit different: DisplayPort compatibility, which means it can easily output to a variety of video standards.

    The MHL consortium announced a 3.0 spec on Tuesday, upping Mobile High Link's resolution support from 1080p up to 4K at 30 frames per second. Future Android phones and other devices that support the new spec will be able to push video at 3840x2160 while drawing up to 10 watts of charging power; they'll also support 7.1 surround sound audio and the HDCP 2.2 content protection standard. Of course, the 3.0 spec is mostly futureproofing, since 4K TVs are just beginning to show up in stores and homes.