Latest StoriesGaming
    Testing: Logitech G502 Proteus Core Gaming Mouse

    Wait, wasn't it just one year ago that Logitech released the G500s, the rebirth of its venerated G5 line of gaming mice? Hold on for just a second while I check my review. Yep, that was just last March. But here we are, with another new high-end gaming mouse, the G502. And this year, Logitech's given it a fancy moniker: the Proteus Core. I'm not sure if that's meant to evoke a certain StarCraft faction in gamers' minds, or simply a take on the SAT-friendly word 'protean', meaning versatile or adaptable. The latter's likely the case, given the G502's ability to be calibrated for different mousing surfaces (glass and mirrors notwithstanding). Regardless, Logitech's new flagship is an aggressive product, an $80 mouse that not only succeeds last year's G500s, but revamps the design of Logitech's gaming mouse line. That curvy G5 design that I was so hot on last year has once again been retired (at least temporarily).

    I've been testing the G502 for about a week, in first-person shooters, real-time strategy games, and lots of desktop imaging work. I'm not a MOBA player, so my perspective may not reflect those playing the dominant PC gaming game type today. And as I've said before, a gaming mouse is an accessory that most people rarely change--they find the one that works for them and stick with it. If you like the Razer DeathAdder, Mad Catz R.A.T., or even Logitech's own previous G-series, mice, there's really not a lot of reason to spend another $80 on a new gaming mouse unless your current one breaks. Gaming mice technology has really reached a point where every new generation of product offers fewer new benefits; product engineers really feel like they're reaching when they push the boundaries of sensor DPI or add more configurable buttons. And the G502 has plenty of those new back-of-box features, for sure. Let's run through them and evaluate whether they truly add any benefit to your gaming experience.

    Arguably the most important component in a gaming mouse is its sensor, and the G502's optical (IR) sensor was apparently designed from the ground up to introduce two notable features. The first is DPI (dots per inch, or technically counts per inch) sensitivity that ranges from 200 to 12000. You read that correctly: this mouse is sensitive to past 10,000 DPI, which I believe is a first for a gaming mouse. (Consider that the G5, circa 2005, topped out at 2000 DPI). At that maximum setting, the tiniest flick of the wrist will send the cursor all the way across a 1080p panel; it's meant for gamers who want to make extremely large movements quickly, or desktop users running multiple monitors spanning many thousands of pixels wide. Of course, high DPI doesn't denote accuracy, just sensitivity. A mouse set to 10,000 DPI isn't useful if it isn't accurate at that "resolution"--the trick is testing the mouse's accuracy at the sensitivities that you find most useful.

    FTL: Advanced Edition Adds More Space and an iPad Version

    One of my favorite games of 2012 was FTL. It's a space sim roguelike-like unlike anything The game is deceptively simple and endlessly entertaining. In it, you control the crew of a starship on a desperate mission to stop the rebel alliance, or something like that. The upshot is that you control the crew, assigning them tasks like ship repair, arming the weapons, or cranking up the engines while traversing system after system. Because the game has roguelike elements, each time you play, the map changes. So while you'll likely encounter friendly merchants, hostile insect aliens and deadly plagues on one playthrough, your next trip across the galaxy will probably be very different.

    Yesterday, FTL: Advanced Edition was released as a free upgrade for Mac and PC owners, as well as a brand new iPad edition. I spent some time playing both versions over the last few days, and I can unequivocally recommend them both. The iPad version is a lovely translation of the PC game, with a touch interface that works well on even the smaller screen of my IPad Mini. The game performs wonderfully on all the devices I tested, and the touch interface complements the pause-and-issue-commands nature of the game perfectly.

    10 Things You Should Know about DirectX 12

    It’s about efficiency, not new features

    The new DirectX will add a few new rendering features, but those new features aren’t as important as efficiency. Direct3D 12 has a thinner abstraction layer between the operating system and the hardware. Game developers will have more control over how their code talks to graphics hardware. Overhead is reduced substantially. Time for threads to complete has been reduced by 2-5x in some cases.

    Only Direct3D 12 has been discussed

    Most gamers tend to think of Direct3D when DirectX is mentioned, and the focus of the recent announcement is indeed on D3D. There was no discussion of audio, game controller interfaces, Direct2D or other aspects of DirectX. Part of the reason for this early peak at DirectX 12 is that AMD’s Mantle, a direct-to-metal API with similar ambitions, was starting to get some traction. Microsoft no doubt worried that API fragmentation would return game development to the bad old days, where you wouldn’t be able to run any game on any graphics card.

    DirectX 12 is now a console API

    Direct3D 12 will run on the Xbox One. The execution environment has been described as “console-like,” which probably means the layers are thinner. This makes sense today, since most modern GPUs are really highly parallel and highly programmable. What this means for future versions of Windows is unknown, since Direct3D is the rendering API for Windows 8 and beyond. I hope Microsoft doesn’t “freeze” the Windows API at D3D 8. We’d once again be in a situation where application developers might end up using different APIs than game developers.

    Hands-On: Virtuix Omni Treadmill with Oculus Rift

    We strap on a harness and step on board the Virtuix Omni motion tracker at this year's Game Developers Conference. The Kickstarter-backed treadmill system pairs with an Oculus Rift development kit to simulate walking and running in a first-person shooter. It took a little getting used to, but the experience was unlike anything we've tried before.

    Why Facebook Buying Oculus VR Is Probably a Good Thing

    Earlier today, Facebook announced that it was buying virtual reality startup Oculus for $2 billion, and as is the usual, the Internet erupted in panic. Despite actively disliking what Facebook has become and avoiding the service wherever possible, I actually think Facebook buying Oculus is probably a good thing for Oculus, the virtual reality community, VR enthusiasts, and even gamers.

    If you take Mark Zuckerberg's post regarding the Oculus acquisition at face value, it seems clear that Facebook's impetus for buying Oculus is to accelerate Oculus's potential as a communications medium, taking it beyond games and turning it into a technology that becomes part of the fabric of our lives, just like computers, the Internet, and smartphones have been integrated in our lives.

    Reading between the lines, I'm pretty sure Zuckerberg wants to build Neal Stephenson's Metaverse. I'm actually OK with that.

    In Brief: Facebook Buys Oculus VR for $2 Billion

    Facebook has just announced that it will acquire Oculus VR, the makers of the Oculus Rift, for $2 billion in cash and stock. According to Facebook's news release, the company "plans to extend Oculus’ existing advantage in gaming to new verticals, including communications, media and entertainment, education and other areas. Given these broad potential applications, virtual reality technology is a strong candidate to emerge as the next social and communications platform." Said the Oculus VR team in a blog post, "Facebook understands the potential for VR. Mark and his team share our vision for virtual reality’s potential to transform the way we learn, share, play, and communicate." In a Facebook post, Mark Zuckerberg reinforced that Oculus technology would still be a platform for gaming first, but wants it to be a new communication platform for life experiences: "Imagine enjoying a court side seat at a game, studying in a classroom of students and teachers all over the world or consulting with a doctor face-to-face -- just by putting on goggles in your home."

    Oculus Rift development will continue from Oculus VR's headquarters in Irvine. Facebook will be hosting a conference call to discuss the acquisition later today, which we'll be tracking. Strange times.

    Norman 20
    In Brief: Nvidia Announces $3000 Titan Z Video Card

    Wasn't it just one month ago that Nvidia announced its GeForce GTX Titan Black video card? Yes, I'm sure it was, because I was in a meeting with them when they mentioned it while briefing me on Maxwell. Titan Black was supposed to be their new super high-end video card--a GTX Titan that ran at an unfettered 889MHz core clock (boosted to 980MHz), and 7GHz GGDR5 memory. Basically a suped-up GTX 780 Ti, priced at $1000. Well that's only one third the price of the new Titan Z video card, announced today at Nvidia's GPU Technology Conference in San Jose. Titan Z is basically two Titan Black cards crammed into one--a total of 5,760 processing cores (two GK110 chips), and 12GB of dedicated frame buffer memory. There's no word about its availability, or other important details like power consumption, but it's clearly a boutique product that's more for prestige than anything. PC Gamer should get one (or two) for their Large Pixel Collider.

    Norman 4
    Testing Maxwell: Nvidia's Six-Inch Hammer

    Nvidia recently unleashed its latest graphics architecture on the world, Maxwell. The first iteration is the GTX 750, a GPU that will be the core of a graphics card whose asking price will typically be under $150. Two variants of the GTX 750 will be shipping, the GTX 750 and the GTX 750 Ti. I’ll get into the differences shortly.

    Once upon a time, the first iteration of a GPU architecture would show up as powerful, power-hungry beasts of a graphics card. That changed a bit when Nvidia introduced its first Kepler card aimed at gamers, the GTX 680. The GTX 680 took the gaming world by surprise, delivering leading edge performance, but was miserly on power consumption and low on fan noise, particularly when compared to Nvidia’s earlier GTX 580s and AMD’s Radeon HD 7970s.

    Still, the GTX 680 was a high end card, even though it broke the mold a bit as to what a high end card should be. The GTX 750 Ti’s typical asking price is $150. The overclocked EVGA GTX 750 I’ll be looking at is $169, but it’s both overclocked and ships with a 2GB frame buffer. You can find GTX 750 Ti cards from several manufacturers running at reference clocks for $149. GTX 750 1GB cards can be had for as little $119. I’ll take a look at performance of the EVGA GTX 750 Ti SC and an Nvidia GTX 750 Ti reference card, which also has a 2GB frame buffer, but runs at standard core clocks.

    But first, let’s look at Maxwell.

    Hands-On: Oculus Rift Development Kit 2 Virtual Reality Headset

    New Oculus VR hardware! We get our hands on the second development kit for the Oculus Rift at GDC 2014 and chat with Oculus VR's Nate Mitchell about the roadmap to the final consumer release (plus their thoughts on Sony's VR efforts). Here's how DK2 differs from past prototypes, our impressions of it with new tech demos, and why you should still hold off until the final product.

    In Brief: Sony Announces Project Morpheus VR Headset

    I think this is something most people saw coming, but Sony today announced that it has been working on a virtual reality headset for its PlayStation 4 console. And thankfully it's not the virtual reality headset prototype we saw at CES, which was little more than a motion tracker slapped on Sony's HMZ-T3 display. Unveiled at a GDC press conference, Project Morpheus (as in Dream of the Endless, not the character from The Matrix) is a head-tracking HMD that's much more Oculus Rift than personal movie theater, with specs that look very similar to Oculus VR's Crystal Cove prototype. Morpheus uses a 5-inch 1080p display (LCD instead of Oculus' AMOED), tracks with a combination of accelerometer, gyro, and camera, and has optics that display games with a 90-degree FOV. Sony also carted out the keywords that will be familiar to anyone following modern VR work: presence, low latency, and 3D audio. It's apparently something Sony has been exploring since 2010, when its labs attached Move controllers to a HMD. This being GDC, Sony of course announced plenty of software partners for its VR initiative, including some developers that have already shown work on the Oculus Rift. There's no launch timeframe for Project Morpheus, but I bet that this holiday season is going to be really interesting for fans of virtual reality. What do you guys think of Sony's announcement?

    Norman 2
    Test Notes: PlayStation Dualshock 4 on a PC

    Right after the PlayStation 4 launched, Wes wrote up some basic instructions for connecting your PlayStation 4's DualShock 4 controller to a PC using either Bluetooth or a microUSB cable, but I finally connected my spare DS4 and set up the latest version of DS4fix (1.22 as of this writing).

    The instructions haven't changed much since Wes wrote that story, and the process was relatively straightforward, once I switched to the Microsoft drivers for my Bluetooth adapter, at least. Initially, I switched to the DS4 because I wanted access to its superior D-pad for Spelunky play, but after a few days, I'm ready to swap out my Xbox 360 controller permanently. While I'm sure the application that wraps normal gamepad calls to the DualShock probably adds a small amount of latency, I don't feel it at all. Don't tell anyone at Microsoft, but I'm even using it to play the PC version of Titanfall.

    Will 3
    Why The E.T Video Game Was Made at All

    Last December, Xbox Entertainment Studios announced that it would be producing a series of documentaries about the rise of digital entertainment. The first installment will be about E.T. the game, and screenwriter Zak Penn (X-Men: The Last Stand) is directing it. Wait a second…the Atari E.T. game? One of the biggest disasters in gaming history? Yep, and it’s actually a fascinating way to launch this series.

    It would be much easier to make a documentary about a huge success in the gaming world. How about like how Tetris launched an industry? But success is very easy to take for granted. You can learn a lot more from failure, and it’s often way more interesting to dissect a flop in a post-mortem kind of way. The failure of the E.T. game is especially perplexing in that Atari just had its biggest year, and E.T. was at that point the highest grossing film of all time. How could something like this be on the biggest losers in video game history?

    In 1981, Atari was riding high. The company had come a long way from the humble origins of Pong, and the company was growing by leaps and bounds. Atari was founded under its original name, Syzygy, by Nolan Bushnell in 1972. Their first games, Pong and Tank, were hits, and the company was doing well with sales in excess of $39 million in 1976.

    It wasn't long before Atari was bought out by Warner Communications, but the first couple of years under the Warner umbrella Atari wasn’t making money. Once Bushnell was replaced by Raymond Kassar in 1978, the company finally took off. In 1979, Atari earned a profit of $6 million dollars, then the company scored nearly $70 million in profit a year later. Then in 1981, they hit a billion dollars in sales, with a profit of $300 million. As business reporter Connie Bruck wrote, “There had probably not been another company in the history of American business that grew as large, as fast, as Atari.”

    Yet many will tell you that when a company explodes this fast, it makes investors nervous because it means they can go downhill just as fast. One person who knew Atari wouldn’t be a phenomenon forever was the late Warner chairman Steve Ross. Ross was one of the few naysayers who proclaimed that this kind of success wasn’t going to last, and many didn’t believe him.

    In Brief: The Fictional Videogames in Her

    I'm endlessly fascinated by fake software in media. Whether that's faux futuristic operating systems in science fiction film or a fake video game used as a minor plot point, these are designed elements that an artist crafted to bolster the constructed reality of that story, even if it only appears on screen for only a few seconds. Spike Jonze's Her (which we still need to do a spoilercast about) builds it world thoroughly, from the Samantha OS that's the crux of the story to the imagined videogames of the near-future. Plenty has been said about the former, but this Creators Project post is the first I've read about the latter. Her actually has two fake videogames, and both have plenty to say about the current state of game design. I won't spoil what they are in case you haven't seen the movie, but will direct you to the websites of designers David O Reilly and Kevin Dart, who developed those game sequences for the film.

    The Game Frame Pixel Art LED Display

    It's Tested in the dark! We turn down the studio lights to show off the Game Frame, a Kickstarter project created by friend of Tested and regular podcast guest Jeremy Williams. This is an Arduino-based project to display animated pixel art (like from classic arcade games) using a full-color LED matrix. We chat with Jeremy to learn how it works and how you can even put entire movies on the Game Frame.

    Maker Profile: Jeremy Williams' LED Pixel Art Game Frame

    For those of you who've been following this project since last summer, today is an awesome day: Jeremy Williams just launched his Game Frame Kickstarter. Jeremy--a friend of Tested who you may have heard on our podcast--first showed us his Game Frame prototype when it brought it to our office last year for video demo. It was just a fun personal project--Jeremy loves 8-bit pixel art and wanted to find a way to display it on his walls. And that's what's so great about this project; he wanted something that didn't exist, so he made it. And taught himself more about Arduino programming and LED electronics along the way. There was no plan to make more than just the one; Jeremy's in the making-shit-that's-cool business, not the empire business. It was the feedback from you guys in that first Show and Tell that inspired Jeremy to keep iterating on his prototype and design a Game Frame that would be cheaper to build and easier to use. And based on our conversations with Jeremy since last summer, it was task much easier said than done.

    I asked Jeremy to share the story of the new Game Frame and explain some of the technical challenges he had to tackle to make this project a reality.

    Let's start with the beginning. In conceptualizing the game frame, what were your goals and how did it inform the technologies and design you chose for it?

    It's called pixel art, so why not frame it? There are really two factors, at least for me: nostalgia and artistic merit. The earliest game sprites are powerful purely as memory triggers. Just seeing them is enough to make me smile about the "good 'ol days," when arcades were rotting our youth, MTV played videos, and email was something you sent on a BBS. A roll of quarters at age 10, in 1984, was worth its weight in gold -- but I'd rather have the quarters. I played the hell out of Defender, Joust, Frogger, Burgertime, Tron. I can still hear the sound of an arcade. That's just part of growing up in that generation. So I thought it would be cool to blow up some of that early art just to remind me of my youth.

    But there's also the pure artistry of it all. I've noticed that whenever computers are involved, people's artistic radar gets scrambled. From video games, to the demo scene, to Pixar, computer assisted mediums just haven't gotten their due recognition. With pixel art it's especially frustrating because it's so accessible. The problem with more modern computer-assisted crafts is most people can't grok the creative process, but with pixel art you can. It's painting with squares, and the best examples of this are an expression of sheer cleverness and minimalism. Maybe people don't generally think of old game art as something that has inherent value outside of the game, but I do. I can watch the four frames of the Joust buzzard walk cycle any day, and always be inspired.

    As for Game Frame tech, it turns out that if you want to frame early arcade sprites you need at least 16x16 resolution. That will get you comfortably to about year 1984, plus lots of NES content. Since I already had experience making an Arduino/microSD project, that was the most comfortable path, and the limitations of the Arduino chip map nicely to that number of pixels.