Latest StoriesConcepts
    Everything You Need to Know about HDR TVs

    If you follow consumer technology news, you may have noticed increasing mentions of high dynamic range, or HDR. It's a technical term used to express a large range of luminosity in an image. Typically, HDR describes the quality of a photographic image. But it also describes video images--and with recent advancements in display technologies, HDR televisions and monitors are becoming a new confusing option for shoppers. Let's explore what this all means for you if you're in the market for a new display.

    HDR for Still Images

    When you take a picture with a smartphone, DSLR, or anything in between, the image is captured with fixed values for the shutter speed, ISO, exposure value, etc. In short, these all affect how much light the camera captures, and therefore how bright or dark the picture is.

    If you're trying to take a picture of something with an extreme range of luminosity, you will find yourself with details being washed out in bright areas or lost in the shadows of dark areas. Imagine yourself indoors on a sunny day in a room with a large window. Or, looking at a person outside when the sun is behind them. The human eye is able to simultaneously see detail both inside and out, or the person's face despite the sun, but a camera taking a single image at fixed settings cannot.

    It's these types of scenarios you want an HDR image. All high-end smartphones today can create one. Hit the toggle to turn it on, and now your phone magically takes pictures with a better dynamic range. Well, not quite. You've probably noticed that taking a picture on your phone with HDR turned on takes a bit longer. That's because it's actually taking multiple pictures: one overexposed, one underexposed, and one at a normal exposure, and then the camera software stitches them together for one great looking picture.

    To make an HDR image from pictures taken with a DSLR it requires more steps. At the very least multiple images taken at different exposures are needed. Next you'll need image processing software. The most recent versions of both Photoshop and Lightroom from Adobe are capable of automatically creating HDR images. If you don't want to go through the process of manually putting them together with layers and the whole nine yards, you can use the merge to HDR feature and the software will automatically create an HDR image with what you've provided. Some DSLRs can do "in-camera HDR"; merge three bracketed images. However, they won't look as good compared to creating them with computer software.

    No matter the device, or more specifically, the display, you're viewing an HDR image on you can see the benefits. This is thanks to a process called tone mapping. It's a technique used to bring down the dynamic range of an image or video, and can preserve most of the detail while allowing the content to be viewed properly on a standard dynamic range display.

    Oculus VR 'Santa Cruz' Prototype Impressions

    We go hands-on with Oculus' new 'Santa Cruz' standalone VR headset prototype, and share our thoughts and impressions from the demo. We also chat with Oculus' Nate Mitchell about the future of virtual reality and rate our favorite games from Oculus Connect!

    Tested: PlayStation VR Review

    It's finally here! We review Sony's virtual reality headset, PlayStation VR, which has potential to bring VR to mainstream gamers. Jeremy and Norm discuss PS VR's display quality, ergonomic design, motion controllers, tracking performance, and launch games. Here's how PS VR's hardware and gaming experience compare to the Oculus Rift and HTC Vive.

    Hands-On with Looking Glass Volume, a True Volumetric 3D Display!

    We get up close with Volume, a true volumetric display that can be used for creating 3D content, viewing depth-enhanced videos, and playing holographic games. Its inventors stop by our office to explain how the display works and how they hope volumetric imaging can change how we interact with computer graphics and imagery.

    How Virtual Humans Learn Emotion and Social Intelligence

    At USC ICT's Virtual Humans lab, we learn how researchers build tools and algorithms that teach AI the complexities of social and emotional cues. We run through a few AI demos that demonstrate nuanced social interaction, which will be important for future systems like autonomous cars.

    Hands-On with Shaper Origin Handheld CNC Router!

    This is super cool: a handheld CNC router that uses computer vision to let you see exactly what you're cutting through the bit, and compensates for any shaky hand movement with automatic stabilization. We visit Shaper to learn about the Origin and test out its features!

    Digitizing Photorealistic Humans Inside USC's Light Stage

    We learn how actors are digitized and turned into photorealistic models inside USC ICT's Light Stage capture system. Paul Debevec and his team at the Graphics Lab are focused on inventing technologies that create the most realistic-looking virtual people, objects, and environments. We were blown away by the capabilities of the light stage!

    Tested Tours VR Projects at USC's Mixed Reality Lab

    At USC's Institute for Creative Technologies, computer scientists and engineers have been tinkering with virtual reality, augmented reality, and everything in between. We're given a tour of ICT's Mixed Reality Lab, where projects explore the intersections of VR and accessibility, avatars, and even aerial drones.

    USC Mixed Reality Lab's VR Redirected Walking Demo

    We recently visited the USC Institute of Creative Technology's Mixed Reality Lab, where virtual reality researchers are experimenting with software that will let you walk around forever in VR. We test their redirected walking and lightfield model demos and learn how these technologies could work in future VR games.

    Oculus Medium Sculpting Demo with DC Comics

    Another surprise for us at Comic-Con was running into the team at Oculus, demoing their Touch controllers and the Oculus Medium sculpting tool at the DC Comics booth. We chat with Medium's project lead about how it's changed, and learn how artists are beta testing it in their creative workflows.

    Hands-On with Raw Data's New Multiplayer VR Demo

    We visit the offices of Survios, a VR game company making a sci-fi multiplayer shooter for the HTC Vive and Oculus Touch. The new demo of Raw Data includes teleportation for moving around the map, hero classes, and special powers. We chat with Survios' Chief Creative Officer about some of their VR design ideas.

    Hands-On with Manus VR Virtual Reality Gloves

    Seeing your hands and arms in virtual reality is going to be a big deal, but there's no perfect solution yet for accurate and robust hand presence. That's what Manus VR is trying to achieve with its VR gloves, which we test at this year's E3. We learn how the gloves work and how it integrates with HTC Vive and Steam VR.

    Oculus Touch Hands-On and Interview at E3 2016

    We stop by the Oculus booth at E3 2016 to get hands-on time with Oculus Touch games, including Wilson's Heart and The Unspoken. Here's some of that gameplay, our impressions on those demos, and our hopes for hand presence in virtual reality. Plus, a chat with Palmer Luckey and Nate Mitchell about the Oculus Rift's launch, game exclusivity, and what's coming next.

    Tested Attends Autonomous Vehicle Track Day

    We're on location at Thunderhill Raceway Park, the location of the first Autonomous Vehicle Track Day. Hackers making their own self-driving cars brought their vehicles, sensors, and software to test on a race course, experimenting with autonomous driving at high speeds. We chat with the event organizer and several builders to learn about the future of self racing cars.

    Meet the Carbon M1 Super Fast 3D Printer

    Watch this complex object get 3D printed in less than 15 minutes. Sean and Norm visit Carbon, the makers of the M1 3D printer, to get a demo of this new super fast 3D printing technology working in real-time. We chat with Carbon's VP of Product, Kirk Phelps, to learn how the CLIP 3D printing tech works, and why it's more than just about really fast prints.

    Hands-On with NASA's HoloLens Mars Demo

    NASA has been working with Microsoft's HoloLens technology to allow its Mars Curiosity rover engineers to visualize Mars and plan missions for the robot. We try a version of this OnSight application and chat with NASA's Dave Lavery about the potential of this kind of mobile virtual reality.

    I Can't Stop Thinking about Tesla's Autopilot

    In covering technology news and reviewing consumer devices, there have been a few times when we've used a new technology that gives us a glimpse of the future. The first time I pinched to zoom a photo on the first iPhone. Successfully printing a test cube on the original MakerBot. Putting a Phantom 2 quadcopter into the sky to shoot stabilized 1080p video. Turning my head around to look behind me while wearing the Oculus DK1. Each of those rare moments are a confluence of complex technologies coming together to pull off a incredible feat that "just works". They're the kind of things that stick with you for a while. Or even show up in your dreams. And this past week, I've been having dreams about Tesla's autopilot.

    I went for a test drive of the new Tesla Model X last Thursday, courtesy of Tested reader Christian (who also let us test drive his Model S a few years back). The Model X is the crossover version of the Model S, incorporating all of the updates that have come since the S launched over three years ago. It has AWD, a 250 mile max range, and even Tesla's ludicrous speed option , which slams your breath to the back of your throat while going from 0 to 60 in 3.3 seconds. We'll have a full video this week covering the other features of the car, but the thing I was really curious to try was Tesla autopilot, an optional feature on the newest Model S's and the X.

    We took Christian's Model X onto the freeway south of San Francisco, right at the end of rush hour. With Christian behind the wheel, he explained how autopilot is flipped on in two stages: first by enabling adaptive cruise control, and then flipping the same lever again to enable automatic steering. ACC is a pretty standard feature on new cars; it uses radar systems to adjust your car speed to maintain a set distance from cars in front of you (typically measured in car lengths, and dependent on speed). Autopilot takes that one step further with optical cameras that can see lane markers and keep the car centered in your lane while going at full speed.

    And it works. I took my turn in the driver's seat, taking my feet off the pedals and hands off the steering wheel while we sped down the freeway, the car automatically and comfortably maneuvering the gentle curves of the road at 75 miles per hour. No amount of rational reinforcement beforehand could've prepared my brain for the surreal feeling sitting in the driver's seat and watching the pedals depress and steering wheel turn on its own. My eyes were were still glued to the road and I was still doing doing all the situational awareness calculations I would be doing if I was the one doing the driving, but with that very same data about the distance of other cars were around me visualized right on the dashboard screen. The car had a sense of the distance of speed of other vehicles in the vicinity, showing me color-coded cars in front of me to illustrate their distance, even identifying motorcycles that passed by in between lanes. I could flip the signal light switch and the Model X would make the lane change automatically, speeding up to an open spot in the next lane. After about 15 minutes or so driving with autopilot, I felt at ease with it, comfortable enough to have a conversation with Christian in the passenger's seat without tensing up at every micromovement made by the car.

    In Brief: Lytro Introduces Its Cinema Lightfield Camera

    Camera maker Lytro, which last year pivoted from making prosumer light field still cameras to digital cinema, has introduced its first production-ready studio camera. The Lytro Cinema applies light field sensor technology to video, capturing more than just color and light in each pixel, but light and environment data that allows directors to adjust focus, aperture, and even shutter speed after the shot has been taken. Lytro says that each frame taken (at up to 300FPS) has 755MP of RAW data, and the sensor has a dynamic range of 16 stops. Its sample footage--seen in the promo video below--shows how this data can be used in the post-production process to composite CG elements and make adjustments that previously would have been baked in to the plate. Lytro isn't going to be selling its cameras to studios, but offering it in a rental model, with packages that start at $125K.

    Norman
    Tested: HTC Vive Review

    The consumer release of the HTC Vive is finally here! We've been testing the Vive Pre for a while and the final headset for about a week, playing VR games with tracked controllers in a roomscale setup. Jeremy and Norm discuss the setup process, ergonomics, comfort features, and launch content for Steam VR. Plus, we play through Valve's first-party VR game, The Lab!