Latest StoriesConcepts
    PROJECTIONS, Episode 8: Dexmo's VR Haptic Exoskeleton

    In this episode of Projections, Jeremy and Norm discuss how hand presence has presented itself to consumers over the past three years of virtual reality hardware. We get a demo of Dexmo, a wireless exoskeleton controller that tracks individual fingers and promises to provide haptic feedback to let you actually feel objects in VR. Jeremy chats with Dexta Robotics' CEO about challenges to haptics technology and how they're tackling the problem.

    PROJECTIONS, Episode 1: The Mage's Tale, rEvolve Prototype Hands-On

    Welcome to PROJECTIONS, a new show about the latest in virtual and augmented reality. This inaugural episode kicks off with a discussion of depth in VR games, an exclusive hands-on preview of The Mage's Tale, and a spotlight on the rEvolve accessory prototype for the HTC Vive. Let us know what you think, and what you'd like to see in future episodes!

    Tested: Oculus Touch VR Controller

    They're finally here! Norm and Jeremy test and review the Oculus Touch virtual reality controllers, which bring motion-tracked hand presence to the Oculus Rift VR headset. Here's how Touch compares with the Vive and PSVR controllers in tracking, features, and ergonomics. Plus, we discuss the launch lineup of games and Touch content.

    Everything You Need to Know about HDR TVs

    If you follow consumer technology news, you may have noticed increasing mentions of high dynamic range, or HDR. It's a technical term used to express a large range of luminosity in an image. Typically, HDR describes the quality of a photographic image. But it also describes video images--and with recent advancements in display technologies, HDR televisions and monitors are becoming a new confusing option for shoppers. Let's explore what this all means for you if you're in the market for a new display.

    HDR for Still Images

    When you take a picture with a smartphone, DSLR, or anything in between, the image is captured with fixed values for the shutter speed, ISO, exposure value, etc. In short, these all affect how much light the camera captures, and therefore how bright or dark the picture is.

    If you're trying to take a picture of something with an extreme range of luminosity, you will find yourself with details being washed out in bright areas or lost in the shadows of dark areas. Imagine yourself indoors on a sunny day in a room with a large window. Or, looking at a person outside when the sun is behind them. The human eye is able to simultaneously see detail both inside and out, or the person's face despite the sun, but a camera taking a single image at fixed settings cannot.

    It's these types of scenarios you want an HDR image. All high-end smartphones today can create one. Hit the toggle to turn it on, and now your phone magically takes pictures with a better dynamic range. Well, not quite. You've probably noticed that taking a picture on your phone with HDR turned on takes a bit longer. That's because it's actually taking multiple pictures: one overexposed, one underexposed, and one at a normal exposure, and then the camera software stitches them together for one great looking picture.

    To make an HDR image from pictures taken with a DSLR it requires more steps. At the very least multiple images taken at different exposures are needed. Next you'll need image processing software. The most recent versions of both Photoshop and Lightroom from Adobe are capable of automatically creating HDR images. If you don't want to go through the process of manually putting them together with layers and the whole nine yards, you can use the merge to HDR feature and the software will automatically create an HDR image with what you've provided. Some DSLRs can do "in-camera HDR"; merge three bracketed images. However, they won't look as good compared to creating them with computer software.

    No matter the device, or more specifically, the display, you're viewing an HDR image on you can see the benefits. This is thanks to a process called tone mapping. It's a technique used to bring down the dynamic range of an image or video, and can preserve most of the detail while allowing the content to be viewed properly on a standard dynamic range display.

    Oculus VR 'Santa Cruz' Prototype Impressions

    We go hands-on with Oculus' new 'Santa Cruz' standalone VR headset prototype, and share our thoughts and impressions from the demo. We also chat with Oculus' Nate Mitchell about the future of virtual reality and rate our favorite games from Oculus Connect!

    Tested: PlayStation VR Review

    It's finally here! We review Sony's virtual reality headset, PlayStation VR, which has potential to bring VR to mainstream gamers. Jeremy and Norm discuss PS VR's display quality, ergonomic design, motion controllers, tracking performance, and launch games. Here's how PS VR's hardware and gaming experience compare to the Oculus Rift and HTC Vive.

    Hands-On with Looking Glass Volume, a True Volumetric 3D Display!

    We get up close with Volume, a true volumetric display that can be used for creating 3D content, viewing depth-enhanced videos, and playing holographic games. Its inventors stop by our office to explain how the display works and how they hope volumetric imaging can change how we interact with computer graphics and imagery.

    How Virtual Humans Learn Emotion and Social Intelligence

    At USC ICT's Virtual Humans lab, we learn how researchers build tools and algorithms that teach AI the complexities of social and emotional cues. We run through a few AI demos that demonstrate nuanced social interaction, which will be important for future systems like autonomous cars.

    Hands-On with Shaper Origin Handheld CNC Router!

    This is super cool: a handheld CNC router that uses computer vision to let you see exactly what you're cutting through the bit, and compensates for any shaky hand movement with automatic stabilization. We visit Shaper to learn about the Origin and test out its features!

    Digitizing Photorealistic Humans Inside USC's Light Stage

    We learn how actors are digitized and turned into photorealistic models inside USC ICT's Light Stage capture system. Paul Debevec and his team at the Graphics Lab are focused on inventing technologies that create the most realistic-looking virtual people, objects, and environments. We were blown away by the capabilities of the light stage!

    Tested Tours VR Projects at USC's Mixed Reality Lab

    At USC's Institute for Creative Technologies, computer scientists and engineers have been tinkering with virtual reality, augmented reality, and everything in between. We're given a tour of ICT's Mixed Reality Lab, where projects explore the intersections of VR and accessibility, avatars, and even aerial drones.

    USC Mixed Reality Lab's VR Redirected Walking Demo

    We recently visited the USC Institute of Creative Technology's Mixed Reality Lab, where virtual reality researchers are experimenting with software that will let you walk around forever in VR. We test their redirected walking and lightfield model demos and learn how these technologies could work in future VR games.

    Oculus Medium Sculpting Demo with DC Comics

    Another surprise for us at Comic-Con was running into the team at Oculus, demoing their Touch controllers and the Oculus Medium sculpting tool at the DC Comics booth. We chat with Medium's project lead about how it's changed, and learn how artists are beta testing it in their creative workflows.