Quantcast
Latest StoriesConcepts
    In Brief: The Cognitive Complications of Augmented Reality

    Really interesting piece from Lee Hutchinson at Arstechnica, discussing an IEEE Spectrum analysis of the cognitive questions raised by augmented reality technologies. Refreshingly, it's not about the feasibility of AR tech (eg. specs of HoloLens or Magic Leap), but rather the types and densities of heads-up display data our brains can process and integrate with our perception of the visual world. Our brains can be tricked to process artificial visual triggers alongside the natural world, and AR research is attempting to discern what balance of information keeps distraction to a minimum.

    Norman
    In Brief: The Origins of Color TV Broadcast Standards

    I really loved this recent feature from The Atlantic, telling the story of the bitter rivalry between CBS and RCA in the development of a color TV broadcast standard. At the center of this battle were the so-called "color girls"--models hired by the two companies to assist in the color calibration of the cameras used for their respective technologies. The legacy of those women extend to well beyond the formation of the NTSC, in the intrinsic visual and racial biases built into those technologies. A thought-provoking history lesson in technology that's particularly relevant given the recent missteps in Google Photos' face-tagging feature.

    Norman
    Tested Meets RoboSimian, NASA JPL's Ape-Like Robot

    NASA JPL's RoboSimian stood out at the DARPA Robotics Challenge as one of the few non-humanoid robot designs. The use of four versatile limbs allows it to adapt to the test scenario in ways that would be difficult for a bipedal robot. We chat with Katie Byl of the UC Santa Barbara Robotics Lab, whose team programmed RoboSimian, to learn about the advantages of a quadruped design and how RoboSimian may be utilized in complex environments like being underground or even in space!

    How To Get Into Hobby RC: Telemetry Systems

    One of the fundamental challenges of flying RC aircraft is that you are separated from the machine you are controlling. You must assess the health and status of your vehicle from a distance using only limited visual and aural cues – rarely an easy thing to do. Sometimes the first symptom of a failing system is a trail of smoke that inevitably leads to the ground.

    RC telemetry systems provide the means to accurately gauge certain parameters of your model during flight. Think of it as a remote dashboard. Do you want to know how hot your motor is running? How about an alarm that can warn you when your model reaches an altitude of 400 feet? Telemetry devices can provide those things and more.

    What Telemetry Requires

    There are several different ways to receive telemetry data. Some telemetry systems are standalone units with a transmitter/sensor package in the model and a receiver on the ground. For FPV flyers, On-Screen-Display devices take the data from onboard sensors and overlay it on the real-time video feed. The result is something like a heads-up display found in many modern full-scale aircraft. An increasingly popular form of telemetry system is the type integrated into the model's radio system. The pilot's handheld transmitter sends flight commands to the aircraft while also receiving downlinked data. The same onboard receiver that interprets commands also transmits telemetry data. In this way, both the transmitter and receiver are actually transceivers.

    Telemetry data can be viewed in the transmitter screen, but you'll want to use the tactile and aural feedback options when flying.

    The majority of radio manufacturers offer telemetry-capable systems in their lineups. The example that I've chosen to highlight in this guide comes from Futaba. As of this writing, there are three Futaba aircraft transmitters that are telemetry-capable (10J, 14SG, and 18MZ) as well as a handful of receivers. With these systems, their telemetry features are embedded in the S.Bus2 circuitry of the components. That nuance begs a brief explanation of S.Bus2.

    HARV: Telepresence Camera System with Head-Tracking

    Low-latency telepresence camera systems with head-tracking allows users to look around environments in near-real-time while wearing headsets like an Oculus development kit. We put on Telefactor Robotics' HARV remote vision system and chatted with CEO Martha Jane Chatten about the use of motorized gimbal systems for immersive telepresence.

    Meet the Inflatable Soft Robots of Pneubotics

    The inflatable robot of Big Hero 6 was based on real soft robotics research, like the ones being experimented with at startup Pneubotics. We chat with Pneubotics CEO Kevin Albert to learn how robots can be designed and built with lightweight and flexible skins that have impressive dexterity and structural strength.

    In Brief: The Rise and Fall of Virtuality

    The Kernel has a good feature documenting the story of Virtuality, the commercial VR game company that dominated arcades in the early 90s. There are some interesting lessons here about the enduring appeal of virtual reality, what early adopters found compelling about Virtuality's experiences, and how unrealistic expectations led to its downfall. Two decades later, I think a lot of VR enthusiasts believe that the current wave of consumer VR hardware is destined to succeed. But the truth is that it's still a fragile technology that has a lot of hurdles to overcome to break into the mainstream the same way that smartphones have done.

    Norman
    The Talking Room: Adam Savage Interviews Astro Teller

    Adam Savage welcomes Astro Teller to The Talking Room! Astro is Google's 'Captain of Moonshots', directing the Google X lab where self-driving cars, smart contact lenses, and other futuristic projects are conceived and made real. Adam sat down with Astro at the Tested Live Show this past October to chat about the benefits of thinking big and failing quickly.

    Testing Samsung Gear VR for Galaxy S6 Game Demos

    While the consumer Oculus Rift won't be out until next year, developers and early adopters can still playtest virtual reality games with the Samsung Gear VR Innovator Edition headset. We test the new headset made for the Galaxy S6 smartphone--with its high-density 577 PPI display--and demo some of the winners of the recent Mobile VR Jam contest!

    Hands-On: PlayStation Project Morpheus Games at E3 2015

    Our latest hands-on with Project Morpheus is all about the games. We chat with PlayStation's Richard Marks about the gameplay experiences being developed for Project Morpheus and how virtual reality in the living room can differentiate itself from VR on the desktop. Plus, lots of actual game demos!

    In Brief: How the Modern Laptop was Made

    Here's some morning reading for ya: Ars Technica UK's Sebastian Anthony chronicles the technological advances that allow computer makers to design and build today's laptops. It's an informative feature that tracks how process technology (as guided by Moore's law) and battery chemistry grew up together, bonded by new manufacturing technologies like CNC machines. Great stuff.

    Norman
    Hands-On: Microsoft HoloLens Project X-Ray

    Norm gets his first demo of Microsoft's HoloLens augmented reality headset! At this year's E3, we went behind closed doors to playtest Project X-Ray, a "mixed reality" first-person shooter demo using HoloLens. Microsoft wouldn't let us film or take photos inside the room, so we describe and evaluate the experience after the demo.

    Hands-On: StarVR Virtual Reality Headset

    A new challenger appears! Starbreeze Studios surprised us by announcing the StarVR headset at this year's E3, along with a Walking Dead demo. We go hands-on with this virtual reality prototype that boasts a wide field-of-view and positionally tracked accessories. Even though this isn't a consumer-ready product, there's a lot to learn from its design decisions.

    Hands-On: Oculus Rift CV1 + Oculus Touch Controller at E3 2015

    This is it: The Oculus Rift specs are finalized, and we go hands-on with the engineering sample of the consumer virtual reality headset at this year's E3. We also demo the new Half Moon prototypes of the Oculus Touch controller in an amazing multi-player VR demo. Oculus' Nate Mitchell and Palmer Luckey answer our questions about the headset and controllers, and we share our impressions of the hardware and game demos. It's happening!

    Starbreeze's Project StarVR Headset Offers Ultra-Wide FOV

    Another bit of catch-up. Game developer Starbreeze Studios unveiled their own virtual reality headset, based on the work acquired from VR startup InfiniteEye. The headset--StarVR--boasts an ultra-wide 210-degree field of view with the use of two 5.5-inch 2560x1440 displays. It's essentially a 5K VR display, and uses fresnel lenses for optics and fiducial-marker based positional tracking (eg. QR codes). They've partnered with Skybound for a Walking Dead demo, which we're hoping to get to try at E3 tomorrow.

    Oculus "Step Into the Rift" Event Recap and Analysis

    Today, at Oculus' pre-E3 press event, we were present for the announcement of the Oculus Rift consumer headset and the reveal of the Oculus Touch virtual reality controller. Finally, a VR-specific input solution from Oculus! We discuss all the hardware and software news and share our thoughts, analysis, and expectations leading up to hands-on demos at E3.

    In Brief: Using Ambient Wi-Fi to Power Device

    This is neat: researchers at the University of Seattle have demonstrated how to use a router's Wi-Fi signal to potentially power devices up to seven meters away. As Technology Review reports, the team found that they could configure a router to broadcast an ambient signal (one that wouldn't affect data rates) to send a small amount of continuous power to a remote energy harvesting sensor. That sensor--equipped with a low leakage capacitor--would activate and power a device like a small camera once its charged with enough voltage. In tests, the Wi-Fi signal carried enough energy to trigger a black and white VGA camera capture every 35 minutes, and even charge a Jawbone Up device 41% in under three hours.

    Norman 2
    Testing Hobby RC: Spektrum’s Active Vehicle Control System

    My previous discussions of artificial stabilization systems have focused on those used in airborne models. There are also systems available for RC cars and trucks. This time around, I'll share my experiments with one of those units: Spektrum's Active Vehicle Control (AVC) system.

    As with all of the other stabilization products I've reviewed, AVC relies on a set of sensors that detect unwanted movements in the model. It then sends commands to the controls (in this case: steering and throttle) to counteract those movements. The end result should be that your model feels less affected by outside elements and more in tune with your control inputs.

    The Equipment

    The components necessary for AVC are integrated into the transmitter and receiver of several different Spektrum pistol-grip radio systems. The simplest way to obtain AVC is to buy one of the AVC-equipped ready-to-run vehicles from Losi, Vaterra, or ECX. The other option is to install an AVC capable Spektrum radio system into just about any car or truck you choose. To test AVC, I chose the latter route and retrofitted my ECX Ruckus with a new radio system.

    The Ruckus was originally equipped with a 2-channel Spektrum DX2E radio – a simple, but reliable system. This radio was replaced by a 4-channel DX4C. Two of the channels operate throttle and steering control, while the remaining two channels are used to adjust the throttle and steering gains of AVC.

    The vehicles that include AVC still come equipped with the DX2E. It appears, however, that the radio has been upgraded and includes a different receiver to accommodate AVC. The primary difference I see is that the DX2E has a single gain adjustment that affects steering and throttle gains simultaneously. The DX4C (as well as the higher-end DX4R and DX4S) allows for individual gain control.

    By swapping the stock radio system with a Spektrum DX4C transmitter and SRS4210 receiver, I was able to add AVC to my well-used Ruckus monster truck.

    The DX4C is a computer transmitter and can store up to 20 different model profiles. So you can add AVC to multiple vehicles by using the same transmitter and equipping each model with an SRS4210 receiver.

    AVC places a high demand on a vehicle's steering servo because it is constantly making countless, imperceptible movements to keep the car on track. For that reason, Spektrum advises using a digital servo for steering chores. The Ruckus includes an analog servo, so I swapped it with a Spektrum S6100 unit. With steel gears, ball bearings, 5 times the torque of the stock servo (208 oz-in vs 41.7 oz-in), and nearly double the speed (.13 second transit time vs .23 second) the S6100 is overkill in this application. But it's nice to know that I won't have to worry about it.

    Nvidia's Plan to Improve VR Rendering Performance

    Today's announcement of the GeForce GTX 980 Ti graphics card is great news if you're eyeing high-end GPU hardware, but Nvidia also made some software announcements that may be even more exciting. Specifically, they revealed more about their Gameworks VR initiative. Gameworks is Nvidia's proprietary set of APIs and code libraries that game developers can tap into for graphics optimizations and effects. It's effectively middleware, like Havok physics, SpeedTree, and Nvidia's own PhysX. Developers and game engine makers can license and incorporate this software to make their games, and users who have compatible hardware are able to take advantage those features. Optimizing for GPU-specific middleware has raised some controversy among hardware and developer communities, and we're still waiting to see how this will affect gamers in the long run.

    But politics aside, the features laid out in Gameworks VR warrant your attention if you're someone who plans on adopting the first consumer VR headsets coming out over the next year. One thing that's clear is good VR will require more graphics processing power than is expected for desktop PC gaming, and the GPUs initially required for a "full" virtual reality experience is in the high-end. The price and performance of GPUs could become a bottleneck to widespread VR adoption and headset hardware iteration. Oculus and Valve have to spec out their headsets based on the constraints of available graphics hardware and rendering software.

    Gameworks VR features like VR SLI improves performance through the brute-force method of adding more rendering hardware. Direct Mode is a necessary feature for improving PC compatibility with VR headsets and making gaming more seamless. Front-Buffer Rendering and Asynchronous Time-Warp optimizations work to reduce latency, but don't make VR rendering for the GPU any less taxing. The feature that actually does that is something Nvidia calls Multi-Res Shading (MRS). MRS is a new way of telling the GPU how to render game scenes that takes into account how those pixels are actually shown and seen through VR goggles. Instead of rendering the pixels of frame for display on a rectangular screen, MRS optimizes rendering to leave out pixels you'll never notice. It's one of those "why didn't someone think of this before" ideas that makes total sense--my demo left me very impressed with its implementation. Here's how it works.