Quantcast
Latest StoriesFeatured
    Research Robots Versus the Volcano

    The last time NASA scientists sent a robot into the crater of a volcano was 1994.

    It’s name was Dante II, an autonomous, eight-legged crawler packed with video cameras, lasers and other sensors. It was designed by scientists from Carnegie Mellon University’s Robotics Institute to rappel and hobble down the inside of the active Alaskan volcano Mount Spurr – a proof-of-concept for encounters with the types of hostile environments that NASA robots might deal with in space.

    Photo credit: Phil Hontalas/NASA

    But a tumble towards the end of Dante’s mission and subsequent helicopter rescue offered a stark reminder that “the possibility of catastrophic failure is very real in severe terrain,” the robot’s designers wrote. Even with today’s technology – we have self-driving cars now! – there hasn’t been another Dante since.

    “To get a robot to go over the varied and often difficult terrain is very challenging. Robotics has come a long way since Dante, but […] it’s just not quite at the level where they can handle volcanic terrain yet,” explained Carolyn Parcheta, a volcanologist and NASA postdoctoral fellow sponsored by Tennessee’s Oak Ridge Associated Universities. It’s part of the reason that the U.S. Geological Survey still believes that "experienced volcanologists are a better and more cost-effective alternative for monitoring dangerous volcanoes” than robots – at least, for now.

    In a volcanic environment, there are myriad materials of different sizes and shapes. You’ll find small round rocks where each step is like walking on the shifting sands of a beach. On the more extreme end of the spectrum is lava that’s sharp and jagged, making it near impossible to find space both flat and wide enough for a human foot. You’re always walking at an angle. In the middle, you have what Parcheta describes as “the slow, oozing, ropy looking stuff” that’s still difficult to walk on, but less so than the jagged stuff.

    Photo credit: Phil Hontalas/NASA

    “Volcanic terrain is much more complicated than just a set of stairs or an inclined slope, because it’s often all those different things combined,” Parcheta explains. “There’s no regular pattern to the landscape. It feels random. And to the robot it will be random. It needs to learn how to assess that before it can take its steps, and humans do this on the fly, naturally.” This is, as you might expect, difficult – and one of the big problems that Dante’s designers had. So, for years, humans have instead sufficed.

    But there’s also another reason that volcano crawling robots haven’t exactly been subject to pressing demand. According to Dr. Peter Cervelli, associate director for science and technology at the USGS Volcano Science Center, his agency has had “limited need for ground based robotics” – in large part because the majority of volcanoes in the United States don’t presently pose a threat to human volcanologists.

    Tested Asks: How are Holograms Made?

    While in New York, Norm stops by Holographic Studios, one the last remaining independent holography galleries and holography studios still operating. Its founder, Jason Sapan, has spent almost 40 years practicing the art of holographic imagery. We figure he's the best person to explain to us what exactly is a hologram, and how they're painstakingly made.

    NYCC: Triforce's Video Game Replica Props

    We've met and worked with independent replica prop makers who specialize in video game props, but here's a company working directly with game developers to bring digital characters, armor, and weapons to reality. At New York Comic Con, we stopped by Triforce's booth to check out their newest scale statues and full-size replicas, as well as learn about their production process.

    Bits to Atoms: 3D Printing Quicksilver's Stereobelt

    Remember a few months ago when I spent time obsessing over Quicksilver’s audio gear from X-men: Days of Future Past? I thought that exploration was enough to get it out of my system--until my friend Hadley told me that she would be cosplaying as Quicksilver for New York Comic Con. Without missing a beat, I proclaimed that I had to build her an accurate Stereobelt prop. And so my obsession began anew.

    Prototype - designing multiple parts - more work for better results

    To recap: the Stereobelt, a little-known predecessor to the Walkman, predating Sony's portable cassette player by seven years and cobbled together from existing tech. Only one picture and a patent document of it can be found in all of the interwebs, yet the savvy production designers on Days of Future Past based Quicksilver’s unit on the Stereobelt, therefore giving him probable audio gear for 1973.

    Setting out to create my own Stereobelt, I ran into an immediate problem: a lack of good reference material. Other than the magazine cover of Quicksilver, which showed only one side of the belt, I was unable to find any good reference of the other side or back. At this point, the Blu-ray hadn’t been released and unlike every other Marvel movie, there was no “Making-of” book. So, I started work on what I had reference for, figuring that I may have to improvise the opposite side and revise it when I could get ahold of the movie. I didn’t have a lot of time to build the Stereobelt, so my original intention was to keep it simple and print it as one solid piece. The front and back caps would cause some print issues since they were both tapered and would have to use supports to print as one piece. The caps would also print better if the slopes were oriented upwards, so I decided to compromise and print the body and caps separately and assemble using simple square pins and glue.

    Solid body with caps connected via pins.

    Unlike the Hellboy Millenbaugh Motivator, for which I took meticulous measurements using Photoshop, I totally eyeballed the size and proportions of the Stereobelt on paper. Once it looked right, I started building in 3D and quickly realized another issue - if I built this as one piece, painting and finishing would be difficult since it had a lot of trim pieces. I also liked the idea of being able to print this out in two colors, assemble with no painting and still have it look good, so I decided to break it up into more pieces.

    Photo Gallery: Behind the Scenes at Immortal Masks

    During our visit to Immortal Masks, we not only got a chance to learn about their entire sculpting and production process, but also check out their entire line of creature masks. Sculptor Andrew Freeman is always working on new mask designs, and their team of artists can create variations on a sculpt with unique paint applications. Lifelike Bebop and Rocksteady masks caught my eye, but my favorite has to be the Ogre mask. Which one do you think is the creepiest?

    How Lifelike FX Creature Masks are Made

    Halloween's coming up, and we're looking for the best ways to transform into a terrifying creature of the night. Monster masks have been a longstanding horror effects tradition, and today's masks are more lifelike than ever. We visit the workshop of Immortal Masks to learn how the artists there sculpt, mold, cast, and paint amazing silicone masks that look and move realistically.

    How Much Longer Will Directors Keep Shooting Film?

    Ten years ago, Richard Crudo, the current president of the ASC (American Society of Cinematographers), went through a deep depression, and he’s certain most cinematographers went through the same. The reason? Film was coming to an end, and trying to stop celluloid’s demise would be like trying to hold back a tidal wave.

    Before becoming the president of the ASC, Crudo worked as a cinematographer for many years (American Pie). “I was born, raised and worked in the film era, and I still think it represents the gold standard of visual imaging,” Crudo says. “However, one must be realistic. Film is essentially dead. And to try and keep it going on some rarefied level is certainly admirable, but it really has no application to the rest of us. Clearly we live in a digital world and it’s going to be a digital future. You shoot film, where do you get it processed these days? So many labs have closed. And a laboratory can’t be a boutique operation and be expected to operate with any level of efficiency or perfection.”

    For a long time now, the writing’s been on the wall for cameramen and film fans alike. Once esteemed cinematographers like Roger Deakins (The Shawshank Redemption), and Vilmos Zsigmond (Close Encounters, The Deer Hunter), as well as directors like Martin Scorsese (GoodFellas), made the switch to digital, you knew it was all over. Then again, you also knew that as long as directors like Steven Spielberg, Christopher Nolan, and JJ Abrams kept working, film would still be around, at least for a little while.

    In fact, this summer Nolan and Abrams helped keep Kodak in business. The CEO of Kodak, Jeff Clarke, said in a statement, “After extensive discussions with filmmakers, leading studios and others who recognize the unique artistic and archival qualities of film, we intend to continue production. Kodak thanks these industry leaders for their support and ingenuity in finding a way to extend the life of film.”

    Still, cinematographers today are long past the denial and anger phases of celluloid’s death, and they’re now in the acceptance phase. “The digital image that we see today is as bad as we’ll ever see, and it’s only getting better all the time,” Crudo says. “There’s a couple of developments around the corner that I think are going to cause it to exceed film. This coming from a person who would never dreamed he would be talking like this ten years ago. If I heard myself talking this way back then, I’d have chopped my own head off!"

    Roger Deakins, shooting Skyfall.

    “The scale has tipped 180 degrees from what it used to be,” Crudo continues. “Early on, you’d be very suspect about shooting digitally, and you wanted to shoot film because it was the established standard. Today I’d be very dodgy about shooting film vs shooting digital.” But it's not just about the manufacture and use of film stock that needs to be maintained. There's another side to the equation.

    The New York Comic Con 2014 Cosplay Gallery (475+ Photos)

    This was my first time going to New York Comic Con, and what a year to attend. Attendance approached (and possibly even surpassed) that of San Diego, and I had a ton of fun exploring a new convention venue and figuring it out from a photography point-of-view. The massive muli-floor lobby of the Javits Center--lined from floor to ceiling with glass--made for great daytime photos with cool architecture and signage in the background. The show floor's bright red carpeting was a little less accommodating. But in the two days I was there, I managed to get a few good photos to share with you. Thanks to everyone who stopped for a photo--and if you find yourself in this gallery, email me at norman@tested.com with "NYCC Cosplay" in the subject line and I'll get you a full-res copy of your pic!

    NECA Toys' Alien Nostromo Spacesuit Figure

    One of the coolest things I saw at New York Comic Con was NECA Toys' new Nostromo Spacesuit figures from their Alien toy line. (It's the costume Adam wore this summer at Comic-Con!) We take the fragile sculpt and paint master for this figure out of the display case and scrutinize its details to examine what NECA's artists got right. This is a gorgeous piece!

    Making Tested's Blockhead Puppets!

    Jamie, Adam, Will, and Norm get transformed into adorable plush puppets! Stage prop fabricator and fan of the site Sean Harrington made these awesome blockhead puppets for us, utilizing some cool modern technology to streamline the design and build process. Sean walks us through how the puppets were 3D modeled and then prototyped with a laser cutter, allowing him to iterate on the design to change the look and puppeteering comfort of these flapping blockheads. We're smitten!

    Fuel-Free Flight: The State of Electric Airplanes

    When the all-electric E-Fan made its first flight earlier this year, it signaled a breakthrough in the progress of electric aircraft. Although its performance compares well to other contemporary electric designs, the E-Fan does not represent any major technological leaps. More significant is the company behind the E-Fan: Airbus, the European firm better known for producing large airliners. Airbus is a large, multinational company that is deeply entrenched in the business of burning fossil fuels. That such an establishment is willing to invest in the development and production of pure electric and hybrid aircraft is a strong signal that technology may be on the verge of allowing practical electric aircraft for the masses. Much smaller aviation firms and innovative individuals have been shouting that message for years--it’s just that (almost) nobody was listening.

    – THE AIRBUS E-FAN REPRESENTS AN IMPORTANT MILESTONE IN ELECTRIC FLIGHT: A COMMITMENT IN RESEARCH AND PRODUCTION BY A MAJOR PLAYER IN AVIATION. (PHOTO CREDIT: JULIAN HERZOG VIA WIKIPEDIA COMMONS)

    Why Electric?

    In answering the question of why electric propulsion should even be considered for aircraft, you must look at environmental and engineering aspects. On the environmental front, the obvious benefit of electric power is the lack of CO2 emissions. In fact, very strict European emission standards were the catalyst for Airbus’ development of the E-Fan, a stepping stone to their planned hybrid-powered regional commuter aircraft.

    Even if you trace the energy path of an electric-powered aircraft back to a coal-fueled power station feeding the ground-based charger for the airplane’s batteries, the comparative emissions are a tremendous improvement over the exhaust of a kerosene-burning turbine engine. The same is true of hybrid electric systems that would use a small onboard turbine or internal combustion (IC) engine to recharge batteries in flight.

    Without the vibrations inherent in internal combustion engines, an electric aircraft can be built with a lighter and simpler airframe.

    Another environmental benefit of electric aircraft is their lack of noise. How often do you hear about neighborhoods being built on cheap land near a long-established airport, only to have the new residents complain about engine noise? It doesn’t make much sense, but these squeaky wheels are frequently successful in having the airport closed. According to Dr. Brien Seeley, a representative for the CAFE Foundation (Comparative Aircraft Flight Efficiency Foundation– a volunteer organization renowned for its efforts in measuring and improving the efficiency of small aircraft) noise reduction is the primary motivator for many who are developing and improving electric powerplants for aircraft. He says, “The most significant distinguishing feature of electrically powered aircraft will be their prospects for unprecedentedly low noise and the new operational opportunities that will open when combined with extremely short takeoff and landing (ESTOL).” Perhaps quiet airplanes that are able to operate from short runways will be the key to reuniting general aviation (GA – i.e. your average privately-owned Cessna or Piper) and a noise-intolerant public.

    From an engineering perspective, electric propulsion seems to offer several benefits over IC engines. One of the primary advantages is the vibration-free operation of electric motors. A substantial amount of the structure in GA aircraft is dedicated to absorbing the forces caused by having one or more IC engines attached. Just look at the structure of a glider compared to that of an IC-powered airplane and you will see what I mean. Without the vibrations inherent in IC engines, an electric aircraft can be built with a lighter and simpler airframe. When it comes to airplanes, lighter is almost always better.

    Turning Tiny Satellites into Cheap, Deep Space Drones

    There are lots of tiny little satellites orbiting the earth above your head right now. But that’s all they do: orbit, around and around. There is a plan, however, to give these cheap, so-called CubeSats the ability to strike out on their own. With the aid of some relatively simple propulsion technology, the goal is to push these tiny satellites beyond earths’s gravitational pull and into the outer reaches of space.

    The idea is that, in the not so distant future, unmanned space exploration will be accessible to everyone, and not just the NASAs of the world – like tiny little drones in space.

    Image credit: University of Michigan

    Key to all this is little more than water. Using an electrolysis propulsion system, researchers from Cornell University have been working since 2009 on a system that splits water into hydrogen and oxygen gas that can then be ignited to create thrust. The plan is to launch two of these water-propelled CubeSats into space, and send them orbiting around the moon. Another CubeSat propulsion project is being conducted at the University of Michigan, and raised money through a successful crowdfunding campaign.

    “It kind of levels the playing field for a lot of science inquiry. Not everybody is capable of running a billion dollar spacecraft mission for NASA,” explained Mason Peck, former chief technology officer for NASA, who is now working with fellow researcher Rodrigo A. Zeledon at Cornell on the electrolysis propulsion system. “This actually democratizes access to space.”

    Unlike, say, a communications or military satellite, CubeSats are practically microscopic by comparison – mere 10cm cubes, according to the specification first defined in 1999, that have a volume of just 1 liter and can weigh no more than 1.33 kilograms. But, surprisingly, it’s not size that’s held CubeSat propulsion efforts back.

    It's not the CubeSat's small size--10cm--that has held propulsion efforts back.

    “It’s primarily the fact that CuebSats are secondary payload,” Peck explained. “They’re hitching a ride on some other space craft, and that other space craft does not want the little CubeSat to destroy its expensive payload. So for that reason, the CubeSat specification that allows you to launch these as secondary payloads, prohibits you from using material under pressure, or material that’s explosive, or material that’s volatile, in the sense that if it leaks out it would evaporate and poke the surfaces of the spacecraft.”

    But water, explains Peck, is not only non-volatile, it’s “pretty much the ultimate green propellant.” It sits in a tank, gets zapped by an electrolyzer, which separates the hydrogen and oxygen, and is then sent to a combustion chamber until enough pressure builds up to ignite the whole thing. Safe and simple! In theory.

    Appreciating the Art of Film Editing

    Years ago, I interviewed a number of film editors, which was a fascinating experience for me. You can learn a lot about the storytelling process from editors; they're in charge of one of the most important and under-appreciated aspects of filmmaking: choosing not only what shots to leave in, but what to leave out. The collaboration between director and editor on a movie is crucial, because having complete freedom with no outside guidance can ruin a film just as much as having no freedom at all.

    Over the history of cinema, film editing went from physically cutting celluloid on flatbed moviolas to editing digitally on Avid machines, but the most important pieces in an editor’s arsenal have always been the same: timing, instinct, patience, and personal chemistry.

    Photo credit: Flickr user ahhdrjones via Creative Commons

    Steven Kemper’s area of expertise in the editing room is in the action genre. He has cut a number of films for John Woo, including Face / Off and Mission: Impossible 2. Woo’s action sequences are tight and well constructed, yet surprisingly Kemper says Woo gives his editors “tons of leeway” in the cutting room. Woo storyboards his action sequences, “but very often he wings it on the set if he doesn’t get a shot, a shot isn’t working out the way he hoped or he ran out of time. None of the scenes look like the storyboards when you’re done, but you do get an idea of what he’s going for, there are focus points in the sequence that we make sure to hold on to. You end up doing much more than John originally intended. That’s what I really enjoyed about working with him, is he’s totally open to stuff.”

    Working on a John Woo film, the editor has many options open to him considering Woo has multiple cameras rolling during an action scene, sometimes as many as 16 shooting all at once. Woo’s action sequences are famous for deftly blending together numerous camera angles and speeds, which breaks the monotony of typical action editing. “A lot of movies I see today, it seems gratuitous that they go to slow motion in certain spots,” says Kemper. “One of the things I worked particularly hard on, on all of Woo’s pictures is to carefully meld the over-cranked, under-cranked, and normal speed material. If you catch it at the right action, it’s almost seamless. It’s almost like you haven’t realized for a beat that you’ve gone from slow motion right back to a 24-frame shot. I found it not only challenging, but a heck of a lot of fun.”

    Photo credit: Flickr user andrew_saliga via Creative Commons.

    In talking with Kemper, I learned that patience is one of the most important skills for an editor. In cutting the last forty minutes of Mission: Impossible 2, Kemper spent ten weeks--seven days a week, from seven in the morning to eleven at night--editing that portion of the film. For forty minutes, the editor sifted through 12 to 15 hours of film, which he cut down to what you see in the movie. “Woo shoots so much great stuff, to not sift through every frame is a crime!,” Kemper says.

    Building a Custom Arcade Cabinet, Part 6

    With the frame of the arcade cabinet constructed, Norm and Wes head back to the garage to begin the wiring of the buttons and other electronics. In this episode, we discuss the different types of custom arcade controls, the hardware to link them all together, and the tiny computer we're going to build to run the software. (This video series was brought to you by Premium memberships on Tested. Learn more about how you can support us by joining the Tested Premium community!)

    My 10 Virtual Reality Takeaways from Oculus Connect

    I've had a few days now to digest all the information that came out of this past weekend's Oculus Connect conference. It may have only been a two-day developer conference, but the keynotes alone had enough information to expand the imaginations (and lexicon) of virtual reality enthusiasts. There was of course the big Crescent Bay prototype announcement and demo, which Oculus unfortunately said that it has no plans to release or show anywhere else. It was also my first time being able to try the Samsung Gear VR and Oculus' current VR UI solution in Oculus Home and the Cinema application. My mind's been buzzing since I got back from LA, and I wanted to distill some of my personal takeaways from the experience.

    Presence is NOT the same as reality

    More so than at any past Oculus event or meeting I had attended before, the Oculus team emphasized the idea of presence--a significant milestone in virtual reality technology. It's this threshold past which your brain's subconscious computing starts to take over and makes you believe that you're in a separate space within a VR headset. Presence was emphasized because the team thinks that they've achieved it for most people in the Crescent Bay prototype. The 10 minute demo I had with Crescent Bay was leaps and bounds better than the DK2 experience, but I'm going to hold off on giving them the sustained presence checkbox until I can get more time with it. More importantly, we now know Oculus' definition of presence, and the specific technical requirements they're targeting for a consumer release (sub-millimeter tracking accuracy, sub-20ms latency, 90+Hz refresh, at least 1Kx1K per eye resolution, highly calibrated and wide FOV eyebox).

    The reason I'm a little hesitant to say that I achieved the full presence in Crescent Bay is that I really have no appropriate point of comparison for that sensation. The feeling of presence in a virtual space should not be confused with the feeling of reality. I think a lot of people will expect that once they put on something like Crescent Bay, what they see inside the headset feels exactly like what the real world feels like. That's not the case at all. It still looks very much like rendered game graphics, with aliased edges and surreal feeling of disembodiment. To me, presence is about the feeling of space inside of the headset--a sense that the virtual objects and environments you're looking at have volume and a distance from you eyes that's not just two inches away on a screen. Stereoscopy and proper mapping of your head movements are a huge part of that. Presence in these VR demos never takes away the awareness of the virtual nature of that space, but you do feel more apart to it.

    Standing in VR opens up possibilities

    The biggest question for me coming out Oculus Connect was whether the consumer version of the Rift would be a sit-down-only experience. I know that Palmer told everyone in interviews that the Rift is meant to be used sitting down, but I agree with commenters that it may just be them working out a legally and ergonomically acceptable solution for a stand-up design. At least that's fun to think about. Regardless, the Crescent Bay demo confirmed that standing up in VR is technically possible with what Oculus has made so far, and that walking around isn't necessary for a stand-up VR experience (ie. we don't need VR treadmills). The square mat we were allowed to walk around on in the demo was sufficient to show how effective positional tracking could be in a stand-up experience. Even the ability to shift your full body and weight around was extremely meaningful--being able to physically crouch and duck in the virtual space felt liberating in a way that I think will have a profound impact in VR game design. Spinning around in a full 360 degrees was less important, or at least emphasized less with these demos.

    Of course, this setup would require more hardware, including a way to mount the positional tracking camera above the standing user, and a cable management system to keep the headset cable out of the way.

    We're Putting on a Live Stage Show!

    Update: Tickets are now open for everyone to purchase! You can buy them here. See you next month!

    Hey everyone! Will and I have something exciting to announce. On Saturday, Oct 25th, as part of the Bay Area Science Festival, we're putting on our very first live event at San Francisco's historic Castro theater. It's called Tested: The Show (remember that?) and it'll be an afternoon of presentations, demos, and conversations with some of our favorite people about the culture of making and technology's role in it.

    This is the first time we've done anything like this, so we thought a bit about how we could best present the type of stuff we do on Tested--showcasing awesome maker project, geeking out about technology, 3D printing, and more--on stage to a live audience. We don't want to give away too much yet, but you're going to see familiar faces like Game Frame creator Jeremy Williams and The Zoidberg Project's Frank Ippolito show off what they're working on today. And just wait until you see what Jamie has to show.

    For those of you who can make it, we'd love for you spend the afternoon with us on October 25th. We're opening ticket sales tonight to Tested Premium Members first (check your email for instructions!) and will be putting tickets on sale to the general public this Wednesday evening.

    We know that not all of you will be able to make it to San Francisco, so we're going to be recording the entire show and putting it up on the site (and on YouTube) as soon as we can after the event. This is first time we're doing a live event of this kind, but we hope it won't be the last--we'd love to travel your way in the future. We're super excited to put this show on for you, and can't wait to hear what you think.

    If you have any questions, please email us directly at tips@tested.com or post in the comments. I've also included a show FAQ below. Hope to see you in October!

    Hands-On with Samsung Gear VR at Oculus Connect

    At Oculus Connect, Norm gets to try out the upcoming Gear VR virtual reality headset, a collaboration between Samsung and Oculus. It uses a Galaxy Note 4 for its brains and screen, with VR software and optimizations designed by John Carmack. Norm shares his opinion of display performance on the Note 4's 60Hz 1440p screen, and whether the phone's technology is sufficient for a good mobile virtual reality experience.

    Bits to Atoms: World Maker Faire 2014 Recap

    Hey everyone, Sean here with a World Maker Faire New York recap! I’ve been to every NYC Maker Faire and it keeps getting bigger. I’ve had a booth the last two years but was too busy to get one together this time. The upside was I actually got to see the convention and all the new 3D printers and accessories that were either just announced or being shown in person for the first time. Since Will and Norm were unable to make it this year, I wanted to share with you some of the projects and cool stuff I saw.

    Charlesworth Dynamics crew Maker Faire 2013

    Within seconds of setting foot in the 3D Printing Village (one of World Maker Faire’s biggest draws) I ran into Anthony Campusano, a fellow maker who I’ve met numerous times and builder of an amazing Lament Configuration box from Hellraiser.

    Makers: Sean - Anthony - Andreas

    The fellow with him enthusiastically exclaimed, “I follow you on Twitter!” and it turns out to be Andreas Ekberg, who made the Tested Cruiser skateboard! I didn’t realize it until later but Andreas is also responsible for the Classic LEGO Spaceman print that has been on my to-buy list. I had a great time hanging out with my fellow makers. Now let’s take a look at some of the good stuff I saw.

    How Google Should Improve Android Wear to Fend Off Apple Watch

    Apple has been rumored to be working on a watch for a few years, but unlike all that speculation about a TV, the Apple Watch turns out to be real. It's really no surprise--the competition is getting into wearables in a big way. Android Wear and the Apple Watch are aimed at perfecting a growth area for mobile devices. Almost everyone who wants a smartphone or tablet already has one, but a smartwatch is something new.

    Android Wear has the early lead, but now that Apple has unveiled its approach to wearable computing, the heat is on Google to clean up Wear's rough edges--here's what I think needs to happen.

    It Doesn't Always Have to be About Voice

    Both Android Wear and the Apple Watch are big on voice interaction, but more so on the Android side of things. Apple has Siri, which is a capable voice-activated assistant, but Google Now and voice search on Android Wear is plugged into a giant matrix of information. Google is working hard to make natural language work on all devices running Google Search, and voice is a good interface method for Android Wear. However, Google might want offer some more options in the next version of Wear.

    Android Wear's over-reliance on voice input is especially clear when you want to open an app on the watch. With a voice command, you just tell the watch to "start [app name]." If you aren't in a situation where talking at your wrist is possible, you have to open the search interface, tap once, scroll down, tap on 'start,' then scroll down to find and launch the app. That's more than a little ridiculous. There are apps like Wear Mini Launcher that make app launching much easier, but how many people will know to install that? Google ought to realize you can't always shout commands at your wrist.

    Similarly, Google allows you to respond to messages received on your phone via Android Wear. Voice replies are, of course, vastly superior to any keyboard you could ever cram into a 1.5-inch screen. However, Android Wear doesn't even have good support for quick replies. There are a few default options built into the OS, but if yes, no, ok, and a few others don't get the point across, you'll have to get the phone out or use voice. This would be a really simple fix--you could simply add custom quick replies through the Android Wear phone app.

    Building a Custom Arcade Cabinet, Part 5

    We're getting close! In this fifth episode of our custom arcade cabinet build, Norm and John tackle some mistakes made in the original plywood cutting and then work together to assemble the cabinet frame. The challenge of finding a way to mount the heavy CRT monitor inside the chassis requires some problem solving and precise measurements, but this thing is finally starting to look like a real cocktail cabinet! (This video series was brought to you by Premium memberships on Tested. Learn more about how you can support us by joining the Tested Premium community!)