Shoot the Sky: The Technology Invented for Astro Cinematography

By Norman Chan

Jay Nemeth is a pioneer in modern aerospace cinematography, inventing new ways to shoot videos of high-altitude missions like Redbull's Project Stratos.

Jay Nemeth founded FlightLine Films in 1984 to help Hollywood productions get aerial footage from helicopters and fixed wing aircraft. His company created motion controlled stabilized camera systems to shoot everything from slow landscape establishing shots to high-speed car chases and even high-G fighter jet flights. Nemeth's most recent job, though, was directing all the camerawork for RedBull's Project Stratos mission, culminating in the amazing video we saw of Felix Baumgartner's jump in October. For that job, Nemeth had to utilize camera technologies that were previously the domain of NASA and the Air Force, for operations like Space Shuttle launches. I spoke with Jay about his work at FlightLine Films and how his company evolved from providing aerospace video services for movies and television to pioneering privatized astro cinematography.

When you started FlightLine Films, what technical challenges did you feel were not being met by traditional film crews?

There were several companies that provided Aerial Camera services, but no one was providing Astro-Cinematography services because there was really no need. All spaceflights were being done by the government and the military, NASA and the Air Force. In 2004, I took my kids to see the first private spaceflight by Scaled Composites in Mojave, CA. I saw the space industry pendulum swinging from government to the private sector and realized that these people would need film crews that could work in zero gravity and would need cameras and equipment that would work in that hostile environment.

Aerospace cinematography sounds really fancy, but it's really just shooting photos and video high up in the air. How does that environment differ from shooting on the ground?

Working in Zero G is not easy, especially when you are trying to hand hold a camera. You tend to just drift free and rotate uncontrollably. We have methods for mounting cameras and providing mobility for the operator to capture shots in ways other than just a fixed and locked down camera position. We have done zero gravity shoots including one for Nintendo with Apollo 11 astronaut Buzz Aldrin. We also have space rated cameras for the exteriors of spacecraft, and housings that allow Motion Picture Cameras and Digital Cinematography cameras to work in the cold vacuum of space. We also have the only privately owned ground based optical trackers with integrated control rooms for filming rocket launches and vehicle re-entry.

Shooting these missions must be a really memorable experience and totally different from what camera operators are used to.

Some of the most memorable missions were for customers that we aren't allowed to talk about, but of the ones we are, the two that stand out are STS135, the final space shuttle launch and the Red Bull Stratos mission. To see Felix Baumgartner falling from the capsule in the stark contrast of the shortwave infrared video and clearly see his arms and legs from 24 miles away was pretty impressive.

We're big gearheads, so let's talk about camera equipment. What cameras do you use, and what do you look in camera technology for when developing new rigs to shoot?

We use a variety of sensors and optics that are determined by the mission. The most powerful telescope we currently use is equivalent to an 8000mm lens. The shortwave infrared is great at making a stable image even though it's looking though distorted air. We use high speed engineering cameras that can run up to several thousand frames per second, as well as Digital Cinematography cameras that can provide IMAX quality images. Unlike the military who must wait for a bidding process, and procurement procedures that can take two years, we are able to get the most technologically advanced cameras as soon as they are available, and sometimes sooner as we receive prototypes from manufactures for testing.

How do the different environments on ground, air, and, and space determine what you're able to shoot and what equipment you can use?

Until now, ground based optical tracking systems costing millions of dollars have only been available to the government and military. We offer these services to the private and commercial space sector using state of the art equipment a fraction of the cost that it would take to get these services from the military ranges, if you could even pull it off. We call these trackers the JLAIRs. A lot of our crew members also work at the ranges operating the government equipment, and it's a kick to hear them say that our systems and way of doing things is superior.

Air systems are pretty standardized and we have built a great reputation as one of the leaders in the field for helicopter and fixed wing filming.

Space systems are a whole new ball game and require extensive testing in chambers that simulate the space environment. Components in some COTS equipment have to be changed, some devices have to be designed and built form the ground up. Our goal is to provide customers with a fast and affordable solution to imaging that is of the highest quality possible.

Can you talk a little bit about the development and creation of your JLAIR system?

It was during the planning of the Red Bull Stratos mission that we realized we were not going to get cooperation from the Air Force by providing us with the trackers we needed. And the units we could get from one the test ranges were prohibitively expensive. I decided then that it was time to build our own. Dennis Fisher who spent 30 years as head of optics at Vandenberg AFB assisted with the design of the JLAIRs, named in honor of the MLAIR he designed for the Air Force. Our tracking pedestals spent most of their lives tracking Space Shuttle launches at the Cape. We upgraded the electronics and installed state of the art sensors and optics.

What are JLAIR's capabilities and what type of video is it best suited to shoot?

The JLAIRs can easily track suborbital spacecraft from launch to apogee and back down. They can clearly image the International Space Station as it passes overhead, track an F22 passing by at 90 degrees per second, and as we saw on Red Bull Stratos, easily resolve a man at 128,000 feet.

JLAIR was adapted from a system NASA used. What improvements did your company have to make?

The pedestals, which are the servo controlled mounts that hold the telescopes, spent most of their time filming Space Shuttle launches before we acquired them. The difference is that our system can relay any sensor's image to anywhere immediately in High Definition. We can uplink the video over satellite IP, and anyone with a password can watch the JLAIR video images live on their iphone, ipad or other mobile device. This live situational awareness to flight controllers, broadcasters, and recovery teams down range is a unique capability.

Whenever we can, we choose to have a human make the decisions about how images are captured. When it is not possible or practical, we design an automated system.

What are the differences and challenges between shooting video in zero-G and high-G?

Zero G shooting is not taxing on today's solid state equipment, but high G can still be a problem especially if vibrations are present such as in a rocket launch. In high G, the crew is usually strapped in to something like a fighter cockpit. In Zero G, you really need to plan out the shots, and expect that your subjects will probably not be where you expect them to be, unless they are holding on to something. The other factor is that from a cinematic standpoint, some angles just don't look that impressive despite the fact that you really are experiencing weightlessness. Lens choices, camera angles, and camera movement weigh heavily on the shot "feeling" like you are weightless.

Recently, Red Bull's Project Stratos got a lot of attention not only for Felix Bamgartner's stunt, but only the way it was filmed and broadcasted live. How did you get involved with that?

In September of 2008, I was working on Red Bull Rampage, an extreme mountain bike competition, doing aerial photography out of a helicopter. After a day of filming, I was having dinner with the producer, Derek Westerlund, and he asked what other projects I had going. I was reluctant to tell him about the new direction for my company, providing film and imaging services to the private space industry, because it sounded kind of nutty to build up a business when there were really no customers yet. So I gave him my spiel about how I was working on cameras that operate in space, and I would one day provide Zero G qualified film crews to the production community, etc. He was silent and just stared at me as I rambled on. I was sure he was thinking I had lost my mind. But instead he told me that Red Bull was working on a project that would take place at the edge of space, and that they were looking for someone that knew how to film these types of things. He said they couldn't find anyone. I said, "I know, that's why I decided to build a business around this." The next day we had a contract in the works, and I became the Red Bull Stratos Director of Photography.

Did the Stratos job offer any new challenges that you had to invent around?

The ascent of the balloon to altitude is a very slow and subtle event, quite different from the fast, furious and explosive event of a rocket launch. This slow climb through temperatures of -70 degrees celsius required some innovations where we needed to heat equipment that later had to be cooled at altitude.

You've described the Stratos Capsule as a flying video production studio. Can you elaborate?

The capsule payload, which was the official term for the camera systems, included 9 High Definition cameras. 9 HD recorders, 3 4K digital cinematography cameras, 3 digital 20 megapixel still cameras, 3 video microwave transmitters, camera control units, video routers, multiplexers, downconverters, upconverters, audio embedders, custom controllers, telemetry radios and computer, 48 space rated circuit breakers and power regulators, and several miles of wiring all crammed into a space the size of a beer keg. It was quite a technological achievement to fit most of the equipment found in a 45 foot mobile live-television production trailer into this small space.

What's your hope for video capture and production for the future of aerospace exploration?

We were so methodical about the design of the systems for Red Bull Stratos, that in hindsight, there isn't much I would change. Unlike past camera systems for space missions that were basically technical in nature, we approached this in a very cinematic manner. My goal was to tell the story of the Stratos mission through images and take viewers on a trip to 128,000 feet with Felix so they could get a taste of what it was like. We plan on giving all future private and commercial spaceflights this same "Wow" factor and take the viewers on an unforgettable trip while giving flight controllers the best situational awareness they've ever had. One of our current projects is developing a High Definition 3D camera that will transmit live video back from the surface of the Moon.