No More Peek-a-Boo: Inventing a Modern Periscope

By Erin Biba

In order to see what’s going on above the water even the most high-tech modern periscope still has to poke it’s little head out above the surface. But scientists and engineers are on the hunt for a new periscope design concept.

The physical design and internal mechanics of a periscope has changed quite a bit over the years, but there’s one thing that still remains the same: in order to see what’s going on above the water even the most high-tech modern periscope still has to poke it’s little head out above the surface. And when you’re a military machine whose main goal is stealth that isn’t exactly a smart move. That’s why, for at least a decade, some scientists and engineers have been trying to figure out how to build a virtual periscope. One that can see what’s happening all around without having to come up for air. And they’re starting to make some significant and exciting progress.

Photo credit: US Navy Naval Historical Center

An Extremely Brief History of Periscopes

According to the US Navy, the first periscope was designed in 1854 by a French chemist named Edme Hippolyte Marie-Davie. It was simply a long tube with mirrors set at 45 degrees angles at each opening. There were several attempts to perfect the design through the following decades--among them a 65-foot, 130-ton tube set with eight prisms designed by American John Holland in 1900, which gave the viewer a very dim 360 degree view of the horizon and could actually be rotated.

Image credit: US Patent Office

The modern periscope, or, at least, the one we all remember from Looney Toons, was a perfected version of Holland’s design. Patented in 1911 by Dr. Frederick O. Kollmorgen, the new version used two telescopes instead of a series of lenses (or prisms). Because it didn’t need prisms at the opening or a series of lenses throughout, the new periscope could be built at a variety of lengths and its opening above the surface could be much smaller. Kollmorgen started a company to develop and update his telescope design and, in fact, the company he created (called Kollmorgen) still exists today.

Kollmorgen’s original design went through several upgrades through the years--adding night vision, star pattern recognition systems, optical magnification, and antennas for satellite communication, but the overall concept mostly remained the same. Then, in the 1960s, the US Navy created the Type 18 periscope, which added television cameras that allowed its images to be displayed anywhere on the submarine and also recorded.

In modern US submarines, beginning around 2004 on all Virginia-class attack subs, the periscopes were replaced by photonics masts. These are telescoping arms that have visible and infrared digital cameras at the top. Since they don’t use mirrors or telescopes, there is no need for the control room to be located directly below the masts anymore. Because of this, the Navy has relocated these sub’s operations area away from the hull and down one deck where there is a lot more space.

Periscopes in the Future

Photonic masts are great for lots of reasons, mainly because they bring the submarine into modern digital era. But they still need to be located above the water in order to capture images of the surface. They’re just not that stealthy. At least, not as stealthy as a periscope could be...

Yoav Schechner, a professor of electrical engineering at Technion, the Israel Institute of Technology, is working on building a better periscope. He has spent the last several years tackling the major problems that would face a virtual scope able to see all the activity happening above the water, while greatly reducing the need to breach the surface.

Photo: Technion-Israel Institute of Technology

The main issue standing in the way of an underwater periscope is the way that light is affected by water and air. It’s a physics issue you are probably very familiar with if you have ever been at the bottom of a swimming pool looking up (or even simply put a teaspoon into a glass of water). When light rays enter water, they bend. This means objects outside of the water appear distorted. The more waves and movement in the water, the more the light rays bend in different directions, and the more distorted the image of objects outside becomes.

In order to have a clear image taken from underwater there has to be a way to account for the distortion created by the surface.

In order to have a clear image taken from underwater there has to be a way to account for the distortion created by the surface. And one way to do that is to somehow measure the waves and understand how, in each unique instance in which a picture is taken, that unique water pattern has affected the image. Seems impossible, but it is actually something that has been done in astronomy for years.

“In ground-based astronomy people are looking at galaxies through the turbulent atmosphere,” says Schechner of his inspiration to use stargazing as a solution. “They need to somehow measure the distortion created by the atmosphere and correct for that. They use a guide star, something in the sky which they already know they shape of, or they create it artificially by shining a laser into the upper atmosphere and then measuring how it gets distorted.”

Unfortunately, there aren’t a lot of objects in the middle of the ocean that can be used as a guide star...except one...the sun! So Schechner designed his camera system to use the sun (or the moon, if it’s night) as a fixed point of reference for removing the distortion from images. Here’s how it works: Half of the camera’s field of view observes a pinhole array -- a letter-sized piece of metal that has small holes in it every 1.8 centimeters. Below the pinhole array is a semi-transparent diffuser that allows the dots of light passing through the pinhole to land on a surface and illustrate which direction they are moving as they enter the water’s surface.

Image credit: Yoav Schechner

“If the water is not flat, then the rays get broken and distorted in different ways over each pinhole. The points projected on the screen will not be uniformly interspaced, since waves on the water create perturbations. We can see: This point should have been here but it moved to this place. It means something about the shape of the water. By taking pictures of the screen and measuring how the dots are moving it’s possible to measure the distortion,” says Schechner.

The other half of the camera’s field of view is unobstructed. It simply points at the target (the sun or another object) and shoots a wavy picture. The reason the view is split in half is because both images have to be taken at exactly the same time in order to determine how the image is affected by the water’s distortion. Using the sun as a static reference is also extremely useful when the target is moving. Once the image is captured, a computer uses a specially-created algorithm to compare the images, calculate the distortion, and remove it.

So how do the images taken by Schechner’s Stella Maris (Stellar Marine Refractive Imaging Sensor) look? “It’s not perfect, but it’s getting better and better,” Schechner says. “The results we have so far show we can significantly reduce the distortion, but we don’t remove it completely. We’re trying to do something people have never done. To the best of our knowledge from the literature, nobody has ever demonstrated the virtual periscope. People have talked about it, but we’ve managed to show success in a very preliminary and crude system. These are very initial results.”

One of the next steps for Schechner and his team, students Marina Alerman and Yohay Swirski, is to start building a commercially-viable version of the system. In addition to improving the definition and clarity of the images, he is also planning to reduce the size of the scope (right now it’s a large box). “We still have more tricks up our sleeve. The system as it is now is very preliminary and big and bulky. It is good to test as research, you need to know the angle of the sun relative to the system, so we use a crude compass and a crude leveler. It would be nice if we could put in an electronic sensor that will give us a better orientation of the system. We also want to improve the resolution by putting the pinholes closer, but if you put them too close you can get confused about which dots are which.”

Unsurprisingly, despite the fact that the system is extremely preliminary, Schechner says he has already had interest from folks hoping to implement it. He says the final product won’t just be useful to submarines, it can also help divers orient themselves or assist researchers studying animals that breach the water’s surface (for example, a whale or a dolphin that jumps) to see how they move above and below the water line without interruption. Whatever the final use turns out to be, it should only be a few years before periscopes are truly stealth.