The Air Force wants its new spy sensors to work like a bee’s eye.
Here’s why: Most of the military’s optical sensors, which convert an image into an digital signal, are designed by drawing inspiration from human optics. This means that our machines have so far been largely limited to seeing only what we can see. The problem is, the human eye isn’t great. Even if you’re blessed with 20:20 vision, you miss a lot of stuff because our eyes can only detect a small proportion of the all the light waves that are actually bouncing off objects.
Superficially of course, our vision allows us to differentiate between colors, how close something is, and whether it’s coming towards us or away from us – but other animals can see much more.
If you were to dump an average human in the wilderness somewhere, he’d have practically no idea where in the world he was, if he were to rely on just his sight. But do the same to an arthropod, say a bee or a locust, and it would stand a much better chance at finding its way back to base. That’s because their optical systems can see one very important thing that we can’t: polarized light wave patterns.
The pattern of a polarized light wave is such that it indicates the location of where the light is coming from and the insect understands its own geo-location in relation to the polarized light’s origin. In this way, the insect has a built in, autonomous, and sophisticated way of navigating from point A to point B.
What if we could develop sensors that work more like a bee’s eye than a human one? Navigation systems would be autonomous, they wouldn’t need to beam their signal to second party and wait to be told where it was. It’d just know. In other words, they’d work fine by themselves – no loosing the sat nav signal when you’re car’s flanked by downtown skyscrapers or as you go through a tunnel.
Well, that’s exactly what the U.S. military wants to do. In the latest call for research proposals from small business, the U.S. Air Force is asking for someone to develop “biologically-inspired integrated vision systems.”
In creating the next generation of imaging sensors, the Air Force hopes to improve the navigational capabilities, target detection and range of its military hardware. It’s looking to back a program to create an unified system where data is input, crunched, and used by the same computer to allow “autonomous behavior.” This would make the whole process of navigation altogether faster and more efficient and reduce the need for boots on the ground.
Insect vision also differs from ours in color differentiation, they’re able to detect broad-spectrum light waves, which means they can see colors that we can’t. When we see one shade of red, they might see several distinct shades. They’re altogether more skilled at discriminating where one object starts and another begins. This is something that the Air Force is also enthusiastic to include in the project, expressing interest in “camouflage-breaking techniques.”
Turning to the animal kingdom as a muse for technological design is nothing new. All kinds of creatures have been used over the years. Unmanned vehicle control was learned from echolocating bats and the micro hooks of seeds that stick to animal fur gave way to the inception of Velcro. And then, of course, there are the robots modeled after pack mules and cheetahs that can already outrun the faster human. If this Air Force project works out as planned, the machines will be able to see a whole lot better than people, too.