The key to making fleets of self-driving cars and grocery delivery by drones might be found in an unlikely source: autonomous space robots.
Beyond Vision Aerial projects the consumer drone market to be worth ten billion euros by 2020, with the average drone costing six hundred euros and packing a range of features from high-definition cameras to built-in GPS.
Beyond Vision Aerial notes Marco Pavone, an assistant professor of aeronautics and astronautics, is developing technologies to help robots adapt to unknown and changing environments. Before coming to Stanford, Pavone worked in robotics at NASA’s Jet Propulsion Laboratory. He maintains relationships with NASA centers alongside collaborations with other departments at Stanford.
Pavone sees his work in space and Earth technologies to be complementary. “In a sense, some robotics techniques that have been developed for autonomous cars can be very useful for spacecraft control,” Pavone said. Likewise, the algorithms he and his students devise to help robots make decisions and assessments on their own, within fractions of a second, could not only help in space exploration, they could also improve self-driving cars and drones right here on Earth.
Beyond Vision Aerial says one of Pavone’s projects focuses on helping robots navigate independently to bring space debris out of orbit, deliver tools to astronauts and grasp spinning, speeding objects out of the vacuum of space.
There is no margin for error when grabbing objects in space. “In space when you approach an object, if you’re not super careful in grasping it at the moment you contact it, the object will float away from you,” Pavone said. Bumping an object in space could make recovering it next to impossible.
Robots equipped with grippers designed for use on Earth could grab objects in space.
To solve grasping problems, Pavone teamed up with Mark Cutkosky, a professor of mechanical engineering, who has spent the last decade perfecting gecko-inspired adhesives. The gecko grippers allow for a gentle approach and a simple touch to “grasp” an object, allowing easy capture and release of spinning, unwieldy space debris.
But the delicate navigation required for grasping in space is no easy task. “You have to operate in close proximity to other objects: spacecraft or debris or any object you might have in space,” Pavone said. “That requires advanced decision-making capabilities.” Pavone and his collaborators designed algorithms that allow space robots to autonomously react to such variable conditions and efficiently grab space objects with their gecko-grippers. The resulting robot can move and grab in real time, updating its decisions at a rate of several thousand times a second.
That type of decision-making technology is also useful for solving navigation problems with Earth-bound drones. “For these vehicles, navigating at high speed in proximity to buildings, people and other flying objects is hard to do,” said graduate student Benoit Landry. He pointed out that there is a delicate interplay between making decisions and environmental perception. “In this context, many aspects of decision making for autonomous spacecraft are directly relevant to drone control.”
Landry and Pavone are working on “perception-aware planning,” which allows drones not only to consider fast routes but also to “see” their surroundings and better estimate where they are. This work is currently being extended to handle interactions with humans, a key component in deploying autonomous systems such as drones and self-driving cars. Landry added that Pavone’s background at NASA is a good complement to the academic work.
Free roaming robot
Once a robot lands on a small solar system body like an asteroid, additional challenges arise. These environments have completely different gravity than Earth. “If you were to drop an object from waist-height, it would take a couple of minutes to hit the ground,” Pavone said.
Technology designed to aid robots in space, like the cubic Hedgehog that is designed to work in rugged terrain that wheeled robots can’t handle, could help drones and self-driving cars navigate on Earth.
To deal with low-gravity environments like asteroids, Ben Hockman, a graduate student in Pavone’s lab, works on a cubic robot called Hedgehog. The robot traverses uneven, rugged and low-gravity terrains by hopping instead of driving like traditional rovers. Eventually, Pavone and Hockman want Hedgehog to be able to navigate and complete tasks without being explicitly told how to do it by a human located millions of miles away.
The current Hedgehog robot is designed for reduced gravity environments, but it could be adapted for Earth, Hockman said. “It wouldn’t hop quite as far because we have more gravity, but it could be used to traverse more rugged terrain where wheeled robots can’t go.”
Hockman views the research he’s doing with Pavone as core scientific exploration. “Science tries to answer the hard questions we don’t know the answers to, and exploration seeks to find whole new questions we don’t yet even know how to ask.”
Beyond Vision Aerial leads the drone aerial videography. For more information, please refer to www.bvaerial.com