Jump to content
  • The best robot to search for life could look like a snake

    Karlston

    • 335 views
    • 6 minutes
     Share


    • 335 views
    • 6 minutes

    Snaking into the ice on Enceladus might work better than drilling through it.

    image-scaled.jpeg

    Trying out the robot on a glacier.
    NASA/JPL-Caltech

     

    Icy ocean worlds like Europa or Enceladus are some of the most promising locations for finding extra-terrestrial life in the Solar System because they host liquid water. But to determine if there is something lurking in their alien oceans, we need to get past ice cover that can be dozens of kilometers thick. Any robots we send through the ice would have to do most of the job on their own because communication with these moons takes as much as 155 minutes.

     

    Researchers working on NASA Jet Propulsion Laboratory's technology development project called Exobiology Extant Life Surveyor (EELS) might have a solution to both those problems. It involves using an AI-guided space snake robot. And they actually built one.

    Geysers on Enceladus

    The most popular idea to get through the ice sheet on Enceladus or Europa so far has been thermal drilling, a technique used for researching glaciers on Earth. It involves a hot drill that simply melts its way through the ice. “Lots of people work on different thermal drilling approaches, but they all have a challenge of sediment accumulation, which impacts the amount of energy needed to make significant progress through the ice sheet,” says Matthew Glinder, the hardware lead of the EELS project.

     

    So, instead of drilling new holes in ice, the EELS team focuses on using ones that are already there. The Cassini mission discovered geyser-like jets shooting water into space from vents in the ice cover near Enceladus’ south pole. “The concept was you’d have a lander to land near a vent and the robot would move on the surface and down into the vent, search the vent, and through the vent go further down into the ocean”, says Matthew Robinson, the EELS project manager.

     

    The problem was that the best Cassini images of the area where that lander would need to touch down have a resolution of roughly 6 meters per pixel, meaning major obstacles to landing could be undetected. To make things worse, those close-up images were monocular, which meant we could not properly figure out the topography. “Look at Mars. First we sent an orbiter. Then we sent a lander. Then we sent a small robot. And then we sent a big robot. This paradigm of exploration allowed us to get very detailed information about the terrain,” says Rohan Thakker, the EELS autonomy lead. “But it takes between seven to 11 years to get to Enceladus. If we followed the same paradigm, it would take a century,” he adds.

    All-terrain snakes

    To deal with unknown terrain, the EELS team built a robot that could go through almost anything—a versatile, bio-inspired, snake-like design about 4.4 meters long and 35 centimeters in diameter. It weighs about 100 kilograms (on Earth, at least). It’s made of 10 mostly identical segments. “Each of those segments share a combination of shape actuation and screw actuation that rotates the screws fitted on the exterior of the segments to propel the robot through its environment,” explains Glinder. By using those two types of actuators, the robot can move using what the team calls “skin propulsion,” which relies on the rotation of screws, or using one of various shape-based movements that rely on shape actuators. “Sidewinding is one of those gaits where you are just pressing the robot against the environment,” Glinder says.

     

    image-1.jpeg
    The basic design also works on surfaces other than ice.

    The standard sensor suite is fitted on the head and includes a set of stereo cameras providing a 360-degree viewing angle. There are also inertial measuring units (IMUs) that use gyroscopes to estimate the robot’s position, and lidar sensors. But it also has a sense of touch. “We are going to have torque force sensors in each segment. This way we will have direct torque plus direct force sensing at each joint,” explains Robinson. All this is supposed to let the EELS robot safely climb up and down Enceladus' vents, hold in place in case of eruptions by pressing itself against the walls, and even navigate by touch alone if cameras and lidar don’t work.

     

    But perhaps the most challenging part of building the EELS robot was its brain.

    Space snake brains

    Joysticking EELS around on Enceladus was out of the question due to the huge communication lag, so the team went for nearly complete autonomy. Ground control will be limited to issuing general commands like “explore this area” or “look for life.” “Think of it like Tesla self-driving software, only you have a vehicle with 48 steering wheels, 48 sets of pedals, working in a space where there are no roads, no stop signs, and no speed limits,” Thakker explains. The AI driving the EELS robot was built around a hierarchical layered software architecture organized into two categories of modules called estimators and controllers.

     

    The lowest-level estimators take information from internal sensors like IMUs and torque force sensors in the segments and use it to determine the robot’s state—whether it is falling, slipping, or hitting something. One level higher are estimators that build a map of the environment and figure out the robot’s location based on feeds from cameras and lidar. The highest-level estimator reasons about risk and figures out when to move fast and when to play it safe. Controllers responsible for taking action range from the basic actuator control systems at the lowest level to task and motion planning at the highest level.

     

    “There are two sides to your mind. There is an intuitive side which is fast, biased, and very low powered, and there is a logical side that asks questions, evaluates answers, and tries to make sense of what is actually going on. We are trying to use the same framework where there are two such subsystems,” says Thakker. The intuitive side of EELS was built using machine learning, in which the robot trained itself how to move. The logical side is a physics-based model with hard-wired safety rules that prevent the robot from exceeding certain speeds or tacking slopes above a specific grade, and so on. All this should make the EELS robot do well on alien icy worlds.

     

    If it ever gets there.

     

    “Currently we are not part of any flight mission,” says Robinson. According to him, though, the EELS architecture can be used in many other destinations, including Earth. “When we were testing EELS on Athabasca Glacier in Canada, we were using it to do actual science. We designed a science instrument that measured the salt content in the water flowing in the glacier. There are terrestrial and space applications for a snake robot. You can use it for search and rescue, looking into rubble piles and so on. But Enceladus remains a source of inspiration for us,” Robinson claims.

     

    Science Robotics, 2024 DOI: 10.1126/scirobotics.adh8332

     

    Source


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...