Researchers Developed a Digger Finger Robot that Senses the Shapes of Buried Objects

Researchers Developed a Digger Finger Robot that Senses the Shapes of Buried Objects

Researchers created a ‘Digger Finger’ robot that digs through granular material such as sand and gravel while sensing the shapes of buried objects. The technology could aid in the deactivation of buried bombs or the inspection of underground cables. Robots have gotten quite good at identifying objects over the years – as long as they’re out in the open.

Detecting buried items in granular material such as sand is a more difficult task. To do so, a robot’s fingers would need to be thin enough to penetrate the sand, mobile enough to wiggle free when sand grains jam, and sensitive enough to feel the detailed shape of the buried object.

To meet the challenge of identifying buried objects, MIT researchers have developed a sharp-tipped robot finger equipped with tactile sensing. The appropriately named Digger Finger was able to dig through granular media such as sand and rice in experiments, and it correctly sensed the shapes of submerged items it encountered. According to the researchers, the robot could one day perform subterranean tasks such as finding buried cables and disarming buried bombs.

Researchers developed a ‘Digger Finger’ robot that digs through granular material, like sand and gravel, and senses the shapes of buried objects. The technology could aid in disarming buried bombs or inspecting underground cables.

The findings will be presented at the International Symposium on Experimental Robotics in the near future. Radhen Patel, a postdoctoral researcher at MIT’s Computer Science and Artificial Intelligence Laboratory, is the study’s lead author (CSAIL). Branden Romero, a CSAIL PhD student, Nancy Ouyang, a Harvard University PhD student, and Edward Adelson, the John and Dorothy Wilson Professor of Vision Science in CSAIL and the Department of Brain and Cognitive Sciences, are among the co-authors.

Trying to find objects buried in granular material (sand, gravel, and other loosely packed particles) isn’t a new quest. Previously, researchers used technologies such as Ground Penetrating Radar or ultrasonic vibrations to sense the subterranean from above. However, these techniques only provide a hazy view of submerged objects. They might have difficulty distinguishing between rock and bone, for example.

“The idea is to create a finger with a good sense of touch and the ability to distinguish between the various things it’s feeling,” Adelson explains. “That would be useful, for example, if you’re trying to find and disable buried bombs.” Making that idea a reality meant clearing a number of hurdles.

Researchers-developed-a-Digger-Finger-robot-that-senses-the-shapes-of-buried-objects-1
Slender robotic finger senses buried items

The team’s first challenge was a matter of form: The robotic finger had to be slender and sharp-tipped.

The researchers had previously used a tactile sensor called GelSight in their previous work. The sensor was made of a clear gel that was covered with a reflective membrane that deformed when objects were pressed against it. A camera and three different colors of LED lights were hidden behind the membrane. The lights shone through the gel and onto the membrane, while the camera captured the pattern of reflection on the membrane. The 3D shape of the contact area where the soft finger touched the object was then extracted using computer vision algorithms. The device provided an excellent sense of artificial touch, but it was weighty.

The researchers slimmed down their GelSight sensor in two ways for the Digger Finger. First, they reshaped it into a slender cylinder with a beveled tip. They then removed two-thirds of the LED lights, replacing them with a combination of blue LEDs and colored fluorescent paint. “It saved a lot of complexity and space,” Ouyang says. “That’s how we got it into such a small package.” The finished product had a tactile sensing membrane about 2 square centimeters in size, similar to the tip of a finger.

After determining the size, the researchers focused on motion, mounting the finger on a robot arm and digging through fine-grained sand and coarse-grained rice. When a large number of particles become stuck in place, granular media tends to jam. As a result, it is difficult to penetrate. As a result, the team enhanced the Digger Finger’s capabilities and put it through a battery of tests.

“We wanted to see how mechanical vibrations help with digging deeper and breaking through jams,” Patel explains. “We experimented with different operating voltages for the vibrating motor, which changed the amplitude and frequency of the vibrations.” Rapid vibrations were discovered to help “fluidize” the media, clearing jams and allowing for deeper burrowing – though this fluidizing effect was more difficult to achieve in sand than in rice.

They also experimented with various twisting motions in both rice and sand. Occasionally, grains of each type of media would become entangled between the Digger-tactile Finger’s membrane and the buried object it was attempting to detect. When this happened with rice, the trapped grains were large enough to completely obscure the shape of the object, but the occlusion was usually cleared with some robotic wiggling. Trapped sand was more difficult to remove, but because the grains were so small, the Digger Finger could still sense the general contours of the target object.

Operators will have to adjust the Digger Finger’s motion pattern for different settings “depending on the type of media and the size and shape of the grains,” according to Patel. The team intends to continue experimenting with new motions to improve the Digger Finger’s ability to navigate various media.

According to Adelson, the Digger Finger is part of a program that is expanding the domains in which robotic touch can be used. Humans use their fingers in a variety of situations, such as fishing for a key in a pants pocket or feeling for a tumor during surgery. “As we get better at artificial touch,” Adelson says, “we want to be able to use it in situations where you’re surrounded by all kinds of distracting information.” “We want to be able to distinguish between the stuff that’s important and the stuff that’s not.”

Share This Post