{"id":17126,"date":"2017-06-24T01:05:09","date_gmt":"2017-06-24T01:05:09","guid":{"rendered":"http:\/\/www.kurzweilai.net\/?p=302026"},"modified":"2017-06-25T20:38:52","modified_gmt":"2017-06-25T20:38:52","slug":"tactile-sensor-lets-robots-gauge-objects-hardness-and-manipulate-small-tools","status":"publish","type":"post","link":"https:\/\/hoo.central12.com\/fugic\/2017\/06\/24\/tactile-sensor-lets-robots-gauge-objects-hardness-and-manipulate-small-tools\/","title":{"rendered":"Tactile sensor lets robots gauge objects&rsquo; hardness and manipulate small tools"},"content":{"rendered":"<div id=\"attachment_302592\" class=\"wp-caption aligncenter\" style=\"width: 368px;  border: 1px solid #dddddd; background-color: #f3f3f3; padding-top: 4px; margin: 10px; text-align:center; display: block; margin-right: auto; margin-left: auto;\"><img class=\" wp-image-302592\" title=\"Gelsight sensor on robot arm\" src=\"http:\/\/www.kurzweilai.net\/images\/Gelsight-sensor-on-robot-arm.png\" alt=\"\" width=\"358\" height=\"468\" \/><p style=' padding: 0 4px 5px; margin: 0;'  class=\"wp-caption-text\">A GelSight sensor attached to a robot\u2019s gripper enables the robot to determine precisely where it has grasped a small screwdriver, removing it from and inserting it back into a slot, even when the gripper screens the screwdriver from the robot\u2019s camera. (credit: Robot Locomotion Group at MIT)<\/p><\/div>\n<p>Researchers at <a href=\"https:\/\/www.csail.mit.edu\/\" >MIT\u2019s Computer Science and Artificial Intelligence Laboratory (CSAIL)<\/a> have added sensors to grippers on robot arms to give robots greater sensitivity and dexterity. The sensor can judge the hardness of surfaces it touches, enabling a robot to manipulate smaller objects than was previously possible.<\/p>\n<p>The &#8220;GelSight&#8221; sensor consists of a block of transparent soft rubber &#8212; the \u201cgel\u201d of its name &#8212; with one face coated with metallic paint. It is mounted on one side of a robotic gripper. When the paint-coated face is pressed against an object, the face conforms to the object\u2019s shape and the metallic paint makes the object\u2019s surface reflective. Mounted on the sensor opposite the paint-coated face of the rubber block are three colored lights at different angles and a single camera.<\/p>\n<p>Humans gauge hardness by the degree to which the contact area between the object and our fingers changes as we press on it. Softer objects tend to flatten more, increasing the contact area. The MIT researchers used the same approach.<\/p>\n<p>A GelSight sensor, pressed against each object manually, recorded how the contact pattern changed over time, essentially producing a short movie for each object. A <a>neural network<\/a> was then used to look for correlations between changes in contact patterns and hardness measurements. The resulting system takes frames of video as inputs and produces hardness scores with very high accuracy.<\/p>\n<p>The researchers also designed control algorithms that use a computer vision system to guide the robot\u2019s gripper toward a tool and then turn location estimation over to a GelSight sensor once the robot has the tool in hand.<\/p>\n<p>\u201cI think that the GelSight technology, as well as other high-bandwidth tactile sensors, will make a big impact in robotics,\u201d says Sergey Levine, an assistant professor of electrical engineering and computer science at the University of California at Berkeley. \u201cFor humans, our sense of touch is one of the key enabling factors for our amazing manual dexterity. Current robots lack this type of dexterity and are limited in their ability to react to surface features when manipulating objects. If you imagine fumbling for a light switch in the dark, extracting an object from your pocket, or any of the other numerous things that you can do without even thinking \u2014 these all rely on touch sensing.\u201d<\/p>\n<p>The researchers presented their work in two papers at the International Conference on Robotics and Automation.<\/p>\n<p><iframe frameborder=\"0\" height=\"315\" src=\"https:\/\/www.youtube.com\/embed\/QDvXHNtYbBA?rel=0\" width=\"560\"><\/iframe><br \/>\n<em>Wenzhen Yuan | Measuring hardness of fruits with GelSight sensor<\/em><\/p>\n<hr \/>\n<h4>Abstract of\u00a0<em>Tracking Objects with Point Clouds from Vision and Touch<\/em><\/h4>\n<p>We present an object-tracking framework that fuses point cloud information from an RGB-D camera with tactile information from a GelSight contact sensor. GelSight can be treated as a source of dense local geometric information, which we incorporate directly into a conventional point-cloud-based articulated object tracker based on signed-distance functions. Our implementation runs at 12 Hz using an online depth reconstruction algorithm for GelSight and a modified secondorder update for the tracking algorithm. We present data from hardware experiments demonstrating that the addition of contact-based geometric information significantly improves the pose accuracy during contact, and provides robustness to occlusions of small objects by the robot\u2019s end effector.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Researchers at MIT&rsquo;s Computer Science and Artificial Intelligence Laboratory (CSAIL) have added sensors to grippers on robot arms to give robots greater sensitivity and dexterity. The sensor can judge the hardness of surfaces it touches, enabling a robot to manipulate smaller objects than was previously possible. The &ldquo;GelSight&rdquo; sensor consists of a block of transparent [&#8230;]<\/p>\n","protected":false},"author":13,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[46,43],"tags":[],"class_list":["post-17126","post","type-post","status-publish","format-standard","hentry","category-airobotics","category-news"],"_links":{"self":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts\/17126"}],"collection":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/users\/13"}],"replies":[{"embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/comments?post=17126"}],"version-history":[{"count":2,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts\/17126\/revisions"}],"predecessor-version":[{"id":17145,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts\/17126\/revisions\/17145"}],"wp:attachment":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/media?parent=17126"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/categories?post=17126"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/tags?post=17126"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}