{"id":25196,"date":"2018-06-29T19:16:30","date_gmt":"2018-06-29T19:16:30","guid":{"rendered":"http:\/\/www.kurzweilai.net\/?p=318119"},"modified":"2018-07-02T01:54:04","modified_gmt":"2018-07-02T01:54:04","slug":"how-robots-aided-by-deep-learning-could-help-autism-therapists","status":"publish","type":"post","link":"https:\/\/hoo.central12.com\/fugic\/2018\/06\/29\/how-robots-aided-by-deep-learning-could-help-autism-therapists\/","title":{"rendered":"How robots aided by deep learning could help autism therapists"},"content":{"rendered":"<p><iframe frameborder=\"0\" height=\"376\" src=\"https:\/\/www.youtube.com\/embed\/k_mSXvydAb8\" width=\"560\"><\/iframe><br \/>\n<em><a href=\"https:\/\/www.youtube.com\/channel\/UCEPhh0AIOV9GjPkE1JNjH8w\">MIT Media Lab<\/a>\u00a0 (no sound) | Intro: Personalized Machine Learning for Robot Perception of Affect and Engagement in Autism Therapy. This is an example of a therapy session augmented with SoftBank Robotics\u2019 humanoid robot NAO and deep-learning software. The 35 children with autism who participated in this study ranged in age from 3 to 13. They reacted in various ways to the robots during their 35-minute sessions &#8212; from looking bored and sleepy in some cases to jumping around the room with excitement, clapping their hands, and laughing or touching the robot.<\/em><\/p>\n<p>Robots armed with personalized \u201c<a href=\"https:\/\/en.wikipedia.org\/wiki\/Deep_learning\" >deep learning<\/a>\u201d software could help therapists interpret behavior and personalize therapy of autistic children, while making the therapy more engaging and natural. That\u2019s the conclusion of a study by an international team of researchers at MIT Media Lab,\u00a0Chubu University,\u00a0Imperial College London, and University of Augsburg.*<\/p>\n<p>Children with autism-spectrum conditions often have trouble recognizing the emotional states of people around them &#8212; distinguishing a happy face from a fearful face, for instance. So some therapists use a kid-friendly robot to demonstrate those emotions and to engage the children in imitating the emotions and responding to them in appropriate ways.<\/p>\n<p><strong>Personalized autism therapy<\/strong><\/p>\n<p>But the MIT research team realized that deep learning would help the therapy robots perceive the children\u2019s behavior more naturally, they report in a <em>Science Robotics <\/em>paper.<\/p>\n<p>Personalization is especially important in autism therapy, according to the paper&#8217;s senior author, Rosalind Picard, PhD, a professor at MIT who leads research in affective computing: \u201cIf you have met one person, with autism, you have met one person with autism,\u201d she said, citing a famous adage.<\/p>\n<hr \/>\n<blockquote><p><strong>\u201cComputers will have emotional intelligence by 2029\u201d&#8230; by which time, machines will \u201cbe funny, get the joke, and understand human emotion.\u201d &#8212; Ray Kurzweil<\/strong><\/p><\/blockquote>\n<hr \/>\n<p>\u201cThe challenge of using AI [artificial intelligence] that works in autism is particularly vexing, because the usual AI methods require a lot of data that are similar for each category that is learned,\u201d says Picard, in explaining the need for deep learning. \u201cIn autism, where heterogeneity reigns, the normal AI approaches fail.\u201d<\/p>\n<p><strong>How personalized robot-assisted therapy for autism would work<\/strong><\/p>\n<p>Robot-assisted therapy** for autism often works something like this: A human therapist shows a child photos or flash cards of different faces meant to represent different emotions, to teach them how to recognize expressions of fear, sadness, or joy. The therapist then programs the robot to show these same emotions to the child, and observes the child as she or he engages with the robot. The child\u2019s behavior provides valuable feedback that the robot and therapist need to go forward with the lesson.<\/p>\n<p>\u201cTherapists say that engaging the child for even a few seconds can be a big challenge for them. [But] robots attract the attention of the child,\u201d says lead author Ognjen Rudovic, PhD, a postdoctorate fellow at the <em>MIT Media Lab<\/em>. \u201cAlso, humans change their expressions in many different ways, but the robots always do it in the same way, and this is less frustrating for the child because the child learns in a very structured way how the expressions will be shown.\u201d<\/p>\n<p><iframe frameborder=\"0\" height=\"316\" src=\"https:\/\/www.youtube.com\/embed\/EonsuxKyYNE\" width=\"560\"><\/iframe><br \/>\n<em>SoftBank Robotics | The researchers used <a href=\"https:\/\/www.softbankrobotics.com\/emea\/en\/robots\/nao\">NAO<\/a> humanoid robots in this study. Almost two feet tall and resembling an armored superhero or a droid, NAO conveys different emotions by changing the color of its eyes, the motion of its limbs, and the tone of its voice.<\/em><\/p>\n<p>However, this type of therapy would work best if the robot could also smoothly interpret the child\u2019s own behavior &#8212; such as excited or paying attention &#8212; during the therapy, according to the researchers. To test this assertion, researchers at the MIT Media Lab and Chubu University<em> <\/em>developed a personalized deep learning network that helps robots estimate the engagement and interest of each child during these interactions, they report<em>.<\/em>**<\/p>\n<p>The researchers built a personalized framework that could learn from data collected on each individual child. They captured video of each child\u2019s facial expressions, head and body movements, poses and gestures, audio recordings and data on heart rate, body temperature, and skin sweat response from a monitor on the child\u2019s wrist.<\/p>\n<p>Most of the children in the study reacted to the robot \u201cnot just as a toy but related to NAO respectfully, as it if was a real person,\u201d said Rudovic, especially during storytelling, where the therapists asked how NAO would feel if the children took the robot for an ice cream treat.<\/p>\n<p>In the study, the researchers found that the robots\u2019 perception of the children\u2019s responses agreed with assessments by human experts with a high correlation score of 60 percent, the scientists report.*** (It can be challenging for human observers to reach high levels of agreement about a child\u2019s engagement and behavior. Their correlation scores are usually between 50 and 55 percent, according to the researchers.)<em><\/em><\/p>\n<p><em>Ref.: <\/em><a href=\"http:\/\/robotics.sciencemag.org\/content\/3\/19\/eaao6760.full\">Science Robotics<\/a><em> (open-access). Source: <\/em><em><a href=\"http:\/\/news.mit.edu\/2018\/personalized-deep-learning-equips-robots-autism-therapy-0627\">MIT<\/a><\/em><em><\/em><\/p>\n<p><em>* The study was funded by grants from the Japanese Ministry of Education, Culture, Sports, Science and Technology; Chubu University; and the European Union\u2019s HORIZON 2020 grant (EngageME).<\/em><\/p>\n<p><em>** A deep-learning system uses hierarchical, multiple layers of data processing to improve its tasks, with each successive layer amounting to a slightly more abstract representation of the original raw data. Deep learning has been used in automatic speech and object-recognition programs, making it well-suited for a problem such as making sense of the multiple features of the face, body, and voice that go into understanding a more abstract concept such as a child\u2019s engagement.<\/em><\/p>\n<div id=\"attachment_318158\" class=\"wp-caption aligncenter\" style=\"width: 537px;  border: 1px solid #dddddd; background-color: #f3f3f3; padding-top: 4px; margin: 10px; text-align:center; display: block; margin-right: auto; margin-left: auto;\"><a href=\"http:\/\/www.kurzweilai.net\/how-robots-aided-by-deep-learning-could-help-autism-therapists\/key-stages-sensing-perception-and-interaction-during-robot-assisted-autism-therapy\" rel=\"attachment wp-att-318158\"><img class=\" wp-image-318158\" title=\"key stages (sensing, perception, and interaction) during robot-assisted autism therapy\" src=\"http:\/\/www.kurzweilai.net\/images\/key-stages-sensing-perception-and-interaction-during-robot-assisted-autism-therapy.png\" alt=\"\" width=\"527\" height=\"545\" \/><\/a><p style=' padding: 0 4px 5px; margin: 0;'  class=\"wp-caption-text\">Overview of the key stages (sensing, perception, and interaction) during robot-assisted autism therapy.<br \/>Data from three modalities (audio, visual, and autonomic physiology) were recorded using unobtrusive audiovisual sensors and sensors worn on the child\u2019s wrist, providing the child\u2019s heart-rate, skin-conductance (EDA), body temperature, and accelerometer data. The focus of this work is the robot perception, for which we designed the personalized deep learning framework that can automatically estimate levels of the child\u2019s affective states and engagement. These can then be used to optimize the child-robot interaction and monitor the therapy progress (see Interpretability and utility). The images were obtained by using Softbank Robotics software for the NAO robot. (credit: Ognjen Rudovic et al.\/Science Robotics)<\/p><\/div>\n<p><em>\u201cIn the case of facial expressions, for instance, what parts of the face are the most important for estimation of engagement?\u201d Rudovic says. \u201cDeep learning allows the robot to directly extract the most important information from that data without the need for humans to manually craft those features.\u201d<\/em><\/p>\n<p><em>The robots\u2019 personalized deep learning networks were built from layers of these video, audio, and physiological data, information about the child\u2019s autism diagnosis and abilities, their culture and their gender. The researchers then compared their estimates of the children\u2019s behavior with estimates from five human experts, who coded the children\u2019s video and audio recordings on a continuous scale to determine how pleased or upset, how interested, and how engaged the child seemed during the session.<\/em><\/p>\n<p><em>*** Trained on these personalized data coded by the humans, and tested on data not used in training or tuning the models, the networks significantly improved the robot\u2019s automatic estimation of the child\u2019s behavior for most of the children in the study, beyond what would be estimated if the network combined all the children\u2019s data in a \u201cone-size-fits-all\u201d approach, the researchers found. Rudovic and colleagues were also able to probe how the deep learning network made its estimations, which uncovered some interesting cultural differences between the children. \u201cFor instance, children from Japan showed more body movements during episodes of high engagement, while in Serbs large body movements were associated with disengagement episodes,\u201d Rudovic notes.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>MIT Media Lab&nbsp; (no sound) | Intro: Personalized Machine Learning for Robot Perception of Affect and Engagement in Autism Therapy. This is an example of a therapy session augmented with SoftBank Robotics&rsquo; humanoid robot NAO and deep-learning software. The 35 children with autism who participated in this study ranged in age from 3 to 13. [&#8230;]<\/p>\n","protected":false},"author":454,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[46,49,56,43],"tags":[],"class_list":["post-25196","post","type-post","status-publish","format-standard","hentry","category-airobotics","category-cognitive-scienceneuroscience","category-human-enhancement","category-news"],"_links":{"self":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts\/25196"}],"collection":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/users\/454"}],"replies":[{"embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/comments?post=25196"}],"version-history":[{"count":1,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts\/25196\/revisions"}],"predecessor-version":[{"id":25197,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts\/25196\/revisions\/25197"}],"wp:attachment":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/media?parent=25196"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/categories?post=25196"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/tags?post=25196"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}