{"id":856,"date":"2015-08-19T02:58:11","date_gmt":"2015-08-19T02:58:11","guid":{"rendered":"http:\/\/www.kurzweilai.net\/?p=259814"},"modified":"2015-08-19T02:58:11","modified_gmt":"2015-08-19T02:58:11","slug":"a-brain-computer-interface-for-controlling-an-exoskeleton","status":"publish","type":"post","link":"https:\/\/hoo.central12.com\/fugic\/2015\/08\/19\/a-brain-computer-interface-for-controlling-an-exoskeleton\/","title":{"rendered":"A brain-computer interface for controlling an exoskeleton"},"content":{"rendered":"<div id=\"attachment_259873\" class=\"wp-caption aligncenter\" style=\"width: 407px;  border: 1px solid #dddddd; background-color: #f3f3f3; padding-top: 4px; margin: 10px; text-align:center; display: block; margin-right: auto; margin-left: auto;\"><img class=\" wp-image-259873\" title=\"exoskeleton control system\" src=\"http:\/\/www.kurzweilai.net\/images\/exoskeleton-control-system.jpg\" alt=\"\" width=\"397\" height=\"318\" \/><p style=' padding: 0 4px 5px; margin: 0;'  class=\"wp-caption-text\">A volunteer calibrating the exoskeleton brain-computer interface (credit: (c) Korea University\/TU Berlin)<\/p><\/div>\n<p>Scientists at <a href=\"http:\/\/www.korea.ac.kr\/\" >Korea University<\/a> and <a href=\"http:\/\/www.tu-berlin.de\/\" >TU Berlin<\/a> have developed a brain-computer interface (BCI) for a lower limb exoskeleton used for gait assistance by decoding specific signals from the user&#8217;s brain.<\/p>\n<div id=\"attachment_259872\" class=\"wp-caption aligncenter\" style=\"width: 407px;  border: 1px solid #dddddd; background-color: #f3f3f3; padding-top: 4px; margin: 10px; text-align:center; display: block; margin-right: auto; margin-left: auto;\"><img class=\" wp-image-259872\" title=\"visual stimulus generation\" src=\"http:\/\/www.kurzweilai.net\/images\/visual-stimulus-generation.jpg\" alt=\"\" width=\"397\" height=\"325\" \/><p style=' padding: 0 4px 5px; margin: 0;'  class=\"wp-caption-text\">LEDs flickering at five different frequencies code for five different commands (credit: Korea University\/TU Berlin)<\/p><\/div>\n<p>Using an <a href=\"https:\/\/en.wikipedia.org\/wiki\/Electroencephalography\" >electroencephalogram (EEG)<\/a> cap, the system allows users to move forward, turn left and right, sit, and stand, simply by staring at one of five flickering light emitting diodes (LEDs).<\/p>\n<p>Each of the five LEDs flickers at a different frequency, corresponding to five types of movements. When the user focuses their attention on a specific LED, the flickering light generates a visual evoked potential in the EEG signal, which is then identified by a computer and used to control the exoskeleton to move in the appropriate manner (forward, left, right, stand, sit).<\/p>\n<p style=\"text-align: center;\"><iframe frameborder=\"0\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/jeLghZ8GASA?rel=0\" width=\"640\"><\/iframe><br \/>\n<em>Korea University\/TU Berlin | A brain-computer interface for controlling an exoskeleton<\/em><\/p>\n<p>The results are published in an <a href=\"http:\/\/iopscience.iop.org\/1741-2552\/12\/5\/056009\/article\" >open-access paper<\/a> today (August 18) in the\u00a0<em>Journal of Neural Engineering<\/em>.<\/p>\n<p>&#8220;A key problem is designing such a system is that exoskeletons create lots of electrical &#8216;noise,&#8217;&#8221; explains <a href=\"https:\/\/www.ml.tu-berlin.de\/menue\/members\/klaus-robert_mueller\/\" >Klaus Muller<\/a>, an author of the paper. &#8220;The EEG signal [from the brain] gets buried under all this noise, but our system is able to separate out the EEG signal and the frequency of the flickering LED within this signal.&#8221;<\/p>\n<p>&#8220;People with <a href=\"https:\/\/en.wikipedia.org\/wiki\/Amyotrophic_lateral_sclerosis\" >amyotrophic lateral sclerosis (ALS)<\/a> (motor neuron disease) or spinal cord injuries face difficulties communicating or using their limbs,&#8221; he said. This system could let them walk again, he believes. He suggests that the control system could be added on to existing BCI devices, such as <a href=\"http:\/\/www.kurzweilai.net\/openbci-opens-up-low-cost-brain-wave-controlled-experimentation-to-everyone\" >Open BCI<\/a> devices.<\/p>\n<p>In experiments with 11 volunteers, it only took them a few minutes to be trained in operating the system. Because of the flickering LEDs, they were carefully screened for epilepsy prior to taking part in the research. The researchers are now working to reduce the &#8220;visual fatigue&#8221; associated with longer-term use.<\/p>\n<hr \/>\n<p><strong>Abstract of\u00a0<em>A lower limb exoskeleton control system based on steady state visual evoked potentials<\/em><\/strong><\/p>\n<p><em>Objective.<\/em>\u00a0We have developed an asynchronous brain\u2013machine interface (BMI)-based lower limb exoskeleton control system based on steady-state visual evoked potentials (SSVEPs).\u00a0<em><\/em><\/p>\n<p><em>Approach.<\/em>\u00a0By decoding electroencephalography signals in real-time, users are able to walk forward, turn right, turn left, sit, and stand while wearing the exoskeleton. SSVEP stimulation is implemented with a visual stimulation unit, consisting of five light emitting diodes fixed to the exoskeleton. A canonical correlation analysis (CCA) method for the extraction of frequency information associated with the SSVEP was used in combination with\u00a0<em>k<\/em>-nearest neighbors. <em><\/em><\/p>\n<p><em>Main results.<\/em>\u00a0Overall, 11 healthy subjects participated in the experiment to evaluate performance. To achieve the best classification, CCA was first calibrated in an offline experiment. In the subsequent online experiment, our results exhibit accuracies of 91.3 \u00b1 5.73%, a response time of 3.28 \u00b1 1.82 s, an information transfer rate of 32.9 \u00b1 9.13 bits\/min, and a completion time of 1100 \u00b1 154.92 s for the experimental parcour studied.\u00a0<em><\/em><\/p>\n<p><em>Significance.<\/em>\u00a0The ability to achieve such high quality BMI control indicates that an SSVEP-based lower limb exoskeleton for gait assistance is becoming feasible.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Scientists at Korea University and TU Berlin have developed a brain-computer interface (BCI) for a lower limb exoskeleton used for gait assistance by decoding specific signals from the user&rsquo;s brain. Using an electroencephalogram (EEG) cap, the system allows users to move forward, turn left and right, sit, and stand, simply by staring at one of [&#8230;]<\/p>\n","protected":false},"author":13,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[49,48,43],"tags":[],"class_list":["post-856","post","type-post","status-publish","format-standard","hentry","category-cognitive-scienceneuroscience","category-electronics","category-news"],"_links":{"self":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts\/856"}],"collection":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/users\/13"}],"replies":[{"embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/comments?post=856"}],"version-history":[{"count":1,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts\/856\/revisions"}],"predecessor-version":[{"id":857,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/posts\/856\/revisions\/857"}],"wp:attachment":[{"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/media?parent=856"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/categories?post=856"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/hoo.central12.com\/fugic\/wp-json\/wp\/v2\/tags?post=856"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}