论文标题

3D打印的大脑控制的机器人臂假肢通过嵌入式深度学习中的SEMG传感器

3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors

论文作者

Lonsdale, David, Zhang, Li, Jiang, Richard

论文摘要

在本文中,我们介绍了通过深度学习开发机器人手臂假肢的工作。我们的工作建议使用应用于Google Inception模型的转移学习技术来重新训练表面肌电图(SEMG)分类的最终层。已经使用Thalmic Labs Myo臂章收集了数据,并用于生成图形图像,该图像由每个图像包含8个子图组成,该图像包含每个传感器40个数据点捕获的SEMG数据,对应于臂章中的8个SEMG传感器的数组。然后,通过使用深度学习模型Inpection-V3将捕获的数据分为四类(拳头,大拇指,张开手,休息),通过转移学习来训练模型,以在新数据的实时输入中准确预测每个模型。然后将此训练有素的模型下载到基于ARM处理器的嵌入系统中,以实现从我们的3D打印机制造的大脑控制机器人臂假肢。测试该方法的功能,使用3D打印机和现成的硬件来控制机器人臂来控制它。 SSH通信协议用于执行带有ARM处理器的嵌入式Raspberry Pi托管的Python文件,以在预测手势的机器人臂上触发运动。

In this paper, we present our work on developing robot arm prosthetic via deep learning. Our work proposes to use transfer learning techniques applied to the Google Inception model to retrain the final layer for surface electromyography (sEMG) classification. Data have been collected using the Thalmic Labs Myo Armband and used to generate graph images comprised of 8 subplots per image containing sEMG data captured from 40 data points per sensor, corresponding to the array of 8 sEMG sensors in the armband. Data captured were then classified into four categories (Fist, Thumbs Up, Open Hand, Rest) via using a deep learning model, Inception-v3, with transfer learning to train the model for accurate prediction of each on real-time input of new data. This trained model was then downloaded to the ARM processor based embedding system to enable the brain-controlled robot-arm prosthetic manufactured from our 3D printer. Testing of the functionality of the method, a robotic arm was produced using a 3D printer and off-the-shelf hardware to control it. SSH communication protocols are employed to execute python files hosted on an embedded Raspberry Pi with ARM processors to trigger movement on the robot arm of the predicted gesture.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源