Intuitive movement-based prosthesis control enables arm amputees to reach naturally in virtual reality – preprint

Effie Segas, Sébastien Mick, Vincent Leconte, Rémi Klotz, Daniel Cattaert, Aymar de Rugy
eLife. 2022-10-17; :
DOI: 10.1101/2022.10.15.22281053


Impressive progresses are being made in bionic limbs design and control. Yet, controlling the numerous joints of a prosthetic arm necessary to place the hand at a correct position and orientation to grasp objects remains challenging. Here, we designed an intuitive, movement-based prosthesis control that leverages natural arm coordination to predict distal joints missing in arm amputees based on proximal stump motion and knowledge of the movement goal. This control was validated on 29 participants, including 7 above-elbow amputees, who picked and placed bottles in a wide range of locations in virtual reality, with median success rates over 99% and movement times identical to those of natural movements. Remarkably, this was achieved without any prior training, indicating that this control is intuitive and instantaneously usable. It could be used for phantom limb pain management in virtual reality, or to augment reaching capabilities of invasive neural interfaces usually more focused on hand and grasp control.

Auteurs Bordeaux Neurocampus