Mapping human senses for rehabilitation by a wrist robotic device
Date of Issue2017
School of Mechanical and Aerospace Engineering
Proprioception, which is recognized to sense limb positions and body movements, has played a vital role in our daily activities as an impaired proprioceptive system will hinder the position sense and increase the chance of reinjury. To better assess and quantify proprioception precisely and reliably, robotic technology has been introduced. A widely-used method for proprioception assessment is actively matching or reproducing the previously experienced target position. Evidence showed that whether the target is present kinaesthetically or visually affected the matching accuracy and during the movements of kinaesthetic target presentation, variance from the target position is lower by presenting the target actively than passively. To map human joint position senses quantitatively and compare the accuracy of active and passive movements in the presence of visual target, eight subjects were invited to undertake joint position matching tasks via a programmed wrist robot, during which they were asked to match a target position after the target was presented with visualisation both actively and passively. The matching error, bias of error and variability for both conditions were calculated and compared. Matching accuracy was obtained respectively in flexion, extension, abduction and adduction, demonstrating this wrist robot to be a high suitable platform to assess wrist proprioceptive functions. Overall results showed no significant difference in matching accuracy when the target was actively or passively present with visualisation integrated. As movements coded by active afferent information have been proved to yield higher accuracy than passive movements. Results in this study indicated that presenting targets with visualisation affects the matching accuracy.
Final Year Project (FYP)
Nanyang Technological University