Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Aug 15;13(1):13273.
doi: 10.1038/s41598-023-40446-5.

Using principles of motor control to analyze performance of human machine interfaces

Affiliations

Using principles of motor control to analyze performance of human machine interfaces

Shriniwas Patwardhan et al. Sci Rep. .

Abstract

There have been significant advances in biosignal extraction techniques to drive external biomechatronic devices or to use as inputs to sophisticated human machine interfaces. The control signals are typically derived from biological signals such as myoelectric measurements made either from the surface of the skin or subcutaneously. Other biosignal sensing modalities are emerging. With improvements in sensing modalities and control algorithms, it is becoming possible to robustly control the target position of an end-effector. It remains largely unknown to what extent these improvements can lead to naturalistic human-like movement. In this paper, we sought to answer this question. We utilized a sensing paradigm called sonomyography based on continuous ultrasound imaging of forearm muscles. Unlike myoelectric control strategies which measure electrical activation and use the extracted signals to determine the velocity of an end-effector; sonomyography measures muscle deformation directly with ultrasound and uses the extracted signals to proportionally control the position of an end-effector. Previously, we showed that users were able to accurately and precisely perform a virtual target acquisition task using sonomyography. In this work, we investigate the time course of the control trajectories derived from sonomyography. We show that the time course of the sonomyography-derived trajectories that users take to reach virtual targets reflect the trajectories shown to be typical for kinematic characteristics observed in biological limbs. Specifically, during a target acquisition task, the velocity profiles followed a minimum jerk trajectory shown for point-to-point arm reaching movements, with similar time to target. In addition, the trajectories based on ultrasound imaging result in a systematic delay and scaling of peak movement velocity as the movement distance increased. We believe this is the first evaluation of similarities in control policies in coordinated movements in jointed limbs, and those based on position control signals extracted at the individual muscle level. These results have strong implications for the future development of control paradigms for assistive technologies.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
(a) Setup for Experiment 1. Subjects were seated in front of a digitizing tablet to capture the movement trajectories as they moved a manipulandum. A LCD monitor was placed above the tablet that showed the target location and the subjects’ manipulandum position. The subjects were asked to achieve the target positions displayed on the screen. (b) Setup for Experiment 2. Subjects were instrumented with an ultrasound transducer that recorded cross-sectional ultrasound images of their forearm, which resulted in a proportional signal correlating to their extent of flexion. The position of the subject-controlled cursor on the screen in front of them was driven by the proportional signal derived from sonomyography. Subjects were asked to achieve the target positions displayed on the screen. (c) Diagram showing the target acquisition task performed in Experiment 1. The dark circle on the left shows the starting position, while all the possible target locations are shown as faint circles on the screen. (d) Diagram showing the target acquisition task performed in Experiment 2. All possible targets are given by faint red lines. The ’X’ indicates the user controlled cursor. Flexing their wrist would make the cursor go left and extending their wrist would make the cursor go right (for a right handed person). It would be opposite for a left handed person. A fully extended wrist state would correspond with the cursor being at the right-most edge of the screen and a fully flexed state of the wrist would correspond with the cursor being in the left-most edge of the screen (for a right handed person).
Figure 2
Figure 2
Position traces versus time. The red line shows the mean position trace across all subjects, and the shaded yellow region shows one standard deviation. The horizontal axis represents time (seconds) from movement onset at time zero, and the vertical axis represents the distance to the target as a percentage of the workspace. The black line represents the minimum jerk trajectory based on the time to reach the target.
Figure 3
Figure 3
Velocity traces versus time. The red line shows the mean velocity trace across all subjects, and the shaded yellow region shows one standard deviation. The horizontal axis represents time (seconds) from movement onset at time zero, and the vertical axis represents the movement velocity as a percentage of the screen covered per second. The black line represents the minimum jerk trajectory based on the time to reach the target.
Figure 4
Figure 4
Peak velocity versus movement distance. Each violin shows the distribution of peak velocities for all trials plotted against movement distance for those trials. As the movement amplitude increased, the subjects proportionally increased the peak movement velocity.
Figure 5
Figure 5
Time to target in seconds. The time take by subjects to acquire a specific target increased as the movement distance to that target increased.
Figure 6
Figure 6
Normalized time to target versus normalized peak velocity. Every trial’s peak velocity and time to target are normalized with respect to the average peak velocity and average time to target for the smallest movement distance. Each color represents a different movement amplitude. The ellipse is centered at the normalized mean of the distribution and the size of the major/minor axis represents the standard deviation along that axis. As the movement amplitude increases (from blue to magenta), the time to target (vertical location of each ellipse) does not change as much as the peak velocity (horizontal location of each ellipse).
Figure 7
Figure 7
Normalized mean velocity profiles across all subjects versus time. The data is grouped by movement amplitude (each color represents a different movement distance). The peak velocity achieved by the subjects increases as the target moves further away from the start location.
Figure 8
Figure 8
Root mean squared position error between the users’ cursor position and the ideal minimum jerk trajectory. The error increased as the movement distance to that target increased.
Figure 9
Figure 9
Percentage of trials completed versus time. The data is grouped by movement distance. The percentage of trials completed within a certain time period was inversely proportional to the movement distance. Each color shows a different movement distance, whereas the black line shows the average number of trials across all movement distances. These figures also show that almost all the trials were completed within 2 s.
Figure 10
Figure 10
Path efficiency for both control modalities versus movement distance. Path efficiency is the ratio of actual path length to the ideal path length.

Update of

Similar articles

References

    1. Birbaumer N. Breaking the silence: Brain-computer interfaces (BCI) for communication and motor control. Psychophysiology. 2006;43:517–532. doi: 10.1111/j.1469-8986.2006.00456.x. - DOI - PubMed
    1. Zhu M, He T, Lee C. Technologies toward next generation human machine interfaces: From machine learning enhanced tactile sensing to neuromorphic sensory systems. Appl. Phys. Rev. 2020;7:031305. doi: 10.1063/5.0016485. - DOI
    1. Vuletic T, et al. Systematic literature review of hand gestures used in human computer interaction interfaces. Int. J. Hum Comput Stud. 2019;129:74–94. doi: 10.1016/j.ijhcs.2019.03.011. - DOI
    1. Hramov AE, Maksimenko VA, Pisarchik AN. Physical principles of brain-computer interfaces and their applications for rehabilitation, robotics and control of human brain states. Phys. Rep. 2021;918:1–133. doi: 10.1016/j.physrep.2021.03.002. - DOI
    1. Gu X, et al. EEG-based brain-computer interfaces (BCIs): A survey of recent studies on signal sensing technologies and computational intelligence approaches and their applications. IEEE/ACM Trans. Comput. Biol. Bioinf. 2021;18:1645–1666. doi: 10.1109/TCBB.2021.3052811. - DOI - PubMed

Publication types