Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Feb 1;13(1):1813.
doi: 10.1038/s41598-023-29091-0.

Accuracy and feasibility of a novel fine hand motor skill assessment using computer vision object tracking

Affiliations

Accuracy and feasibility of a novel fine hand motor skill assessment using computer vision object tracking

Bokkyu Kim et al. Sci Rep. .

Abstract

We developed a computer vision-based three-dimension (3D) motion capture system employing two action cameras to examine fine hand motor skill by tracking an object manipulated by a hand. This study aimed to examine the accuracy and feasibility of this approach for detecting changes in a fine hand motor skill. We conducted three distinct experiments to assess the system's accuracy and feasibility. We employed two high-resolution, high-frame-rate action cameras. We evaluated the accuracy of our system in calculating the 3D locations of moving object in various directions. We also examined the system's feasibility in identifying improvement in fine hand motor skill after practice in eleven non-disabled young adults. We utilized color-based object detection and tracking to estimate the object's 3D location, and then we computed the object's kinematics, representing the endpoint goal-directed arm reaching movement. Compared to ground truth measurements, the findings demonstrated that our system can adequately estimate the 3D locations of a moving object. We also showed that the system can be used to measure the endpoint kinematics of goal-directed arm reaching movements to detect changes in fine hand motor skill after practice. Future research is needed to confirm the system's reliability and validity in assessing fine hand motor skills in patient populations.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
Experiment 1 setup. (A) A picture of the experiment's workspace. The table was draped with black cloth. A tape measure was placed on the middle of the table. Each 0.1 m was marked from 0.3 to 1 m. The stereo camera system was placed on the table. The target object is 0.3 m away in front of the left camera. The object was then manually moved in the z (anterior–posterior) direction at 0.1 m increments to a distance of 1 m from the camera. The video was taken at a frame rate of 24 frames per second (fps). (B) The experiment workspace from superior view. Two cameras were set apart at each baseline distance for each trial. In the data analysis, the camera image sensor offset (1 cm) was added to the estimated distance between the camera and the object.
Figure 2
Figure 2
Experiment 1 object detection process. (A) Raw image in RGB color space. (B) Image in HSV color space. (C) Cropped HSV image. (D) Binary mask of the target object. (E) The final result of the object detection with bounding box (red square) and centroid (yellow cross). The Cartesian coordinate of the centroid is displayed in yellow text.
Figure 3
Figure 3
Experiment 2 setup and object detection process. (A) Experimental setup. The action cameras on the mount were placed in front of the laptop PC monitor. A simulated pendulum was played on a laptop 300 Hz fresh rate. The pendulum simulation includes the pendulum anchor (red circle), rod (blue line), and bob (green circle). The pendulum started at a pi/2 angle and played for 10 s. (B) Object detection process. Green and red circles were separately detected from the cropped HSV image.
Figure 4
Figure 4
Experiment 3 setup. (A) Chopstick object pick-up test workstation. The workstation was constructed using aluminum extrusion profiles. On the back top side of the workstation, two cameras on a dual twin mount were installed. The cameras were positioned at a 30° angle from the table. The test template was positioned in the center of the table. The workstation's top was illuminated by an LED light. To provide auditory cues, a Bluetooth speaker was placed on the back side of the table. (B) The chopstick object pick-up test template. (1) Chopstick holding position. Participants positioned the tips of their chopsticks at this chopstick holding spot at the beginning of each trial. When a participant hears the whistle, they began moving the chopsticks to pick up the object. (2) Target object home location. The target plastic block is placed at this home position at the start of each trial. (3) Target object—blue plastic block. This blue-colored plastic block sized 1 cm on each edge. A 3-D printer was used to create this object. (4) Target location. The participant moves the plastic block from the home location to this location in 5 s for each trial. The target area measured 2 × 2 cm2. A trial was regarded successful if the participant correctly placed the plastic block in this area without dropping it within five seconds.
Figure 5
Figure 5
Experiment 3 object detection process. (A) Raw image in RGB color space. (B) Image in HSV color space. (C) Cropped HSV image. (D) Binary mask for the detected target object. (E) The final result of the object detection with bounding box (red square) and centroid (yellow cross). The Cartesian coordinate of the centroid is displayed in yellow text. The pixel was the unit of the centroid's Cartesian coordinate.
Figure 6
Figure 6
Experiment 1 results. (A) An example result from a 15 cm baseline setting. On the left y-axis, the distance between the left camera and the object is shown. On the right y-axis, the tangential velocity profile is shown. Blue lines indicate when the green ball was not moving. Black lines indicate when the green was moving. Red dots indicate the movement onset and offset of the green ball based on the velocity profile. (B) Average distance estimation error at different distances and baselines. Boxplot indicates the distance estimation error distribution at each object's distance from the camera. The Scatter plot indicates the distance estimation error at different baseline settings.
Figure 7
Figure 7
Experiment 2 results. (AC) Data from 60 fps of frame rate. (DF) Data from 120 fps of frame rate. (A,D) Pendulum angle comparison. (B,E) Position in x (mediolateral) direction comparison. (C,F) Position in y (superior-inferior) direction comparison.
Figure 8
Figure 8
Exemplar tangential velocity profiles in different time points from a participant. (A) Tangential velocity profiles of the object movements at baseline test (7 successful trials out of 10 baseline trials are presented). Each thin black line indicates a velocity profile of a trial. A thick blue line indicates the mean velocity profile of the timepoint. Gray shade indicates the standard error. (B) Tangential velocity profiles of the object movements at the fifth block of practice (5 trials). (C) Tangential velocity profiles of the object movements at immediate retention test (10 successful trials out of 10 baseline trials are presented). (D) Tangential velocity profiles of the object movements at delayed retention test (9 successful trials out of 10 baseline trials are presented.). Refer to Supplementary Fig. 7 for further information about the success rate of the chopstick motor skill trials.

Similar articles

Cited by

References

    1. Shishov, N., Melzer, I. & Bar-Haim, S. Parameters and measures in assessment of motor learning in neurorehabilitation; A systematic review of the literature. Front. Hum. Neurosci.11, (2017). - PMC - PubMed
    1. Subramanian SK, Yamanaka J, Chilingaryan G, Levin MF. Validity of movement pattern kinematics as measures of arm motor impairment poststroke. Stroke. 2010;41:2303–2308. doi: 10.1161/STROKEAHA.110.593368. - DOI - PubMed
    1. Pfister A, West AM, Bronner S, Noah JA. Comparative abilities of Microsoft Kinect and Vicon 3D motion capture for gait analysis. J. Med. Eng. Technol. 2014;38:274–280. doi: 10.3109/03091902.2014.909540. - DOI - PubMed
    1. Rammer, J. R. Markerless analysis of upper extremity kinematics during standardized pediatric assessment. In ProQuest Dissertations and Theses 227 (2014).
    1. Colyer SL, Evans M, Cosker DP, Salo AIT. A review of the evolution of vision-based motion analysis and the integration of advanced computer vision methods towards developing a markerless system. Sports Med. Open. 2018;4:24. doi: 10.1186/s40798-018-0139-y. - DOI - PMC - PubMed

Publication types