Participants were recruited from the UCI stroke survivor database, regional hospitals and stroke support groups. Inclusion criteria were: aged 18–85 years old, with stroke (single or multiple, ischemic or hemorrhagic) confirmed via radiological imaging occurring greater than 6 months prior to enrollment with residual unilateral upper extremity weakness (Table. 1). Participants were excluded if they scored fewer than 3 blocks on the box and blocks test (BBT) or had less than a 20% difference in BBT score between their affected and unaffected arm. Additional exclusion criteria included severe muscle tone of the affected arm (score > 3 on the Modified Ashworth Spasticity Scale), severe aphasia (score of 3 on question 9 of the NIH Stroke Scale), evidence of major depression (DSM V criteria or Geriatric Depression Scale score > 10) and concurrent participation in another study related to stroke recovery. A total of 46 chronic post-stroke adults were enrolled in this study. All visits took place at UCI. All participants provided informed consent (UCI IRB#476), and procedures were conducted according to the Declaration of Helsinki. The trial was registered on ClinicalTrials.Gov (NCT04818073).
Fig. 1
Left: Experimental protocol of the parent randomized, controlled trial. Data from the baseline sessions and first week of training (green box) were used in this study to evaluate feasibility of each training mode, as judged by success and motivation. The baseline sessions included clinical assessments of hand function (Box and Blocks, Fugl-Meyer) and robotic assessments of sensorimotor function (finger capacity, finger and thumb proprioception). Each robotic training session included 10 games of RehabHero (5 in two-string, 5 in three-string mode), and 18 games of FingerPong (8 in matched mode, 10 in target mode). Physical assistance was tuned during the first session, and unassisted gameplay was assessed in the third session. Right: The FINGER robot can support flexion and extension of the index and middle finger, and adduction/abduction and flexion/extension of the thumb, shown here attached to a participant’s left hand. The black plastic occlusion screen slides over to cover the participants’ view of the hand during training and assessments. FINGER can be used to train either hand in mirrored right and left robotic configurations
Experimental protocolIn this paper, we report on the game-play success and motivation data collected from the first week of an ongoing clinical trial of robotic hand therapy for chronic stroke survivors (Fig. 1). Full clinical outcomes will be published in a subsequent paper. All participants trained using the FINGER robotic exoskeleton, which can assist and measure flexion and extension of the index and middle finger, and adduction/abduction and flexion/extension of the thumb [36, 37]. “Baseline” clinical and robotic evaluations of hand sensorimotor function were performed twice, one week apart, prior to starting robotic training. At the second Baseline visit, participants were randomly assigned to one of three training groups; Standard Training, Proprioceptive Training, or Virtual Training (detailed below), and were briefly trained on FINGER games to familiarize them with task instructions. The following week, participants then trained for three, 2-hour sessions with FINGER. For all groups, each training session consisted of 10 games of RehabHero (5 in two-string mode, 5 in three-string mode, described next), and 18 games of FingerPong (8 in classic mode, 10 in target mode, detailed below), resulting in 1060 cued flexion/extension movements in the session.
Rehabilitation gamesParticipants in the Standard and Virtual Training groups played rehabilitation games in the Vision-based game mode, while participants in the Proprioception training group played in the Proprioceptive game mode (Fig. 2), detailed below. For all groups, the workspace for each game was bounded between 90% of the participant’s maximal range of motion in the device, with a minimum workspace of 30 degrees. In all groups, vision of the hand was occluded by a black plastic divider during training.
Vision-based gamesIn RehabHero, a version of the popular musical video game GuitarHero (Fig. 2A), participants were cued to hit notes on one of three different “strings” by flexing their index and/or middle finger to stop inside of the target “string” as a scrolling note passed through it [36, 38]. Participants needed to flex their index finger to hit notes incoming on the top string, their middle finger to hit notes on the bottom string, and both fingers to hit notes on the middle string. In the protocol, the first five RehabHero songs used only top and bottom strings (2-string mode), while the second five used all three strings (3-string mode), increasing the challenge. In FingerPong, participants flexed or extended their finger to move a paddle up or down to hit a ball to a computer opponent in a modified version of the classic Pong game (Fig. 2C). In classic mode, participants could hit the ball with any part of their paddle; in target mode, participants were cued to hit the ball with a specific part of the paddle to rebound the ball to highlighted targets, increasing the challenge. Participants played half of the FingerPong games with their index finger controlling the paddle, and half with their middle finger.
Fig. 2
Overview of visual and proprioceptive games. In all training modes, vision of the hand is blocked from view by the black plastic occlusion screen. A: Visually guided RehabHero, in which the note (green ball) moves across the screen to the “fretboard” (3 vertical keys representing guitar strings) to indicate which finger (top key = index finger, middle key = both fingers, bottom key = middle finger) to flex to try to hit the note when it reaches the fretboard. Participants’ finger position is represented by the black dots. To train finger extension, the participants were cued to extend their fingers past the green vertical line after each note. B: Proprioceptive RehabHero, in which vision of the fretboard and incoming notes is removed. The incoming note’s vertical position is displayed proprioceptively by the position of the thumb, which is moved upward into radial abduction to indicate top key notes, a neutral middle position to indicate middle key notes, and downward into palmar abduction to indicate bottom key notes. The timing of the note arriving at the fretboard is displayed by a red line moving horizontally across the gaming monitor. C: Visually guided FingerPong, in which the player’s paddle and the ball are always visible. The player controls their paddle by flexing/extending their finger to move the paddle down/upward to hit the ball when it arrives on the player’s side of the gaming monitor. D: Proprioceptive FingerPong, in which the vertical position of the ball is removed and displayed proprioceptively by the robot moving the “ball finger” in flexion/extension to display the ball’s downward/upward position on the screen (left insert). The ball’s horizontal movement is displayed visually by a vertical yellow bar moving laterally across the screen. The participant must match their “paddle finger” to their “ball finger” when the yellow line reaches the player side of the monitor to hit the ball (right insert)
Proprioceptive gamingThe Proprioceptive Training mode for both RehabHero and FingerPong was implemented by removing some of the visual information displayed on the screen and replacing it with physical cues, a robotic gaming strategy we call “propriopixels” [32]. In RehabHero, the incoming note was replaced by a vertical bar moving across the screen to indicate the timing the note arrives at the string, while occluding the vertical position of the note. We then displayed which string the incoming note was arriving on proprioceptively with movement of the thumb (Fig. 2B). The robot moved the thumb to a top position (corresponding to radial abduction) for top string notes, a bottom position (palmar abduction) for bottom string notes, and halfway between to indicate middle string notes. Because the hand was covered by an opaque screen during gameplay, participants had to sense their thumb position proprioceptively and use this information to decide which finger(s) to try to flex to hit the incoming note.
For FingerPong, we replaced the display of the ball’s vertical position on the screen with robot guided flexion/extension movement of one finger to display the ball’s (bottom/top) vertical position on the screen (Fig. 2D). Again, the ball’s horizontal position (hit timing) was displayed by a vertical bar moving across the screen. Participants were tasked with relaxing their “ball finger” and proprioceptively sensing its position and then moving the “paddle finger” (the finger controlling the paddle) to match its position. In the target mode, which we designed to be more challenging, participants had to move the paddle finger into relative positions to the ball finger (slightly above, centered, slightly below) to hit the ball to targets on the other side of the screen.
Robot assistance strategyIn the previous FINGER clinical trial [28], we found that an 80% success rate yielded higher self-reported motivation than a 50% success rate. Similarly, our study of a home-based sensor system for unassisted movement training indicated that a game success rate of 80–90% was predictive of the best perseverance, with higher and lower success rates showing lower perseverance [30]. Thus, we chose to adjust the robot and game parameters to try to achieve 80% game success; these parameters depended on the game and assistance mode, as described below.
To achieve parameters that stabilized game success at 80%, the assistive algorithm (Eq. 1) increased the parameters by a relative increment of 1 for each missed movement (i.e., note missed) and decreased by ¼ for each successful movement (i.e., ball hit) as done in our previous study [39]. Using this algorithm, when the participant reaches 80% success, the tuning gain converges to a stable value.
$$\:Gain\left(n+1\right)=\left\Gain\left(n\right)+1x\:\:\:if\:missed\\\:Gain\left(n\right)-\fracx\:\:\:\:\:if\:hit\end\right.$$
(1)
where n is the current trial number (e.g. note in guitar hero), and x is an empirically determined incremental value to adjust parameter gains.
Physical assistance was implemented by the FINGER robot in the same way as the original study. Briefly, the FINGER robot applied assistive forces using a compliant position controller with a tunable gain to guide the fingers along a smooth trajectory to intercept the note at the cued time, or to move the paddle to intercept the ball. Assistance was only provided if the participant initiated movement themselves, as determined via force sensors mounted at the connection point between the fingers and the FINGER exoskeleton (threshold = 2 N). Gains were increased for each missed note/finger extension in RehabHero or missed hit in FingerPong, and decremented for successful movements. In RehabHero, assistance gains were finger and direction specific; in FingerPong, assistance was bidirectional and averaged across fingers. This physical assistance strategy was used for the “Standard Training” group and the “Proprioceptive Training” mode.
In this work, we extended this algorithm to adjust game parameters rather than physical assistance gains to create the “Virtual Training” mode. In this group, we provided virtual assistance during visually guided gaming, by amplifying the virtual representation of participants’ movements on the screen and by adjusting the required timing accuracy. In RehabHero, movement amplification was achieved by setting an inflection point halfway between participants’ maximum flexion and extension and applying a tunable linear gain such that the representation of flexion and extension movements made on either side of the inflection point were amplified. Timing constraints were adjusted independently from movement gains; missed notes resulted in both movement and time adjustments, while instances in which the participant hit the note outside of the correct time window resulted in adjustments to the timing constraint alone. In FingerPong, Virtual Assistance was achieved by adjusting the paddle size and speed of the ball, such that missed hits increased the paddle size and reduced the ball speed, while successful hits resulted in decreased paddle size and increased ball speed. Other than the graphical scaling of finger to game movement, the visual interface of the games in physical and virtual assistance modes were identical.
Across all gaming modes, we further adjusted our algorithm to allow for increased virtual challenge if the participant exceeded 80% success rates without any assistance. This was done as success rates > 90%, in which training is “too easy”, have been shown to reduce motivation for training [31]. Virtual challenge was implemented by increasing the required timing accuracy in RehabHero, and by increasing the speed of the ball and decreasing the paddle size in FingerPong. The gains for each game were tuned in the first session of each week to prevent slacking and then held constant for the remainder of the week for training.
Performance metricsTraining performancePerformance during training was quantified by participants’ average success rate in each gaming mode across sessions 2 and 3 that were performed with fixed assistance gains. Unassisted gameplay was evaluated at the beginning of session 3, in the 2-string mode for RehabHero and the classic mode for FingerPong, to compare with assisted gameplay. We additionally quantified the magnitude of assistance needed in each gaming mode, as well as the number of movements attempted across the three training sessions.
Robotic assessments of motor and proprioceptive abilityWe assessed participants’ finger strength and proprioceptive ability using a battery of robotic assessments performed in the baseline visits. We collected these metrics to understand if the adaptive algorithm was able to titrate success as a function of motor and proprioceptive impairment, which may impact the Virtual and Proprioceptive training modes, respectively. All proprioceptive assessments were performed with vision of the hand occluded by a black plastic occlusion screen (Fig. 1).
To quantify the motor capacity of participants fingers, which we expected to impact feasibility of training for the Virtual Assistance Training group, we measured participants’ maximum voluntary contraction (MVC) in flexion and extension for both the index and middle fingers when moved independently and in tandem. We then quantified motor ability using a previously developed metric, “hand capacity” that creates a summary score that combines patients finger strength and ability to move the fingers independently [40]. Higher scores coincide with greater strength and individuation.
The primary proprioceptive assessment was Crisscross, which was developed for our previous clinical trial with FINGER, and was shown to be predictive of responsiveness to Standard Robotic Training [28, 33, 41]. In Crisscross, participants are tasked to push a button at the instance of perceived finger crossing as the robot moves their fingers back and forth in an alternating, crossing pattern. In this study, Crisscross had 20 total crossings, occurring at speeds ranging from 8 to 18 deg/s per crossing in a pseudorandom order. Performance was quantified by the average absolute error between fingers at the instant of button press.
The next two assessments, ThumbSense and Move and Match, were designed to be matched to the proprioceptive-elements of Proprioceptive Training, to assess participants proprioceptive ability independently from a gamified setting. ThumbSense, assessed participants’ ability to discern between the thumb positions used in the proprioceptive RehabHero mode (detailed above). The thumb was randomly moved to the top, middle, and bottom string positions 20 times in a pseudorandom order at the same speed as used in training. The robot paused for 6–10 s in each position, matched to the duration allowed for position discernment during gameplay. Participants were instructed to state their thumb position each time it came to rest and were scored based on their accuracy. The second assessment, Move and Match, tasked participants to move one finger (index/middle) to track robot-facilitated movements of the other finger (middle/index). Physical performance in this assessment mimicked the tracking and matching of the ball finger in the proprioceptive FingerPong game (detailed above), and commonly used joint-reproduction assessments [34, 42,43,]– [44]. Performance was quantified as the average absolute tracking error between fingers. Note that Move and Match performance depended not only on proprioceptive ability, but also on finger movement ability. For a more direct measure of finger proprioceptive ability alone, we used Crisscross.
Clinical assessmentsIn addition to robotic assessments, we performed standard clinical tests of hand function at the baseline visits, including the Box and Block test, which scores the number of blocks transferred over a divider within one minute by the affected arm [45], and the Fugl-Meyer Assessment of Upper Extremity (FMA_UE) [46].
Motivation assessmentFinally, to evaluate participants motivation in each training mode (Standard, Virtual, Proprioceptive) we had them evaluate their motivation for training using an abbreviated version of the Intrinsic Motivation Inventory (IMI) [47], identical to the version we used in the previous study of FINGER [28]. The survey included 14 questions evaluating participants perceived Value/Usefulness of the task, Pressure/Tension while performing the task, Effort/Importance of the task, Interest/Enjoyment of the task, and their perceived Competency (Supplemental Fig. 1). Participants evaluated their IMI at the end of the first training session, after completing 10 RehabHero games and 18 FingerPong games.
Data analysisWe used Kolmogorov-Smirnov testing to determine the normality of data. For normal distributions, we used two sample and paired t-tests for comparison testing, and for non-normal data, we used Wilcoxon rank and signed rank testing.
To determine the efficacy of the assistance algorithm in titrating success within each group, we quantified participants average success and level of assistance needed in each training mode across sessions 2 and 3, and compared participants' average assisted performance to unassisted gameplay using comparison testing. Within each of the new training groups, we performed a correlational analysis to determine if the assistance strategy was able to mediate success in individuals with motor deficits (finger capacity) in the Virtual Assistance group, and proprioceptive deficits (Move and Match, ThumbSense) in the proprioception group. Across all training groups, we further investigated how our primary metrics of sensorimotor hand function (Crisscross, Finger Capacity, BBT), that have previously been identified as predictive of training success [28, 40], related to game success in assisted and unassisted modes via correlation analyses.
Feasibility of each new training mode was compared to the Standard Training group. Specifically, we performed comparison testing between the new (Virtual, Proprioceptive) training groups to the Standard Training group, to identify any differences in game success achieved, the relative assistance needed in each mode, and their motivation. We further performed an exploratory correlation analysis to identify how game play success and sensorimotor hand function were related to motivation for training within and across all game modes.
Comments (0)