To understand the context of use and to establish the requirements following a human-centered approach, we first spent several days in the Department of Neurology, University Hospital Bern, Switzerland, shadowing clinicians and therapists. We closely followed them during their daily work, taking field notes and asking questions, allowing us to obtain an in-depth understanding of their working environment, procedures, involved people, and interactions [88]. We also performed literature research on previous device developments and studies reporting device requirements studies, e.g., [89,90,91,92].
To narrow down the list of requirements, we carried out a 35-question online survey with 33 neurorehabilitation experts (four physicians, one nurse, two speech therapists, one neuropsychologist, six occupational therapists, and 19 physiotherapists, with two thirds of the participants having five or more years of experience) from the University Hospital Bern and Reha Rheinfelden, Switzerland. The detailed results of this survey were reported in [84].
The resulting initial principal system requirements were:
Collective movement of the index to little finger with a large functional range of motion (RoM) that allows grasp training as well as the training of active finger extension (i.e., full extension required).
Independent thumb movement, including thumb opposition allowing different grasps such as medium wrap (i.e., cylindrical grasp with contribution of palm), precision disk (i.e., fingertip prehension) and lateral pinch [93, 94].
Whole-arm movements with a sufficient RoM to perform reach and grasp movements.
Fast setup (below 5 min), even for patients who cannot fully extend their fingers due to spasticity or hypertonicity.
Suitable for a wide variety of hand sizes while avoiding many or complicated adjustments.
Rehabilitation games to train reaching and grasping movements with high-fidelity haptic rendering to provide meaningful sensory information.
The games’ difficulty must be adjustable to individual patients’ abilities.
Patient’s movements can be assisted by manually adjustable robotic forces/torques.
Graphical therapist interface to control the robot and games, e.g., start, stop, and perform adjustments.
Once we started the prototyping phase, we continued seeking feedback on intermediate prototypes from therapists at the University Hospital Bern, Switzerland, in co-creation sessions. In the early stages of the development, we facilitated the conversations by using various 3D-printed prototypes as well as an Omega.3 device (Force Dimension, Switzerland) for the exercises and haptic rendering development.
Haptic device developmentDriven by the established requirements, we designed a novel six-DoF rehabilitation haptic device that consists of a custom-made hand module with three DoF, which allows grasping training, mounted on a delta robotic base (Lambda.3, Force Dimension, Switzerland) with three translational DoF to train reaching movements (Fig. 1). Following, we describe the developments of our modular device that allows: (i) collective finger flexion/extension (Fig. 2), (ii) thumb movements (Fig. 3), and (iii) hand translations.
Fig. 1The Lambda.3+ haptic device consists of a delta robot (Lambda.3) and a custom hand module with hand-size-specific exchangeable handles. The user is ready to grasp a virtual object with extended fingers and abducted thumb
Fig. 2Collective finger flexion and extension movements
Fig. 3Thumb circumduction movements, enabled by the remote center of motion mechanism
Hand module: collective finger flexion/extensionThe design and performance evaluation of the Palmar RehabilitatIon DEvice (PRIDE) for collective finger (i.e., index finger to little finger) flexion/extension movements is described in detail in [55]. We provide here a summary for completeness.
To meet our objective of a short setup, we aimed to minimize the number of adjustable parts. This was also highly encouraged by the therapists in the co-creation sessions. It also became apparent that the hand module should be compact and cylindrically shaped during setup to enable patients with limited finger mobility to slide their hands onto it with flexed fingers. Therefore, a RoM ranging from fully extended fingers to flexed fingers for the setup was required. In consultation with the therapists, we found a finger flexion of 165\(^\circ\) (180\(^\circ\) for an earlier version of PRIDE), measured as the orientation of the distal segment of the finger with respect to the metacarpal bones, to be adequate. To achieve this, we first categorized the hand sizes given in anthropometric databases [95,96,97,98] into four distinct sizes, which we called S, SM, ML, and L (from small to large). The S corresponds to the 5th percentile of women’s and L to the 95th percentile of men’s hand sizes. The intermediate sizes were equally spaced between the smallest and largest sizes. Following, we engineered a mechanism where we used a combination of linkages, pulleys and cables, as well as exchangeable handles (i.e., the part that supports the palm of the hand) to position differently-sized hands. The mechanical design parameters were synthesized an optimization approach.
The resulting design with minimal mechanical backlash and high transparency—characteristics that are advantageous for haptic rendering—is depicted by the CAD renderings in Fig. 4. To swap the four size-specific handles, they can easily be unlocked and locked with a quick-release lever on the back of the hand module (see Fig. 4). The hand is attached to the handle using two straps: One at the wrist and one over the metacarpal bones. The fingertips are kept in contact with the fingertip support through a custom-made padded fixation on the dorsal side that comfortably presses the fingers against it (Figs. 4 and 5). The fingertip fixation features a ratchet mechanism and can be released by a lever as well. Note that the fingertip support, as well as the handle, are tilted in the coronal plane by 25\(^\circ\) as this results in natural cylindrical grasp [99]. The hand module is mostly 3D printed from polylactic acid (PLA) and reinforced with metal parts as needed. The mechanism is actuated through a Capstan drive by an electric motor with an integrated encoder (RE30 with MR Encoder, 1000 PPR, Maxon Motor AG, Switzerland).
Fig. 4CAD model of the device. Top left: The two main parts are the robotic delta base for translational movements and a custom-made hand module for grasping. Top right: Hand module with exchangeable handle that enables collective finger flexion/extension, thumb circumduction, and thumb flexion/extension. Bottom left: Finger flexion/extension mechanism and inclined handle. Bottom center: Thumb circumduction mechanism. Bottom right: Thumb flexion/extension mechanism
Hand module: thumb circumduction and flexion/extensionIn the co-creation sessions with the therapists, we found that the majority of thumb movements required for the most relevant grasps seem to be achievable by a combination of thumb circumduction and flexion/extension. Despite the complexity of the thumb kinematics [100,101,102]), the thumb tip can be considered to move on the surface of a cone that originates in the carpometacarpal (CMC) joint during circumduction [103]. We also assumed that only a relatively small flexion/extension RoM is required for the majority of grasps and thus further simplified the thumb movements by approximating the thumb tip path during flexion/extension as a circular arc segment. Initially designed for a RoM of 0°–50°, we limited the RoM further to 30°–50° in software.
Based on this knowledge and assumptions, we developed a two-DoF mechanism to extend the aforementioned PRIDE hand module. We presented the development in detail in [85]. Here, we again only provide a summary for completeness. The movements of the thumb tip support are achieved through a serial kinematic chain. The first revolute joint is realized as a parallelogram-based remote center of motion (RCM) mechanism to avoid collisions with the user’s hand (Fig. 4, bottom center). The axis of rotation of this RCM lies in the coronal plane and goes through the CMC joint with a negative inclination of 20° with respect to the longitudinal axis of the hand. The second revolute joint is located in proximity to the thumb’s MCP joint and enables thumb flexion/extension. To eliminate the need for further adjustments, the exchangeable handles of PRIDE were redesigned such that the vertical (i.e., in radial direction) position of the thumb tip aligns across all hand sizes. To do so, we assumed that the hand breadth scales proportionally to the hand length and added a corresponding offset for the vertical hand position. The design parameters of the thumb mechanism were again synthesized using an optimization approach.
The final thumb mechanism is depicted in Fig. 3, showcasing the circumduction RoM. The mechanism also mostly consists of PLA parts with a few metal parts (e.g., stainless steel axles). The RCM circumduction is actuated through a capstan drive by an electric motor with an integrated encoder (DCX22S with Enc EASY 16, 1000 PPR, Maxon Motor AG, Switzerland). To avoid a high continuous motor load, the RCM mechanism is fitted with a spring that compensates the weight of the moving parts. A geared RC servo (D625MW, Hitec RCD, USA) with an encoder (AMT10E, 5120 PPR, CUI Devices, USA) was used for flexion/extension. The thumb tip is fixated to the thumb tip support with a hook and loop strap (Fig. 5).
Fig. 5Setup sequence: (1) The handle must be unlocked by pulling the lever on the back (here left side) of the hand module. (2) The handle is being removed by vertically sliding it off the hand module. (3) Once the new handle is installed, the patient’s hand can be slid onto the device without the need for finger extension. (4) The wrist, thumb and metacarpals straps are tightened with hook and loop attachments. (5) The fingertip support is tightened by sliding it towards the fingers and held in place through a ratchet mechanism
Delta robot (Lambda.3): hand translationTo enable reaching movements, we installed the hand module on a vertically oriented Lambda.3 device (Force Dimension, Switzerland) with three translational DoF (Fig. 1). The Lambda.3 is a haptic device with parallel delta kinematics and highly backdrivable capstan drive actuation for all three axes. The entire assembly was mounted on a stand with lockable wheels for easy transportation.
The hand module is mounted at an angle of 15° in the sagittal hand plane to account for the slight wrist extension that naturally occurs during grasping. The mass of the hand module (1700 g) is spring-compensated in the vertical direction by the Lambda.3 device to reduce continuous motor load during operation. We manufactured the hand module in a right-sided and a left-sided version, whereas the same Lambda.3 robot was used for all participants.
The resulting rehabilitation robot (see Fig. 1) has a total of six DoF and falls into the group of end-effector-style devices [16]. The workspace and force/torque specifications of the complete rehabilitation robot are summarized in Table 1.
Table 1 Device specificationsHaptic rendering and controlControl hardwareWhile the Lambda.3 robotic base comes with integrated control electronics, we used an additional control box that was custom-made by Force Dimension to drive the three motors of the hand module. Both controllers were connected via USB to a host PC and both were equipped with an independent hardware watchdog timer. Triggering a watchdog timer shuts all motors off immediately and shortcuts the motor drivers, resulting in electromagnetic viscosity that prevents the device from collapsing too quickly. The control box of the hand module was also equipped with an emergency stop button (see Fig 1, button with red marking on top of the control box) that, when pressed by the responsible therapist, has the same effect as the watchdog.
Physics engine-based whole-hand haptic renderingWe developed a haptic rendering method based on the open-source physics engine Bullet to render whole-hand interactions between the patient’s hand (virtually represented by a hand avatar; Fig. 6) and virtual objects. Here, we provide a simplified overview and refer the reader to [86] for more details. Our approach essentially consists of a bilateral proportional-derivative (PD) control law [104] that couples a main device (i.e., our haptic device, denoted in following with index ) and a simulated hand avatar, denoted with index .
The generalized device coordinates \(\varvec_m = [x_m, y_m, z_m, \theta _, \theta _, \theta _]^}}\) and the forces/torques \(\varvec_m = [f_, f_, f_, f_, \tau _, f_]^}}\) are indicated in Fig. 4 and Table 1. The simulated hand avatar was modeled as a multi-body object using capsule colliders (Fig. 6), whereas its DoFs correspond to the DoFs of our device. Its generalized coordinates \(\varvec_s\) and its forces \(\varvec_s\) are, thus, defined analogously.
The virtual forces applied in the physics engine to the simulated hand avatar (\(\varvec_s\)) are given in Eq. (1), and the haptic rendering forces sent to the main haptic device (\(\varvec_m\)) are given in Eq. (2). The diagonal matrices \(\textbf_\), \(\textbf_\) and \(\textbf_\), \(\textbf_\) represent proportional and damping gains, respectively. The matrix \(\tilde}\) denotes the mass matrix of the simulated hand avatar (diagonal elements only, see [86]), and \(\varvec\) is the lumped term for velocity and gravity-dependent forces. The diagonal matrix \(\mathbf \) in Eq. 2 is an adaptive damping gain that stabilizes the patient’s hand, fingers or thumb upon impact with a virtual object while only slightly degrading the sensations of hand-object impacts or hampering movements in free space. The term \(\varvec_e\) in Eq. 2 is a vector of additional exercise-specific forces (see Section "Virtual training tasks") and \(\varvec_a\) represents adjustable assistive forces (see Section "Therapist interface and device operation").
$$\begin \varvec_s= & \tilde} \left( \textbf_(\varvec_m - \varvec_s) + \textbf_(\dot}_ - \dot}_) \right) + \varvec \end$$
(1)
$$\begin \varvec_m= & \textbf_(\varvec_s - \varvec_m) - \mathbf \textbf_\dot}_ + \varvec_ + \varvec_a \end$$
(2)
Software implementationIn Fig. 6, the different software modules including the therapeutic games described in Section "Virtual training tasks" and the therapist interface in Section "Therapist interface and device operation" are depicted. The control software—responsible for low-level control, haptic rendering, and safety checks—was written in C++ and communicates through a hardware abstraction layer (HAL) with the hardware described above. In addition to the hardware safety features, the control software also continuously monitors the device’s velocity and the motor encoders, and disables all forces in case of an operational anomaly. The mechanical gravity compensations of the Lambda.3 and hand module are complemented by fine-tuned gravity and friction compensation in software. A scaling factor of six between the translational movements of the haptic device and the hand avatar movements was introduced to adjust the virtual translational workspace in the games to the available workspace of the device. The finger and thumb movements were mapped 1:1 between the patients’ hands and the simulated hand avatar to provide congruent proprioception-visual information during grasping.
The control software update loop runs at a rate of 1 kHz. All software modules ran simultaneously on the host computer (AMD Ryzen 7 PRO 5850U, 32GB, with Ubuntu 22.04 and a low-latency kernel). Except for the Bullet physics engine, where shared memory was used, the communication between the different software modules was performed through socket networking using the User Datagram Protocol (UDP).
Fig. 6Overview of the software architecture. Except for the Bullet physics engine server which communicates through shared memory with the control software, the different software modules communicate using UDP. The main control software computes the haptic forces at a rate of 1 kHz and interacts with the robotic devices through a hardware abstraction layer (HAL) using a USB connection
Virtual training tasksOverviewWe developed two gamified training tasks to complement our device, where we targeted reaching, grasping and releasing objects. We aimed to design tasks that benefit from meaningful haptic feedback, i.e., easily interpretable sensory information that contributes to controlling the movements and facilitates task completion for simultaneous motor and sensory training. While the first game (cocktail bar game) was inspired by therapy tasks and endorsed by therapists, we organized brainstorming sessions with two therapists for the development of the second game (slingshot game) [105]. Both games were developed in an iterative process through informal feedback from repeated intermediate testing by therapists. The graphical elements and game logic were developed in Unity3D (Unity Technologies, USA) in C#, while haptics-related elements were implemented as part of the control software (see Section "Haptic rendering and control").
Although our robotic device is capable of rendering forces in all DoFs, we decided to block the vertical axis (z-axis) with a PD controller for three reasons. First, during informal testing with therapists in the co-creation sessions, we found that, likely due to the 2D screen, performing straight and targeted 3D grasping movements seemed challenging—a finding also supported by literature [106]. In particular, vertical and horizontal (i.e., forward/backward) movements could be indistinguishable depending on the perspective on the 2D screen. Thus, we agreed with the therapists that the games were challenging enough with a blocked z-axis, and that we did not want to add another dimension of difficulty and source of ambiguity. Second, this allowed us to adjust the height of the device during training. Finally, we prevented any adjustment or calibration that might have been required with dynamic arm-weight support [107].
Cocktail bar gameThe first game was designed to promote proximal and distal movements with a special focus on fine distal movements and force dosage. In the cocktail bar game (Fig. 7), the patients are presented with four liquid dispensers, each containing a differently colored liquid with a distinct viscosity. The user’s hand is represented by a hand avatar, which reflects the user’s real hand movements. Below each dispenser, a glass appears that must be filled with the liquid. For this, the user has to move the hand avatar in front of the dispenser, open the hand, bring the thumb in opposition, approach the dispenser, close the hand until the fingers “touch” it, and then squeeze to pour liquid. If the hand is not properly opened before approaching the dispenser, the dorsal side of the fingers collide with it and the user will feel that the fingers are pushed into a flexed position, analogous to a real-life scenario. Thus, the opening of the hand before grasping is necessary, meeting the therapists’ request for the integration of active finger extension in the training of functional movements.
Importantly, the liquid and dispenser physical characteristics (i.e., viscosity and stiffness) are reflected in the haptic rendering of the interaction between the patient’s hand and the dispenser. Whenever the dispenser is squeezed too hard, the liquid spills and sputters. Whenever liquid is spilled beside any of the glasses, the user is penalized, i.e., the life bar located on the top left of the screen decreases (Fig. 7). Hence, the challenge is to skillfully squeeze the liquid dispenser based on the visuo-haptic feedback. As soon as a glass is full, a green check mark appears, the glass disappears, and the counter of successful glasses on the right of the screen increments. In a first step, each one of the four glasses needs to be filled once. Afterward, the glasses appear randomly upon completion of the preceding one. The difficulty could be varied by scaling the required forces to pour liquid from the dispensers (i.e., facilitating movements for patients with low residual grasping force) and by adjusting the sensitivity of the dispensers (i.e., how quickly they start to sputter and spill liquid).
For the haptic rendering of the hand-dispenser interaction, a virtual wall (i.e., virtual spring-damper system with gains \(k_\) and \(k_\), respectively) to the finger flexion/extension movement is superimposed to the whole-hand interaction forces from the simulated collisions in the physics engine (force \(f_\) in Eq. 2, Section "Haptic rendering and control"). The gains \(k_\) and \(k_\) can be modified based on the desired impedance characteristics of the corresponding liquid dispenser. The virtual wall is activated during the grasping of a dispenser shortly before the fingertips hit the cylindrical collider representing the liquid dispenser (at 90\(^\circ\) finger flexion). Since the fingertips are occluded—i.e., behind the dispenser—in this position, the virtual wall is perceived as the interaction between the patient’s hand and the dispenser. Although this approach does not perfectly model a deformable liquid dispenser, it works very well in practice, requires only minimal additional computational cost, and simplifies the development compared to actually utilizing deformable meshes or modifying the properties of the liquid dispenser colliders in an online manner. The exercise-specific finger flexion/extension force \(f_\) is calculated as a virtual wall using Eq. (3) with \(\Delta \theta _\) denoting the penetration into the virtual wall. The other components of the vector \(\varvec_e\) remain zero.
$$\begin f_ = k_ \Delta \theta _ + k_ \dot_, &\quad \Delta \theta _ > 0\\ 0, &\quad \text \end\right. } \end$$
(3)
Slingshot gameIn this game, the therapeutic objective is to train fine proximal movements while still requiring distal movements during grasping. The goal is to hit ghosts that appear on the screen using a simulated slingshot (Fig. 8). For this, the user first reaches out to grasp a projectile (i.e., the red ball) at the screen’s center. Grasping requires opening the hand, opposing the thumb, and closing the fingers around the projectile. Once the user thinks that the projectile is correctly grasped, the slingshot can be tensioned by pulling back and released by opening the hand. To aim, the arm can be moved left and right, with the projectile’s expected trajectory shown on the screen.
The game increases difficulty as the patient plays. In the first phase, three stationary ghosts are presented. Once the three initial ghosts are shot, three new ghosts appear that move continuously from left to right and back. Finally, ghosts randomly appear and move towards the user. In this phase, whenever the user misses a ghost and the ghost passes the slingshot, the life bar, depicted on the top left of the screen (Fig. 8), decreases. The difficulty of the game can be further adjusted by scaling the user’s input forces, i.e., the necessary forces for grasping and pulling back the slingshot can be reduced to facilitate the task.
To simulate a slingshot, a virtual spring-damper system is temporarily attached to the projectile with coordinates \(x_p\) and \(y_p\) (Fig. 8). The dynamics of the projectile are governed by the force \(\varvec_p\) from Eq. (4) and the hand-projectile interactions. Thereby, the diagonal matrices \(\textbf_\) and \(\textbf_\) denote the stiffness and damping. To facilitate grasping, the stiffness of the slingshot in frontal direction (away from the user) is increased by the scalar factor \(\gamma \in [1, \hat]\), which is computed according to Eq. (5). The peak value \(\hat\) was chosen \(\hat = 3\) in our experiments. This allows the user to push the hand against the slingshot to get a good grasp before pulling back. By either opening the hand or reducing the grasping pressure, the slingshot forces on the projectile will overcome the simulated friction between the hand avatar and the projectile and catapult it in the direction where the user aimed. The virtual spring-damper system is detached from the projectile when a release is detected, i.e., when the projectile surpasses a velocity threshold.
$$\begin \varvec_p= & \gamma \left( \textbf_ \begin x_p\\ y_p \end + \textbf_ \begin }_p\\ }_p \end \right) \end$$
(4)
$$\begin \gamma= & \hat\left( 1 - \frac \big | \text (y_p, -x_p)\big | \right) , &\quad x_p < 0\\ 1, &\quad \text \end\right. } \end$$
(5)
Fig. 7Cocktail bar game: The patient has to grasp liquid dispensers with different haptic characteristics and skillfully squeeze them to pour liquid into the glasses
Fig. 8Slingshot game: The projectile (close-up in the bottom left corner) must be grasped and pulled back and released to shoot with the slingshot. The goal is to hit the ghosts
Therapist interface and device operationThe graphical user interface (GUI) is the main interaction channel between the therapist and the system. When it comes to the design, we aimed for a logical flow, where therapists start at the top left and successively work their way down to the bottom and where elements with related functionalities are grouped together. Our goal was to provide a minimum set of functionalities that were necessary to operate the device without increasing the overall system complexity. The resulting GUI (see Fig. 9) was developed in Python using Qt (The Qt Company, Finland) in German, the local language at the study hospital.
The GUI is divided into four distinct sections: (i) In the Robot section, the robotic device can be started, stopped and prepared for swapping handles or setting up the patient. The current status of the robot (e.g., disconnected, ready, busy, error messages, etc.) is displayed. (ii) The Game section lets the therapist select, start, and quit a game. (iii) In the Settings section at the bottom left of the GUI, adjustments to the height of the robot, the assisting forces, and game difficulty can be set in three separate tabs. (iv) In the Plotting area on the right side, a real-time plot of the estimated forces (using the motor current) that the patient applies is shown. The plotting was not an explicit requirement from therapists, but since they were curious about the magnitude of the applied forces during the co-creation sessions, we decided to add it. The desired DoFs for which the forces should be plotted can be selected using tick boxes.
The sequence to initiate a training game is as follows. After starting the device with the start button in the Robot section, the robot moves close to the patient into a predefined position. The therapist can then install the correctly sized handle for the specific patient prior to the patient setup by first pressing a button in the Robot section. By the press of this button the hand module moves into an adequate configuration to make the handle easily accessible, i.e., thumb support at full circumduction and extension, finger support at moderate flexion). During the handle swapping, the GUI provides visual instructions to the therapist in a pop-up window. Once the correct handle size is mounted, the therapist presses the last button in the Robot section and the hand module automatically moves to a fully closed configuration for easy installation of the patient’s hand in the device. The therapist is then assisted in donning the patient by instructions displayed in a pop-up window. The complete setup sequence of changing the handle and installing a patient is depicted in Fig. 5. Once the patient is installed, the therapist can select the game in a drop-down menu and start the therapy.
To adjust the training to the individual patient’s needs, the Settings section offers several adjustments. This section consists of the tabs Arm, Hand, and Training. In the first two tabs, the height of the robot can be adjusted, and for each of the remaining five DoFs, an assistive force (see Section"Haptic rendering and control") can be adjusted in either direction. The height adjustment is performed with a minimum-jerk trajectory to be perceived as smooth by the patient [108]. The assisting force magnitude can be changed using plus / minus buttons and displayed on a linear gauge. The Training tab lets therapists adjust the game difficulty as described in the Section "Virtual training tasks". Importantly, the use of the settings is not required, and therapists are free to use them according to their preferences and patients’ needs. All the settings can be adjusted online while the patient is playing the game.
Fig. 9Graphical User Interface (GUI) for the therapists. In the section Robot, the robot can be started, prepared, and stopped. In Games, a game can be selected and started or stopped. The section Settings allows to make game-specific difficulty adjustments as well as adjustments of the device height and assistive forces. In the Plotting area, the estimated patient-device interaction forces are plotted
Usability studyParticipantsWe performed our usability study with seven therapists and seven stroke patients. The study wa
留言 (0)