Scan-Driven Fully-Automated Pipeline for a Personalized, 3D Printed Low-Cost Prosthetic Hand

The process of fitting a prosthetic hand that is comfortable, functional, easy to use, has an acceptable appearance and overall improves the amputees' quality of life is a complex, tedious and costly process. The very high price tag due to the time spent on manually fitting the device by a trained specialist makes these devices inaccessible to large portions of the population. We present a concept and preliminary results for a fully automated fitting and manufacturing pipeline for a personalized low-cost prosthetic hand. The hand is personalized in almost every aspect, from appearance to user interface, control and feedback. The pipeline only requires a 3D printer, RealSense cameras, a few basic mechanical components, and basic tools for the model assembly. The user scan-driven data and the user preferences initiate a fully-automated pipeline which culminates in a customized, easy-to-assemble PCB design and ready to print STL files, including the optimized orientation, support and layout, such that the final parts are only one click away. We believe that the proposed pipeline and design can highly impact the accessibility of prosthetic hands and could potentially be expanded to other medical applications.


I. INTRODUCTION
The loss of one's hand can lead to a drastic reduction in The first and second authors contributed equally to this work. 1  the quality of life by decreasing the level of independence and the capability of performing activities of daily living (ADLs) [1]. A prosthetic hand can cost from $3,000 for a body powered device, to $100,000 for a neuro-prosthetic arm such as the i-Limb and the DEKA arm [2]. The very high price tag makes these devices inaccessible to large portions of the population. In many cases, the price is heavily influenced by the time spent on manually fitting the device by a trained specialist. Even when the financial barriers are surpassed, rejection rates of prosthetic devices are considerably high and are usually related to the following causes: lack of social acceptance, weight, sensitivity of the electrical system, lack of a stable grip and adaptive grasping force, lack of sensory feedback and age of first fitting which is greatly affected by financial resources due to the child's constant growth [1], [3]. Not using a prosthetic device could lead to degeneration of joints and muscles, inflammations and other complications [4]. According to Katsavelis et al. [5] there is a crucial need to develop a functional, easy to fit and maintain and customizable, yet low-cost prosthesis.
In recent years, alongside the growing availability of 3D printers and improvements in Computer Aided Design (CAD) programs, the use of 3D printing in medical applications is expected to substantially increase in the foreseeable future [5], [6]. Using 3D printers to produce prosthetic devices has many advantages. First, a design can be constructed out of fewer parts or even one, reducing costly assembly procedures. Second, 3D printers can create highly complex geometries, thus providing large design freedom. That said, it is important to note the limitations of 3D printing. It is difficult to predict the mechanical properties of

Scan-Driven Fully-Automated Pipeline for a Personalized, 3D Printed Low-Cost Prosthetic Hand
Yair Herbst 1 , Student Member, IEEE, Shunit Polinsky 1 , Anath Fischer 1 , Yoav Medan 2 , Ronit Schneor 1 , Joshua Kahn 1 , and Alon Wolf 1 , Senior Member, IEEE Figure 1. Our proposed low-cost and user-specific hand-fitting pipeline. The pipeline starts with user data which initiates a fully-automated process that ends with a personalized prosthetic hand fitted to the user.
the printed product and even then, there is only a limited amount of materials suitable for printing compared to traditional manufacturing methods.
One of the best-known research and development groups when discussing 3D printed prosthetic hands is the e-NABLE community [7]. The designs created by the community are open-source and the fabrication is done locally using consumer 3D printers. By utilizing 3D printers, the designs offered by the community help overcome some of the abovementioned rejection reasons, e.g., customizing appearance to increase social acceptance and reducing weight by optimizing the 3D structure. While in theory designs can be personalized and customized [8], any substantiale changes requires significant effort from a professional. A better fitting process is required to increase the impact of these designs. Moreover, in the case of prosthetic hands, the design of the device includes not only the mechanical and functional design but also the user control methods for producing an intuitive and user-friendly device. Most of the advanced prosthetic hands available rely on electromyogram (EMG) signals from the user's stump [9]. When the number of controlled Degrees of Freedom (DOF) is high, others found additional methods, such as foot control [10]. As of today, the usage of digital fitting exists in several medical domains such as design and fabrication of maxillofacial implants [11] and custom-made hip prostheses using CT scans [12]. Similarly, for lower limb amputation, a software for design and testing of a personalized socket and prosthesis in a fully virtual environment, was developed by Colombo et al. [13]. These examples are mainly focused on the 3D structure of the printed part rather than the customization of the entire system as required in upper limb prosthesis.
In this paper, we present a novel, digital design process in the spirit of industry 4.0 to create a personalized and low-cost prosthetic hand. Our proposed automated fitting pipeline is entirely digital to minimize the design time, the high cost and dependency on trained professionals throughout the pipeline, while potentially achieving a low-cost, tailor-made design that can be accessible from anywhere on the globe. The pipeline is divided into three main steps. The first is obtaining user data using a combination of regular and depth cameras, and user preferences. Next, a CAD model is generated automatically according to the user's scan-driven data. The model is based on a functional skeleton and a skin to customize appearance. This approach allows us to increase social acceptance while preserving a reasonable level of hand functionalities. In addition, according to the user preferences an interface is chosen. The available options range from simple body-powered hands to more advanced EMG and other interfaces, sensory feedback and more. This step ends with files entirely ready for 3D printing, a circuit board suitable to the chosen user interface and a bill of materials based on standard off-the-shelf parts. The last and final step is assembly. This step is the only manual one in the pipeline but is shortened significantly by optimizing the previous steps. The entire pipeline is illustrated in Fig. 1, the proposed pipeline and final outcome will potentially help decrease cost, fitting-time, rejection rates and more. The entire research and final design are uploaded and shared online for anyone in the world to use and modify.

II. SYSTEM ARCHITECTURE
The pipeline described in Fig. 1 was implemented as a complex system which is combined of the user data capture components, the parametric prosthetic hand design and a customized user interface. The different components are described in the following sections. To achieve the main purpose of a fully automated pipeline, the hand's mechanical design is automatically modified based on the scanner outputs. Since we aim to create a low-cost design pipeline, the design contains only off-the-shelf components, and the mechanical design was developed for manufacturing by a standard 3D printer using standard printing materials such as ABS or PLA. The assembly of the hand, including the electronics, is simple to accomplish with basic knowledge and tools. To allow for better personalization, a variety of user interfaces are available for both control and feedback and a mobile app allows simple integration with 3 rd party technology.

A. Scan-Driven User Measurements
One of the key aspects of our system is the personalization of the hand. Two aspects of the hand geometric structure are customized, the socket and the kinematic model. Nowadays the fitting of the socket is usually done manually by professionals thus resulting in long and costly procedures [13]. In our design we developed a low-cost 3D scanner and implemented the resulting surface in our CAD model. While commercial 3D scanners are increasingly being used in the industry alongside the rise of 3D printers, these products are still expensive and unsuitable for our system. We offer a low-cost scanner design based on an Intel RealSense (Intel, Santa Clara, CA, USA) depth cameras, shown in Fig. 2A. The scanner is based on three RealSense cameras mounted on a rigid stable structure that can be rotated manually. By applying a simple sphere-based calibration, the scanner operator rotates the cameras around the stump, captures the 3D data at several different angles, and the developed software automatically applies alignment, registration, segmentation and denoising of the final combined point cloud. Currently, we are developing an additional scanner shown in Fig. 2B. The scanner is based on a single RealSense camera which rotates around the stump in a known trajectory and captures data in specific locations. Once the point cloud is collected in either one of the scanners, a piece-wise parametric surface is fitted to the data. The current surfaces we use are Bezier surfaces, which are widely used in different applications and are suitable for our purpose for two main reasons. (i) The mathematical calculation is simple, meaning that the model can be generated in real-time; and (ii) the surface lies within the convex hull of the control points i.e. the entire scan is contained in the resulting surface. Such a surface can be easily integrated into a CAD model to generate the customized socket. The process described above can potentially reduce the need of trained professionals thus significantly reducing cost. Although, the development of a full-socket is still in progress, we designed a customized halfopen socket based on the measurements extracted from the 3D model of the scanned residual limb together with a detachable sensors band to integrate user control based on surface electrodes, e.g. EMG control. In addition to the socket and unlike most existing designs, the terminal part of the prosthesis, the hand itself, is also personalized specifically for the user. This is done using a regular camera mounted under a transparent plate. Printed markers are placed on the perimeter of the plate; the software locates these markers, fixes any distortions, and scales the image to size. The setup is shown in Fig. 2C. Using this system, our kinematic model is fitted to the user's unaffected hand. This data in-turn is passed to the CAD model to create a perfectly fitted hand design, as discussed in the following section.

B. Prosthetic Hand Design
The hand design presented here is a robotic hand that can perform a variety of user-defined hand configurations and grips by actively controlling all the fingers independently with different levels of grip forces using micro DC motors. In general, the user's input is converted into the desired action by actuating specific motors and measuring the current drawn by each motor. According to the implemented current control, an adjustable grip is achieved based on the resistance that each motor undergoes. In parallel, a feedback signal is transmitted to the user that resembles the grip status. The desired action is completed through two closed feedback loops as shown in Fig. 3 the first loop is the control loop between the controller and the actuators using the drawn current as the control signal and the second loop is between the user and the prosthetic hand using the feedback signal as the control signal.

1) Mechanical Design
Our mechanical design is a bioinspired under-actuated hand and can be divided into two parts, a skeletonthe kinematic structure, and skinthe hand appearance. By splitting our design, we can optimize both functionality and appearance. The entire CAD model is controlled via equations linked to the scan-driven user data mentioned in previous sections, through a graphical user interface (GUI) in MATLAB (MathWorks, Natick, MA, USA) as shown in Fig.  4A. The design was created in SolidWorks (Dassault Systèmes, Paris, France).
The skeleton shown in Fig. 4B determines our kinematic model and is based on commonly used hand models e.g. [14] the model has 20 DOF, three in the thumb, four in each of the other fingers (flexion-extension and abduction-adduction) and one in the wrist (pronation-supination). However, the hand is actuated using only six motors driving a tendon system, hence the design is highly under-actuated. This means that the finger closing profile is under-defined and is determined by the force equilibrium, thus allowing the hand to conform to various object shapes while maintaining a relatively small amount of active DOF [15]. This type of actuation is well suited to the current control method we have chosen and is described in-depth later in the text. To maximize grip success, the joints are pretensioned in a way that each finger closes gradually such that the proximal phalanx closes first, followed by the middle phalanx and lastly the distal phalanx.
As mentioned, social acceptance is one of the major rejection reasons. While younger users are in many cases interested in a robotic look, others prefer a human-like hand. The separation of the prosthetic into a skeleton and a skin allows us to satisfy both ends of the spectrum and change only the cosmetics of the hand without the need for a kinematic redesign. Up till now, we have been working with two different skins, a human-like skin and an "Ironman" skin, both are shown in Fig 1. We aim to simplify the process of incorporating a new skin into the design and by that allowing any user to create or find a skin suitable for him\her. Furthermore, using an RGB-D scanner such as the one offered, we can potentially mirror the scan of the user's unaffected hand and use it to create a truly personalized skin. Once the kinematic model and skin are fitted to the user the output of our mechanical design is two CAD assembly files. The first is a functional assembly used to display all the parts connected, check for interferences, and do a general evaluation. The second assembly is a print-ready assembly in which all the parts are laid out on a surface according to the printer bed size, in their ideal orientation for print. The orientations are determined mainly to maximize strength but also to minimize support volume. This assembly saves valuable time when preparing the files for print.

2) Electrical Design
The electronic design was developed as a combination of modules to achieve maximum personalization of the device while using only accessible components and designing a simple to solder board. The main functions of the modules are to (i) independently activate each motor according to the defined task sent by the user control module, (ii) apply an adaptive grip, and (iii) monitor the user's inputs status for force grip adjustment and user's safety. Once the user desires are captured in one of the possible user control systems, a hand motion is initiated. From this point, data on the grip status is gathered at the microcontroller and an adaptive grip is applied without the need for additional user inputs. The current control method which produces the desired adaptive grip is based on the current drawn by the motors when resistance is applied i.e. the prosthetic touches the object. The method uses three parameters -the absolute current value, slope and action time. These three parameters combined reduce the errors caused by the natural high starting current and the noise caused by the mechanical friction and leakage current. The implemented adaptive control mechanism allows automatic grip adaptation to objects of various size and shape, with different torque levels. This method of operation is similar to the haptic perception in a healthy hand where mechanoreceptors measure pressure and skin deformations as a result of grasping an object.

C. User Interfaces
One of the inputs received from prospective users and according to Cordella et al. [1] is about the reliability of the device. Since our hands are engaged with our environment most of the day, it is important that the electronics of the prosthetic hand and the methods to control it will have high reliability over time. In addition, there is a need to control the prosthetic's motion while drawing as little visual attention as possible. Other features suggested include: controlling the grasping force, controlling each finger individually and adding additional wrist motion before and during grasping an object [1]. Although all these requirements are important, a clear trade-off can be noticed. Increasing the number of user-controlled functionalities and DOF may also increase training time, the cognitive burden and reduce the reliability and the classification accuracy of the user's intention. According to the above factors and more, we have developed several user control and feedback interfaces.

1) User Control
The functionalities that can be controlled by the user are whether to move a specific finger and to which direction (i.e., opening or closing); the force level applied during grasping; and whether to rotate the wrist and to which angle. The core concept is to reduce the number of actions the user needs to perform and remember in order to accomplish the desired hand movement. Hence the prosthetic hand functionalities are designed as a tree-like structure. All hand parameters including movement definition, control parameters etc., can be modified through a mobile application we developed.
One of the user control methods we developed is based on using the stump's muscles activation signal. The acquired signal can be an EMG signal for our EMG interface or a force myography signal for our Force Sensing Resistors (FSR) interface. For the EMG interface the MyoWare EMG sensors (Advancer Technologies, Raleigh, NC, USA) shown in Fig. 3 were chosen to capture and recognize the user muscle pattern. A calibration stage was developed for a modular and robust EMG classification. During this stage the user applies repetitive muscle movements classified into a circumscribing sphere in 3D space as shown in Fig. 3. The sphere is then associated with one of the pre-defined hand movements. The main advantage of this method is that each user can define his own control inputs according to his muscle capabilities and skills. Moreover, the calibration stage can be applied at any time to overcome the known drawbacks of EMG sensors such as sensitivity to the adhesion quality of the skin, sweat, muscle fatigue and other environmental factors [19] that might have a major effect on the classification accuracy of the user's desires.
However, controlling the prosthetic hand using the residual limb's muscles is not suitable for all amputation cases, such as people who suffer from phantom pain when using the residual limb [16] or from permanent loss of muscular activity due to lack of physical activity and hence do not have a sufficient and reliable EMG signal, and muscle activation [9], [17]. Therefore, additional user controls were developed to overcome this gap. We have developed a very simple foot control which utilizes the change in the ankle diameter during dorsiflexion and plantar flexion. Using this method, we can robustly detect when the user is tapping with his foot on the floor or in the air and thus control opening and closing motions of the prosthetic. In additional we have under development an IMU based method for full foot-gesture recognition, a sole with embedded FSRs and voice control. As part of our mobile app features, we developed a communication protocol and software packages for simple integration with 3 rd parties who develop other user-control methods and technology.

2) Sensory Feedback
While much research is done on the subject, sensory feedback remains a challenge even in high-end prosthetic hands [18]. The issue of sensory feedback in low-cost designs is rarely discussed even though it is highly significant for

3D Scanner 3-Cameras Setup
Validation of the process -different amputation types were scanned, CAD models were generated and their socket and prosthesis were manually adjusted accordingly.

Electric Design Current Control
A correlation between the grip force and the current drawn by the motors was validated (with emphasis on differntiation between compliant and rigid objects)

Mechanical Design
Parametric Design Model validated for sizes ranging from a 10-years-old child's hand (approx. 7 cm palm diameter) to an adult hand size proper function and device adoption [1]. The addition of sensory feedback has the potential to increase device functionality (mainly grasping capability). Also, current prosthetic users use their device as a tool, but sensory feedback might increase body ownership [19].
The sensory feedback module is based on a low-cost mechatronic interface. The hand is equipped with five FSRs (Interlink Electronics, CA, USA) placed on the tip of the fingers, these sensors are used together with the measured current drawn from the motors to estimate the grip force applied by the hand on the grasped object. The system provides the user with feedback using the feedback band shown in Fig. 5 in a modality-matched approach, i.e., measured force is conveyed by applying force using the two servo motors which tighten a band that applies force on the residual limb.

III. PRELIMINARY RESULTS
Due to the complexity of the entire pipeline, each system module was tested in a different manner. Some modules still lack systematic testing but were qualitatively reviewed by users, see Table I for a short description per each system module. In general, the device was tested in common ADLs including, holding a glass, bottle, and a can; pouring liquid and particles from one container to another; gripping a ball, notebook, phone, and keys; performing gestures and writing. From all the above ADLs, only writing was not simple to accomplish, and the task could be achieved only when using a large marker. Although a formal feedback collection has not yet been accumulated in the framework of this paper, initial feedback from prospective users and feedback from healthy subjects was collected. One of the main benefits raised by the prospective users after examining the hand is the low overall weight (~300g, varies with size) and the vast customization option in comparison to high-end commercial hands. In addition, although there are more hand configurations than needed, users were appreciative of several gesture configurations that are meant to be more communicative rather than functional. This type of gestures cannot be found in most commercial prosthetic hands and were inspired by the natural use of the human hand in communication and not only in functional tasks.
In addition to the validation, the proposed design was compared to existing devices as shown in Table II. The information was taken from [20]. The Ottobock and OpenBionics designs were chosen as they are at the lower price range of bionic hands. In technical terms the proposed design is comparable with the commercial devices except for the max weight grasped. It is important to note that the published specifications for commercial devices is limited so values might have been evaluated in different methods in the different devices. The major differences between the proposed design and the commercial ones are related to the device personalization in all different aspects and reduced dependency on trained professionals.

IV. CONCLUSION AND FUTURE WORK
Most existing prosthetic hands have a very high price tag; this means that they do not offer a feasible solution to large portions of the population and especially to children due to constant growth. The price is determined in many cases by the manual fitting procedures involved, and by the use of high-end hardware even when it is not necessarily mandatory. Rejection rates remain high even when financial resources do not present an issue. In this paper, we present a fully automatic fitting pipeline concept for a personalized, low-cost 3D printed prosthetic hand. The fitting concept is based on the extraction of the user geometric data i.e., a 3D scan of the stump and kinematic structure of the nonaffected hand. The 3D design is adjusted accordingly. In addition, the appearance of the hand can be easily modified according to user preferences. The hardware was chosen as minimalistic as possible to provide only essential features, thus resulting in a simple and robust design. By designing a modular software and hardware we allow the user to choose the most suitable interface for him\her. Currently, we offer an EMG interface and a foot-based interface, while  providing a wide platform of 3 rd party integration with our mobile application and Bluetooth communication. Lastly, the output of the automated pipeline is an easy to manufacture design with a minimal need for manual adjustments. The above-mentioned advantages could enable intensive use by users who do not have the means or reject other prosthetic hands and were looking for a better solution to their situation, we believe this concept has the potential to provide a much-needed leap in the field of upper limb prosthetics and could be expanded to other medical applications.

APPENDIX
The design and related developments discussed in this article were uploaded as an open-source project available for anyone to download and use: https://brml.technion.ac.il/show_project/43