key: cord-0585835-wrtdx3wf authors: Prasad, Vignesh; Stock-Homburg, Ruth; Peters, Jan title: Advances in Human-Robot Handshaking date: 2020-08-26 journal: nan DOI: nan sha: 4dab31d089f443ff41990f0f44a6c3d1eff0e6f1 doc_id: 585835 cord_uid: wrtdx3wf The use of social, anthropomorphic robots to support humans in various industries has been on the rise. During Human-Robot Interaction (HRI), physically interactive non-verbal behaviour is key for more natural interactions. Handshaking is one such natural interaction used commonly in many social contexts. It is one of the first non-verbal interactions which takes place and should, therefore, be part of the repertoire of a social robot. In this paper, we explore the existing state of Human-Robot Handshaking and discuss possible ways forward for such physically interactive behaviours. Handshaking is a commonly and naturally used physical interaction and an important social behaviour between two people [18] in many social contexts [13, 22, 51] . It is one of the most common greetings that is usually the first nonverbal interaction taking place in a social context. Handshaking is, therefore, an important social cue for several reasons. Firstly, it plays an important role in shaping impressions [9, 13, 51] . Moreover, it helps set the tone of any interaction, since the sense of touch can convey distinct emotions [23] . Robot handshaking improves the perception of robots as well by making humans more willing to help them [4] leading to better cooperation and coexistence. Having human-like body movements plays an important role in the acceptance of HRI as well. Thus, having a good handshake can not only widen the expressive abilities of a social robot but also provide a strong first impression for further interactions to take place. We propose the following framework in our study, shown in Fig. 1 . A group of researchers from Okayama Prefectural University, Japan modelled human handshaking interactions using motion capture to track participants' joints. Firstly, a transfer function to mimic the requester's reaching motion for the responder was developed [31] . This was further developed into a minimum jerk trajectory model which accurately captures the velocity profiles and generates smooth motions [25, 30, 44, 45] . Shaking was modelled as a spring and damper system [28, 61] , whose oscillatory motion profile fit that of shaking. A similar spring-damper model is proposed by Dai et al. [16] to model the elbow stiffness by measuring muscle contractions in the arm using EMG signals. A group of researchers from the University of Lorraine on modelled the mutual synchronization (MS in short) between participants during shaking as well as the forces exerted on the palms [36, 38, 53] . Tagne et al. [53] investigate the joint motions with IMUs place at each joint. Melnyk and Hénaff [38] trends across different gender pairings are analysed as well. Both of these works analyze the influences of social setting, such as greeting, congratulating or sympathising. Tagne et al. [53] observe a shorter duration during greeting compared to sympathy and congratulation, which were similar. The grip strength shows contradictory results. Tagne et al. [53] observe the lowest grip strength in case of sympathy, followed by greeting and then, congratulations. Melnyk and Hénaff [38] observe slightly higher grip strength for consolation although not significantly. Regarding gender, male pairs shook for a lesser duration than mixed pairs and female pairs shook the longest. No conclusive correlations were found between gender and grip strength, contrary to [13, 42] . Knoop et al. [35] studied the contact area, pressure and grasping forces exerted while handshaking. A positive correlation was found between contact pressure and grasping force. This was non-linear as the grasping force got higher. Jindai et al. [25, 30, 31] and Ota et al. [44, 45] propose two models for reaching. One with a transfer function based on the human hand's trajectory with a lag element and the other is a minimum jerk trajectory model, which fits the velocity profiles and provides smooth trajectories by definition. More recently, works model reaching using machine learning. Campbell et al. [12] use imitation learning to learn a joint distribution over the actions of both the human and the robot during handshaking. They execute open-loop trajectories and human adjusts to them in training as their pneumatic robot cannot be kinesthetically taught. During test time, the posterior distribution is inferred from the human's initial motions from which the robot's trajectory is sampled. Their framework estimates the speed of the interaction as well, to match the speed of the human. Christen et al. [14] use Deep Reinforcement Learning to learn physical interactions from human-human interactions. They use an imitation reward which helps in learning the intricacies of the interaction. Falahi et al. [19] use one-shot imitation learning to kinesthetically teach reaching and shaking behaviours based on gender and familiarity detected using facial recognition. However, it cannot be generalized due to the extremely low sample size. Vinayavekhin et al. [57] model hand reaching with an LSTM trained using skeleton data. They predict the human hand's final pose and devise a simple controller for the robot arm to reach the predicted location. In terms of smoothness, timeliness and efficiency, their method performs better than following the intermediate hand locations. However, it performs worse than using the true final pose due to inaccuracies in the prediction. Avelino et al. [5] model grasping with different degrees of hand closure. Since position control is used, the force perceived depends on the hand sizes of the participants. They address this in their next work [6] , where participants adjust the robot's fingers until a preferable grasp is reached. This provides a reference for the force sensors and the joint positions, using which grasping behaviours are developed. They do not incorporate any force-feedback, which is discussed below. Ouchi and Hashimoto [46] propose a remote handshaking system, where a handshake is performed while on a call using a custom silicone-rubber based robotic soft hand. The force exerted on the robot hand, measured using a pneumatic force sensor, is relayed to the robot hand at the other end that mimics it. They see that participants barely felt any transmission delay and perceived the partner's existence better during the call. Pedemonte et al. [48] design a robot hand for handshaking controlled by the force exerted on it. It is sensor-less, with a deformable palm controlling the fingers based on the degree of deformation using variable admittance control. Arns et al. [2] improve this design with lower gear ratios, impedance control and more powerful actuators to obtain stronger grasping forces and almost instantaneous speeds (< 0.05s). The force exerted by the robot hand depends on the force exerted by the human, leading to a partial synchronisation. Arns et al. test how it feels in comparison with a human hand on a 5-point scale (1-very different, 5-identical). It was perceived well in terms of compliance (3.9/5), force feedback (4/5) and overall haptics (3.7/5). Vigni et al. [56] follow a more closed-loop approach by measuring the force exerted by the robot hand with force-sensitive resistors and control the force exerted by the robot hand which is approximated from the degree of hand closure. They compare three different relationships between the exerted forces of the human and the robot namely linear, constant and combined (constant+linear). The latter two are used with high (strong) and low (weak) constant values. The combined controllers were perceived better than the constant ones. Participants were seen to adjust their force based on the robot's, showing that humans tend to follow the force exerted on their hand. The stronger variants of the constant and combined controllers were perceived as more confident/extroverted. In this section, we describe works that that model the shaking phase. They mainly do so by aiming to achieve synchronous motions with the interaction partner while reducing interaction forces. The works can be broadly divided into the following three categories: Central Pattern Generator Models, Harmonic Oscillator Models and Miscellaneous Models. Central Pattern Generators (CPGs) [24] are biologically inspired neuronal circuits, that generate rhythmic outputs from non-rhythmic inputs. Kasuga and Hashimoto [34] model the shoulder and elbow motions of a robot using the exerted torque on the joints as input with a CPG which, however, doesn't adapt to the human. For better synchronization, some works adapt the CPG to learn the frequency of the shaking motions. This is achieved by either incorporating a learning framework into the CPG [3, 32, 37] or by parametrizing the CPG and learning the parameters on the fly [47, 50] . Harmonic oscillator models either mimic harmonic systems like spring-damper systems [17, 39, 61] or follow simple sinusoidal motions [8, 59, 60, 62] . Chua et al. [15] propose a hybrid model that uses both, a spring-damper model to update impedance parameters and a simple sinusoidal trajectory to generate reference trajectories. Similarly other works use impedance control to model the stiffness [8, 17] and for better synchronization, some estimate the impedance parameters in an online fashion by using an EKF [39] , a HMM [59] or least-squares [60] . Karniel et al. [33] and Nisky et al. [41] design an experimental framework and metric for testing the human-likeness of shaking motions on a 1D haptic stylus. Avraham et al. [7] test 3 models with this. The first is a tit-for-tat model that passively records the joint motions and replays it. The second is a biological model simulating muscle generated motions to achieve low interaction forces. The final is a simple linear regression model. The tit-for-tat and linear regression models fare much better than the biologically inspired model. Pedemonte et al. [49] introduce a remote handshaking mechanism using their previously developed hand. They develop a vertical rail mechanism that the hand is mounted on and is passively controlled. This shaking motion along with the force exerted on the hand is relayed to the partner's hand and rail mechanism. This allows realistic haptic interaction to take place remotely where the participants can adequately perceive each other's motions and forces. Before we talk about the various social responses, we would like to discuss the differences in metrics and criteria used for understanding the way different methods evaluate their studies. One common metric used is the bipolar scale (7-point or 5-point scale) where one end conveys a negative perception of the parameter, and the other end conveys a positive perception. Another popular method is the Bradley-Terry model [11] , which is a probabilistic model specifically used for understanding paired comparisons among a set of different methods. However, these are general metrics used for statistical analysis. To this end Karniel et al. [33] devise a custom metric for comparing the human-likeness of different human-robot handshaking methods in a Turing test-like setting, called a Model Human Likeness Grade (MHLG). This is based on the perceived probability of a method being human-like by a participant. Additionally, the use of many different types of robotic interfaces makes it difficult to generalize the comparison of results across different works. Some works, use a simple gripper like interface [10, 19, 32, 50] , some use a rod-like endeffector [7, 21, 33, 41, 47, 59, 58] , and some use a human-hand like interface that is either actively controlled [4, 6, 1, 14, 39, 54, 56] , passively controlled [2, 8, 46, 48, 49] or not controlled at all [12, 17, 25, 26, 27, 28, 29, 30, 31, 34, 35, 37, 40, 43, 44, 45, 52, 55, 57, 61] Such a variety in the usage of different evaluation criteria, metrics and especially robotic interfaces makes it is difficult to converge on a common benchmark on common parameters to evaluate different human-robot handshaking methods. Therefore, we categorize the different works evaluating human-robot handshaking based on the factors they evaluate or the goal of their experiments. These can roughly be divided into the categories as shown below. Ammi et al. [1] and Tsalamlal et al. [54] explored combinations of visual and haptic behaviours. Among visual expressions, "happy" was rated higher than "neutral" one, the least being "sad". Significantly higher arousal and dominance were seen for strong handshakes and higher valence for soft ones. Higher arousal and dominance was also seen with strong handshakes in a visuo-haptic case as compared to a visual-only case. Another framework studying the effect of visuohaptic stimuli was proposed by Vanello et al. [55] . They develop a sensor glove to track the participants' hand motions and contact pressure and have a screen on which visuals (faces of humans and robots) are shown. Participants' feedback is analysed using fMRI activity. Nothing can be concluded from their results as only three participants took part in their study. Jindai et al. [27], found that a delay of 0.1 seconds between the voice and handshake motion of the robot was acceptable. Jindai et al. [30] , they further saw that participants preferred when the gaze shifted steadily from the hand while reaching to the face after contact. Ota et al. [44, 45] found the response to a handshake to be preferable with a delay of 0.2s to 0.4s. Nakanishi et al. [40] equip a robotic soft hand on a video screen showing a remote presenter in a telepresence scenario. Interactions were better perceived when the presenter's hand was not visible on the screen. They further saw that when participants controlled a second robot hand placed with the presenter, feelings of closeness and physically shaking hands were rated higher when both the presenter's hand and the robot hand were not in the frame. They argue that the hand's visibility cancels the feeling of synchronization, which some subjects reported was due to seeing two hands for the same interaction. Avelino et al. [4] see how handshaking affects the willingness to help a robot when it has to perform a navigation task. Participants who shook hands with the robot found it to be more warm and likeable and were more willing to help the robot. However, they argue that a human-like robot handshake would lead to participants not anticipating the robot getting stuck in a simple navigational task due to a mismatch between the expected skill and the actual behaviour. Bevan and Fraser [10] study the effect of handshaking on negotiations between participants, where one participant interacts remotely via a Nao robot. Handshaking improved mutual cooperation, however, haptic feedback for the telepresent negotiator had no significant impact. It did not affect the perceived trustworthiness, which they argue is possibly due to the childlike nature of the Nao robot. Garg et al. [20] classify people's personality as introverts and extroverts using statistics of accelerations, Euler angles and polar orientations when shaking hands with a robot hand. The features are ranked based on Mutual Information followed by a K Nearest Neighbours classification, achieving a 75% accuracy. Orefice et al. [42] similarly look at distinctions in personality as well as gender. They found that male-male pairs applied more pressure than male-female pairs. Female-female pairs had a longer duration and lower frequency. Regarding personality, they found that introverts shook at higher speeds while extroverts exerted more pressure. Using these features, they predict the human's gender and personality during a human-robot handshake. They further perform a longitudinal study [43] that looks at how pressure variations while shaking hands with a Pepper robot reflects the participants' immediate mood. A consistency was seen when shaking hands with a human subject or with Pepper, which was unexpected as interacting with Pepper might not be as human-like. The only significant differences between different positive moods were with "Calm" and "Cheerful" moods, with less pressure observed in a "Calm" mood. For negative moods, a "Bored" mood had lower pressure than "Excited" or "Tense" moods, both of which had more arousal. In general, lower pressures were found with moods with lower arousal. Giannopoulos et al. [21] and Wang et al. [58] compare the human likeness of their previous handshaking models (a basic one [60] and an interactive one [59] ) with a human operating the robot. Both studies perform their experiment with participants wearing noise-cancelling headphones playing music and ambient conversations in a cocktail bar scenario. Giannopoulos et al. [21] blindfold the participants and Wang et al. [58] make the participants wear a VR headset with a human model rendered for the robot. The human-operated handshake was rated the highest (6.8/10 in both), followed by the interactive handshake (5.9/10 in [21], 5.3/10 in [60] ). The basic handshake was rated the lowest (3.3/10 in [21] , 3.0/10 in [60] ). The interactive handshake was close to the human-operated one, but both were far from the maximum human-likeness (10/10), possibly due to the rod-like end effector. Stock-Homburg et al. [52] test if an android robot's hand, made with soft silicone skin and a heated palm, can pass as a human hand. Participants were blindfolded and shook hands with a human and the robot twice in random order. Majority of them (11/15) correctly guessed the first hand they interacted with, which some said was from the mechanical feel of the robot hand. By the fourth handshake, all guessed correctly. Participants only had a static interaction. Testing handshake behaviours instead, could yield better insights. Overall, we discussed various works looking into human-robot handshaking. Due to differences in hardware and metrics, it is difficult to come up with a common benchmark to evaluate these studies. However, some qualitative conclusions can be drawn. In general, an element of synchronization is present. This can be measured well in the shaking stage where low interaction forces can be an indication of synchronization. In reality, there is a leader-follower situation which arises, which could perhaps reflect on various personal attributes of the people shaking hands. From the perspective of a social robot, contextual cues would be effective in having a better impact from the handshake. This requires further research in other fields like emotion recognition, estimating intent, personality etc. For a more human-like perception, each aspect of the movement at different stages needs to be human-like since we are still far from having robotic interfaces that not only look human-like but also feel human-like, as most still have a mechanical feel. This combined with a smooth integration of the different phases of handshaking is also important since delays in switching between the different stages could possibly not be well perceived. One thing to keep in mind is physical interactions vary over different cultures, age groups, geographic locations. Depending on the context too, different interactions would be more prevalent, like hugging or patting for higher intimacy or bumping fists or giving high fives in a friendly scenario. Moreover, due to the Covid-19 pandemic, there are increasing restrictions and precautions regarding limiting physical contact which has led to alternative interactions, like shaking/tapping feet, touching elbows/forearms, remote high fives and so on. However, the importance of handshakes in business and formal settings is a good motivation for continuing to develop human-like handshaking behaviours. Additionally, learning different physically interactive behaviours would help improve the perception of a social robot, which is a good direction for future work. Haptic human-robot affective interaction in a handshaking social protocol Design, control and experimental validation of a haptic robotic hand performing human-robot handshake with human-like agility Physical humanrobot interaction in the handshaking case: learning of rhythmicity using oscillators neurons The power of a hand-shake in human-robot interactions Human-aware natural handshaking using tactile sensors for vizzy, a social robot Towards natural handshakes for social robots: human-aware hand grasps using tactile sensors. Paladyn Toward perceiving robots as humans: Three handshake models face the turing-like handshake test Haptic interface for handshake emulation The influence of handshakes on first impression accuracy Shaking hands and cooperation in tele-present humanrobot negotiation Rank analysis of incomplete block designs: I. the method of paired comparisons Learning interactive behaviors for musculoskeletal robots using bayesian interaction primitives Handshaking, gender, personality, and first impressions Guided deep reinforcement learning of control policies for dexterous human-robot interaction Human-robot motion synchronization using reactive and predictive controllers Research of human-robot handshakes under variable stiffness conditions Variable viscoelasticity handshake manipulator for physical human-robot interaction using artificial muscle and mr brake Handwork as ceremony: The case of the handshake Adaptive handshaking between humans and robots, using imitation: Based on gender-detection and person recognition Classifying human-robot interaction using handshake data Comparison of people's responses to real and virtual handshakes within a virtual environment The handshake as interaction Bio-inspired plastic controller for a robot arm to shake hand with human Physical analysis of handshaking between humans: mutual synchronisation and social context On the role of stiffness and synchronization in human-robot handshaking. The International Remote handshaking: touch enhances videomediated social telepresence Three alternatives to measure the humanlikeness of a handshake model in a turing-like test Let's handshake and i'll know who you are: Gender and personality discrimination in human-human and humanrobot handshaking interaction Pressure variation study in humanhuman and human-robot handshakes: Impact of the mood A handshake response motion model during active approach to a human Handshake response motion model with approaching of human based on an analysis of human handshake motions Handshake telephone system to communicate with voice and force A kinematic controller for human-robot handshaking using internal motion adaptation Design, control, and experimental validation of a handshaking reactive robotic interface A haptic bilateral system for the remote human-human handshake Synchronization based control using online design of dynamics and its application to human-robot interaction Exploring the handshake in employment interviews Evaluation of the handshake turing test for anthropomorphic robots Measurement and analysis of physical parameters of the handshake between two persons according to simple social contexts Affective handshake with a humanoid robot: How do participants perceive and combine its facial and haptic expressions? 19th International Symposium in Robot and Human Interactive Communication The role of closed-loop hand control in handshaking interactions Human-like hand reaching by motion prediction using long short-term memory Handshake: Realistic human-robot interaction in haptic enhanced virtual reality An hmm approach to realistic haptic human-robot interaction Modelling of human haptic skill: A framework and preliminary results Development of a shake-motion leading model for human-robot handshaking Human-robot handshaking: A hybrid deliberate/reactive model The authors thank the Interdisciplinary Research Forum (Forum Interdisziplinäre Forschung) at the Technical University of Darmstadt and the Association of Supporters of Market-Oriented Management, Marketing, and Human Resource Management (Förderverein für Marktorientierte Unternehmensführung, Marketing und Personalmanagement e.V.) for funding this work.