Of particular interest to the neuroscience and robotics communities is the understanding of how two humans could physically collaborate to perform motor tasks such as holding a tool or moving it across locations. When two humans physically interact with each other, sensory consequences and motor outcomes are not entirely predictable as they also depend on the other agent’s actions. The sensory mechanisms involved in physical interactions are not well understood. The present study was designed (1) to quantify human–human physical interactions where one agent (“follower”) has to infer the intended or imagined—but not executed—direction of motion of another agent (“leader”) and (2) to reveal the underlying strategies used by the dyad. This study also aimed at verifying the extent to which visual feedback (VF) is necessary for communicating intended movement direction. We found that the control of leader on the relationship between force and motion was a critical factor in conveying his/her intended movement direction to the follower regardless of VF of the grasped handle or the arms. Interestingly, the dyad’s ability to communicate and infer movement direction with significant accuracy improved (>83%) after a relatively short amount of practice. These results indicate that the relationship between force and motion (interpreting as arm impedance modulation) may represent an important means for communicating intended movement direction between biological agents, as indicated by the modulation of this relationship to intended direction. Ongoing work is investigating the application of the present findings to optimize communication of high-level movement goals during physical interactions between biological and non-biological agents.
Human physical interactions can be intrapersonal, e.g., manipulating an object bimanually, or interpersonal, e.g., transporting an object with another person. In both cases, one or two agents are required to coordinate their limbs to attain the task goal. We investigated the physical coordination of two hands during an object-balancing task performed either bimanually by one agent or jointly by two agents. The task consisted of a series of static (holding) and dynamic (moving) phases, initiated by auditory cues. We found that task performance of dyads was not affected by different pairings of dominant and non-dominant hands. However, the spatial configuration of the two agents (side-by-side vs. face-to-face) appears to play an important role, such that dyads performed better side-by-side than face-to-face. Furthermore, we demonstrated that only individuals with worse solo performance can benefit from interpersonal coordination through physical couplings, whereas the better individuals do not. The present work extends ongoing investigations on human-human physical interactions by providing new insights about factors that influence dyadic performance. Our findings could potentially impact several areas, including robotic-assisted therapies, sensorimotor learning and human performance augmentation.