key: cord-0297271-4ynx9mrw authors: Yizhar, Or; Buchs, Galit; Heimler, Benedetta; Friedman, Doron; Amedi, Amir title: Proprioception Has Limited Influence on Tactile Reference Frame Selection date: 2020-11-03 journal: bioRxiv DOI: 10.1101/2020.11.02.364752 sha: 0a0b0673e5c09379855c22e73db193cad1c70a39 doc_id: 297271 cord_uid: 4ynx9mrw Perceiving the spatial location and physical dimensions of objects that we touch is crucial for goal-directed actions. To achieve this, our brain transforms skin-based coordinates into a reference frame by integrating visual and proprioceptive cues, a process known as tactile remapping. In the current study, we examine the role of proprioception in the remapping process when information from the more dominant visual modality is withheld. We developed a new visual-to-touch sensory substitution device and asked participants to perform a spatial localization task in three different arm postures that included posture switches between blocks of trials. We observed that in the absence of visual information novel proprioceptive inputs can be overridden after switching postures. This behavior demonstrates effective top-down modulations of proprioception and points to the unequal contribution of different sensory modalities to tactile remapping. How does body posture influence the way we interpret and perceive the external environment? Our constant physical interaction with the world requires a continuous update of the body's location in external space, its relation to other objects, and its relation to itself (e.g., the relative positions of body-parts in motion). These varied representations of the body form a conscious perception of the external world and play an essential role in action planning. Think for instance of driving a car with a stirring wheel in your hand -sensory information about the wheel gives rise to a coherent perception of its function and leads to a set of possible actions that one can perform with it. First, we access the wheel's physical dimensions through tactile stimulations received on our palms that form an anatomical reference frame. To steer the car, we transfer this information into a different reference frame that integrates the anatomical reference frame with external information fitting the wheel's functional use (e.g. the sidewalk is to right, the opposite lane is to left), in a process known as tactile remapping 1, 2 . The transformation results in the adoption of an allocentric reference frame that is independent of the body, relating objects' dimensions to external anchors, or of an egocentric reference frame, relating objects' positions to one's own body 3, 4 . Tactile stimulations can be remapped into many allocentric or egocentric reference frames with the ultimate selection depending on the actions that precede or follows the sensation 5, 6 , the gravitational dimensions of the external environment [7] [8] [9] [10] , and the general position of the body 2,4,11 . Which cognitive mechanisms drive tactile remapping? One influential view considers tactile remapping as part of a wider process of acquiring sensorimotor contingencies 2,8,12,13 . According to this theory, perception emerges through experiencing multiple co-patterns of incoming sensory signals coupled with outgoing motor actions towards the stimulus. In the context of tactile remapping, multiple reference frames are learned from exposure to tactile stimulations that are integrated with visual and proprioceptive cues to execute diverse actions. Thereafter, many reference frames are accessible with different probability weights that are constantly updated with ongoing sensorimotor experiences, and are then retrieved implicitly during the tactile remapping process 2, 3, 13 . Supporting studies show that a change to body posture, gaze, or object's position in external space triggers a gradual adaptation period marked by inconsistent reference frame selections and increases in decision times as participants integrate new sensory information 4, 10, 14, 15 . Over time, participants' reference frames selection becomes more robust as new contingencies are established 2,10,16,17 . However, the description of tactile remapping as a byproduct of sensorimotor contingencies overlooks the differential contribution of vision and proprioception to the process. In particular, it is known that vision is our predominant sensory modality for spatial processing and has a particularly strong influence on reference frame selections. Vision impairs the localization of tactile stimuli when arms are crossed 17, 18 , determines the assignment of vertical and horizontal axes to our body 14, 19 , and biases actions towards touched objects 20, 21 . Conversely, the effects of proprioception (i.e., posture) per se on tactile remapping are less studied and harder to isolate, even when visual inputs are withheld. Such studies typically include complex spatial and cognitive tasks that strongly influence perception but are unrelated to proprioception proper, such as the need to relocate the object after changing postures 2,5,13 , or a manual delivery of tactile stimuli that biases participants' responses 4, 14, 15 . In the current study, we tested the effects of changing body postures on reference frame selection in blindfolded participants. To disentangle the contribution of proprioception from other factors, we built a visual-to-tactile Sensory Substitution Device (SSD) 22,23 that transforms 2D grayscale images into tactile stimuli delivered on the inner arm. The fixed device moves together with the arm and thus nullifies the need to actively relocate the stimulations. With this unique setup, we had participants perform a simple spatial localization/orientation task while also switching their arm's posture between trial blocks. We aimed to further investigate the 4 role of posture in tactile remapping by asking how switching postures affect previously acquired references. According to a strictly sensorimotor prediction, after switching postures new proprioceptive cues will gradually integrate with a stored body representation that will produce an adaptation and learning period, characterized by less consistent responses and longer decision times. Results in this direction will suggest that, albeit vision's dominant influence, incoming proprioceptive signals strongly influence the remapping process. An alternative hypothesis is that proprioception is a unique sensory modality, which we are less consciously aware of 24 , and its effect on remapping is weaker than previously assumed. In this case, participants would rapidly adapt to new postures as top-down representations would override incoming bottom-up proprioceptive cues, diverging from a pure sensorimotor contingency description. In this study, we investigated the properties of reference frames' selection when relying solely on proprioceptive cues. To this aim, we used a visual-to-tactile SSD that transfers 2D images to blindfolded participants' arm, which was placed in three different postures (Fig. 1a. ). In each experimental trial they were presented with a series of vibrotactile stimulations corresponding to pixels an image and their task was to report the stimulus' spatial location ("up/down") or orientation ("upward/downward"). We interpreted their responses on the yaxis based on the anatomy of the inner arm, referred to here as coordinate selection. Distal responses were defined as a perception of the line's upper location/orientation located away from the trunk and towards the wrist. Proximal responses were defined as the perception of the line's upper location/orientation located towards the trunk and the elbow (Fig. 1b.) . This was done to obtain a homogenous categorization of participants' responses, ultimately facilitating the identification of their selected reference frame (arm-based, trunk-based, external, etc.). Group-level analyses were performed with a Wilcoxson signed-rank two-tailed test and corrected for multiple comparisons (Bonferroni, p = 0.05). Subject-level statistics passed the criteria for normal approximation and analyzed with a normalized two-tailed t-test corrected for multiple comparisons (False Discovery Rate, a = 0.05). In our second experiment, we replicated parts one and two of experiment 1 while From the fit, we can predict some saturation in the learning process after approximately 60 trials. In contrast to the overall learning curve trend, there is a single significant increase in decision times after the first trial following the first posture switch (n = 20, p = 0.001). To understand the reference frame choices during the tactile remapping process before and after switching, we analyzed participants' responses from both experiments on each part. We asked whether a participant held one preference over the other compared to chance (Normal approximation to Binominal, X ~ B {n = 16-24, p = q = 0.5}, FDR corrected). All 72 participants showed a clear and significant preference (n = 72, a = 0.05) in part 1 (Fig. 3a. ). For the Flexion posture all 30 participants took a distal reference frame, while for the Extension posture 27 out of 30 participants adopted the proximal one. In the Neutral posture, six participants showed a consistent proximal preference, and the other six a distal one. In part two, responses from 39 out of 40 participants passed the FDR correction (n = 40, a = 0.05) with a significant consistent preference (Fig. 3a. ). 14 participants who switched from Extension-to-Flexion adopted a distal reference frame (switched coordinates) and 6 adopted a proximal one (maintained the same coordinates). For those who switched from Flexion-to-Extension, 13 adopted a distal reference frame (maintained the same coordinates) and 6 a proximal preference (switched coordinated). In sum, out of the 39 participants, 17 participants changed their reference frame after the switch, which indicates a remapping into external or trunk-centered reference frames, while 22 participants maintained their initial reference frame which indicates a remapping anchored to the anatomy of the inner arm. In part three, all 20 participants' responses passed the FDR correction as they all reverted to the preferences selected in part one (Fig. 3a.) . 10 participants adopted a distal reference frame in the Flexion condition. In the Extension condition 8 participants adopted the proximal reference frame and other two the distal one. These individual preferences show that the insignificant group result for the Extension condition in part three (Fig. 2d.) does not stem from participants' inability to select and hold a reference frame. To further investigate the process of changing or maintaining a reference frame, we conducted a post hoc analysis between decision times of participants who changed their reference frame after switching (External or trunk-centered, n = 7) and those who maintained their previous reference frame (arm-centered n = 13). We observed that participants who changed their reference frames have shorter overall decision times (Fig. 3b.) . To examine the differences between the groups we conducted an ANCOVA on the regressions' line coefficients. The difference in slopes were not significant between the two groups (df = 1, F = 2.01, p = 0.16) suggesting a similar learning curve for both groups. However, the intercept of participants who maintained their reference frame was significantly higher than those who changed reference frames (n = 72, p = 0.0001). The current study investigated the role of proprioception in tactile remapping by measuring the effects of arm posture on reference frame selections, and the extent to which they are affected by switching postures. We found that participants' initial selection of reference frames is highly dependent on their posture, and was not anchored to a specific anatomical location on the inner arm, such as the wrist or the elbow. The remapping destination was to another body part, such as the trunk or face, which reflected an egocentric reference frame, or otherwise anchored to the external environment in an allocentric reference reference frame that is centered on the trunk or external by reversing their perception of the vertical axis. In contrast to these predictions, the other half adopted a new reference frame that is based on the anatomy of the inner arm, which is invariant to changes in the arm's posture. Most importantly, we observed little cost for either adopting a new or maintaining a reference frame. Participants exhibit a strong consistency in their responses after changing postures while their decision times are unaffected and continue to follow a general learning curve trend (save for the first trial after the switch). Although participants' initial reference frame is trunk/facecentered or external, many of our participants switch to a reference frame that is centered on the anatomy of the inner arm. Moreover, our post-hoc analysis hints that participants who chose the inner arm as a reference may sport shorter decision times, even when considering trials that occurred before the switch. The latter result suggests that differences in decision times between the groups couldn't be specifically linked with a posture switch, but perhaps associated with a more general behavior. The small and uneven number of subjects per group and the lack of differences in the learning rate between them calls for a more direct investigation by future studies. Considering the weighing scheme model of sensorimotor contingencies 2 in the context of our findings, the ability to select multiple reference frames with little cognitive costs follows an extreme instance where all options are weighed equally. It is possible that while the initial choice of reference frames is implicit, switching posture in the absence of vision forces participants to make an explicit choice. This is further corroborated by the results from the second posture switch, where all participants maintained their previous reference frame affirming an explicit choice. Taken together, our results show that top-down modulation can easily nullify low-order proprioceptive cues when choosing between reference frames, and that previously stored representations can be abstracted from current sensory inputs and spatial tasks. We suggest that the lack of visual inputs is the primary reason for the behavior. Vision is essential in forming body representations and has been widely reported as dominant over competing inputs from other modalities 2, 19, 25, 26 . For example, crossing effects in temporal order judgments are substantially decreased when participants are blindfolded but the same manipulation has little effect on the weighing of different reference frames in congenitally blind adults [27] [28] [29] . Moreover, vision and touch share a combined multisensory object representation, which is formed by inputs from both modalities 30 . Visual cues thus act both as facilitators for body representations but also act as a disturbance to maintaining a stored body representation that was tactually derived. As our participants are blindfolded, vison could not override the changes in proprioceptive signals, revealing the contribution of proprioception to the coordinate transformation process. Proprioception is a unique sensory modality, and though much is known about its physiology, it remains a somewhat esoteric sensory modality. While vision is an exteroceptor identified with a known sensation, proprioception is an interoceptor that, for the most part, is not consciously perceived 24 . It is a perception of the self that results from motor actions taken and initiated by the self, and can thus be predicted. As such, the sensory consequences of arm movement could be anticipated, and they might interfere less with higher body representations. In conclusion, the present study demonstrates that top-down modulations can nullify new proprioceptive information during the process of tactile remapping, ultimately confirming that the weight of proprioceptive information during spatial tasks is considerably weaker compared to that of visual information. We conducted two separate experiments of image localization, focusing on the perceived elevation (up-down) coordinate of the presented stimuli. Blindfolded participants were fitted with the device on their dominant hand, and received a short introduction about the device and experimental process, followed by two introductory pre-test stimuli. Importantly, no information was given on the way the algorithm conveys information on the y-axis. Participants had to report the orientation or spatial position of the stimuli. For horizontal line stimuli (Fig. 1b.) , the question was "Is the stimulus located on the upper or lower part of the image?", and for the diagonal line stimuli (Fig. 1b.) , "Does the stimulus have a downward or upward slope?". The experimenter did not provide any feedback to participants' responses. another block of trials with the same task. To reduce implicit biases, participants were told that switching arm postures is necessary to eliminate fatigue (Fig. 1a. ). In experiment 2, participants were asked to respond with keyboard strokes as soon as possible, using their non-stimulated hand. To minimize the bias towards the keys' physical location, participants were randomly assigned to different key stroke combinations. In part one, 20 participants were randomly assigned to the Extension (n = 10) or Flexion postures (n = 10). Participants were asked to switch their postures twice (Fig. 1a.) , at the end of part one and of part two. Self-initiated responses resulted in shorter block durations (7:47, CI [7:13, 8:23] minutes) in comparison to the first experiment Touch and the body Tactile remapping: From coordinate transformation to integration in sensorimotor processing Mental rotation of tactile stimuli Perceived spatial organization of cutaneous patterns on surfaces of the human body in various positions Reference frames for coding touch location depend on the task Integration of anatomical and external response mappings explains crossing effects in tactile localization: A probabilistic modeling approach Systems of Spatial Reference in Human Memory From maps to form to space: Touch and the body schema How our body influences our perception of the world Cognition overrides orientation dependence in tactile viewpoint selection Sensorimotor alignment effects in the learning environment and in novel environments Implicit and explicit body representations Neuropsychologia Flexibly weighted integration of tactile reference frames Importance of head axes in perception of cutaneous patterns drawn on vertical body surfaces Taking someone else's spatial perspective: Natural stance or effortful decentring? Towards explaining spatial touch perception: Weighted integration of multiple location codes Dynamic tuning of tactile localization to body posture Response demands and blindfolding in the crossedhands deficit: An exploration of reference frame conflict The effects of immediate vision on implicit hand maps Where you look can influence haptic object recognition. Attention, Perception, Psychophys Touch used to guide action is partially coded in a visual reference frame The " EyeCane " , a new electronic travel aid for the blind: Technology, behavior & swift learning Navigation Using Sensory Substitution in Real and Virtual Mazes The Proprioceptive Senses: Their Roles in Signaling Body Shape, Body Position and Movement, and Muscle Force Alleviating the 'crossed-hands' deficit by seeing uncrossed rubber hands Multisensory brain mechanisms of bodily self-consciousness Early Vision Impairs Tactile Perception in the Blind How visual experience impacts the internal and external spatial mapping of sensorimotor functions Disentangling the external reference frames relevant to tactile localization Vision and touch: Multiple or multisensory