key: cord-0046974-85f1yfk5 authors: Radu, Iulian; Tu, Ethan; Schneider, Bertrand title: Relationships Between Body Postures and Collaborative Learning States in an Augmented Reality Study date: 2020-06-10 journal: Artificial Intelligence in Education DOI: 10.1007/978-3-030-52240-7_47 sha: 7a38033884876157fa0e2688fba756e3ca30e170 doc_id: 46974 cord_uid: 85f1yfk5 In this paper we explore how Kinect body posture sensors can be used to detect group collaboration and learning, in the context of dyad pairs using augmented reality system. We leverage data collected during a study (N = 60 dyads) where participant pairs learned about electromagnetism. Using unsupervised machine learning methods on Kinect body posture sensor data, we contribute a set of dyad states associated with collaboration quality, attitudes toward physics and learning gains. Body postures and gestures are nonverbal communication channels, which have been shown to reveal valuable information about learners' internal states, such as their attitudes towards a learning activity [1] , misconceptions [2] , comfort with collaborators [3, 4] . Additionally, when students collaborate with other students or teachers, the amount of synchronization between their gestures and postures has been linked to collaborative learning dimensions, such as affect [5] , learning gains [1] and quality of collaboration [6, 7] . In studies involving teachers and students, body synchrony has been linked to increased learning gains [5, 10] . However, for some situations body synchrony is negatively correlated with learning. Abney et al. [11] observed dyad movement using computer vision algorithms, and found that synchrony was negatively correlated with learning. Another study [12] , which studied Kinect dyad movements, found that body synchronization had no overall effect on any collaborative or learning measures, but found that learning gains were correlated with cycles of "cognition and action", where dyads alternated between reflecting in the activity and interacting with the system. These conflicting results indicate that further research is needed to understand the links between posture and collaborative learning. To perform such research, the traditional method is qualitative coding of video data, which requires large time investment from manual coding. Over the last decade, researchers have been investigating how automated methods can be used to detect body postures and their links to student attitudes and learning [8] . In this paper we expand this research by contributing new methods for analyzing body posture data from Kinect sensors, and new understanding of the relationships between posture synchronization and collaborative learning. The goal of this paper is to determine if static postures of paired participants can be used as indicators of group learning, attitudes and collaboration. We perform this investigation in the context of an augmented reality (AR) experience. Decreasing costs and advanced body tracking technology make AR popular for educational use [15] , and it is valuable to understand user behaviors under this context. We use data from a previous study () where 60 dyads interacted with a homemade speaker system, a common activity in learning physics. Dynamic visual representations of the electromagnetic concepts of the speaker are visualized through the AR headset (Fig. 1 ). We measured several dependent measures of collaboration, attitudes and learning gains. For this analysis, all variables were measured at the group level. Collaboration was measured using a validated rating scheme described by Meier, Spada and Rummel [20] , measuring collaborative processes on subdimensions such as coordination (i.e. whether participants divided tasks and managed time), information processing (i.e. whether participants shared sharing information and reached consensus), etc. Attitudes towards the user experience were measured using the survey instrument in [21] measuring perception of aesthetics, endurability, focus, novelty, involvement and usability. Learning was calculated as relative learning gains (RLG), which measure the amount of knowledge gained between pre and post tests of electromagnetism knowledge. Relative learning gains were calculated on the overall test score, as well as on specific subdimensions such as the ability to answer transfer questions. These dependent measures were correlated with dyad participant postures, calculated based on data collected from a Microsoft Kinect sensor, and from the Microsoft Hololens headsets worn by participants. Through these sensors we collected joint coordinates and gaze data from both participants, and calculated dyad posture metrics such as closeness between participants (which may signal how comfortable participants feel with each other), similarity between spine angles (which may indicate that participants mirror each other's posture), orientation towards peers (which may indicate focus on discussion), forward lean (possibly indicating engagement with the task). Participants were recruited from the study pool of a laboratory at a university in the northeastern United States. Participation required subjects to not know each other, have no significant prior physics knowledge, be born on/after 1976, speak English fluently, have at least a bachelor's degree, and wear no bifocal glasses. All participants first individually completed a pre-test, then a 30-min paired activity of answering worksheet questions while interacting with the apparatus, followed by individual post-test. Only data from the paired activity was used for analysis. After data cleaning, the resulting dataset contains 50 dyad sessions: 25 sessions with the AR visualizations and 25 sessions without. Prior to calculating Kinect metrics, the Kinect data was preprocessed to remove noise and disambiguate between the seated participants and researcher. We explored K-means posture clustering using the "elbow method", exploring combinations of clustering variables and number of clusters k = 2, 3, 4, 5. The optimal configuration involved k = 4 clusters and variables of spine synchrony, mean distance between participants, and discussion orientation (Fig. 2 left) . Figure 2 (right lists the significant correlations found between the time in each cluster and the measures, and Fig. 3 shows the video frames at the datapoints that most closely represents each cluster. Cluster 0, what we labeled as "Turn Takers", are characterized by low spine similarity and positively correlated with coordination and overall collaboration. Figure 3 (top left) shows one participant is leaning forward interacting with the setup while the other is watching. This configuration indicates that low spine synchrony could be indicative of a collaboration style where participants take turns interacting with the setup. This is supported by research in [12] where cycles of leaning forward and backward indicated cycles of reflection and action were found across successful dyads. Cluster 1 "Open to Collaboration", is characterized by low distance between participants and participants facing parallel to each other, and is correlated with overall positive attitudes and learning. Figure 3 (top right) shows both participants are sitting close to each other and are engaged in the task in front of them, and left participant in a thinking pose. This configuration appears to show participants highly focused on the task and which would explain a positive correlation with overall attitude and learning. Cluster 2 "Closed to Collaboration", is characterized by high distance between participants and with participants facing each other, and is negatively correlated with overall positive attitudes toward the experience. This clustering configuration seems to be indicative of a more negative experience where participants spend some time facing each other yet remain more distant. The figure above shows a dominant interaction where one participant dominates the activity while the other is sitting back. Cluster 3 "Synchronized Lean", is characterized by high average distance and high spine synchronization, and is negatively correlated with overall coordination. In contrast to Cluster 0, this may indicate the dyad does not spend much time taking turns and that both participants were leaning forward and backward at the same time. In this paper we used unsupervised machine learning methods on body posture sensor data. We detected different posture clusters associated with collaboration and learning, finding these metrics were correlated to dyad posture variables such as spine similarity, distance between peers, and synchronized orientation of participants. We found that when participant spines were not synchronized, the dyad pair tended to show higher levels of coordination. This may indicate that dyads who are good at coordinating tend to take turns, as participants move individually before sharing what they gained from their individual explorations. This result aligns with results from [12] , where iterating between active and passive states was significantly correlated with learning gains (interpreted as cycling through moments of reflection and action). Alternatively, this may indicate participants are individually active at the same time, leading to high levels of individual movement. Additionally, dyads who were physically closer to each other throughout the activity had better overall attitudes toward the collaborative task. Also, participants who spent more time focused on the activity rather than each other had more positive attitudes. One interpretation is that when people are engaged in the activity, they will be highly focused on the task and enjoying each others' interactions; conversely, participants who are bored will turn to each other to talk more. Dyads also communicated better when leaning forward. People who were leaning forward are likely to be more engaged in the activity, and people who are leaning backward are likely to be more disengaged; this is likely to be reflected in their communication. The methodology and findings presented in this paper have larger implications for the learning sciences community, as they can serve to indicate markers of successful and unsuccessful collaborations, possibly applicable to other contexts where dyad pairs are learning through interaction with physical objects, and useful to designing systems that monitor student learning through body posture observations. We acknowledge the potential statistical errors introduced by performing large numbers of correlations due to the exploratory nature of our research. Automatic detection of nonverbal behavior predicts learning in dyadic interactions Hooks and shifts: a dialectical study of mediated discovery Disequilibrium in the mind, disharmony in the body Presentation skills estimation based on video and kinect data analysis Group rapport: Posture sharing as a nonverbal indicator Automatically detected nonverbal behavior predicts creativity in collaborating dyads Interpersonal synchrony: a survey of evaluation methods across disciplines Synchrony and cooperation Nonverbal synchrony in psychotherapy: coordinated body movement reflects relationship quality and outcome Moving memories: behavioral synchrony and memory for self and others Movement dynamics reflect a functional role for weak coupling and role structure in dyadic problem solving Unraveling students' interaction around a tangible interface using multimodal learning analytics Joint Attention: Its Origins and Role in Development Leveraging mobile eye-trackers to capture joint visual attention in co-located collaborative learning groups Augmented reality in education: a meta-review and cross-media analysis Experimenting with electromagnetism using augmented reality: impact on flow student experience and educational effectiveness Creating interactive physics education books with augmented reality Investigating augmented reality support for novice users in circuit prototyping Looking inside the wires: understanding museum visitor learning with an augmented circuit exhibit A rating scheme for assessing the quality of computersupported collaboration processes Developing and evaluating a reliable measure of user engagement