key: cord-0046932-8p3bm6a9 authors: Snyder, Caitlin; Hutchins, Nicole M.; Biswas, Gautam; Emara, Mona; Yett, Bernard; Mishra, Shitanshu title: Understanding Collaborative Question Posing During Computational Modeling in Science date: 2020-06-10 journal: Artificial Intelligence in Education DOI: 10.1007/978-3-030-52240-7_54 sha: d8dd801e3ddf46286ae7787900ef59437efa9d11 doc_id: 46932 cord_uid: 8p3bm6a9 Curricular standards in STEM [9] and computer science have emphasized the role of asking questions to support inquiry learning in K-12 education. In this paper, we examine the role of questioning during collaborative computational modeling of scientific processes through discourse analysis to understand how students grapple with the synergistic application of STEM and CT to build, test, and evaluate their models. To our knowledge, limited research has targeted a systematic understanding of question posing during computational modeling in science. We aim to develop a better understanding of question posing in support of inquiry and problem-solving during model building. Curricular standards in STEM [9] and computer science (https://k12cs.org/) emphasize the practice of posing questions to aid learning and domain exploration. Developing question posing skills may support critical thinking, which, in turn, impacts student learning and knowledge building [11] . In this paper, we present a systematic analysis of students' question posing as they work together to build computational models of scientific processes [7] . Our focus is on synergistic processes, i.e., the integration of science and CT concepts and practices that students have to develop for successful model building [4, 11, 12] . In this paper, we present a systematic analysis that evaluates learner's collaborative question posing with regards to learning goals i.e. the synergistic processes behind the integration of STEM and CT constructs necessary for successful model building. Our analyses show that transformation questions have an overall greater impact on a group's success in building computational models. dialogue provides a rich opportunity to understand collaborative processes that are contextualized within learning goals. Leaner-to-learner questions in collaborative dialogue provide insight into group collaboration characteristics in relation to the constructs of the learning domain. In this work we categorize naturally occurring learnerto-learner into two types of questions: confirmation and transformation [10] . Confirmation questions usually require only a shallow understanding of topics. They are used to clarify information, seek reassurance about an idea or ask about the location of a system specific object. Transformation questions require a higher understanding of domain concepts. Such questions challenge actions or statements, pose ideas about domain constructs or guide the modeling processes. We contextualize students' collaborative question posing in terms of synergistic learning processes. Computational model building of scientific phenomenon has been shown to be an effective framework for integrated knowledge construction of STEM and CT concepts and practices [1, 7] . In previous work studying synergistic learning, we identified three essential applications of synergistic processes during computational modeling: initialization, modeling conditional behavior, and debugging [6, 12] . Each process requires students to integrate their conceptual understanding of physics knowledge with a computational representation to support model building. 26 high school sophomore students worked in the C2STEM environment [7] in groups one class day a week for two months to complete a 45-min training unit followed by three kinematics and one mechanics computational modeling units. Students were divided into groups: 8 triads and 1 dyad. Technical issues resulted in removal of 2 triads from our analyses. Model scores were computed using a pre-defined rubric for all modeling tasks that evaluated proficiency in CT and physics separately. We focused our analysis on the 2D motion with constant velocity challenge task where students were instructed to model a boat crossing a river while stopping at two different islands along the way. To be successful in this module students had to calculate the boat's heading angle and resultant velocity while considering the river's current. Our analysis to understand students' question posing during computational model building is guided by two research questions: (RQ1) What questioning characteristics linked to CT and physics can we derive from students' discourse, and how does these impact their learning? (RQ2) What characteristics of collaboration in the context of synergistic learning can we derive from students' questions? Learner-to-learner questions were coded by two coders according to three frameworks: (1) question posing (QP): transformation, confirmation, (2) question answering (QA): selfanswered, other answered, or not answered, and (3) synergistic learning processes (SLP): initialization, modeling conditional behaviors, and debugging. Inter-rater reliability was checked by calculating Cohen's kappa which resulted in good agreement (k = 0.73) for all three forms of coding. Table 1 below shows the final computational model scores and proportions of total questions to total utterances (Q-to-U), transformation questions to total questions (Trans), and total confirmation questions to total questions (Conf) for each group. The two highest performing groups, 2 and 5, had a majority of transformation questions while all other groups had a majority of confirmation questions. This coincides with the findings of [11] that transformation questions require advanced domain knowledge. The majority confirmation questions in the other groups may indicate a shallow understanding of domain or CT constructs for the modeling tasks. We further investigate how these questions relate to synergistic learning processes by analyzing the proportion of each type of question (confirmation and transformation) during each synergistic process (initialization, debugging and conditional behavior changes) to the total number of questions. In previous work [11] , we found that the majority of questions during initialization were classified as confirmation questions. In the context of performance-based analysis, this remains true for all groups except for the highest performers, Group 5 and Group 2, who asked equal or more transformation questions. 50% of Group 5 initialization questions were transformation and Group 2 had 53% while the rest of the groups had less than 44% transformation questions during initialization. During debugging, the groups varied but it is worth noting that the high performing groups (Group 5 and Group 2) and one middle performing group (Group 8) had a majority of transformation questions, 61%, 57% and 60% respectively. We hypothesize that transformation questions during debugging results in higher final model scores because debugging requires advanced domain knowledge to analyze the model behavior based on expected behaviors to locate and correct errors. While working on conditional behavior changes, the groups as whole had a majority of transformation questions (58%). Groups individually varied but it is worth noting that the three worst performing groups had a majority of confirmation questions (Group 1 and 6), or asked no questions while modeling conditional behavior changes (Group 7). 75% of Group 1's questions while working on conditional behavior changes were confirmation questions and 67% of Group 6's questions. Table 2 shows the correlation between different characteristics of students' question posing pose on their final model. Note that we only include the spearman coefficient, q, for transformation questions during each synergistic process since the proportion calculation results in the coefficient for confirmation questions being the negative of the transformation coefficient. Q-to-U does not have a strong correlation, which supports the idea that shallower learning analytic representations miss out on important information. The question posing approach (e.g., transformation and confirmation questions) shows a high correlation with the final model score. Transformation questions during the debugging processes highly correlated with the final score. In regards to RQ2, we calculated the proportion of each question type (transformation vs confirmation) in regards to how it was answered (not, self, other) with respect to all questions of that type. Most transformation questions were answered by other students (62%) with 8% being self-answered and 31% not answered. We hypothesize that transformation questions were more likely to be answered by another group members (e.g., to build consensus or counter-challenge) or not be answered due to the complexity of the question or potential domain misunderstanding(s) by other group members. Most confirmation questions were not answered (50%) with 25% being self-answered and 25% being answered by others. We hypothesize that confirmation questions were more often not answered due to the fact that some confirmation questions are simply think aloud statements and the students may not actually expect a response (e.g., asking about the location of a block found shortly after asking). Our results support previous conclusions [11] and extend them by showing the correlation between transformation questions and the success of a group's final model as well analyzing questions in terms of who responded to questions posed. The grounding and alignment of AI-enhanced technologies (e.g., designing and using learning analytics dashboards that include feedback on collaborative processes) with learning theories can support optimization of curriculum design and instruction [5] . This systematic approach demonstrates how an evaluation of key collaborative processes (question posing and group response) can be linked to learning objectives (synergistic learning) to provide comprehensive feedback on possible pedagogical actions. We recognize the limitations in our analysis, including the small sample size, and aim to conduct this systematic analysis with a larger sample size and multiple domains in the future. Learner modeling for adaptive scaffolding in a computational thinking-based science learning environment The Community of Inquiry Theoretical Framework. Handbook of Distance Education Self and social regulation of learning during collaborative activities in the classroom: the interplay of individual and group cognition Examining Synergistic Learning of Physics and Computational Thinking Through Collaborative Problem Solving in Computational Modeling Analytics for learning design: a layered framework and tools: analytics layers for learning design A design-based approach to a classroom-centered OELE C2STEM: a system for synergistic learning of physics and computational thinking Collaborative group work and individual development of metacognition in the early years NGSS Lead States: Next Generation Science Standards: For States, by States Questions of chemistry Exploring synergistic learning processes through collaborative learner-to-learner questioning Analyzing students' synergistic learning processes in physics and CT by collaborative discourse analysis Acknowledgements. This research is supported by NSF grant #1640199.