CDIE770431 37..47 Getting real: the authenticity of remote labs and simulations for science learning Megan Sautera, David H. Uttalb, David N. Rappc, Michael Downinga and Kemi Jonad* aDepartment of Psychology, Northwestern University, Evanston, Illinois, USA; bDepartment of Psychology & Education, Northwestern University, Evanston, Illinois, USA; cDepartment of Psychology & Learning Sciences, Northwestern University, Evanston, Illinois, USA; dDepartment of Learning Sciences & Computer Science, Northwestern University, Evanston, Illinois, USA (Received 12 November 2012; final version received 22 December 2012) Teachers use remote labs and simulations to augment or even replace hands-on science learning. We compared undergraduate students’ experiences with a remote lab and a simulation to investigate beliefs about and learning from the interactions. Although learning occurred in both groups, students were more deeply engaged while performing the remote lab. Remote lab users felt and behaved as though they completed a real scientific experiment. We also exam- ined whether realistic visualizations improved the psychological and learning experiences for each lab. Students who watched live video of the device collect- ing their data in the remote lab felt most engaged with the task, suggesting that it is the combination of the realistic lab and realistic video that was of the greatest benefit. Keywords: cognition; Web-based learning; remote lab; simulation; video; science learning Introduction Hands-on activities have long played a central role in science education (National Research Council, 2005; Stohr-Hunt, 1996). However, financial and practical constraints can limit access to these activities. Recent technological advances have led to increases in the use of tools that augment (or in some cases replace) hands-on science learning via interaction with a computer (Honey & Hilton, 2011; Scanlon, Colwee, Cooper, & DiPaolo, 2004). Nevertheless, the introduction of computer-based tools into the science laboratory repertoire has elicited signifi- cant debate (Ma & Nickerson, 2006). As a means of evaluating the utility of such tools, we focus on unpacking the psychological and learning implications of simulations and remote labs in support of science learning goals. Using psycho- logical presence as our guiding construct, we examine how these two technologies affect student experience with, and learning consequences from, computer-based laboratories. *Corresponding author. Email: kjona@northwestern.edu Distance Education, 2013 Vol. 34, No. 1, 37–47, http://dx.doi.org/10.1080/01587919.2013.770431 � 2013 Open and Distance Learning Association of Australia, Inc. sschmidt Typewritten Text https://www.researchgate.net/publication/260190903_Getting_real_the_authenticity_of_remote_labs_and_simulations_for_science_learning sschmidt Typewritten Text sschmidt Typewritten Text sschmidt Typewritten Text sschmidt Typewritten Text sschmidt Typewritten Text sschmidt Typewritten Text Remote labs Remote labs are computer-mediated laboratory experiences that allow students to access real experimental devices online (such as oscilloscopes, mass spectrometers, or Geiger counters). Remote labs provide access to scientific experiences that would otherwise be inaccessible, such as when schools lack specific facilities or equipment, with the goal of achieving similar experiences and learning outcomes (Lindsay & Good, 2005). In contrast, many science education programs rely on simulations, which differ in important ways from remote labs. Simulations do not provide access to real experimental devices but instead simulate data using compu- tational models. A critical issue is whether the use of actual equipment is important in students’ learning experiences, or whether simulations will suffice. Does the use of real equipment, as in remote labs, enable students to feel more like they are doing real science, despite accessing the equipment remotely? A meta-analysis of research studies comparing the efficacy of remote and hands- on labs shows little or no systematic differences in learning outcomes between the two types of experiences (Ma & Nickerson, 2006; Triona & Klahr, 2003). Students experience remote labs as being as effective as hands-on labs (Corter et al., 2004). However, students think that simulations are less effective than remote labs because simulations don’t feel as realistic (Scanlon, Colwee, Cooper, & DiPaolo, 2004). The realistic nature of remote labs affords students the opportunity to more directly apply theories learned in the classroom to real-world phenomena. In contrast, using simulations can lead students to overlook the link between theory and application (Lindsay & Good, 2005). However, other research shows that students don’t always think of remote labs as realistic experiences (Corter et al., 2007). The degree to which students derive realism from the experience might be a function of the user interfaces and visualiza- tions used in the lab experiences. For instance, Nedic, Machotka, and Nafalski (2003) noted that many remote labs resemble simulations without giving the user “a feeling of real presence in the laboratory” (p. 1). Using an interface that included controls that resembled actual equipment, as well as photographs and webcams, increased students’ preference for working in a remote environment (Nedic & Machotka, 2006). Presence We hypothesize that computer-based labs are more effective when their interfaces and visualizations lend a sense of presence to the experience. Presence is the “subjective experience of being in one place or environment, even when one is physically situated in another” (Witmer & Singer, 1998, p. 225) such that the “the virtuality of experience is unnoticed” (Lee, 2004, p. 32). Factors that increase the sense of presence include design features, content, and user characteristics (Lom- bard & Ditton, 1997). We believe one cause of presence, realism, is particularly important for making computer-based labs seem more like authentic science labs. Realism may be especially important because beliefs about the validity and authen- ticity of the technology may play a bigger role in lab effectiveness than the technol- ogy itself (Lindsay & Good, 2005; Ma & Nickerson, 2006; Nedic et al., 2003). Across a range of applications, realism can be improved through design features such as photorealism (Daniel & Meitner, 2001), motion and sound (Heft & Nasar, 2000; Hetherington, Daniel, & Brown, 1993), and display size (Lombard, 1995). 38 M. Sauter et al. Specifically, in the present research we investigated two factors that may influ- ence students’ learning and their sense of presence: (1) the kind of lab that was pre- sented (remote lab or simulation) and (2) the inclusion of visualizations (photographs and videos). Undergraduate students were randomly assigned to com- plete a physics lesson that was presented either as a remote lab or as a simulation. Half of the students in each condition saw a video of a scientific device in action; the other half saw only a static photo of the device (see Table 1). We predicted that (1) remote lab users would rate their experiences as being more realistic than would simulation users and (2) seeing a video would better support presence and learning outcomes because it encodes the motion of the device. Our design also allowed us to test the interaction of two these factors. Method Participants in the experiment consisted of 123 undergraduate students at North- western University (United States). Most participants (N = 83) were first-year stu- dents, with the remainder including sophomores (N = 19), juniors (N = 10), and seniors (N = 11). On average participants had taken one physics class. Of the partic- ipants 13 had no previous physics experience, 56 had taken high school introduc- tory physics, 33 had taken advanced-placement physics in high school, and 21 had taken physics at the college level. Participants were tested individually at a desk in a quiet office at Northwestern University in Evanston, Illinois, using the Firefox browser on an iMac. The Radioactivity iLab Our experiment utilized the Radioactivity iLab from http://www.iLabCentral.org, a website that provides access to remote online lab devices around the world (Jona & Vondracek, 2013). The Radioactivity iLab lesson includes a multistep, interactive online application that allows participants to perform an experiment pertaining to radioactivity. In the remote lab condition, participants remotely controlled a Geiger counter to measure radiation from a sample of radioactive strontium-90, with the actual equipment housed at the University of Queensland (Australia). In the simula- tion condition, participants used an identical Web interface but received simulated data based on computational models of radioactive decay of strontium 90. The sim- ulated data included randomized error to emulate the sampling error in real Geiger counter data. The learning goal of the lab was for students to observe and infer the inverse square law, which states that the intensity of the radiation from a point source decreases as a function of the square of the distance from that source. Visualizations Participants were assigned to see one of two different visualizations: half in each lab condition (remote lab or simulation) could access a photo, and half could access Table 1. Experimental conditions. Visualization condition Lab type Remote lab + photo Remote lab + live webcam Simulation + photo Simulation + prerecorded video Distance Education 39 http://www.iLabCentral.org a video. All participants saw the same small image of the Geiger counter on the left side of the screen as they worked through the lab (see Figure 1, upper); however, participants in the video condition could click on this image to see a video of the Geiger counter. Participants assigned to the video in the remote lab condition saw a live webcam feed of the Geiger counter performing their experiment, while partic- ipants in the simulation condition saw a recording of a webcam feed performing a similar experiment. Participants assigned to the photo in both lab conditions could click on the image to see a large photo, presented in the same frame size as the video. Figure 1. Accessing the visualizations in the Radioactivity iLab. When students clicked on the photo in the upper panel, they saw either the video or the photo (lower panel) in a new window. 40 M. Sauter et al. Procedure Participants progressed through a structured inquiry process using an online lab journal that contained instructions, readings, and metacognitive prompts (see Figure 1, upper right). The experimenter introduced each phase of the task to the participants but did not assist them further. The inquiry process included multiple phases that guided the participants through the scientific process of conducting an experiment. First, participants accessed short articles about radiation to research their topic. Next, participants wrote a research question to guide their investigation. Participants could design their experiment by choosing the distances at which radia- tion would last and the number of trials (or repetitions) the experiment would run at those settings. Finally, participants received their data and analyzed it using provided online graphing tools. Learning task and assessment Participants were informed they were testing educational software that was designed to promote physics learning. They completed four tasks: a pretest, the lab, a posttest, and the interview. The pretest, lab, and posttest were presented via computer, and the interview was conducted by the experimenter in the same location at Northwestern University. The pretest and posttest questions were designed to determine whether performing the labs improved content knowledge of radiation and the inverse square law. The interview was designed to probe participants’ thoughts about science labs and their specific experience with the online lab. Our assessments fell into the following three categories: the psychological pres- ence of the lab experience, thoughts about science labs, and learning outcomes. All assessments took place during the interview except the learning outcomes measures, which took place during the pretest and posttest. We created reliable coding schemes (Kappa > .70 using two coders) for each question to analyze the responses. Results Presence We examined whether lab type (i.e., remote or simulated data) and visual features (e.g., webcam or photo) influenced students’ perceptions of the realism of the lab. Presence was indicated by the participants’ feelings and attitudes toward the lab and whether they would apply actions normally taken in a hands-on experiment to a computer-based lab. For each of the assessment questions, we coded a “yes” response as 3 points, a “maybe/some doubt” response as 2 points, and a “no” response as 1 point. The remote lab group rated their experience as more like a real lab. On average, remote lab users (M = 2.54) more strongly believed that they had done a real experi- ment than simulation users did (M = 2.27), F(1, 117) = 5.14, p = <.05. There was also an interaction of lab type by visualization, F(1, 117) = 5.14, p = <.05. Simulation users who saw the video reported feeling more like they completed an experiment (M = 2.50) than did simulation users who just saw a photo (M = 2.03), F (1, 58) = 6.22, p = <.05, with no analogous difference for remote users (photo: M = 2.62, video: M = 2.50), F(1, 59) = .45, p = ns. We asked participants to discuss actions normally taken in a hands-on experi- ment to see if these actions were also applicable to computer-based labs. We asked Distance Education 41 participants to discuss the variability they observed (or failed to observe) in their data, and most participants noticed such variability (M = 2.78). Neither condition nor visualization type was significant, nor was the interaction, F(1, 117) 6 2.41, p = ns. However, remote lab users (M = 2.79) tended to expect variability in their data more often than simulation users did (M = 2.54), F(1, 116) = 4.09, p = .053. We asked participants whether they wanted to run the lab again: remote lab users (M = 1.90) responded more positively than simulation users (M = 1.56), F(1114) = 4.24, p = <.05. Although there was no main effect of visualization type [F(1, 115) = 2.49 p = .12], remote users who saw a video (M = 2.13) were more likely to want to rerun the lab as compared to users who saw a photo (M = 1.64), F (1, 57) = 5.23, p = <.05. Simulation users showed no such pattern, F(1, 57) = 0.04, p = ns. Participants’ reasons for wanting to rerun the lab included a desire to confirm or replicate their original data and to try different settings or methods. After completing the task, we told participants about the other lab type and asked them to compare the simulation and the remote lab. The majority of partici- pants overall preferred the remote lab over the simulation, particularly if they had completed the remote lab, χ2(1, N = 116) = 13.511, p = <.01. Few remote lab users preferred the simulation. This preference did not vary as a function of viewing a picture or video within either the remote lab (χ2 (1, N = 56) = .012, p = ns) or simu- lation (χ2 (1, N = 54) = .313, p = ns). This means that the lab type exerted a greater influence on participants’ lab preferences than the visual features did. Thoughts about science labs We asked participants to compare three lab types (hands-on lab, remote lab, and simulation) to probe their general thoughts and beliefs about science labs. A qualita- tive analysis of their responses revealed a core theme: participants often discussed the methodologies in terms of the quality of the data that each method produced. We coded their responses and performed Fisher’s exact tests to examine whether the frequency of their statements differed across lab and visualization type. One advantage of remote labs that arose from the data is the idea that computers can regulate the settings and measurements of the device, which may decrease human error compared to hands-on labs. Remote lab users who saw a video consid- ered how their methodology might decrease human error. More remote lab users who saw a video (N = 12) than a photo (N = 3) noted that hands-on data would be prone to human error (Fisher’s exact = .02, two-tailed). Remote lab users who viewed the video were also more likely to consider how the remote interface might decrease human error: 10 video viewers mentioned this in contrast to only two photo viewers (Fisher’s exact = .025, two-tailed). Some participants were wary of simulations, and simulation users who saw a video were especially likely to discuss problems with the methodology and the ben- efits of other methods. Of the participants 36% (N = 43) indicated that because sim- ulations did not use new data, they considered the method unscientific or unimportant. More simulation users who saw a video were wary of the simulation methodology (N = 14) than were users who saw a photo, (N = 5), Fisher’s exact = .025, two-tailed, with no similar effect emerging for remote lab users. Slightly more simulation users than remote lab users mentioned the importance of hands-on methodology in science labs, with 18 simulation users making this statement, compared to nine remote users (Fisher’s exact = .08, two-tailed). 42 M. Sauter et al. Learning outcomes We tested whether participants knew more content information after using the labs. Participants were asked specific test questions both before and after the lab activity. Each question was scored using a rubric checked by two coders (Kappa > .70). First, we examined test questions that could be answered using the readings, including (1) What is radiation? (2) What are some different types of radiation? Explain why these types of radiation differ. (3) What is radioactive decay, and how does it work? Participants gave more thorough answers to the question “What is radiation?” after than before the lab, F(1, 119) = 88.62, p = <.001. Participants were also better at naming different types of radiation and explaining how they differ after than before the lab, F(1, 112) = 76.43, p = <.001. Participants were also better at explaining radioactive decay after than before the lab, F(1, 112) = 61.21, p = <.001. However, simulation users performed better on this question than remote lab users, F(1, 111) = 4.70, p = <.05. Because participants in the simulation condition in our study felt less like they did an actual experiment, they may have concentrated more on theory than on application (Lindsay & Good, 2005). We also examined content questions that could be answered by doing the lab, including (1) How can you measure radioactivity? (2) Does the intensity of radia- tion change over time or distance? If so, explain the relationship(s). Participants were better at explaining how radioactivity was measured after than before the lab, F(1, 118) = 158.77, p = <.001. Participants were better at explaining the relation between distance and intensity of radiation after than before the lab, F(1, 118) = 105.06, p = <.001. There was a significant interaction between lab and visualiza- tion, F(1, 118) = 7.93, p = <.01. Within the remote group, participants who saw a video were better at explaining the relation between distance and intensity of radia- tion than those who only saw a photo, F(1, 58) = 8.38, p = <.01, with no compara- ble difference for simulation users, F(1, 58) = 2.29, p = ns. By viewing the video, participants saw the relation between distance and radiation in action: The particle counts decreased as the Geiger sensor moved away from the source. We also examined whether participants engaged with the lab differently depend- ing on lab or visualization type. Participants’ experimental designs did not differ according to condition. In the average experiment, participants chose about seven distances (M = 6.74) to measure at 4.98 s each, and they ran about five trials (M = 5.41). However, participants in the remote lab condition (especially those who saw a video) wrote better research questions to guide their experiment. We scored their questions on a scale of 0 to 3, with 3 being the most thorough response and 0 being a response unrelated to the experiment. Participants who used the remote lab wrote higher-quality questions (M = 2.53) than did participants who used the simula- tion (M = 2.05), F(1, 121) = 15.99, p = <.01. Additionally, remote lab users who saw a video (M = 2.75) wrote higher-quality questions than did users who saw a photo (M = 2.27), F(1, 60) = 12.04, p = <.01, but simulation users did not show this difference (M = 2.09 for photo versus M = 2.00 for video, F(1, 59) = .258, p = ns). Discussion Overall, remote labs and videos best re-created students’ experiences doing science labs: (1) Remote lab users were more likely to feel and behave as though they conducted a real experiment and (2) Remote users who watched the video felt most engaged with the task. These findings have important implications for science Distance Education 43 educators and learning technology developers, as well as researchers who can further expand our knowledge of realistic and engaging distance learning. Both remote lab and simulation modalities of the Radioactivity iLab were effec- tive at teaching the target science content; however, only the remote lab re-created students’ experiences of doing science labs. By grounding the experiment in the real world through the use of real devices, participants felt and behaved as though they conducted a scientific experiment. An important difference between the remote lab and simulation conditions involved beliefs about the data source; remote lab users were able to gather real data from a real device whereas simulation users were using computationally derived data, which did not feel as realistic or scientifically authentic. The authenticity of the data was important because it encouraged student engagement with the experimental task. Consider that remote lab users, as com- pared to simulation users, wrote higher-quality research questions and were more likely to want to run the lab again. By creating an authentic online experience grounded in reality, remote labs helped students to engage in scientific inquiry when the necessary empirical equipment was not locally available. The lab interface and visualizations were important in creating a more realistic experience. Some research has shown that students do not always believe remote labs are realistic (Corter et al., 2007), which may be due in part to the authenticity of the visual features of the labs. The Radioactivity iLab’s interface was intended to mimic a student’s actual research journal and workspace. Students were required to write a research question to drive their research, design their own experiment, and analyze their own results. These activities relied on the steps involved in a hands- on lab, with the connection to a real, albeit distal device fostering authenticity. Additionally, participants who watched live video of the device collecting data based on their own experimental design felt most engaged with the task, suggesting that it is the combination of the realistic lab and a realistic video that was of the greatest benefit. The remote lab and the video on the screen helped support the learning experi- ence. The remote lab created an authentic context and the visual features augmented the experience. The video allowed participants to actually see their experiment being run, which in turn enhanced student engagement with the activity. However, remote lab users and simulation users experienced this engagement differently. Remote lab users seemed more invested in the actual experiment—they crafted bet- ter research questions, considered how their experiment limited human error while also evaluating other possible sources of variability in their data, and wanted to run their experiment multiple times. Simulation users seemed invested in the idea of the experiment, but felt constrained by the methodology. The video helped them feel like they did an experiment, but the data from the simulation did not seem authentic enough. They considered ways that their experiment would be better if the method- ology was different, such as running a hands-on lab. Even though the remote lab and the simulation looked the same on the screen, the remote lab’s connection to a real device was integral to fostering an engaging and realistic lab experience. These findings are particularly important given current calls for a greater focus on teaching scientific practices (National Research Council, 2012), for more authen- tic lab experiences (Chinn & Malhotra, 2002; Singer, Hilton, & Schweingruber., 2005; Sunal, Wright, & Sundberg, 2008), and assessments of how those experiences can influence students’ academic trajectories. In a large-scale survey, Lopato (2007) found that undergraduate research experiences influenced student motivation in 44 M. Sauter et al. subsequent science courses. Participating in real research leads students to gain confidence and feel like a scientist (Hunter, Lausen, & Seymour, 2006). Because remote online labs feel authentic and are easy for students to access, their imple- mentation could have a positive effect on student experiences and outcomes for science learning. This is especially important for students in schools with limited resources, where a lack of equipment makes science learning less engaging for stu- dents and leads to poorer preparation for later science learning. It is also critical for the rapidly growing population of students who are taking science and engineering courses online where no physical access to laboratory equipment is possible. Remote labs are an important addition to simulations and computational models in the growing toolbox of learning technologies for science education. The results of this study point to important differences in the affordances of remote labs and simulations for science learning and critical elements of the user interface that enhance student engagement across both types of tools. Developers of science cur- ricula used in face-to-face, online, or blended delivery modalities should be attuned to the affordances of each type of learning technology and integrate hands-on work with simulations and remote labs in ways that optimize student learning. Learning technology developers should also adopt interface design principles and software tools that can further improve the realism of their user interfaces in order to benefit from the advantages pointed to by this study. Finally, the present research may shed light not only on whether various online education environments are effective, but also on how and why. This is particularly germane to science and engineering fields where learning about—and interacting with—scientific phenomena and the instruments used to study them are not only crucially important but also especially challenging to do at a distance from a physi- cal laboratory. In the future, we plan to continue the approach of linking research on presence with research on learning from remote labs and from simulations. One particularly interesting question is whether it is possible to design simulations that capture the benefits of remote labs in terms of creating a sense of presence. The critical question will not be which learning environment is better, but how to maximize the potential of each to create the most effective learning environments possible. This line of research will be essential in informing the design and devel- opment of massive open online courses (MOOCs) and other new forms of online and blended learning. Acknowledgements This work is supported in part by the National Science Foundation under grants OCI- 0753324 and DUE-0938075. However, any opinions, findings, conclusions, and/or recommendations are those of the investigators and do not necessarily reflect the views of the Foundation. We gratefully acknowledge the University of Queensland, Australia for providing access to the remote radioactivity lab equipment. Notes on contributors Megan Sauter received her PhD in cognitive psychology from Northwestern University in 2011. She now works as a user experience researcher at AnswerLab in San Francisco, California. Megan is passionate about using psychological principles to improve people’s experiences with technology. David Uttal is professor of psychology of Education at Northwestern University. His research focuses on cognitive development, particularly the development of spatial and symbolic Distance Education 45 thinking. He is also active in efforts to enhance students’ participation in STEM (science, technology, engineering, and mathematics) through improving their spatial thinking. David N. Rapp is associate professor at Northwestern University in the School of Education and Social Policy, and in the Department of Psychology. His research focuses on the cognitive mechanisms underlying successful and unsuccessful learning. He is associate editor at the Journal of Educational Psychology and Discourse Processes. Michael Downing is a research specialist in developmental neuroscience at the University of Chicago, Illinois. His current research examines the physiological components of early cortical development. Previously, he has conducted research at Northwestern University examining the psychology of virtual learning. Kemi Jona is research professor of learning sciences and computer science at Northwestern University where he leads R & D projects in cyberlearning tools for STEM education. He is also the Director of the Office of STEM Education Partnerships at Northwestern University. References Chinn, C. A., & Malhotra, B. A. (2002). Epistemologically authentic inquiry in schools: A theoretical framework for evaluating inquiry tasks. Science Education, 86, 175–218. doi:10.1002/sce.10001 Corter, J. E., Nickerson, J. V., Esche, S. K., & Chassapis, C. (2004). Remote versus hands- on labs: A comparative study. In Proceedings of the 34th ASEE/IEEE Frontiers in Edu- cation Conference. Savannah, GA: ASEE/IEEE. F1G.17–F1G.21. Corter, J. E., Nickerson, J. V., Esche, S. K., Chassapis, C., Im, S., & Ma, J. (2007). Con- structing reality: A study of remote, hands-on, and simulated laboratories. ACM Transac- tions on Computer-Human Interaction, 14, 7-1–7-27. doi:10.1145/1275511.1275513 Daniel, T. C., & Meitner, M. M. (2001). Representational validity of landscape visualiza- tions: The effects of graphical realism on perceived scenic beauty of forest vistas. Jour- nal of Environmental Psychology, 21, 61–72. doi:10.1006/jevp. 2000.0182 Heft, H., & Nasar, J. L. (2000). Evaluating environmental scenes using dynamic versus static displays. Environment and Behavior, 32, 301–322. doi:10.1177/0013916500323001 Hetherington, J., Daniel, T. C., & Brown, T. C. (1993). Is motion more important than it sounds? The medium of presentation in environment perception research. Journal of Environmental Psychology, 13, 283–291. doi:10.1016/S0272-4944(05)80251-8 Honey, M. A., & Hilton, M. (2011). Learning science through computer games and simula- tions. Washington, DC: National Academies Press. Hunter, A., Laursen, S. L., & Seymour, E. (2006). Becoming a scientist: The role of under- graduate research in students’ cognitive, personal, and professional development. Science Education, 91, 36–74. doi:10.1002/sce.20173 Jona, K., & Vondracek, M. (2013). A remote radioactivity experiment. The Physics Teacher, 51, 25–27. doi:10.1119/1.4772033 Lee, K. M. (2004). Presence, explicated. Communication Theory, 14, 27–50. doi:10.1093/ct/ 14.1.27 Lindsay, E. D., & Good, M. C. (2005). Effects of laboratory access modes upon learning outcomes. IEEE Transactions on Education, 48, 619–631. doi:10.1109/TE.2005.852591 Lombard, M. (1995). Direct responses to people on the screen: Television and personal space. Communication Research, 22, 288–324. doi:10.1177/009365095022003002 Lombard, M., & Ditton, T. (1997). At the heart of it all: The concept of presence. Journal of Computer-Mediated Communication, 3. Retrieved from http://jcmc.indiana.edu/vol3/ issue2/lombard.html Lopato, D. (2007). Undergraduate research experiences support science career decisions and active learning. CBE Life Science Education, 6, 297–306. doi:10.1187/cbe.07-06-0039 Ma, J., & Nickerson, J. V. (2006). Hands-on, simulated, and remote laboratories: A compara- tive literature review. ACM Computing Surveys (CSUR), 38(3), 1–24. doi:10.1145/ 1132960.1132961 46 M. Sauter et al. http://jcmc.indiana.edu/vol3/issue2/lombard.html http://jcmc.indiana.edu/vol3/issue2/lombard.html National Research Council. (2005). How students learn: Science in the classroom. Washington, DC: National Academies Press. National Research Council. (2012). A Framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press. Nedic, Z., Machotka, J., & Nafalski, A. (2003). Remote laboratories versus virtual and real laboratories. Proceedings of the 33rd ASEE/IEEE Frontiers in Education Conference, (vol. 1, pp. T3E-1–T3E-6). San Diego, CA: ASEE/IEEE. Retrieved from http://fie-con- ference.org/fie2003/index.htm Nedic, Z., & Machotka, J. (2006). The remote laboratory NetLab for teaching engineering courses. Global Journal of Engineering Education, 10, 202–212. Retrieved from http:// www.wiete.com.au/journals/GJEE/Publish/index.html Scanlon, E., Colwee, C., Cooper, M., & DiPaolo, T. (2004). Remote experiments, re- versioning and re-thinking science learning. Computers & Education, 43, 153–163. doi:10.1016/j.compedu.2003.12.010 Singer, S. R., Hilton, M. L., & Schweingruber, H. A. (Eds.). (2005). America’s lab report: Investigations in high school science. Washington, DC: National Academies Press. Stohr-Hunt, P. (1996). An analysis of frequency of hands-on experience and science achieve- ment. Journal of Research in Science Teaching, 33, 101–109. doi:10.1002/(SICI)1098- 2736(199601)33:1<101::AID-TEA6>3.0.CO;2-Z Sunal, D. W., Wright, E., & Sundberg, C. (2008). The impact of the laboratory and technology on K–12 science learning and teaching. Greenwich, CT: Information Age Publishing. Triona, L. M., & Klahr, D. (2003). Point and click or grab and heft: Comparing the influ- ence of physical and virtual instruction materials on elementary school students’ ability to design experiments. Cognition and Instruction, 21, 149–173. doi:10.1207/ S1532690XCI2102_02 Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence questionnaire. Presence. Teleoperators and Virtual Environments, 7, 225–240. doi:10.1162/105474698565686 Distance Education 47 http://fie-conference.org/fie2003/index.htm http://fie-conference.org/fie2003/index.htm http://www.wiete.com.au/journals/GJEE/Publish/index.html http://www.wiete.com.au/journals/GJEE/Publish/index.html Copyright of Distance Education is the property of Routledge and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.