Issues in Science and Technology Librarianship | Winter 2009 |
|||
DOI:10.5062/F4N29TWP |
URLs in this document have been updated. Links enclosed in {curly brackets} have been changed. If a replacement link was located, the new URL was added and the link is active; if a new site could not be identified, the broken link was removed. |
Derin Sherman
Professor of Physics
dsherman@cornellcollege.edu
Cornell College
Mount Vernon, Iowa
Required science courses can have limited interest from some students. In this article, a physics professor and a science librarian describe methods used to engage non majors in learning about science in a non-threatening way. By evaluating the science on selected web sites, and classifying the sites according to six categories (valid, speculation, controversial, uninformed, misrepresentation, and invalid) and then comparing findings with others in the class, students are simultaneously immersing themselves in practices of the scientific community and gaining critical thinking skills.
It has long been a goal of Information Literacy that students should be able to properly evaluate web sites. The most commonly used criteria are accuracy, authority, objectivity, currency, and coverage. While acknowledging the importance of evaluating web sites, the course Science through Film and Fiction at Cornell College attempts to take evaluation capabilities to another level: evaluating the science found on web sites. Students discover that the hallmark of a scientifically reliable site is that it represents consensus within the scientific community. Thus, scientific validity is ascertained not by evaluating a single web site, but by comparing information from several web sites. One success of this venture was engaging non-majors who may think they have very little interest in science. Cornell College is a small liberal arts college in Mount Vernon, Iowa. Classes are taught on the Block Plan with students taking one course at a time. Courses meet for two hours each morning and afternoon for one month. In this article, a science librarian and a physics professor will share the objectives, description, strategies and outcomes of this instruction.
As described in the 2008-2009 Cornell College Academic Catalog, the course Science through Film and Fiction explores scientific topics found in selected novels and feature films. Through investigating specific concepts and using them as case studies, the students learn about the historical development of science, the process of scientific discovery, and the role of science and technology in society.
Using fiction as a basis for exploring science rather than reputable science documentaries is justified in the course description proposal. "Fiction is, ironically, more 'real' than documentaries because fiction depicts science as an integrated part of the real world, while documentaries focus on only a small piece of the real world...Examples [of science topics] include the relationship between academia and industry in research, genetically modified food, cloning, alien contact, space exploration, surveillance and privacy. This course will provide them [students] with some of the skills needed to think about these types of technologies, ask the right kinds of questions about these technologies, and utilize the tools required to research the technology in more depth."
As a warm-up to the science evaluation assignment, the students watched the movie Twelve Angry Men in morning class. While watching the movie, students are asked to pay particular attention to how the jury evaluates conflicting pieces of evidence, how they construct theories explaining the evidence, and how they eventually reach consensus. Following the movie, students are asked to consider the events in the movie as a simple model of how scientific consensus is achieved in the scientific community.
In the afternoon class, the physics professor and/or librarian introduced the students to the categories and criteria (see Strategies section) for evaluating each category that they will be using in their evening assignment (see Assignment section). The class uses web sites provided by the professor and librarian to evaluate. During this session, the first class (18 students) divided into small groups, with each group evaluating a different site; a second class (24 students) evaluated the sites as a whole, with each contributing to the evaluation of the same site.
The librarian created a {course web page} to be used during the class as well during homework. On the course page, students were given several methods for evaluating the science based on asking several questions including: What do the experts say? Who is doing the research? Students were steered toward reputable journals that non-science majors might understand such as Physics Today, Science News, American Scientist, and Nature. Links to appropriate databases (e.g., Academic Search Premier via EBSCOhost, Science Full Text Select via WilsonWeb, and JSTOR) for finding citations, abstracts, and often full text were readily accessible from the course page. Online verification sites such as EurekAlert (http://EurekAlert.org) of the American Association for the Advancement of Science was recommended as a source for exploring press releases from universities, researchers, and scientists representing current science topics. Students were encouraged to look for web sites of professional societies related to the topic in question. They were also directed to look at what other web sites link to the site under investigation. Several practice sites, gathered through recommendations of librarian colleagues on the ACRL Science and Technology Section Discussion List, were also included on the course page. Some of these sites were used during the class practice session. A chart of criteria (see Strategies section below) as well as relevant questions, and the professor's assignment rounded out the course page.
The professor explains it this way:
Students are often asked to research scientific topics beyond their range of expertise. This problem of scientific evaluation has become increasingly relevant as more students turn to the Internet to find sources of information.
A primary goal of this course is learn how to evaluate a scientific paper or web site without adequate training in the relevant scientific field. We began by establishing six categories for evaluating scientific work:
It is worth noting that these categories are not mutually exclusive, and that a paper's status may change with time as scientific knowledge evolves. A paper that was speculative two decades ago may now be regarded as valid, or it might be controversial, or even just plain wrong. A paper that might have appeared to be valid at the time of publication could have been shown to be fraudulent. This is an important distinction to make: the judgment being cast is not on the factual validity of the paper, but on the relationship between the evidence presented in the paper and the state of scientific knowledge as it exists at the time of judgment. For example, an amateur scientist may produce a factually correct paper, but it may take some time for the scientific community to evaluate it. Thus, the paper might initially be classified as uninformed, although it might later be classified as valid, or even revolutionary.
Here are some web sites for students to evaluate. You may wish to try your hand at evaluating these web sites yourself.
Some of these sites prove quite challenging, but there is a method for evaluating the validity of these sites which is fairly general: compare the statements made on these sites with those made on known, reputable sites. If there is a fair degree of scientific consensus, then the facts stated on each of these sites should be reflected on other sites.
Starting with the LaserStars site, a quick glance at the bottom of the page shows that it was created by a physics professor at the University of Ottawa. This is therefore not the work of an amateur or a crackpot. However, scientists have been known to make mistakes, so we can not treat the title of "Physics Professor" as a guarantee of scientific validity. Instead, students need to evaluate the scientific statement being made on this site. Fortunately, two key phrases appear at the top of this page that will prove useful in evaluating this site: "laser action" and "Wolf-Rayet stars". This site's thesis is that laser action is responsible for the spectra of Wolf-Rayet stars. The class can now go to Google and search on these terms to see if this thesis is accepted by the astrophysics community as a whole. The term "Wolf-Rayet" is likely to be more specific than "laser action" so we began our search with just the first term. Over 200,000 sites refer to Wolf-Rayet stars. One of the first sites that pops up is http://imagine.gsfc.nasa.gov/docs/ask_astro/answers/980603a.html which is a known reputable site, since NASA is a known organization interested in space, without any particular scientific bias when it comes to basic astrophysics. Fortunately, the NASA site is intended for lay-people and is written in common, ordinary English. And, oddly enough, there is no mention of laser action.
Returning to Google, we tried another search: this time with both terms: "Wolf-Rayet" and "laser action". Only 40 sites appear, and many of these directly reference the author of the LaserStars web site. Presumably, if most scientists accepted the statement that Wolf-Rayet stars exhibited laser action, then these two terms would appear together far more often. As it stands, only about 0.02% of all web sites that mention Wolf-Rayet stars also refer to "laser action." It does not appear that the LaserStars web site accurately depicts the scientific community's perception of Wolf-Rayet stars.
So, how are we to evaluate the LaserStars web site? Some of the comments on the site date back to 1973, so the theory may have been speculative when it was first proposed. Today, this site is controversial at best, but more likely misrepresentation. The site appears to be presenting only one side of a story, and it is hard to believe that the author is unaware of the opposing points of view.
Does this mean that the evidence presented on the LaserStars web site is factually incorrect? Not at all! The conclusion is that this web site does not appear to accurately reflect the general understanding of the scientific community. The only way to assess the factual validity is through detailed study and experimentation of the facts. However, this study may take many years, and it is beyond the abilities of our students to conduct such work, especially given that their research papers may be due in only a few weeks time. The goal for our students is to assess how well a specific web site reflects current scientific knowledge, and the LaserStars web site does a poor job in this regard.
After the in-class introduction and practice session, each student was to evaluate the six sites remaining on the list after http://laserstars.org before the next morning class. Each one was to write comments about each site, assign a category, and explain his/her reasons for choosing that category.
In class the next morning, the students divided themselves into groups of four to five for the purpose of debating their assessments of the web sites and coming to consensus within each group. They were encouraged to do their best to persuade others in the group to see things from their point of view. The classroom had computers so in case of disputes, the students could log on and re-examine the evidence. While the groups were meeting, the professor and the librarian circulated, occasionally asking questions, but remaining outside of the decision process. When the groups finished their discussions, the class as a whole, led by the professor, discussed each web site, the categories each group assigned, and came to a consensus as a class in a similar manner to the way they came to consensus within each group. They also discussed whether each site would be appropriate to use as a source for a class paper.
During the discussion, sometimes search strategies were compared. For example, one student mentioned he only had eight hits when he searched on one subject. Others had 86,000 hits. Examining the first student's method revealed that he had entered a string of unrelated words in quotation marks, thus requiring the terms to be found in that particular order. Allowing the students to discover "mistakes" and new strategies themselves brought deeper learning of the concepts.
Students gained significant experience in evaluating science. While performing this exercise, they experienced many roles of the scientific community including critical evaluation, peer review, and coming to consensus. They participated in scientific thinking painlessly and discovered how some of their own biases influenced their decisions. They experienced how discussion and further investigation may broaden one's understanding.
"Doing this makes me think I've probably used really bad sites in my past research."
"I found last night's assignment very frustrating. Some of the sites did not fit easily into one category."
"Our group agreed on all of them. I guess we have a very intelligent group."
Working on an approach to having students evaluate the science they encounter on web sites has been evolving for several years. During this current academic year, Science through Film and Fiction has been taught twice. The first time, it was the first course taken by first year students. The second time, it was the fifth course (first of the second semester) and was populated primarily by first year students, but included several students from second-fourth year.
Although it is impossible to conclude anything definitively based on only two courses, it was evident that the students for whom this was their first college course had a more difficult time with it mainly because many had not yet developed critical thinking skills beyond the high school level. Some came to class unprepared. Even with these drawbacks, the discussion was productive and learning took place. There was a noticeable difference in the students' participation and decision making by the second semester. The discussion in class was quite lively; the students were thoughtful, as well as much more thorough in their analyses and coming to consensus.
Students were sometimes confused about whether they were evaluating the science itself, or the web site. This led to some good discussions about whether, when, and how one could separate the science and the web site. Students who were the most "concrete" thinkers seemed to have the most frustration with the assignment. They expected one and only one correct answer. Those students who had developed beyond that point and were more conceptual thinkers seemed to enjoy the assignment. In both cases, the exercise was useful in stretching the students' ability to think in new ways.
The strategies, exercises and assignments presented in this article have served several goals very well. Students learned to approach investigation and analysis using scientific strategies. The assignment helped prepare them for their next assignment in the class: to write a paper on a scientific topic using reputable sources. It also assisted in laying a foundation of critical analysis that will be useful in future college courses as well as in daily life. It was gratifying to watch non majors lose their fear of approaching science as they practiced their skills. Recognizing that the assignment discussed in this paper is a work in progress, the authors feel that it has developed to a point where it is successful and may be useful to others either as it stands or as a stimulus to create adaptations.