Journal of Research on Technology in Education 407 JRTE, 40(4), 407–426 Effectiveness of Personal Interaction in a Learner-Centered Paradigm Distance Education Class Based on Student Satisfaction Shu-Hui Hsieh Chang and Roger A. Smith Iowa State University Abstract This study examined relationships between students’ perceptions of course-related interaction and their course satisfaction within the learner-centered paradigm in distance education. A Students’ Perceived Interaction Survey (SPIS) instrument was developed to examine nine separate hypotheses about the nature of course-related interaction. A volunteer sample of 855 students from the 949 students enrolled in Computer Science 103—Computer Literacy and Applications at Iowa State University in the fall of 2005 was used. This study employed a multiple linear regression. It concluded that student-instructor personal interaction, student- student personal interaction, and student-content interaction, along with students’ perceptions of WebCT features and gender were predictors of course satisfaction. In this study 94% of the participants indicated they were satisfied with the course. No significance was found in the relationships between student satisfaction and student-teaching assistant (TA) personal interaction, the student’s prior partial online distance education experience, the student’s prior entirely online distance education experience, and academic year. (Keywords: interaction, learner-center, student satisfaction, distance education.) INTRODUCTION Distance education has become widely used around the world today and is available in a number of forms that reduce the time and space constraints pres- ent in traditional classrooms (Verduin & Clark, 1991). Distance education is especially advantageous because it makes learning accessible to students all day, everyday, giving them immense control over their own learning schedules. Within this new educational paradigm, virtual classrooms provide students with an environment that allows them to access information conveniently (Ko & Rossen, 2001). According to Perez’s (2001) research, many students reported that the main disadvantage of distance education was a lack of personal interaction between the instructor and the students. Opportunities for students to meet with their instructor in a face-to-face environment were nonexistent, preventing students from asking questions, engaging in discussions, and exchanging non-verbal cues with the instructor (Perez, 2001). In Rost’s (2000) research regarding distance education, online instructors utilized forms of technology that lacked personal interaction, decreasing the quality of education. Although many studies have considered different variables related to student performance and satisfaction, few studies have examined 408 Summer �008: Volume 40 Number 4 the relationship between interactivity, the effectiveness of technology used in distance education, and course satisfaction levels of distance education learn- ers. Concerns about the quality of distance learning can be addressed better if researchers understand how students perceive interaction in virtual classrooms and how technology contributes to their learning. PROBLEM OF THE STUDY AND HYPOTHESES This study determined whether there was a relationship between students’ perceptions of how effective course-related interaction was and their level of course satisfaction. The Students’ Perceived Interaction Survey instrument (SPIS) was developed by the researchers to measure nine variables within the learner-centered paradigm in distance education. These variables included: stu- dent-instructor personal interaction, student-teaching assistant (TA) personal interaction, student-student personal interaction, student-content interaction, gender, academic classification, students’ prior experiences with distance educa- tion in a partially online class setting, students’ prior experiences with distance education in an entirely online class setting, and students’ perceptions on the ef- fectiveness of particular WebCT features in helping them learn. Based on these variables, nine hypotheses were developed. PURPOSE OF THE STUDY This study served three purposes: 1. To identify the relationships between student-instructor personal interaction and course satisfaction, student-TA personal interaction and course satisfac- tion, student-content interaction and course satisfaction, and student-stu- dent personal interaction and course satisfaction. 2. To identify the relationship between students’ perceptions about the effec- tiveness of WebCT features for their learning and course satisfaction. 3. To identify the relationships between course satisfaction and specific student demographics such as gender, academic classification, and prior distance education course experiences. LITERATURE REVIEW In an educational setting, interaction through communication and collabo- ration is the most central mechanism educators use to encourage students to become active learners. As the distance education system evolves, interactive processes, especially those that imitate the interactive processes in traditional face-to-face classrooms, have been attracting special attention. The insufficient amount of interactive learning opportunities within the online course environ- ment is considered one of the major downsides of distance education (Perez, 2001). In response to this lack of interaction, educators should examine thoroughly the current status of the distance education field and study the factors that de- fine and influence the current designs and contents of distance education. In a world that constantly develops new technologies, understanding these factors Journal of Research on Technology in Education 409 is important to anticipate and modify the newest educational methods to cor- respond with the newest technologies. The internet has become an invaluable asset to distance education because it allows students to learn through various technologies, such as two-way video and computer-mediated communication. This enables students to play an ac- tive role in the learning process and provides flexibility and convenience for learners (Willems, 2005). Increased interaction, made possible by utilizing the newer two-way communication technologies, has immense impact upon dis- tance education. Inadequate faculty training, lack of knowledge of online course design, and doubt about real-time classroom concepts working in the online environment determines a need for theoretical and empirical research on course design prin- ciples for online instructors (McCombs & Vakili, 2005). Furthermore, Barrett, Bower, and Donovan (2007) indicated it is critical for online instructors to shift from the traditional teacher-centered to the learner-centered teaching style. New Education Model: The Shift to a Learner-Centered Paradigm Olson and Wisher (2002), in examining 47 studies on Web-based courses in higher education, found many cases where faculty members were not trained adequately in online instructional design. The American Psychological Associa- tion addressed this concern and developed 12 learner-centered principles in 1990, then revised the list into 14 learner-centered principles in 1995 (Alexan- der & Murphy, 1998). McCombs and Whisler (1997) defined the learner-cen- tered paradigm based on these principles: The perspective that couples a focus on individual learners (their hered- ity, experiences, perspectives, backgrounds, talents, interests, capacities, and needs) with a focus on learning (the best available knowledge about learning and how it occurs and about teaching practices that are most effective in promoting the highest levels of motivation, learning, and achievement for all learners) (p. 9). McCombs et al. (2005) indicated that online educators should implement these 14 learner-centered psychological principles into curriculum design. These principles included: 1) nature of the learning process, 2) goals of the learning process, 3) construction of knowledge, 4) strategic thinking, 5) thinking about thinking, 6) context of learning, 7) motivational and emotional influences on learning, 8) intrinsic motivation to learn, 9) effects of motivation on effort, 10) developmental influences on learning, 11) social influences on learning, 12) individual differences in learning, 13) learning and diversity, 14) standards and assessment. This learner-centered dynamic curriculum focuses on the needs of individual learners and provides opportunities to gain expertise as goals and projects progress. In addition, the curriculum provides students the opportunity to learn anytime, anywhere, encourages learning autonomy, assesses students’ backgrounds to understand individual needs, promotes interaction and col- laboration with other students, allows students to share their personal needs and interests with others, observe the learning progression and feedback, and change 4�0 Summer �008: Volume 40 Number 4 according to students’ needs. They concluded that teachers should include learners in decisions about learning processes and respect students’ individual backgrounds and abilities while simultaneously focusing on promoting motiva- tion, overall achievement, and learning. White (2005) stated that online educators should focus on developing learn- ers and understanding their perspectives on distance education. Miller (2007) reported that students in learner-centered online classrooms produced higher- quality course projects and mastered concepts better than those in non-learner- centered online classrooms. The learner-centered model has become a key com- ponent for online distance education, breaking from the traditional teaching model. Chou (2001) conducted a research study at the University of Hawaii on an upper level undergraduate course based on learner-centered instructional design and employed constructivist and small group cooperative learning activities in the curriculum. The study utilized WebCT and other computer media com- munication systems such as Palace and Active World. Chou identified two ele- ments that impact the different patterns of interaction, one being the design of leaner-centered online activities. These activities, which include student-moder- ated discussion, small group cooperative learning projects, and constructivist- based instructional activities, were found to enhance interpersonal relationships and increase opportunities for students to share information and build knowl- edge while collaborating with others. They also allowed students to express their viewpoints and take responsibility for their learning to reduce confusion in the online environment. The second element Chou identified was the technological attributes that enhance social presence and effective communication. Student perceptions of the technological attributes of the course management system af- fect how frequently they engage in online interaction. In order to promote stu- dent learning and interaction, instructors should help students become familiar with the technology at the beginning of the semester. The faster the students learn the technological features needed to complete coursework, the faster they can concentrate on learning the course material. In Chou’s (2001) study, out of a variety of different course management sys- tems, students rated the WebCT chat feature to be the most straightforward and reliable. These research results showed the incorporation of learner-centered instructional design and constructivist, and cooperative activities into distance education enhanced student learning. Well-planned, synchronous activities executed through a well-designed and trustworthy course management system can indeed promote student interaction and active learning. These studies indicate the online course management system is one of the most important elements impacting a student’s learning and satisfaction. Many researchers reported that WebCT helped online educators develop active and ef- fective online courses (Cheng-Chang, 2003; Freeman & Field, 2004; Hutchins, 2001; Kendall, 2001; LeRouge, Blanton & Kittner, 2002; Robertson & Klotz, 2001; Sabine, 1998; Spilotopoulos & Carey, 2005). WebCT offers several ac- tive tools that can facilitate meaningful interaction between instructors and stu- Journal of Research on Technology in Education 4�� dents, students and teaching assistants, students and students, and students and the course design and content, including discussion forums and chat features (Dabbagh & Schmitt, 1998; McGreal, 1998; Morss, 1999). Maurino (2006) indicated that threaded discussions served as a vehicle for the development of a student’s in-depth learning and critical thinking skills. The online discussion activities contributed to student’s course participation and satisfaction, their learning outcomes and facilitated interaction (Goodell & Yusko, 2005). Interaction: A Critical Factor in Online Distance Education Moore (1989) categorized three types of interaction in distance education: student-content, student-instructor, and student-student interactions. Zhao, Lei, Yan, Lai ,and Tan (2005) and others agree that personal interaction is the fundamental element that facilitates learning in distance education. Miller, King, and Doerfert (1996) emphasized that students desire personal contact with their instructors and peers, along with a high-quality level of tech- nology in the distance education environment. New techniques must be con- structed that make time for students to interact, because personal interaction between teachers and students, students and students, and students and course content directly relates to student course satisfaction. Stravredes (2002) empha- sized the importance of interaction by affirming that student achievement and positive attitudes increased as the level of interaction increased. Gao (2001) investigated the effects of different levels of interaction on achievement and attitudes of college students in a Web-based learning environ- ment. The results of the study showed that active learning on the part of stu- dents directly contributes to their learning outcomes. Gao declared that provid- ing feedback from instructors helps reinforce the learning material and provides further motivation for students to become even more active in the learning process. LaPointe and Gunawardena (2004) conducted a research study to under- stand the relationship between peer interaction and learning outcomes in computer-mediated conferencing. The online courses LaPointe and Gunawar- dena studied were very diverse; the courses ranged from teaching basic skills to teaching theories, and covered many levels of education. Courses for associate, bachelor, masters, and doctoral degrees were all incorporated into the research, all of which were designed using asynchronous online discussions. The final research results indicated peer interaction had a strong direct effect on learning outcomes. Moreover, human interaction with technology is the primary way students learn in the online environment; therefore, it is crucial for online educators to develop a learning environment that promotes student-instructor, student-con- tent, and student-student personal interactions (Garrison & Cleveland-Innes, 2005). These online courses can bring people all over the world together to dis- cuss course content at the same time, producing an incredible interactive online learning experience. To reach this goal, having a qualified educator who has the ability and knowledge to design effective materials that allow learners to partake in an enriched interactive learning experience is essential (Porter, 1997). 4�� Summer �008: Volume 40 Number 4 Student Satisfaction Course satisfaction is a critical component in improving learning achievement in the traditional classroom and the distance education environment. Many researchers have examined the factors that influence student satisfaction in dis- tance education (Freddolino & Sutherland, 2000; Fredericksen, Pickett, Shea, Pelz, & Swan, 2000; Niles, 2002). Researchers believe student satisfaction, which reflects a student’s attitude toward learning, should be studied and im- proved by all educators so that students can excel in a distance education setting (Biner, Dean, & Mellinger, 1994). Moore (2002) stated that social interaction prompted by the instructor and prompt instructor feedback were both linked to students’ satisfaction with the course. The most significant contributor to perceived learning in these online courses was the interaction between the instructor and the students. Students reported that the higher the level of interaction with the instructor or their classmates, the higher the level of learning they achieved in the course. With the advancement of the Internet, educators have an unmatched oppor- tunity to design and conduct effective distance learning courses filled with help- ful features that promote communication and interaction. However, dangers accompany these promises made by ever-improving technology. Educators must understand that utilizing these advanced technologies will not automatically make their distance learning courses more dynamic and interactive. In fact, more hard work is required by the instructor to effectively adapt the technolo- gies to develop clear, interactive online courses. Within the advancements of education, the role of interaction has changed considerably along with the development of pedagogical approaches and meth- odologies. Even though the degree of interaction varies between traditional and distance settings, research about the implications of interaction on student learning has identified that interaction positively affects students’ abilities to learn. Conversely, lack of interaction makes learning boring and difficult. Therefore, further research focusing on the specific implications of interaction on student learning should increase understanding on how to integrate interac- tion most effectively in distance education settings to maximize students’ abili- ties to learn. Because WebCT is one of the most prominent resources utilized by distance education, it is important to examine the effectiveness of WebCT features on the incorporation of interaction in distance education, the impact of interaction on student learning, and students’ attitudes about learning within the learner-centered paradigm. Furthermore, studies focusing on innovative uses of technology that promote interaction in distance learning would be espe- cially beneficial to teachers. These types of specialized studies expand teachers’ knowledge about the different types of interaction that can occur within the on- line setting. Because interaction has been defined as a crucial component of the learning process, educators must familiarize themselves with interaction’s impact on the quality of learning, experiment with various approaches to interaction, conduct research exploring the effectiveness of these different types of interac- tion, and eventually implement their findings into distance education courses so students can reap the benefits of this knowledge. Journal of Research on Technology in Education 4�3 METHODOLOGY The methodology developed for this study included the research design, the development of the instrument and the pilot test, the participants’ characteris- tics, the sampling procedure, and the data collection and analysis techniques. Research Design A survey was developed for this particular study called the Students’ Perceived Interaction Survey (SPIS). The survey was administered to the participants through WebCT during the week of November 29 to December 7, 2005. Participant Characteristics Computer Science 103—Computer Literacy and Applications, at one large Midwestern university, is a one-semester online computer literacy and ap- plications course. In the fall of 2005, 949 students enrolled in the class and 25 teaching assistants were employed to help grade student homework. These Computer Science 103 students volunteered to participate in this study while taking the course. Freshmen, sophomores, juniors, and seniors with various ma- jors in various colleges participated, along with different ethnicities and genders. Development of the Instrument and Pilot Test The survey was developed in four phases. In phase one, the original version of the survey was prepared and initial exploratory data were collected. Phase two consisted of a survey review by an expert committee of professors. Phase three involved a pilot test where 20 Computer Science 103 teaching assistants took the survey, along with 46 Computer Science 103 students. The survey was re- vised at each phase and finalized in the fourth phase. Validity and Reliability of the Instrument To examine the validity and reliability of the Students’ Perceived Interaction Survey (SPIS) instrument for distance education, factor analysis and Cronbach’s alpha tests were conducted. Factor analysis was one of the primary statistical methods used in this research. By using the principal component method, in- dividual factors were extracted from each of the scales. Kaiser’s rule and Scree plots were used to determine the number of factors. To justify the factor analy- sis results, the Kaiser-Meyer-Olkin Measure of Sampling Adequacy (KMO) was examined. To access internal consistency, the Cronbach’s alpha statistic, based on standardized item scores for a set of unidimensional items, was calculated. After running the factor analyses for parts 2–6, most of the values of the Kai- ser-Meyer-Olkin Measure of Sampling Adequacy (KMO) were greater than 0.8. These results indicated that the factors were well defined and the probability would be high that if another sample was obtained and the analysis repeated, the resulting factors would be consistently the same (Tabachnick & Fidell, 2001). Most of the reliability of Cronbach’s Alpha Based on Standardized Items scores for each factor were greater than 0.7. A Cronbach’s alpha score greater than 0.7 indicates strong internal consistency of a construct (Cronbach, 1951). These scores indicate how consistently individuals respond to the items within a scale. Table 1 (p. 414) shows the factor analysis and Cronbach’s alpha scores for the six factors found in the SPIS. 4�4 Summer �008: Volume 40 Number 4 Data Collection and Data Analysis The survey results were analyzed using SPSS 14.0 for Windows. The Univari- ate General Linear Model Procedure and Linear Regression Procedure in SPSS were used to perform a multiple regression analysis to determine the relation- ship between the independent variables and course satisfaction. Descriptive sta- tistics were calculated for each of the demographic variables: age, gender, race, college classification, and prior distance education experiences. RESEARCH MODEL AND FINDINGS To examine the relationship between course satisfaction and other inde- pendent variables, a multiple linear regression model was developed by the researcher. The most appropriate statistical method to analyze the data was re- gression analysis. The model used a set of continuous and categorical variables to predict course satisfaction. For the categorical independent variables, dummy variables were created. The model developed was as follows: where Y = Course satisfaction X 1 = WebCT effectiveness X 2 = Prior partial online experience X 3 = Prior entirely online experience X 4 = Student-TA interaction X 5 = Student-instructor interaction X 6 = Student-student interaction Factor Analysis Reliability (Cronbach’s Alpha Based on Standardized) Overall Part Variable # of Items Questions KMO % of Variance # of Factors 1 WebCT Features N/A Only need correlation with part 6 2 Student– TA 3 10-12 0.656 70.384 1 0.787 3 Student– Instructor 7 15-18, 21-23 0.836 42.093 1 0.765 4 Student– Student 5 25-29 0.736 44.651 1 0.685 5 Student– Content 5 42-45, 47 0.804 54.588 1 0.786 6 Course Satisfaction 6 38-41, 46, 48 0.821 54.901 1 0.833 Table 1: Factor Analysis and Reliability for the Final SPIS Survey Journal of Research on Technology in Education 4�5 X 7 = Student-content interaction Z 1 = Gender (Male) Z 21 = Year (Freshman) Z 22 = Year (Sophomore) Z 23 = Year (Junior) Examination of Overall Model The F test (shown in Table 2) was used to examine the overall multiple re- gression model. The null hypothesis was H0 : βi = 0, while the F statistic was 179.447. The p-value was < 0.001, meaning the model was significant. The R square value of 0.702 indicated that all the independent variables together pre- dicted 70.2% of the variability of course satisfaction, which was fairly high. The assumptions of this model—independence, normality, and equality of variances—were satisfied. Because students completed the surveys at times that were personally convenient as opposed to a classroom setting, independence can be assumed. The histogram of standardized residuals showed the residuals close- ly followed a normal distribution. The results of the Levene’s Test of Equality of Error Variances (Table 3, p. 416) indicated the F value was 1.427 and the P- value was 0.191. Therefore, the null hypothesis was not rejected and the model met the equality variance assumption; the error variance of the dependent vari- able was equal across groups. VIF (variance inflation factor) was used to assess multicollinearity, which ex- ists when the independent variables correlate with each other. If a VIF value is above 10, then these values indicate serious multicollinearity, which inflates the Source Type III Sum of Squares df Mean Square F One Tailed Significance* Corrected Model 285.660 11 25.969 179.447 < 0.001 Intercept 1.400 1 1.400 9.671 0.001 Gender 1.289 1 1.289 8.909 0.002 Academic Year 0.497 3 0.166 1.145 0.165 WebCT Effectiveness 2.045 1 2.045 14.133 < 0.001 Partial Online Experience 0.001 1 0.001 0.006 0.471 Entirely Online Experience 0.112 1 0.112 0.774 0.190 Student–TA 0.107 1 0.107 0.737 0.196 Student–Instructor 1.103 1 1.103 7.621 0.003 Student–Student 1.958 1 1.958 13.527 < 0.001 Student–Content 85.787 1 85.787 592.788 < 0.001 * One-tailed significant p-value was divided by the two-tailed p-value from SPSS output. R2 = .702 Table 2: Test of Between Subjects Effects 4�6 Summer �008: Volume 40 Number 4 standard errors of the regression coefficients. At a result, t-tests would not be accurate for testing deviation of the regression coefficient from zero. According to Table 4, VIF statistics for this model were between 1.084 and 3.372. These statistics did not indicate any multicollinearity problems. Because all the as- sumptions for multiple regression were satisfied, this model was used to test the research question. Testing the Null Hypotheses, Findings, and Discussion Nine hypotheses were tested using the multiple regression model at an alpha level of 0.05 (one tailed). The multiple regression results took into account the relationships of all variables in the model simultaneously, and thus provided a more accurate measure of how any one independent variable was related to the dependent variable. The regression model estimated the partial slopes between each of the predictor variables and the dependent variable. This estimate dif- fered from the bivariate correlation between these variables, which did not par- tial out the relationships among the other variables in the model. The research results demonstrated that student-instructor interaction, stu- dent-student interaction, and student-content interaction, along with gender and student perceptions of WebCT features were predictors of course satisfac- tion. In this study 94% of the participants indicated they were satisfied with the course. Moore (1989) found that there were three critical types of interaction in distance education: student-instructor, student-student, and student-content, which this study supports. Interaction is considered the key to success in tradi- tional classrooms, as well as in the distance education environment (Fulford & Zhang, 1993). The results of this study strongly support this perspective. Testing the First Null Hypothesis: According to the results shown in Table 4 for student-instructor interac- tion, the p-value for the t test was 0.003, which was less than 0.05. Therefore, the null hypothesis was rejected. The results showed that there was a positive and significant relationship between students’ scores on the student-instructor interaction items in the SPIS instrument for distance education and students’ scores on the course satisfaction items in the SPIS instrument for distance education. Table 3: Levene’s Test of Equality of Error Variances Dependent Variable: Course Satisfaction F df1 df2 Significance 1.427 7 842 0.191 This tested the null hypothesis that the error variance of the dependent variable was equal across groups. Design: Intercept + Gender + Academic Year + WebCT Effectivess + Partial Online Ex- perience + Entirely Online Experience + StudentTA Interaction + StudentInstructor Interaction + StudentStudent Interaction + StudentContent Interaction Journal of Research on Technology in Education 4�7 Student-Instructor Interaction is a Predictor of Course Satisfaction Moore and Kearsley (1996) indicated that the instructor is responsible for facilitating student-instructor, student-student, and student-content interac- tions in the distance education classroom environment. In addition, interaction between the instructor and students greatly impacts students’ perceptions of distance education (Hiltz, 1995). Computer Science 103 presented several op- portunities for student-instructor interaction, which contributed to students’ levels of satisfaction with the course. These opportunities included: 1) face-to- face orientation sessions in the first week of the semester, 2) effective commu- nication via WebCT e-mail, 3) synchronous chat sessions to develop interactive communication, 4) access to a frequently updated grade book, 5) constructive feedback about students’ performances, 6) opportunities to reflect on learning and identify ways to improve performance. In this study 90.4% of the participants stated that they enjoyed the class very much. The prompt feedback and constructive comments from the instructor increased students’ enjoyment levels and influenced their course satisfaction. Testing the Second Null Hypothesis: The mean of the student-TA interaction variable was 5.171. According to the results shown in Table 4, the regression coefficient of the student-TA interac- Table 4: Parameter Estimates Parameter B Std. Err t One-Tailed Significance* VIF Intercept -0.0567 0.153 -3.579 0.002 Gender(Male =0) 0.0810 0.027 2.985 0.002 1.084 Gender(Female = 1) 0 . . . . Year [Freshman = 1] 0.094 0.052 1.818 0.035 3.261 Year [Sophomore =2] 0.080 0.049 1.642 0.051 3.372 Year [Junior =3] 0.075 0.053 1.417 0.079 2.609 Year [Senior] 0 . . . . WebCT Effectiveness 0.117 0.031 3.759 < 0.001 2.069 Prior Partial Online Experience -0.001 0.010 -0.075 0.471 1.288 Prior Entirely Online Experience 0.015 0.017 0.880 0.190 1.141 Student – TA 0.017 0.020 0.858 0.196 1.491 Student – Instructor 0.105 0.038 2.761 0.003 2.205 Student – Student 0.079 0.021 3.678 < 0.001 1.466 Student – Content 0.756 0.031 24.347 < 0.001 2.107 * One-tailed significant p-value was divided by the two-tailed p-value from SPSS output. 4�8 Summer �008: Volume 40 Number 4 tion variable was estimated to be 0.017. The corresponding p-value for the t test was 0.196, which was greater than 0.05. Therefore, the null hypothesis was not rejected. It suggested that there was no positive and significant relationship between the students’ scores on the student-TA interaction section in the SPIS instrument and their scores on the course satisfaction section in the SPIS instru- ment. However, several circumstances could explain these results. Computer Science 103 was a large class that consisted of 949 students divided into 40-per- son sections with a total of 25 sections. A total of 25 section TAs were assigned to grade students’ homework and answer questions about course material. In general, students appreciated the work of the TAs, but students’ opinions about the quality of their own TA varied significantly, potentially affecting students’ perceptions of student-TA interaction. Therefore, compared to other factors such as student-instructor interaction, student-student interaction, student-con- tent interaction, WebCT features, and gender, student-TA interaction was not significant in predicting course satisfaction. Testing the Third Null Hypothesis: According to the results for student-student interaction shown in Table 4, the p-value was less than 0.001 for the third hypothesis. Therefore, the null hypoth- esis was rejected. The results showed that there was a positive and significant relationship between the students’ scores on the student-student interaction section in the SPIS instrument for distance education and their scores on the course satisfaction section in the SPIS instrument. Student-Student Interaction is a Predictor of Course Satisfaction Students in an online classroom environment often feel isolated because of a lack of interaction with other students. It is crucial for online instructors to develop a curriculum that actively promotes student-student interaction. There were several student-student interactions that occurred as part of this study that contributed to increasing students’ levels of course satisfaction; namely: 1) constructivist-based hands-on projects and simulation tests, 2) discussion board case study projects, 3) a student homepage design project, and 4) chat sessions. Students responded positively to these activities; discussion board postings from Computer Science 103 totaled more than 51,000 over the course of the semes- ter. Over 97% of survey participants indicated they appreciated the opportunity to work with partners on the case study projects, and 83.6% indicated they posted at least 60 comments about the work of other groups. Students also appreciated the chat sessions—many participants (90%) within this study in- dicated that they liked the opportunity provided for them to get to know their fellow students in the Computer Science 103 online community. Testing the Fourth Null Hypothesis: The results for student-content interaction, shown in Table 4, indicated the p-value for the t test for hypothesis four was less than 0.001. Therefore, the null hypothesis was rejected. The results showed that there was a positive and signifi- cant relationship between the students’ scores on the student-content interac- tion section in the SPIS instrument and their scores on the course satisfaction section in the SPIS instrument. Journal of Research on Technology in Education 4�9 Student-Content Interaction is a Predictor of Course Satisfaction Several types of student-content interaction contributed to students’ satisfac- tion with the course. In this study, over 96.8% of the participants reported that the Computer Science 103 WebCT course materials were well organized, and about 94.2% indicated that they were satisfied with the quality of the stream- ing lectures. Well-organized course material and streaming lectures can assist student learning, facilitate student-content interaction, and increase learning retention. According to Choi and Johnson (2005), video-based instruction methods provided higher retention rates than traditional text-based instruction. Johnson’s assertions are supported by the results of this study. Furthermore, the instructor posted simulation projects and many other con- tent-rich course materials in each weekly module for students to learn. Because of the instructor’s extra efforts, over 97.1% of the participants indicated that they were satisfied with the content of the course. Furthermore, 93.2% of the participants responded that they were satisfied with the amount of learning they achieved in the class. Testing the Fifth Null Hypothesis: In the results for gender shown in Table 2, the p-value for the t test was 0.002, which was less than 0.05. Therefore, the null hypothesis was rejected, suggest- ing that the mean score of females was less than the mean score of males on the course satisfaction items in the SPIS instrument for distance education. The mean for males was 5.263, while the mean for females was 5.164. Males were more satisfied than females with the course, although the practical difference is small. Gender as a Predictor of Course Satisfaction The results of this study demonstrated that both male and female participants were very satisfied with the course. However, males were slightly more satisfied with the course than females. This online course provided flexibility, social pres- ence, a cooperative learning community, along with high quality student-in- structor, student-student, and student-content interactions. These components were satisfactory for both male and female students. However, Pascarella and Ternzini (2005) indicated that men performed better than women performed in the areas of mathematics and science, and Kearsley (2000) and many others stated that males held more positive attitudes toward computers and technology than females (Furger, 1998; Shashaani, 1994; Spender, 1995, Ullman, 1997). Furthermore, Keinath (1991) indicated that females often felt like they did not have enough time to complete everything they wanted, not only in coursework, but also in all aspects of life. Because the coursework for Computer Science 103 was demanding, females might have felt they had less time to accomplish the required assignments in the class and were therefore less satisfied than males with the course. Testing the Sixth Null Hypothesis: According to the results shown in Table 2, the p-value for the t test related to classification in college was 0.165, which was greater than 0.05. Therefore the 4�0 Summer �008: Volume 40 Number 4 null hypothesis was not rejected. There was no positive relationship between students’ academic classifications and students’ scores on the course satisfaction section in the SPIS instrument for distance education. Zhang (2005) also found that there was no significant relationship between age and how receptive distance education learners were. However, Lim (2001) found that there was a negative relationship between academic status and course satisfaction. The results of this research are consistent with Zhang’s findings, in- dicating no significant relationship between academic classification and course satisfaction. Testing the Seventh Null Hypothesis: Table 4 shows a p-value for the t test related to students’ experience with dis- tance education was 0.471, which was greater than 0.05. Therefore, the null hy- pothesis was not rejected. There was no positive relationship between students’ prior experiences with distance education in partially online class settings and their scores on the course satisfaction section in the SPIS instrument. Discus- sion regarding this hypothesis is closely tied with the next hypothesis, and will be included in the next section. Testing the Eighth Null Hypothesis: According to the results shown in Table 4, the p-value for the t test related to experience with a totally online class was 0.190, which was greater than 0.05. Therefore, the null hypothesis was not rejected. There was no positive relation- ship between students’ prior distance education experience in an entirely online class and their scores on the course satisfaction section in the SPIS instrument. Several factors could have contributed to these results. First, the course was well organized, helping students easily find the information they needed. Sec- ond, successful orientation sessions may have helped students understand what they needed to do to succeed and made online learning easy and enjoyable. Third, the technologies adopted by the instructor promoted active learning. Fourth, the course instructor maintained a high level of communication with students, helping them stay on task and be more satisfied with the course. All of these factors could help explain why prior distance education experience did not impact students’ course satisfaction. Testing the Ninth Null Hypothesis: The mean of the WebCT features variable was 5.055. According to the results shown in Table 4, the p-value was less than 0.001. Therefore, the null hypoth- esis was rejected, suggesting that there was a positive and significant relationship between students’ scores on the effectiveness of WebCT features section in the SPIS instrument for distance education and students’ scores on the course satis- faction section in the SPIS. The instructor adopted several WebCT features that promoted active student learning and increased interaction between students and the instructor, other students, and the course content. The use of these features also built an online learning community. Overall, 97.5% of participants within this study stated Journal of Research on Technology in Education 4�� that the WebCT features used in this class were easy to learn. The results of this study are consistent with Lai (2004) and others who concluded that effective WebCT tools enhanced the student learning experience (LeRouge et al., 2002; Hutchins, 2001). CONCLUSION AND RECOMMENDATIONS FOR FURTHER STUDY As distance education has become a more and more popular educational prac- tice, it is crucial to examine online course quality. For students to successfully learn, teachers must present clear goals and objectives so students do not get frustrated (Porter, 1997). Instructors in the online environment must focus on learners’ needs and plan and execute their lessons clearly and effectively to help students learn the maximum amount of information (Barker & Patrick, 1989; Knowlton, 2000). There are many ways to promote learner achievement in online class envi- ronments, but learner satisfaction is one especially important component in successful distance education courses (Ritchie & Newby, 1989). Some research- ers believe student satisfaction should be examined before learning outcomes, because students’ negative opinions can hinder their learning (Biner, Dean & Mellinger, 1994). Student satisfaction should be taken into account by in- structors because attitudes are often indicative of success. Barrett et al. (2007) reported that the online instructors need to shift their teaching styles from teacher-centered to learner-centered paradigms in order to facilitate better on- line learning environments and promote student satisfaction. Based on these research findings, several recommendations have been made regarding how to create a learner-centered online classroom that incorporates effective WebCT features, increases student-instructor interaction, increases student-student in- teraction, and increases student-content interaction. The results of this research can help educators create a rich distance education environment that encour- ages students to enjoy what they’re learning and perform well. These research results showed that student-instructor, student-student, and student-content interactions, as well as gender and WebCT features are predic- tors of course satisfaction. The following are suggestions for future research: 1. Investigate if increased interaction will increase student learning outcomes measured by grades or academic achievement. 2. Replicate this study on a national level for undergraduate students who are taking a similar course using various course management systems. 3. Replicate this study in other courses in other subject areas. 4. Conduct a qualitative research study to investigate students’ perceptions of the relationships between interaction and their course satisfaction. 5. Conduct an experimental study with a control group to measure if increas- ing interaction will increase course satisfaction. One group would require little to no interaction, while another group would be given a sufficient amount of interaction. 6. Conduct the same study on different course management platforms other than WebCT. 4�� Summer �008: Volume 40 Number 4 7. Determine if the research results concerning gender and preference re- mained consistent in other subject matter. This course was a computer sci- ence course; perhaps a broader subject area would change the results. 8. Determine whether other factors affect interaction, such as students’ learn- ing styles and instructors’ teaching styles, which are not addressed in this study. Further study is needed in these areas. Contributors Shu-Hui Hsieh Chang is the director of distance education and a Black- board/WebCT senior certified trainer for the Computer Science Department at Iowa State University. She has developed several online courses and has taught Computer Science 103 entirely online with about 1,000 students each semester for the past four years. Her main research interests include computer technolo- gies in education, curriculum and instructional design, and evaluation and assessment in both traditional and distance education settings. Dr. Chang is a member of the university’s Distance Education Council. (Address: 116 Biscayne Street, Port Lavaca, TX 77979; 361.552.4702; shchang@iastate.edu) Roger A. Smith is a professor in the Department of Educational Leadership and Policy Studies at Iowa State University. His research interests include dis- tance education; student learning styles, issues related to retention and recruit- ment especially of community college transfer students, and training needs in industry. Dr. Smith is a member of the university’s Distance Education Council. (Address: N232B Lagomarcino Hall, Iowa State University, Ames, IA 50011; 515.294.7001; rasmith@iastate.edu) References Alexander, P. A., & Murphy, P. K. (1998). The research base for APA’s learner- centered psychological principles. In N. Lambert, & B. McCombs (Eds.), How students learn: Reforming schools through learner-centered education, (pp. 25–60). Washington DC: APA. Barker, B., & Patrick, K. (1989). Microcomputer based teleteaching: A descrip- tion and case study. Computers in the Schools, 6, 155–164. Barrett, K., Bower, B. L., & Donovan, N. C. (2007). Teaching styles of com- munity college instructors. The American Journal of Distance Education, 21(1), 37–49. Biner, P. M., Dean, R. S., & Mellinger, A. E. (1994). Factors underlying dis- tance learner satisfaction with televised college-level courses. The American Journal of Distance Education, 8(1), 60–71. Cheng-Chang, P. (2003). System use of WebCT in the light of the technology ac- ceptance model: A student perspective. Unpublished doctoral dissertation, Uni- versity of Central Florida. Dissertation Abstract International, AAT 3094813. Choi, H. J, & Johnson, S. D. (2005). The effect of context-based video instruc- tion on learning and motivation in online courses. The American Journal of Distance Education, 19(4), 215–227. Chou, C. C. (2001). Model of learner-centered computer-mediated interaction for collaborative distance learning. Paper presented at the 24th National Conven- Journal of Research on Technology in Education 4�3 tion of the Association for Educational Communications and Technology. ERIC Document # ED 470075. Cronbach, L. (1951). Coefficient alpha and the internal structure of tests. Psy- chometricka, 16, 297–334. Dabbagh, N. H., & Schmitt, J. (1998). Redesigning instruction through web-based course authoring tools. Educational Media International, 35(2), 106–110. Freddolino, P. P., & Sutherland, C. A. (2000). Assessing the comparability of classroom environments in graduate social work education delivered via interactive instructional television. Journal of Social Work Education, 36(1), 115–129. Fredericksen, E., Pickett, A., Shea, P., Pelz, W., & Swan, K. (2000). Student sat- isfaction and perceived learning with online courses: Principles and examples from the SUNY learning network. Journal of Asynchronous Learning Networks, 4, 2. Retrieved March 9, 2006, from http://www. aln.org/publications/jaln/ v4n2/ index.asp/ Freeman, A. S., & Field, W. D. (2004). Student perceptions of web-based sup- plemental instruction. The Journal of Technology Studies, 30(4), 25–31. Fulford, C. P., & Zhang, S. (1993). Perceptions of interaction: The critical predictor in distance education. The American Journal of Distance Education, 7(3), 8–21. Furger, R. (1998). Does Jane compute? Preserving our daughters’ place in the cyber revolution. New York: Warner Books. Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. The American Journal of Dis- tance Education, 19(3), 133–148. Gao, T. (2001). The effects of different levels of interaction on the achievement and attitudes of college students in a web-based learning environment. Unpublished doctoral dissertation, Purdue University. Dissertation Abstract International, AAT 3075664. Goodell, J., & Yusko, B. (2005). Overcoming barriers to students participation in online discussion. Contemporary Issues in Technology and Teacher Education, 5(1), 77–92. Hiltz, S. R. (1995). The virtual classroom: Learning without limits via computer networks. Norwood, NJ: Ablex Publishing. Hutchins, M. H. (2001). Enhancing the business communication course through WebCT. Business communication quarterly, 64(3), 87–94. Kearsley, G. (2000). Online education: Learning and teaching in cyberspace. Bel- mont, CA: Wadsworth/Thomson Learning. Keinath, L. L. (1991). The role of human interaction in distance education. Un- published doctoral dissertation, Wayne State University. Dissertation Abstract International, AAT 9215108. Kendall, M. (2001). Teaching online to campus-based students: The experience of using WebCT for the community information module at Manchester Met- ropolitan University. Education for information, 19, 325–346. 4�4 Summer �008: Volume 40 Number 4 Knowlton, D. S. (2000). A theoretical framework for the online classroom: A defense and delineation of a student-centered pedagogy. New Directions for Teaching and Learning, 84, 5–13. Ko, S., & Rossen, S. (2001). Teaching online: A practical guide. Boston: Hough- ton Mifflin Company. Lai, H. (2004). Evaluation of WWW on-line courseware usability and tools. Un- published doctoral dissertation, University of Idaho. Dissertation Abstract International, AAT 3123848. La Pointe K. D., & Gunawarndena, C. (2004). Developing, testing and refin- ing of a model to understand the relationship between peer interaction and learning outcomes in computer-mediated conferencing. Distance Education, 25(1), 83–106. LeRouge C., Blanton, E., & Kittner, M. (2002). The other semi-virtual team: Using collaborative technologies to facilitate student team proj- ects. In Proceedings of the international academy for information manage- ment (IAIM) 17th annual conference: International conference on infor- matics education research (ICIER), Barcelona, Spain. Available online: (ERIC Document Reproduction Service NO. ED 481742). Retrieved September 4, 2007, from http://www.eric.ed.gov/ERICWebPortal/cus- tom/portlets/recordDetails/detailmini.jsp?_nfpb=true&_&ERICExt- Search_SearchValue_0=ED481742&ERICExtSearch_SearchType_0=eric_ accno&accno=ED481742 Lim, C. K. (2001). Computer self-efficacy, academic self-concept, and other predictors of satisfaction and future participation of adult distance learners. American Journal of Distance Education, 15(2), 41–51. Maurino, P. S. (2006). Looking for critical thinking in online threaded discus- sions. Journal of Education Technology Systems, 35(3), 241–260. McCombs, B. L., & Vakili, D. (2005). A learner-centered framework for e- learning. Teachers College Record, 107(8), 1582–1600. McCombs, B. L., & Whisler, J. S. (1997). The learner-centered classroom and school: Strategies for increasing student motivation and achievement. San Fran- cisco: Jossey-Bass. McGreal, R. (1998). Integrated distributed learning environments (IDLEs) on the internet: A survey. Educational Technology Review, 9, 25–31. Miller, C. (2007). Enhancing web-based instruction using a person-centered model of instruction. The Quarterly Review of Distance Education, 8(1), 25–34. Miller, W., King, J., & Doerfert, D. (1996). Evaluating interaction in the dis- tance education setting. (Abstract). NACTA Journal, 40(3), 22. Moore, M. G. (1989). Three types of interaction. The American Journal of Dis- tance Education, 3(2), 1–6. Moore, G. M. (2002). What does research say about the learners using com- puter-mediated communication in distance learning. The American Journal of Distance Education, 16(2), 61–64. Moore, M. G., & Kearsley, G. (1996). Distance education: A systems view. Bel- mont, CA: Wadsworth Publishing Company. Journal of Research on Technology in Education 4�5 Morss, D. A. (1999). A study of student perspectives on web-based learning: WebCT in the classroom. Internet Research: Electronic Networking Applica- tions and Policy, 9(5), 393–408. Nilles, Y. J. (2002). Student achievement and satisfaction with asynchronous and synchronous learning in two horticulture courses. Unpublished masters thesis, Iowa State University, Ames, Iowa. ISU 2002 N55. Retrieved September 4, 2007, from http://www.lib.iastate.edu:81/ipac20/ipac.jsp?session=1B889P48 446X7.22221&menu=search&aspect=basic_search&npp=10&ipp=20&spp =20&profile=parks&ri=&index=.GW&term=Student+achievement+and+sat isfaction+with+asynchronous+&x=0&y=0&aspect=basic_search Olson, M. T., & Wisher, A. R. (2002). The effectiveness of web-based instruc- tion: An initial inquiry. International Review of Research in Open and Dis- tance Learning, 3(2), 1–17. Retrieved August, 2007, from http://www.irrodl. org/index.php/irrodl/issue/view/14 Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research. San Francisco: Jossey-Bass. Perez Cereijo, M. V. (2001). Factors influencing how students value asynchronous Web-based courses. Unpublished doctoral dissertation, University of North Texas. Dissertation Abstract International, AAT 9989796. Porter, L. (1997). Creating the virtual classroom: Distance learning with the In- ternet. New York: J. Wiley & Sons. Ritchie, H., & Newby, T. J. (1989). Classroom lecture/discussion via live tele- vised instruction: A comparison of effects on student performance, attitude and interaction. The American Journal of Distance Education, 3(3), 36–45. Robertson, T., & Klotz, J. (2001). Confronting design problems in developing online courses in higher education. Paper presented at the annual meeting of the Mid-South Educational Research Association in Little Rock, AK, November 14-16, 2001. (ERIC Document Reproduction Service NO. ED 459674). Rost, B. (2000). Interaction analyzed in traditional and satellite-delivered ex- tension education presentations. Journal of Extension, 38(1). Retrieved Sep- tember 4, 2007, from http://www.joe.org/joe/2000february/rb3.html Sabine, S. (1998). To integrate your language web tools – CALL WebCT. Paper presented at the natural language processing and industrial application (NLP & IA/TAL & AI) – Special Accent on Language Learning, Moncton, New Brunswick, CA. (ERIC Document Reproduction Service NO. ED 422899). Retrieved September 4, 2007, from http://www.eric.ed.gov/ERICWebPor- tal/custom/portlets/recordDetails/detailmini.jsp?_nfpb=true&_&ERICExt- Search_SearchValue_0=ED422899&ERICExtSearch_SearchType_0=eric_ accno&accno=ED422899 Shashaani, L. (1994) Gender differences in computer experience and its influ- ence on computer attitudes. Journal of Computing Research, 11(4), 347–367. Spender, D. (1995). Nattering on the net: Women, power and cyberspace. North Melbourne: Spiniflex Press Pty Ltd. Spiliotopoulos, V., & Carey, S. (2005). Investigating the role of identity in writing using electronic bulletin boards. The Canadian Modern Language Re- view, 62, 1, 87–109. 4�6 Summer �008: Volume 40 Number 4 Stavredes, M. T. (2002). A case study of student experiences using alternative systems of delivery: Face-to-face versus interactive television video and computer- mediated communication. Unpublished doctoral dissertation, University of Minnesota. Dissertation Abstract International, AAT 3037491. Tabachnick, G. B., & Fidell, S. L. (2001). Using multivariate statistics. Boston: Allyn & Bacon. Ullman, E. (1997). Close to the machine: Technophilia and its discontents. San Francisco: City Lights Books. Verduin, J. R., Jr., & Clark, T. A. (1991). Distance education: The foundations of effective practice. San Francisco: Jossey-Bass. White, C. (2005). Contribution of distance education to the development of individual learners. Distance Education, 26(2), 165–181. Willems, J. (2005). Flexible learning: Implications of “when-ever”, “where-ever” and “what-ever”. Distance Education. 26(3), 429–435. Zhang, Y. (2005). Distance learning rReceptivity–Are they ready yet? The Qual- ity Review of Distance Education, 6(1), 45–53. Zhao, Y., Lei, J., Yan, B., Lai, C. & Tan, H. S. (2005). What makes a differ- ence? A practical analysis of research on the effectiveness of distance educa- tion. Teachers College Record, 107(8), 1836–1884.