CDIE667961 81..98 Student perceptions of satisfaction and anxiety in an online doctoral program Doris U. Bolligera* and Colleen Halupab aDepartment of Professional Studies, University of Wyoming, Laramie, Wyoming, USA; bHealth Education Program, A.T. Still University, Kirksville, Missouri, USA (Received 29 March 2011; final version received 11 July 2011) Eighty-four students in an online health education doctoral program taking the first course in the program over one year (four quarters) were surveyed in regards to their computer, Internet, and online course anxiety, and overall course satisfaction. An 18-item anxiety tool with domains in computer, Internet, and online learning was administered in the first and last weeks of an educational research course to assess for changes in student anxiety. A 24-item satisfaction tool with domains regarding the instructor, technology, setup, interaction, out- comes, and overall satisfaction was administered at the end of the course. Results show a significant negative correlation between anxiety and student sat- isfaction. Student anxiety levels were in the relatively moderate range; changes in anxiety levels over time were not significant. Participants who felt anxious when using computers or the Internet, or when taking online courses experi- enced anxiety with other domains. Keywords: student satisfaction; anxiety; online doctoral students Online education has experienced an explosive growth rate over the past few years. In fall 2008, 4.6 million students in the USA were enrolled in at least one online course—a 17% increase from the number reported the previous year. The increase was much higher than the overall student body growth (1.2%) in higher education (Allen & Seaman, 2010b). In fall 2009, the number of online students increased by almost 1 million—to 5.6 million (a 21% increase). The growth rate for enrollment in online courses is not expected to drop off in the foreseeable future (Allen & Sea- man, 2010a). When discussing online education, several benefits come to mind. The key aspects are flexibility and convenience (Bolliger, 2003; Rekkedal & Qvist-Eriksen, 2004). Online education allows individuals who cannot participate in residential courses access to higher education (Bower & Kamata, 2000). It opens up potential professions and college programs throughout the country to nontraditional students, as well as traditional students who want to live at home to minimize educational costs. Many nontraditional students prefer online learning over campus-based courses (Halsne & Gatta, 2002). There are also several drawbacks to the online learning environment, such as possible lack of face-to-face social interaction, academic and technical skills, *Corresponding author. Email: dorisbolliger@gmail.com Distance Education Vol. 33, No. 1, May 2012, 81–98 ISSN 0158-7919 print/ISSN 1475-0198 online � 2012 Open and Distance Learning Association of Australia, Inc. http://dx.doi.org/10.1080/01587919.2012.667961 http://www.tandfonline.com motivation, time, limited access to resources, and technical difficulties (Muilenburg & Berge, 2005). Other potential issues are low student performance and satisfaction (Navarro, 2000; Simonson, Smaldino, Albright, & Zvacek, 2009). The issue of course satisfaction of doctoral students in online courses or pro- grams in general is not well documented. Particularly, not much attention has been paid to the satisfaction of online doctoral students who are working health-care pro- fessionals. The American Association of Colleges of Nursing (2011) reported the demand for doctoral nursing programs (both research- and practice-based) “increased significantly” in 2010 ({1). Projects of the Carnegie Foundation with a focus on doctoral degrees (e.g., the Carnegie Initiative on the Doctorate, the Carne- gie Project on the Education Doctorate, or the Preparation for the Professions Pro- gram) sparked discussions across the country about the preparation and quality of doctoral students at US universities (Carnegie Foundation for the Advancement of Teaching, n.d.). Subsequent publications such as that by Walker, Golde, Jones, Bue- schel, and Hutchings (2008) highlight the importance of this issue. Although younger students may have grown up using various types of computer technologies, most students in online doctoral programs are nontraditional stu- dents—many are between 45 and 60 years of age. According to the Atlantic Interna- tional University (n.d.), a university that offers online undergraduate and graduate programs, the average age of their bachelor to doctoral degree students is 42 years. These students did not grow up using these types of technologies and many obtained their undergraduate degrees 25 or more years previously, before online programs were available. In contrast, the median age of individuals awarded doctoral degrees was 33.0 years in 2004–2005 and 32.7 years in 2005–2006 (Snyder & Dillow, 2010). Because this segment of the population typically consists of older individuals who are returning to higher education to obtain a doctoral degree after spending a considerable amount of time in the field, returning graduate students may be less comfortable with the online delivery format than their traditional counterparts. Theoretical framework Student satisfaction Student satisfaction is defined as the learner’s perception of the value of educational experiences in an educational setting (Astin, 1993). Student satisfaction is an impor- tant issue and should be considered in the evaluation of course and program effec- tiveness. It is one of the five pillars of quality in online education, together with learning effectiveness, access, faculty satisfaction, and institutional cost- effectiveness (Sloan Consortium., 2002); however, it is a complex construct as it comprises many factors (Wickersham & McGee, 2008). Student satisfaction is an important concept because it may ultimately lead to higher levels of motivation, engagement, learning, performance, and success (Sahin & Shelley, 2008; Wicker- sham & McGee, 2008). Factors associated with student satisfaction in distance learning are flexibility, computer expertise, and usefulness (Sahin & Shelley, 2008). In the online environ- ment several factors, such as instructor behavior, reliable technology, and interactiv- ity influence student satisfaction (Bolliger & Martindale, 2004; Dennen, Darabi, & Smith, 2007). Student perceptions of task value and self-efficacy, social ability, instructional design issues, and the quality of the delivery system and multimedia instruction are also important elements (Liaw, 2008; Lin, Lin, & Laffey, 2008). 82 D. U. Bolliger and C. Halupa Herbert (2006) investigated the quality of online instruction in undergraduates and graduates at a medium-sized Midwestern university. He found the most impor- tant variable in student satisfaction was responsiveness of the faculty to student needs. In a sample of 276 students at a large university in northern Taiwan, Shee and Wang (2008) found that learner interface is the most important aspect of stu- dent satisfaction. Furthermore, Liaw (2008) found that self-efficacy is an important dimension of student satisfaction. Computer-related problems (Frankola, 2001) and failure to understand online media (Herbert, 2006) are two of the primary reasons for adult learners drop- ping online courses. Anxiety associated with technology can have a negative influence on student performance and satisfaction (Sun, Tsai, Finger, Chen, & Yeh, 2008). Technological anxiety In an information age, most learners in postsecondary education institutions need to be computer literate to participate fully and maximize their learning. Basic technol- ogy skills include being familiar with operating systems; using word processing, spreadsheets, and databases; working with communication and presentation software programs; and navigating the Internet (Kay, 2008). Computer anxiety Anxiety is defined as a conscious fearful emotional state. Computer-related anxi- ety has been defined as someone being “uneasy, apprehensive, or fearful” about using computers (Igbaria & Parasuraman, 1989, p. 375). Beckers and Schmidt (2001) suggested “computer anxiety is a multidimensional construct” (p. 46) and identified several constructs. These include positive and negative beliefs about computers, insecurity, nervousness, apprehension, fear, intimidation, and hesitation (Beckers & Schmidt, 2001; Heinssen, Glass, & Knight, 1987; Saadé & Kira, 2007). Negative emotions associated with computer use can affect the overall learning experience. “Frustration, confusion, anger, anxiety, and similar emotional states can affect not only the interaction itself, but also productivity, learning, social relation- ships, and overall well-being” (Saadé & Kira, 2007, p. 1193). Research has shown learners with high levels of computer anxiety are at a disadvantage compared to individuals with lower anxiety levels (Saadé & Kira, 2007). Computer anxiety has been reported to be quite high in college students (Rosen & Weil, 1995; Saadé & Kira, 2007). Researchers pointed out that even though the number of individuals who have self-reported computer phobia has decreased, many adult learners are still fearful of computers. Rosen and Weil (1995) reported three anxiety factors for undergraduate college students in the USA: interactive computer learning anxiety, consumer technology anxiety, and passive computer learning anxi- ety. They noted that almost 32% of students reported anxiety about a variety of computer applications, including taking online courses and computer-scored tests. Conrad and Munro (2008) noted negative user attitudes can be caused by computer anxiety. Igbaria and Parasuraman (1989) developed a conceptual model of computer anxiety and attitudes toward computers that included demographics, personality Distance Education 83 traits, and cognitive styles. They pointed out that computer anxiety influences indi- viduals’ attitudes toward computers. Kay (2008), who investigated the relationship between several types of emotions and acquisition of computer knowledge, found there is an inverse relationship between anger and anxiety and computer knowl- edge gain. Internet anxiety Online learners need to have the skills to be successful in the online environment. They need to have not only computer skills but also proper skills to navigate the Internet and utilize appropriate resources. Students have different levels of computer literacy including Internet navigation skills (Montelpare & Williams, 2000), and some may “find the Internet confusing and intimidating” (Simonson et al., 2009, p. 235). Researchers reported that online learners encountered confusing instructor feedback and instructions on course Web sites (Hara & Kling, 2001), which caused anxiety. Experts have found there is a positive correlation between anxiety and negative attitudes toward technology (Conrad & Munro, 2008; Rosen, Sears, & Weil, 1987). Halupa (2004) examined attitudes toward computers, the Internet, and online learn- ing, and reported that all respondents had a slightly positive attitude toward these three factors, and that age had a significant effect on attitudes toward the Internet— older members demonstrated a less positive attitude toward the Internet. However, age was not a significant difference in determining attitudes toward computers or online learning. Online course anxiety Because the number of students who take online courses in higher education has increased dramatically over the past few years, computer-related anxiety remains an important issue to educators (Saadé & Kira, 2007). Most online learners have to utilize computers, the Internet, and other software programs such as course management systems. Hara and Kling (2001) reported students experienced sev- eral types of distress in an online course: anxiety, frustration, and confusion. Many instructors expect students to interact frequently with online content and with one another using information and communication technologies. For these students, the fear of computer technologies may be compounded. Frankola (2001) and Diaz (2002) found 20–50% of undergraduate students were not retained in traditional classes; Diaz estimated online student attrition was another 10–20%. Anxiety and dissatisfaction with online learning contributes to this problem. Methods Purpose There is a gap in the literature concerning student anxiety and student satisfaction in online doctoral programs. The purpose of this study was to determine doctoral students’ technological anxiety and their satisfaction in the online environment. The research questions were: 84 D. U. Bolliger and C. Halupa (1) How satisfied are doctoral students with a required writing-intensive research-design course that is delivered online? (2) How much technological anxiety do participants exhibit in the online envi- ronment at the beginning of the course? (3) How, if so, will participants’ anxiety change during the semester? (4) Is there a correlation between technological anxiety and student satisfaction? (5) Do participants who feel anxiety in one domain tend to feel anxiety in other domains? Do participants who feel satisfied in one domain tend to feel satis- fied in other domains? Setting and sample The study was conducted at a small, accredited university in the USA. The univer- sity offers online graduate degree programs and professional certificates in a variety of health-care fields and has five colleges at two locations in the USA. The institu- tion’s enrollment in spring 2009 exceeded 3400 students; most were female (60%) and approximately 17% were classified as under-represented minorities. Most stu- dents have various health-care backgrounds, including medicine and osteopathy, allied health, mental health, and nursing, while some have a secondary education background. One hundred and three students were enrolled in one graduate-level online course over the course of four quarters during one academic year. Of these, 84 stu- dents (81.6%) completed the course. The four-credit hour course is taught every quarter and its grading method is a letter grade. It is a required introductory research-design course for admitted doc- toral students in the health education field. In this writing-intensive course, students need to develop a solid draft of their applied research dissertation proposal that is shared with their dissertation committee members. Course requirements include the selection of committee members, completion of weekly assignments (e.g., topic selection, annotated bibliography, research topic description, literature review), and timely participation in at least one discussion thread. Students work individually on assignments, share their submissions with peers, and receive formative feedback from peers and the instructor at several stages. The instructor was a seasoned professional who had taught online graduate-level courses for over six years and had extensive experience in supervis- ing doctoral-level research; she had also previously served as program chair. Instruments The researchers developed two instruments to collect data from participants in order to investigate student satisfaction and anxiety with their online courses. The student satisfaction questionnaire was based on Bolliger and Martindale (2004) and has 24 five-point Likert scale questions ranging from 1 (strongly disagree) to 5 (strongly agree) that address the following student satisfaction elements: (a) instructor, (b) technology, (c) course setup, (d) interaction, (e) outcomes, and (f) overall satisfaction. These items were derived directly from the literature addressing elements integral to student satisfaction in online environments (Bolliger & Martindale, 2004; Herbert, 2006; Liaw, 2008; Lin et al., 2008; Sahin & Shelley, 2008; Shee & Wang, 2008). The instrument also included three open-ended and seven demographical questions. Distance Education 85 The instrument was piloted prior to the data collection phase and had an internal reliability coefficient of .92. Cronbach’s alpha coefficients for the subscales were acceptable (ranging from .69 to .86) with the exception of the technology subscale (a = .53). All students in two courses (a total of 67) completed the questionnaire. After the pilot testing, some items were slightly modified based on the feedback of respondents in order to improve their clarity. The researchers developed a course anxiety scale based on constructs defined in the literature (Beckers & Schmidt, 2001; Heinssen et al., 1987; Kay, 2008; Saadé & Kira, 2007). Three elements were included: (a) computers, (b) the Internet, and (c) online courses. Constructs included insecurity/confidence, anxiety, relaxation/ner- vousness, excitement/apprehension, enjoyment/aversion/fear, intimidation, confu- sion, and empowerment. The instrument included 18 five-point Likert scale items ranging from 1 (strongly disagree) to 5 (strongly agree). Cronbach’s alpha coefficient was used to determine the instruments’ internal reliability after the data collection phase had concluded. The student satisfaction questionnaire’s reliability was high (a = .91), and the reliability of all subscales was acceptable: (a) instructor (a = .82), (b) technology (a = .76), (c) course setup (a = .60), (d) interaction (a = .60), (e) outcomes (a = .72), and (f) overall satisfaction (a = .85). The internal reliability coefficient for the anxiety scale was high (a = .93). All three subscales also had a high internal correlation coefficient: (a) computers (a = .89), (b) Internet (a = .86), and (c) online courses (a = .90). Data collection All students enrolled in the course were invited to participate in the study by their instructor via email. The instructor assigned identifying codes to all students and provided them with links to the instruments, which were housed in an online utility tool located at a different institution and managed by one of the researchers. To keep student responses confidential the course instructor did not have access to the data. All participants were informed about the data collection methods prior to its commencement by providing them with details in the informed consent form; they were assured that all responses were confidential. The anxiety scale was made available during the second and last weeks of each quarter. The student satisfaction questionnaire was available during the last two weeks of each quarter. Students were reminded via email to complete the instru- ments. The response rate for both anxiety scales was 48.8%. Of the respondents, 41 students completed the anxiety scale twice, and 46 students (54.8%) completed the anxiety and satisfaction instruments at the end of the semester. Data analysis All personal identifiers were removed before the data analysis phase began. Fre- quencies and descriptive statistics were generated before six negative items on the satisfaction instrument and seven positive items on the anxiety scale were recoded. Descriptive statistics were generated to evaluate learners’ satisfaction and anxiety. Paired-samples t-tests were conducted to evaluate whether anxiety levels of participants changed over time. Correlation coefficients were computed among participants’ total anxiety and satisfaction scores, and instruments’ subscales. Qualitative data were analyzed using open coding. The data were closely exam- ined in order to develop categories for participants’ responses. Responses to the 86 D. U. Bolliger and C. Halupa three open-ended questions were sorted into discrete parts after comparing similari- ties and differences (Bogdan & Biklen, 1998; Erlandson, Harris, Skipper, & Allen, 1993; Strauss & Corbin, 1998). Results Most participants were female (63.0%) and almost 61% were Caucasian. Their ages ranged from 24 to 59 years (M = 42.33) (Figure 1). Almost all respondents had earned a master’s degree (95.7%); only 4.3% had been awarded a doctoral degree. Most of the participants (78.3%) had previous health-care experience. The number of online courses taken prior to the research-design course ranged from 0 to 50. Of the participants, 32.6% were new to online learning; the research- design course was their first online course (Figure 2), and 10.9% of individuals were still relatively new to the online environment as they had taken only one prior online course. The majority (71.7%) indicated they would not have been able to take the course if it had not been offered in the online setting. When individuals were prompted to share the most important reason they enrolled in the course, they indicated the availability of the course (and program) in the online format gave them much needed flexibility in their already busy lives. Many respondents mentioned they had busy work schedules, long commutes, and travel requirements. Several mentioned prior family commitments. Some mentioned the convenience and affordability of the programs. Another factor was that the course was a program requirement and that their overall goal was to obtain their degrees. Participants were also focused on their final goal—completing the dissertation successfully. They mentioned practical reasons for enrolling in the course, such Figure 1. Age composition of sample (N = 42). Distance Education 87 as learning about applied research foundations, understanding dissertation require- ments, and obtaining support throughout the proposal-writing process. Others expressed reasons for enrolling that were related directly to their professions. They wanted to become better professionals by being able to help others and participate in professional development to obtain new skills. A few individuals needed a new challenge, whereas some participants thought obtaining a doctoral degree would assist them to advance their careers or serve as a good marketing tool. Additional comments pertained to the programs at the university. Individuals liked the idea of an applied dissertation, found the length of the program appealing, or had good experiences with program coordinators. They also mentioned that the university and programs had good reputations. Research question 1: How satisfied are doctoral students with a required writing- intensive research-design course that is delivered online? Of the 24 items, 20 items had a mean score at or above 4.00. Only four items had a mean score below 4.00 (Table 1). Over 90% of respondents strongly agreed or agreed with the following items: item 6 (97.8%); item 20 (95.7%); items 4 and 24 (95.6%); items 3, 7, 10, and 21 (93.5%); item 2 (93.4%); and items 22 and 23 (91.3%). Only six items yielded less than 80% agreement; these were items 5 and 12 (78.2%); item 15 (76.1%); and items 8, 9, and 14 (73.9%). All subscales had a mean score above 4.00. The instructor subscale yielded the highest mean score (M = 4.47) of all six subscales, closely followed by the overall subscale (M = 4.43). The interaction subscale had the lowest mean score (Table 2). These results show that participants were overwhelmingly satisfied with the online research-design course. Figure 2. Number of online courses completed by respondents (N = 46). 88 D. U. Bolliger and C. Halupa T ab le 1 . M ea n sc o re s an d st an d ar d d ev ia ti o n s fo r sa ti sf ac ti o n sc al e it em s. It em M S D 1 . C la ss as si g n m en ts w er e cl ea rl y co m m u n ic at ed to m e. 4 .1 1 .9 9 2 . F ee d b ac k an d ev al u at io n o f p ap er s, te st s, an d o th er as si g n m en ts w as g iv en in a ti m el y m an n er . 4 .6 1 .7 7 3 . T h e in st ru ct o r m ak es m e fe el th at I am p ar t o f th e cl as s an d b el o n g . 4 .5 9 .7 8 4 . I am d is sa ti sfi ed w it h th e ac ce ss ib il it y an d av ai la b il it y o f th e in st ru ct o r. (r ec o d e) 4 .5 7 .6 6 5 . I am sa ti sfi ed w it h th e u se o f “t h re ad ed ” o n li n e d is cu ss io n s an d /o r fo ru m s. 3 .9 1 1 .0 7 6 . I am sa ti sfi ed w it h th e u se o f em ai l. 4 .3 3 .5 2 7 . I am sa ti sfi ed w it h h o w I am ab le to n av ig at e w it h in W eb C T (t h e co u rs e m an ag em en t sy st em ). 4 .1 7 .7 1 8 . I am d is sa ti sfi ed w it h d o w n lo ad ti m es o f re so u rc es in W eb C T . (r ec o d e) 4 .0 2 .8 0 9 . I am sa ti sfi ed w it h th e fr eq u en cy I h av e to at te n d cl as s (e .g ., lo g in to th e co u rs e) . 3 .8 3 .9 0 1 0 . I am sa ti sfi ed w it h th e fl ex ib il it y th is co u rs e af fo rd s m e. 4 .5 0 .6 2 11 . I am d is sa ti sfi ed w it h th e le v el o f se lf -d ir ec te d n es s I am g iv en . (r ec o d e) 4 .1 5 .7 9 1 2 . I am sa ti sfi ed w it h h o w m u ch I en jo y w o rk in g o n p ro je ct s b y m y se lf . 4 .1 1 .9 2 1 3 . I am sa ti sfi ed w it h th e q u al it y o f in te ra ct io n b et w ee n al l in v o lv ed p ar ti es . 4 .0 2 .9 8 1 4 . I am d is sa ti sfi ed w it h th e p ro ce ss o f co ll ab o ra ti o n ac ti v it ie s d u ri n g th e co u rs e. (r ec o d e) 3 .8 9 .9 5 1 5 . I am sa ti sfi ed w it h h o w m u ch I co u ld re la te to th e o th er st u d en ts . 3 .9 1 .9 4 1 6 . I am sa ti sfi ed w it h h o w co m fo rt ab le w it h p ar ti ci p at in g I b ec am e. 4 .2 0 .6 5 1 7 . I am sa ti sfi ed w it h th e le v el o f ef fo rt th is co u rs e re q u ir ed . 4 .2 4 .7 4 1 8 . I am d is sa ti sfi ed w it h m y p er fo rm an ce in th is co u rs e. (r ec o d e) 4 .0 2 1 .0 2 1 9 . I w il l b e sa ti sfi ed w it h m y fi n al g ra d e in th e co u rs e. 4 .0 7 .9 0 2 0 . I am sa ti sfi ed w it h h o w I am ab le to ap p ly w h at I h av e le ar n ed in th is co u rs e. 4 .4 6 .5 9 2 1 . I am sa ti sfi ed en o u g h w it h th is co u rs e to re co m m en d it to o th er s. 4 .4 3 .6 2 2 2 . C o m p ar ed to o th er co u rs e se tt in g s, I am le ss sa ti sfi ed w it h th is le ar n in g ex p er ie n ce . (r ec o d e) 4 .2 6 .6 8 2 3 . M y le v el o f sa ti sf ac ti o n in th is co u rs e w o u ld en co u ra g e m e to en ro ll in an o th er co u rs e in th is se tt in g . 4 .4 6 .6 6 2 4 . O v er al l, I am sa ti sfi ed w it h th is co u rs e. 4 .5 9 .5 8 N o te : N = 4 6 . Distance Education 89 Most satisfying course aspects The most often quoted satisfying elements pertained to student learning: learning about the proposal-writing or research process and the amount of learning that took place in nine weeks. Respondents were satisfied with their progress and the amount of work they produced. Not only did they feel more confident about conducting research but also they were excited about starting the dissertation process. One per- son, who had completed an online course for the first time wrote: “I am most satis- fied with being able to get through the first online course. I have been able to not only keep up but have actually excelled.” One emerging theme pertained to the instructor. Students commented about the instructor’s timely feedback and responses to questions, helpfulness, supportiveness, and openness. Respondents indicated the instructor motivated and encouraged them throughout the course. Three individuals commented about their anxieties: “This is a new experience, and I was somewhat apprehensive. [The instructor] was able to take my anxiety levels down.” And “[the] feedback was great, decreased my anxi- ety.” Another student wrote: “It eased ALL of my fears of beginning, researching, and completing the dissertation.” Interaction with the instructor and peers was important to respondents. Discus- sions with other students in similar situations or with similar questions or problems were valued by learners. The collaborative effort by peers and the instructor was mentioned by several individuals; one person wrote: “I think that the participation from the entire class helps to form a support system. The discussion portion is my favorite!!” Others enjoyed the peer review process and appreciated the fact that they were able to voice their opinions. Several learners were satisfied with the course content because it was “applica- ble,” “engaging,” and “challenging.” They described the layout and pace as appro- priate. Six respondents indicated that the flexibility of the online environment was most satisfying as it allowed them to participate “when possible,” left them “in charge of the time,” and enabled them “to interact with individuals who are located in geographically diverse areas.” Elements that could increase learner satisfaction Of the respondents, 22 did not have any suggestions as to how their satisfaction with the course could be improved. However, several shared elements pertaining to instructional and supportive resources, interaction, and technology. Learners would have appreciated more detailed instructions about assignments, grades, and Table 2. Mean scores and standard deviations for satisfaction subscales. Subscale M SD Instructor (1–4) 4.47 .65 Technology (5–8) 4.11 .61 Setup (9–12) 4.15 .55 Interaction (13–16) 4.01 .60 Outcomes (17–20) 4.20 .61 Overall (21–24) 4.43 .53 Note: N = 46. 90 D. U. Bolliger and C. Halupa participation. Others needed additional feedback from the instructor and more time to complete assignments. Seven students either were not satisfied with their perfor- mance or felt they did not complete all requirements successfully. Some of them indicated they were not able to devote enough time to the course or to feel at ease during the quarter. Some participants wanted more interaction and would have liked to get to know their peers better. One person suggested a quarterly optional meeting for course participants. In contrast, two learners thought the course included too many discussions. Resources that would have been helpful to learners were tuto- rials for statistics, statistic software programs, and examples of dissertations completed at the university. Two students experienced difficulties with the course management system when submitting assignments or accessing graded assignments. Research question 2: How much technological anxiety do participants exhibit in the online environment at the beginning of the course? Possible scores on the anxiety scale were 18–90 based on 18 items ranging from 1 (strongly disagree) to 5 (strongly agree). On each of the three subscales, possible scores ranged from 6 to 30. At the beginning of the quarter, participants’ scores on the anxiety scale ranged from 18 to 58 (M = 33.46; SD = 8.70). Subscale scores ranged from 6 to 23 with mean scores ranging from 10.24 to 12.95. Interestingly, the highest level of anxiety was not related to computers or the Internet but to online courses. See Table 3 for mean scores and standard deviations for all scale items. Results indicate participants’ anxiety levels were in the lower range and therefore relatively moderate. Research question 3: How, if so, will participants’ anxiety change during the semester? Participants’ scores on the first anxiety scale ranged from 18 to 58 (M = 33.46; SD = 8.70) and from 18 to 54 (M = 34.12; SD = 9.34) on the second scale. On the sub- scales, scores ranged from 6 to 23 with mean scores ranging from 10.24 to 12.95 (Table 4). Overall mean score for the anxiety scale and mean scores for the comput- ers and the Internet subscales increased slightly over time. In contrast, the mean score for the online course subscale decreased slightly. However, results of several paired-samples t-tests showed that changes in participants’ anxiety levels over time were not statistically significant. Research question 4: Is there a correlation between technological anxiety and student satisfaction? Correlation coefficients were computed among anxiety and satisfaction scores. The results show there is a significant correlation between anxiety and satisfaction, r(44) = .50, p < .001. Students with lower technological anxiety scores experienced higher levels of satisfaction in the online environment than learners with higher anxiety scores (Figure 3). Distance Education 91 Research question 5: Do participants who feel anxiety in one domain tend to feel anxiety in other domains? Do participants who feel satisfied in one domain tend to feel satisfied in other domains? Correlation coefficient analyses were conducted among the three anxiety subscales and the six satisfaction subscales. Using the Bonferroni approach to control for Type I error across the three correlations, a p-value of less than .017 (.05/3 = .017) was required for significance on the anxiety scales. The results of the correlational analyses show all three correlations were statistically significant (Table 5). Partici- pants who felt anxious when using computers or the Internet, or when taking online courses experienced anxiety with other domains. For the satisfaction subscales, a p-value of less than .003 (.05/15 = .003) was required for significance in order to control for Type I error across the 15 correla- tions. Table 6 shows 12 out of the 15 correlations were statistically significant. In general, participants who were satisfied with technology and setup of, and overall experience with, their online courses were satisfied with other elements of the online course. Three correlations among subscales were not statistically significant: instructor-interaction, instructor-outcomes, and interaction-outcomes. Table 4. Mean scores and standard deviations on subscales. Subscale Anxiety 1 Anxiety 2 M SD M SD Computer (1–6) 10.24 3.40 10.73 4.06 Internet (7–12) 10.27 3.09 10.93 3.40 Online course (13–18) 12.95 4.04 12.46 4.38 Note: N = 41. Table 3. Mean scores and standard deviations for anxiety scale items. Item M SD 1. I am insecure about my computer skills. 1.73 .92 2. I am anxious when I work on computers. 1.59 .74 3. I am quite relaxed when I work with computers. (recode) 1.78 .79 4. I am apprehensive about working on computers. 1.61 .70 5. I avoid working on computers. 1.29 .46 6. I am less intimidated by computers than most other people I know. (recode) 2.24 .97 7. I feel confident about navigating the Internet. (recode) 1.44 .63 8. I get anxious when I am required to use Internet resources. 1.76 .83 9. I get nervous about getting lost in cyberspace. 1.59 .77 10. I get excited about using the Internet. (recode) 2.07 .69 11. I enjoy browsing the Internet. (recode) 1.66 .62 12. I get confused when working with the Internet. 1.76 .80 13. I am confident about working in the online environment. (recode) 1.80 .68 14. I get anxious when I think about logging into my online course. 2.07 1.06 15. I get nervous when I am required to participate in online discussions. 2.17 1.05 16. I am apprehensive about enrolling in online courses. 1.93 .88 17. I am scared that someone will misinterpret my text-based messages in the online environment. 2.73 .98 18. I feel empowered in my online course. (recode) 2.24 .92 Note: N = 41. 92 D. U. Bolliger and C. Halupa Discussion Almost 62% of the students in this research study were over the age of 40 years; the mean age was 42 years. This reflects the average age of students reported by other online universities offering graduate programs, such as the Atlantic Interna- tional University (n.d.). Over 32% of the students in the study had never taken an online class before. For an additional 10.9% of this population, this was only the second online course they had taken. This could be attributed to the age of this Table 5. Correlations among the three anxiety subscales. Computer Internet Internet .69⁄⁄ Online courses .36⁄ .51⁄⁄ Note: N = 41. ⁄p < .05; ⁄⁄p < .001. Table 6. Correlations among the six satisfaction subscales. Instructor Technology Setup Interaction Outcomes Technology .62⁄⁄ Setup .48⁄ .62⁄⁄ Interaction .28 .53⁄⁄ .54⁄⁄ Outcomes .40 .57⁄⁄ .56⁄⁄ .39 Overall .45⁄ .58⁄⁄ .77⁄⁄ .66⁄⁄ .62⁄⁄ Note: N = 46. ⁄p < .01; ⁄⁄p < .001. Figure 3. Scatterplot displaying a negative correlation between technological anxiety and student satisfaction (N = 46). Distance Education 93 population; it is unlikely that many of them had ever received any formal computer training initially in high school or at the undergraduate level on learning manage- ment systems. Satisfaction Overall, the doctoral students were exceptionally satisfied with their online course. Many students noted they would not be able to obtain a doctorate degree were it not for the flexibility of the online learning component. In addition, 93.5% strongly agreed or agreed with the statement “I am satisfied with the flexibility this course affords me.” This supports the premise of other researchers that flexibility is a key component of student satisfaction (Bolliger, 2003; Rekkedal & Qvist-Eriksen, 2004). The responses of these students also support Sahin and Shelley’s assertions that two factors associated with student satisfaction in distance learning are flexibil- ity and usefulness (2008), as well as Lin et al.’s assertions that task value impacts satisfaction (2008). Students found the course useful and 95.7% strongly agreed or agreed with the statement “I am satisfied with how I am able to apply what I have learned in this course.” In addition, Halsne and Gatta (2002) noted nontraditional students prefer online learning; these doctoral students were nontraditional students and many stated they selected this program because they were employed full time and had families. The instructor subscale yielded the highest satisfaction score out of the five subscales. Shee and Wang (2008) and Herbert (2006) noted that frequent, quality interaction with instructors increases student satisfaction. However, student satisfac- tion in this population was also composed of more than one factor, as noted by Sahin and Shelley (2008) and Wickersham and McGee (2008). Anxiety Several students noted they were anxious about taking this online course; this supports Saadé and Kira’s supposition that computer anxiety is important to edu- cators (2007), as well as Rosen and Weil’s findings that students suffer from computer-related learning anxiety (1995). Since the majority of the population of this study was primarily over 40 years of age, this can also be related to Halu- pa’s findings that older adults tend to have a less positive attitude toward the Internet (2004). Students reported slightly higher anxiety at the beginning of the course com- pared to the end of the course, but this difference was not statistically signifi- cant. Surprisingly, student anxiety was relatively low to moderate; this contrasts with Saadé and Kira’s findings that computer anxiety is quite high in college students (2007). Because 95.7% of these students already had a master’s degree and 4.3% already had one doctorate degree, this could be attributed to the fact that the students in this study were already seasoned or highly motivated learn- ers who strived to obtain a doctoral degree. However, students displayed multidi- mensional aspects of anxiety concerning performance insecurity, hesitation, and nervousness (Beckers & Schmidt, 2001; Heinssen et al., 1987; Saadé & Kira, 2007). This sample did not report significant technical problems as noted by Frankola (2001), or a failure to understand the online media as cited by Herbert (2006). 94 D. U. Bolliger and C. Halupa Students in this study who felt anxiety in any of the three domains tested (com- puters, Internet, and online learning) also experienced anxiety with other domains. This reflects Saadé and Kira’s statement that negative emotions can impact the over- all learning experience (2007). Anxiety and satisfaction A significant negative correlation between anxiety and satisfaction was found. Stu- dents with less anxiety were more satisfied than those with higher anxiety. Saadé and Kira (2007) noted learners with higher levels of computer anxiety are at a disadvantage. Feeling disadvantaged can definitely impact satisfaction with the course and course delivery. Also of note is the highest anxiety reported by participants is related to the tak- ing of online courses. Because the students in this sample had almost completed the course when the satisfaction scale was administered, it is probable that some of the initial anxiety was alleviated at that point. Frankola (2001) and Diaz (2002) found the attrition rate of students is 20–50% in undergraduate traditional courses and 30–60% in online courses. Interestingly, the attrition rate for the four quarters over a calendar year in the initial class in the doctoral program was only 18.4%. This could be related to the moderate computer, Internet, and online course anxiety and the high student satisfaction reported with the course. Conclusion Some limitations of this study need to be noted here. First, the study took place at one institution. Second, the population included students in the field of health sciences only. Other researchers might wish to include learners in other subject areas and/or multiple sites. Third, a high percentage of participants were Caucasian females even though the student population at this institution includes a high number of minority and under-represented students. At first, these results might be interpreted as culturally biased; however, 51% of total doctoral degrees granted in 2007–2008 were awarded to females. The percentage of women who graduated with doctoral degrees in the health professions and sci- ences was even higher—almost 73%. In the academic year 2005–2006 over 80% of individuals who were awarded doctoral degrees in all fields were Cau- casian (Snyder & Dillow, 2010). Last, all data was self-reported. Therefore, readers should interpret them with caution because the results are somewhat limited in generalizability. This research study was performed to attempt to help bridge the information gap in the literature regarding doctoral student satisfaction in online learning. Grad- uate students in online programs are usually older, nontraditional students who may experience anxiety particularly with online course delivery systems. However, par- ticipants reported low or moderate levels of anxiety and were satisfied with this writing-intensive course. Because there was a statistically significant relationship between anxiety and satisfaction, online instructors should consider the integration of online student orientations, student-centered approaches, and planned interven- tions in order to alleviate student anxiety, which could result in higher student satisfaction. Distance Education 95 Notes on contributors Doris U. Bolliger is an assistant professor in the Department of Professional Studies at the University of Wyoming where she teaches primarily online graduate-level courses in instructional design and technology. Her research interests include satisfaction, communication, interaction, community, and interventions in the online environment. Colleen Halupa is the assistant dean of online learning at LeTourneau University and an assistant professor in the health education doctoral program at A.T. Still University. She teaches graduate courses in education, educational technology, and research design as well as undergraduate courses in health administration. Her research interests include student and faculty satisfaction, delivery of online doctoral programs, online curriculum quality, and delivery of innovative hybrid curriculum. References Allen, I. E., & Seaman, J. (2010a). Class differences: Online education in the United States, 2010. Babson Park, MA: Babson Survey Research Group. Retrieved from http://www. usdla.org/assets/pdf_files/2010%20Sloan-C%20Report%20on%20Online%20Education% 20Enrollment.pdf Allen, I. E., & Seaman, J. (2010b). Learning on demand: Online education in the United States, 2009. Babson Park, MA: Babson Survey Research Group. Retrieved from http:// sloanconsortium.org/sites/default/files/pages/learningondemand-7.pdf American Association of Colleges of Nursing. (2011, March 16). Despite economic chal- lenges facing schools of nursing, new AACN data confirm sizable growth in doctoral nursing programs. Retrieved from http://www.aacn.nche.edu/news/articles/2011/enroll- surge Astin, A. W. (1993). What matters in college? Four critical years revisited. San Francisco, CA: Jossey-Bass. Atlantic International University. (n.d.). Admissions. Retrieved from http://www.aiu.edu/ admisions.htm Beckers, J. J., & Schmidt, H. G. (2001). The structure of computer anxiety: A six factor model. Computers in Human Behavior, 17, 35–49. doi: 10.1016/S0747-5632(00)00036-4 Bogdan, R. C., & Biklen, S. K. (1998). Qualitative research in education: An introduction to theory and methods. Needham Heights, MA: Allyn & Bacon. Bolliger, D. U. (2003). The design and field test of a Web-based training program for future school administrators in a northwest Florida school district. Journal of Interactive Online Learning, 1(3), 1–12. Retrieved from http://www.ncolr.org/jiol/ Bolliger, D. U., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. International Journal on E-Learning, 3, 61–67. Bower, B. L., & Kamata, A. (2000). Factors influencing student satisfaction with online courses. Academic Exchange Quarterly, 4(3), 52–56. Carnegie Foundation for the Advancement of Teaching. (n.d.). Professional & graduate edu- cation. Retrieved from http://www.carnegiefoundation.org/previous-work/professional- graduate-education Conrad, A. M., & Munro, D. (2008). Relationships between computer self-efficacy, technol- ogy, attitudes and anxiety: Development of the Computer Technology Use Scale (CTUS). Journal of Educational Computing Research, 39, 51–73. doi: 10.2190/EC. 39.1.d Dennen, V. P., Darabi, A. A., & Smith, L. J. (2007). Instructor–learner interaction in online courses: The relative perceived importance of particular instructor actions on performance and satisfaction. Distance Education, 28, 65–79. doi: 10.1080/ 01587910701305319 Diaz, D. P. (2002). Online drop rates revisited. The Technology Source Archives, University of North Carolina, Chapel Hill, NC. Retrieved from http://technologysource.org/article/ online_drop_rates_revisited Erlandson, D. A., Harris, E. L., Skipper, B. L., & Allen, S. D. (1993). Doing naturalistic inquiry: A guide to methods. Newbury Park, CA: Sage. 96 D. U. Bolliger and C. Halupa http://www.usdla.org/assets/pdf_files/2010%20Sloan-C%20Report%20on%20Online%20Education%20Enrollment.pdf http://www.usdla.org/assets/pdf_files/2010%20Sloan-C%20Report%20on%20Online%20Education%20Enrollment.pdf http://www.usdla.org/assets/pdf_files/2010%20Sloan-C%20Report%20on%20Online%20Education%20Enrollment.pdf http://sloanconsortium.org/sites/default/files/pages/learningondemand-7.pdf http://sloanconsortium.org/sites/default/files/pages/learningondemand-7.pdf http://www.aacn.nche.edu/news/articles/2011/enroll-surge http://www.aacn.nche.edu/news/articles/2011/enroll-surge http://www.aiu.edu/admisions.htm http://www.aiu.edu/admisions.htm http://www.ncolr.org/jiol/ http://www.carnegiefoundation.org/previous-work/professional-graduate-education http://www.carnegiefoundation.org/previous-work/professional-graduate-education http://technologysource.org/article/online_drop_rates_revisited http://technologysource.org/article/online_drop_rates_revisited Frankola, K. (2001). Why online learners drop out. Workforce, 80(10), 53–59. Retrieved from http://www.workforce.com/section/magazine Halsne, A. M., & Gatta, L. A. (2002). Online vs. traditionally-delivered instruction: A descriptive study of learner characteristics in a community college setting. Online Jour- nal of Distance Learning Administration, 5(1). Retrieved from http://www.westga.edu/ ~distance/ojdla/ Halupa, C. (2004). Medical providers’ and internet-based education. Academic Exchange Quarterly, 8(3), 116–120. Hara, N., & Kling, R. (2001). Student distress in Web-based distance education. EDU- CAUSE Quarterly, 24(3), 68–69. Heinssen, R. K., Glass, C. R., & Knight, L. A. (1987). Assessing computer anxiety: Devel- opment and validation of the computer anxiety rating scale. Computers in Human Behav- ior, 3, 49–59. doi: 10.1016/0747-5632(87)90010-0 Herbert, M. (2006). Staying the course: A study in online student satisfaction and retention. Online Journal of Distance Learning Administration, 9(4). Retrieved from http://www. westga.edu/~distance/ojdla/ Igbaria, M., & Parasuraman, S. (1989). A path analytic study of individual characteristics, computer anxiety and attitudes toward microcomputers. Journal of Management, 15, 373–388. Kay, R. H. (2008). Exploring the relationship between emotions and the acquisition of knowledge. Computers & Education, 50, 1269–1283. doi: 10.1016/j.compedu. 2006.12.002 Liaw, S. (2008). Investigating students’ perceived satisfaction, behavioral intention, and effectiveness of e-learning: A case study of the Blackboard system. Computers & Educa- tion, 51, 864–873. Lin, Y., Lin, G., & Laffey, J. M. (2008). Building a social and motivational framework for understanding satisfaction in online learning. Journal of Educational Computing Research, 38, 1–27. doi: 10.2190/EC.38.1.a Montelpare, W. J., & Williams, A. (2000). Web-based learning: Challenges in using the Internet in the undergraduate curriculum. Education and Information Technologies, 5(2), 85–101. doi: 10.1023/A:1009647400624 Muilenburg, L. Y., & Berge, Z. L. (2005). Student barriers to online learning: A factor ana- lytic study. Distance Education, 26, 29–48. doi: 10.1080/01587910500081269 Navarro, P. (2000). The promise–and potential pitfalls–of cyberlearning. In R. A. Cole (Ed.), Issues in Web-based pedagogy: A critical primer (pp. 281–297). Westport, CT: Green- wood Press. Rekkedal, T., & Qvist-Eriksen, S. (2004). Student support services in e-learning: An evalua- tion study of students’ needs and satisfaction. European Journal of Open, Distance and E-Learning. Retrieved from http://www.eurodl.org/ Rosen, L. D., Sears, D. C., & Weil, M. M. (1987). Computerphobia. Behavior Research Methods, Instruments, & Computers, 19, 167–179. Rosen, L. D., & Weil, M. M. (1995). Computer anxiety: A cross-cultural comparison of uni- versity students in ten countries. Computers in Human Behavior, 11, 45–64. doi: 10.1016/0747-5632(94)00021-9 Saadé, R. G., & Kira, D. (2007). Mediating the impact of technology usage on perceived ease of use by anxiety. Computers & Education, 49, 1189–1204. doi: 10.1016/j.comped- u.2006.01.009 Sahin, I., & Shelley, M. (2008). Considering students’ perceptions: The distance education student satisfaction model. Educational Technology and Society, 11(3), 216–223. Retrieved from http://www.ifets.info Shee, D. Y., & Wang, Y. (2008). Multi-criteria evaluation of the web-based e-learning sys- tem: A methodology based on learner satisfaction and its applications. Computers & Education, 50, 894–905. doi: 10.1016/j.compedu.2006.09.005 Simonson, M., Smaldino, S., Albright, M., & Zvacek, S. (2009). Teaching and learning at a distance: Foundations of distance education (4th ed.). Boston, MA: Allyn & Bacon. Sloan Consortium. (2002). Quick guide: Pillar reference manual. Needham, MA: Author. Retrieved from http://sloanconsortium.org/publications/books/dprm_sm.pdf Distance Education 97 http://www.workforce.com/section/magazine http://www.westga.edu/~distance/ojdla/ http://www.westga.edu/~distance/ojdla/ http://www.westga.edu/~distance/ojdla/ http://www.westga.edu/~distance/ojdla/ http://www.eurodl.org/ http://www.ifets.info http://sloanconsortium.org/publications/books/dprm_sm.pdf Snyder, T. D., & Dillow, S. A. (2010, April). Digest of education statistics, 2009 (NCES Publication No. 2010-013). Washington, DC: National Center for Education Statistics. Retrieved from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2010013 Strauss, A., & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory (2nd ed.). Thousand Oaks, CA: Sage. Sun, P., Tsai, R. J., Finger, G., Chen, Y., & Yeh, D. (2008). What drives a successful e- Learning? An empirical investigation of critical factors influencing learner satisfaction Computers & Education, 50, 1183–1202. doi: 10.1016/j.compedu.2006.11.007 Walker, G., Golde, C. M., Jones, L., Bueschel, A. C., & Hutchings, P. (2008). The formation of scholars: Rethinking doctoral education for the twenty-first century. San Francisco, CA: Jossey-Bass. Wickersham, L. E., & McGee, P. (2008). Perceptions of satisfaction and deeper learning in an online course. Quarterly Review of Distance Education, 9, 73–83. 98 D. U. Bolliger and C. Halupa http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2010013 Copyright of Distance Education is the property of Routledge and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.