ajpe95 1..10 RESEARCH ARTICLES Comparison of Two Lecture Delivery Platforms in a Hybrid Distance Education Program L. Douglas Ried, PhD, a and Katherine Byers, PhD b aCollege of Pharmacy, University of Florida* bTexas Tech University Health Science Center Submitted June 13, 2008; accepted October 18, 2008; published August 28, 2009. Objective. To compare students’ preferences for and academic performance using 2 different distance education course content delivery platforms. Methods. A randomized, crossover research design was used to compare traditional video with a 4-panel platform among learners on multiple campuses within 1 college of pharmacy.1 The outcomes were students’ preferences for delivery platform and examination scores. Rasch analysis was used to assess unidimensionality and the difficulty of examination items. Hierarchical logistic and multiple regression models were used to assess students’ preferences and academic performance. Results. The logistic model predicting preference for the 4-panel or traditional platform was not significant, but African-Americans and Hispanics were more likely to prefer the 4-panel platform than Caucasian and Asian students. The delivery platform did not impact students’ academic performance. Students who did well on the semester’s previous 2 examinations scored higher on the questions related to schizophrenia. Students with higher Pharmacy College Admission Test (PCAT) scores performed better on the bipolar questions than students who preferred the traditional video platform. Conclusion. The additional faculty time, effort, and cost invested in presenting the class material in a 4-panel platform, and the students’ extra time and effort spent viewing the 4-panel platform did not produce a comparable benefit in student preference and performance. Keywords: distance education, academic performance, distance learning, educational technology, learning style, learning preferences, assessment INTRODUCTION The pharmacist shortage in the United States is attrib- uted to an inability to train new pharmacists at the same rate as the growth in demand for pharmacists.1 Schools and colleges of pharmacy are pressured to graduate more students to lessen the gap between the supply and demand for pharmacists. The high cost of starting new pharmacy schools and the difficulties in finding qualified new fac- ulty members have spurred exploration into options for training pharmacists. Some existing schools and colleges of pharmacy have increased their on-campus class size, while others have addressed space or resource constraints by employing distance education programs. Video-based instruction is effective in distance edu- cation because students are able to view lectures at their own pace, instructions can be reviewed multiple times, class material is more accessible, and study time is spent more efficiently. 2-4 The lack of interaction with the instructor, however, is a disadvantage.4 Nevertheless, distance-learning strategies have been adopted because they are effective and cost-efficient. The best way of delivering content is equivocal. While studies have shown academic achievement is im- proved when instructors respond to students’ different learning styles,5 others have shown no difference.6 Al- though students may have more than one learning style, they usually have a preferred style.7 Thus, while students in the same class receive the same instruction, the teach- ing strategy used may be effective for some students and ineffective for others. Multiple platforms are available to deliver content at a distance; however, not all platforms are equivalent with regard to students’ learning. 8,9 Platforms can be more or less aligned with students’ preferences.10-12 Making the decision even more complex, not all delivery platforms cost the same to develop and deliver. Real costs are associated Corresponding Author: L. Douglas Ried, PhD, Southwestern Oklahoma State University, 100 Campus Drive, Weatherford, OK 73096, Tel: 580-774-3105. Fax: 580-774-7020. E-mail: doug.ried@swosu.edu *Affiliation at time of study. Current affiliation: Southwestern Oklahoma State University. American Journal of Pharmaceutical Education 2009; 73 (5) Article 95. 1 with faculty and programmer effort and time, and soft- ware and hardware requirements; however, more expen- sive platforms may be warranted if they also are more cost effective. The College of Pharmacy at the University of Florida (UF) doubled enrollment by implementing a hybrid dis- tance education program in August 2002.3 The traditional video platform format has predominated since the incep- tion of the program. It has been characterized as ‘‘a talking head with slides’’ and targets visual and audio senses. Lectures are digitally recorded at the Gainesville campus and video streamed via the Internet, enabling students to watch the lectures online within 2 hours. The content is the same for all 4 campuses, as are the teaching method- ologies for all of the courses in the curriculum. Depending on the course, the majority of students on the Gainesville campus also view the lectures via the video platform rather than attending the ‘‘live’’/tradi- tional classes. As an alternative, the college experimented with a 4-panel format, an interactive multimedia platform designed to address students’ different learning preferen- ces and determine the most suitable method to deliver the courses’ content to meet the needs of faculty members and students (Classroom 24/7, formerly DigiScript, Saddleback, NJ). It consists of 4 panels appearing in 1 large window on the computer screen. The video and audio content are synchronized and appear in 1 panel. Transcripts of the lecture narrative, PowerPoint slides, and outlines appear in the other 3 panels. The 4-panel platform allows students to pinpoint and select specific content within a presenta- tion by clicking on embedded outline links. Other orga- nizational features include bookmarks, a search function, and handout tabs. Finally, the self-assessment quizzes with feedback embedded within the lectures offer stu- dents additional opportunities to interact with the course content and make self-assessments regarding whether they need to review a specific part of the lecture. These active-learning strategies are designed to enhance student learning. 13 These strategies are effective in live lectures14 and some speculate they may be effective in Web-based courses.8,15 The effectiveness of adding self-assessment questions and feedback to a content delivery platform are mixed.16-19 In fall 2005, both the traditional video and 4-panel platforms were employed in a second-year pharmacother- apy course. This pilot study compared students’ prefer- ences and academic performance using the 2 platforms. Our hypotheses were that students: (1) would prefer the 4-panel format, and (2) academic performance with the 4-panel format would be higher. METHODS Study Description This pilot study used a randomized crossover research design (Figure 1). It was conducted during the psychiatry segment of the second-year pharmacotherapy course. To ensure students were exposed to both platforms before being randomized into a study group, the first 2 topics covered in the course were delivered to all students using the same format. First, 2 lectures on the treatment of de- pression were presented to all of the students using the 4-panel platform. Next, lectures on the treatment of anx- iety were presented via the traditional video platform. The students were exposed to the 4-panel platform prior to the study to lower the learning curve associated with new technology. The anxiety lectures were delivered in the traditional format after the 4-panel format and before ran- domization to serve as a quasi-washout time period. Following these common lectures, students in each of the classes at the College’s 4 campuses were randomly assigned to 1 of 2 groups. Students in one group received 2 lectures on the treatment of bipolar affective disorder via the traditional video platform, while students in the other group received the same lectures via the 4-panel video platform. Next, students on the campuses that received traditional video platform of the bipolar lectures received 4-panel presentations for the treatment of schizophrenia; while students who had previously received the 4-panel bipolar presentations received the schizophrenia lectures via the traditional video platform. Students were told be- forehand that the 4-panel platform was going to be tested on more than 1 occasion to see which delivery method they preferred, and that at least 2 disease topics would be in the 4-panel format, but they were not told which topics or when. The lecturer and course content were exactly the same and both groups of students viewed exactly the same video and audio portions of the lectures in both platforms. Outcome Variables Examination scores on relevant content. Students’ learning was assessed using 13 items embedded in the Figure 1. Design of a research study comparing two lecture delivery platforms for a hybrid distance education program. American Journal of Pharmaceutical Education 2009; 73 (5) Article 95. 2 third examination of the semester to test the material pre- sented postrandomization. The examination questions pertaining to the psychiatry lectures were identified by the course coordinator. Five items were included for the depression and anxiety lectures, but were not included in the students’ examination score outcome measure be- cause all students received this material using the same delivery platform to reduce confounding. 1 Students’ total examination scores were obtained by summing the value of the logits for individual students on the 8 remaining items.20 Individual students’ logit scores for the 4-panel items were compared to the logit scores for the traditional video platform items. The examination also consisted of an additional 37 items covering throm- boembolic disorders, Alzheimer’s, attention deficit hy- peractivity disorder, obesity, and insomnia. Preference for lecture video presentation. Stu- dents’ preferences for the lecture platforms were assessed via 2 questions embedded within an Internet-based survey administered at the end of the semester. Students were contacted by e-mail and offered extra credit points to complete the Internet-based survey instrument. Students were asked, ‘‘Which of the following state- ments most accurately describes your preferences regard- ing the video presentations?’’ Students expressed their preferences by choosing 1 of 3 options: I preferred the 4-panel platform (11); I have no preference between the 4-panel platform and the traditional video platform (0); or I preferred the traditional video platform (�1). Students were asked to report how much they preferred the chosen video platform using a Likert scale. The categories were: (1) just a little more, (2) somewhat more, (3) a lot more, or (4) it was the best. The value assigned to the preference category was multiplied by the degree of preference choice so scores ranged from �4 to 14. Finally, students were asked to state whether they strongly agreed, agreed, were neutral, disagreed, or strongly disagreed with the statement, ‘‘I think that I learned the material better when I used the 4-panel plat- form compared to the regular video streaming.’’ Students also were asked to identify the most helpful features of the 4-panel platform. Predictor Variables Lecture delivery platform. The primary predictor in this study was the course content delivery platform. DigiScript randomized the delivery order and assignment of the campuses to the platforms (Classroom 24/7, for- merly DigiScript, Saddleback, NJ). The investigators and students were blinded both to the campuses and the order in which the platforms were delivered. Randomization was not revealed until all of the data were collected. The primary differences between the 2 platforms were embedded within the enhanced features of the 4-panel platform. Multiple-choice questions were periodically embedded as part of the 4-panel format. Students were required to answer the questions correctly before moving forward with the rest of the content. Students received feedback regarding their response to the question imme- diately upon submitting the answer. Covariates. Covariates associated with academic success were added to the model. 3,21,22 Students’ demo- graphic characteristics included age, gender, and race (Caucasian, African-American, Hispanic, Asian, and other). Indicators of preadmission academic achievement included earning a bachelor’s degree prior to admission to the pharmacy college (1 5 yes, 0 5 no), science/math grade point average (GPA), and percentile scores on the PCAT. Two measures of students’ current academic per- formance included in the models were the sum of their scores on the first 2 examinations of the semester in the same course and whether they were ‘‘out of sequence’’ with the rest of their academic class (1 5 yes; 0 5 no). Out of sequence students were those with academic difficulty or who were behind in the curriculum for personal reasons. Statistical Analyses The bivariate relationships between students’ charac- teristics, preadmission academic performance, examina- tions scores, and platform preference were examined using Pearson’s chi-square tests for categorical variables and independent t tests for continuous variables. Hierarchical logistic regression models were fit to predict students’ preference for the 4-panel or traditional lecture delivery platform. If the chi-square statistic for the change in the -2 log likelihood value was significant, the prediction of the students’ probability of preferring the 4-panel delivery platform over the traditional delivery platform was improved. A 1-parameter item response (Rasch) model was used, which is based on a probabilistic model that orders items and subjects simultaneously using maximum like- lihood estimation. 20 The result arranges items along a difficulty continuum and subjects along an ability contin- uum.23 Rasch analysis was performed using WINSTEPS, Version 3.31 (WINSTEPS, Chicago, IL). WINSTEPS provides detailed statistics for each item and the overall instrument. We used the default rating scale model with groups equal to 1 which assumes all of the items share the same underlying rating scale structure. We used it as orig- inally conceptualized for use with dichotomous items23 scored as correct or incorrect. Item difficulties and person measures are mapped in logits or log-odds units along the same linear continuum. American Journal of Pharmaceutical Education 2009; 73 (5) Article 95. 3 Rasch models also provide statistical results regard- ing items’ unidimensionality and hierarchical ordering. Unidimensionality refers to whether the instrument mea- sures a single construct.24 Mean square standardized resid- uals (MnSq) and standardized Z scores identify how well each item fits the hypothesized unidimensional construct. For low-stakes multiple-choice questionnaires, a reason- able MnSq value is # 1.3 or 30% variance. 24 Items . 1.3 ‘‘diverge unacceptably from the expected ability/diffi- culty pattern’’.24(p26) In addition to MnSq statistics, ZSTD estimates the improbability of participant responses. Items with Z scores of 2 or higher are considered too un- predictable.25 Items exceeding both the MnSq and ZSTD criteria (MnSq $ 1.3 and ZSTD $ 2.0) were considered ‘‘misfitting.’’24 Item hierarchy orders items sequentially from least to most difficult. A student’s ability is measured by logits, derived from transforming ordinal data into an interval scale. A student’s position on a unidimensional scale is evaluated by ordering item difficulty and person ability on the same linear continuum. Students’ raw scores can be misleading if items are not in a logit scale and hierarchical order. For example, if 2 students receive identical raw scores on a test, they do not necessarily have the same level of ability since one student may have correctly an- swered more of the most challenging items. 24 Two measures of academic performance were exam- ined using linear regression models: (1) students’ scores on the 2 previous examinations, and (2) the sum of stu- dents’ logit scores on the test items representing learning on the postrandomization materials. In each case, the de- mographic and prepharmacy academic variables were first added to the model. Next, the predictor variables (ie, lecture delivery platform) were added to the equation. Significant change in the coefficient of determination (R2) indicates improvement in prediction of students’ ac- ademic performance. The study was conducted according to the principles of the Declaration of Helsinki. The University of Florida Institutional Review Board approved the study protocol. The a priori level of statistical significance was set at #0.05. SPSS was used to conduct the statistical analyses (SPSS for Windows, version 15.0, Chicago, IL.) RESULTS Description of the Sample Three hundred twenty-eight students were enrolled in the class and 65% was female. Students’ average age was 25.8 years (range, 19 to 53 years); their average prephar- macy science and math GPA was 3.4 6 0.3; and their average maximum composite PCAT score was 86.9 6 12.3. The majority (69.2%) of the students were white, 19.2% were Asian, and 11.6% were African-American, Hispanic, or other. The majority (45.7%) attended the Gainesville campus; other students attended the Jackson- ville (16.2%), Orlando (16.8%), or St. Petersburg cam- puses (21.3%). One hundred twenty-four (37.7%) of the students had earned at least a bachelor’s degree before admission. Next, we compared students’ demographic character- istics and prerandomization academic performance according to platform presentation order. On average, the students who viewed the 4-panel platform first were older (Table 1). This difference in age among participants was an unavoidable result of the prerandomization pro- cess for the Gainesville campus students, who were as- signed to watch the traditional video first, were younger on average and represented nearly 40% of the College’s student body. They were also more likely to have earned a bachelor’s degree and had a lower average composite PCAT score before admission. However, the average pre- pharmacy science and math GPA, proportion of out-of- sequence students, race, gender, and average on the first 2 examinations of the semester in the class were similar. Demographic data were incomplete for 3 students, so they were excluded. Predictors of Academic Performance on Examination 1 and Examination 2 Female students scored between 4 and 5 points higher than male students. African-American and Hispanic stu- dents, and those students who indicated their race was ‘‘other’’ scored nearly 7 points lower than Caucasian stu- dents, and increasing age was associated with lower scores on examination 1 and examination 2 (Table 2). Examination scores were higher among students with higher science/math GPA and PCAT scores. Students earning a bachelor’s degree prior to admission also scored nearly 4 points higher on the 2 examinations. When the variable representing the lecture presentation platform was included in the model, the overall model remained significant (p , .001). However, neither the unstandard- ized regression coefficient representing the platform vari- able nor the model’s R 2 change was significant. Unidimensionality and Hierarchy of Examination Items The items on the third examination showed accept- able unidimensionality, except for the anxiety 2 and schizophrenia 1 items. The standardized Z score for the anxiety 2 item was too high. Anxiety 2 and schizophrenia 1 items were the hardest and easiest question to answer, respectively. Item presentation order did not significantly influence the 95% confidence interval. The only item American Journal of Pharmaceutical Education 2009; 73 (5) Article 95. 4 outside of the 95% CI of the error of the items in the presented order was bipolar 2. When the test items were arranged from easiest at the bottom to most difficult at the top, less than 20 students scored less than the mean for item difficulty. The mean of the students’ ability level was approximately 1½ logits above the mean of the item difficulty, indicating that more than 90% of the students were able to correctly answer test items of average difficulty. Platform Preference Students’ platform preference was investigated as a plausible explanation for scoring well or poorly on the post-randomization examination items. Nearly 84% (n 5 273) of the 325 eligible students responded to the end of the semester survey. Nearly 75% of these students (n 5 201) opined that the 4-panel platform was easy to use. Just over 51% of them (n 5 140) preferred the traditional video platform, 100 (36.6%) preferred the 4-panel Table 1. Demographic Characteristics and Academic Performance According to the Platform Delivered First After Randomization (N 5 328) Demographic Characteristics and Achievement Traditional Platform (n5203) 4-Panel Platform (n5125) P Male (n 5 116), No. (%) 69 (34.0) 47 (37.6) 0.59 Female (n 5 212), No. (%) 134 (66.0) 78 (62.4) Age, years (SD) 24.5 (4.3) 27.8 (6.4) ,.001 Prepharmacy SMGPA (SD) 3.4 (0.3) 3.4 (0.4) 0.67 Average PCAT composite score 88.0 (11.1) 85.18 (14.0) 0.04 Bachelors degree or higher (n 5 124), No. (%) 66 (32.5) 58 (46.4) 0.02 Race White (n 5 227), No. (%) 141 (69.5) 86 (68.8) 0.61 Asian (n 5 63), No. (%) 41 (20.2) 22 (17.6) African-American, Hispanic, Other (n 5 38), No. (%) 21 (10.3) 17 (13.6) Out of sequence, No. (%) 19 (9.4) 20 (16.0) 0.10 Score on examination 1 80.7 (11.4) 78.7 (10.3) 0.13 Score on examination 2 78.5 (11.0) 76.6 (9.8) 0.12 Abbreviations: SMGPA 5 science and math grade point average a Chi-square with Yates discontinuity Correction, 1 degree of freedom. b Pearson chi-square. c t test Table 2. Predictors of the Sum of Examination 1 and Examination 2 Scores Step 1 Step 2 Variable Beta P Beta P Female Gender �.12 0.02 �.12 0.02 Black �.11 0.03 �.11 0.03 Asian �.03 0.55 �.03 0.55 Age �.25 ,.001 �.25 ,.001 SMGPA .24 ,.001 .24 ,.001 PCAT .23 ,.001 .23 ,.001 Yes, Bachelors Degree .11 0.05 .11 0.05 Out of Sequence .11 0.04 .11 0.04 Lecture Platform NA NA �.01 0.81 Adjusted R2 0.19 0.19 Model F-Ratio 9.27 8.65 Degrees of freedom 8,316 9,315 Model p value ,.001 ,.001 F-Ratio of Change 0.06 Degrees of freedom 1,315 p-value of change 0.81 Abbreviations: P 5 alpha error (p value); Beta 5 Standardized OLS Regression Coefficient; SMGPA 5 Science and Math Grade Point Average; PCAT 5 Composite Score on the Pharmacy College Admission Test. American Journal of Pharmaceutical Education 2009; 73 (5) Article 95. 5 presentation platform, and 33 (12.1%) indicated no prefer- ence. When asked which feature was most helpful, 58.1% responded the ability to print the lecture narrative, while 25% responded the self-assessment and feedback feature. The remaining 4-panel features accounted for a small pro- portion of the remaining responses, including the lecture slides (5.9%), key word search features (3.3%), embedded outline (2.6%), notes (1.8%), and bookmarks (0.7%). Most (40.1%) of the students were neutral regarding whether they learned better with one platform than the other; however, 28.3% opined they learned better with the 4-panel platform and 31.6% stated they learned better with the traditional video platform. Contrary to conven- tional wisdom, which would be that students would learn better using their preferred platform, students who thought they learned more using the 4-panel platform were more likely to prefer the traditional video platform (51.1%). Those who thought they learned more using the traditional video platform were more likely to prefer the 4-panel platform (62.0%), and those who were neutral about their learning were more likely to be neutral about their platform preference (60.6%; p , 0.001). The logistic model predicting students’ platform pref- erence was not significant. Race was the single significant variable. African-Americans, Hispanics, and other races were nearly 3 times more likely to prefer the 4-panel platform compared to Caucasian students. The findings were the same for the ordinary least squares regression and are available from the authors upon request. Comparison of Students’ Academic Performance According to Delivery Platform The average logit scores on the examination items were 4.5 6 3.3 and 1.3 6 1.5 for the schizophrenia and bipolar questions, respectively. The average difference in the logit and raw scores between the 2 presentation for- mats were similar (schizophrenia logit score 5 4.6 6 3.2 vs. 4.4 6 3.4, independent t test(323) 5 0.53, p 5 0.59; bipolar logit score 5 1.4 6 1.5 vs. 1.2 6 1.5, independent t test(323) 5 1.06, p 5 0.29. Consistent with the bivariate findings, the delivery platform did not have a significant impact on students’ academic performance after controlling for the covariates. Students who received the course content on the 4-panel platform scored , 1 point higher than those who received the content using the traditional video platform for both topics combined. Gender, race, age, prepharmacy science/ math GPA, PCAT score, prior BS degree, and whether the student attended one of the distance campuses or the Gain- esville campus did not influence academic performance. Students who did well on the semester’s previous 2 examinations scored higher on the questions related to schizophrenia (Table 3). The raw logit scores on the schizophrenia questions were impacted the most among students who were ‘‘out of sequence,’’ averaging 1.29 logits lower (95% CI 5 0.09, �2.66) than students who were on track within the college’s curricular sequence. On the bipolar questions, students with higher PCAT scores performed better (Table 3). Students who preferred the traditional video platform scored lower than students who expressed no platform preference. Students who preferred the 4-panel platform also scored lower than those with no preference, and students with higher previous examina- tion scores scored higher. The same relationship and pat- tern of results was seen for academic performance and platform preference for the schizophrenia course content, but it was not significant. DISCUSSION The specific hypotheses of this pilot study were (1) students would prefer the 4-panel format because of the multiple learning styles involved, and (2) the academic performance of students assigned to the 4-panel format would be better than that of students using the traditional format. This finding was initially surprising because the 4-panel platform was designed to appeal to more learning styles. Upon reflection, we identified several possible explanations. In a study of medical students, approxi- mately 36% preferred receiving course-related informa- tion using only 1 learning style, including when asked to choose among visual (learning from graphs, charts, flow diagrams), auditory (learning from speech), printed word (learning from reading and writing), or kinesthetic (touch, hearing, smell, taste, and sight). However, nearly 64% preferred using 2 or more learning styles. 26 The tradi- tional video platform already combined visual and audi- tory learning to engage students. The 4-panel platform added kinesthetic learning in the form of questions and feedback, and the printed word in the form of transcrip- tions of the lecture. Although the 4-panel platform was used to present 2 lectures (1 topic) to all students before the campuses were randomized, students were more fa- miliar with the traditional video platform. The traditional video format had been used the entire previous year and during the current semester, with the exception of the pilot lectures. Some students may have had difficulty adapting to less familiar methods of instruction. 27 Perhaps it should have been more surprising that nearly 40% of the students preferred the 4-panel platform. Next, central to the prin- ciples of ‘‘instructional preference’’ is the notion of how individuals prefer to learn, whether through lectures, in- dividual study, or small-group studies. The 4-panel plat- form included neither small group studies nor live interaction with the instructor. For those who preferred American Journal of Pharmaceutical Education 2009; 73 (5) Article 95. 6 more personal interactions, the 4-panel platform was ba- sically no different than the traditional video platform. Lastly, the most frequently cited benefit of the 4- panel platform was the word-for-word transcription of the lecture rather than the questions and feedback feature. Conversely, the most frequently voiced objections were that: (1) it took longer to complete the lectures because they had to answer the questions and read the feedback, and (2) they could not ‘‘speed up’’ the lectures as they had been able to do with the traditional platform. Although this comment is largely speculative, the extra time it took students to complete the questions and feedback in the 4-panel format may have been a cost that some felt did not benefit them enough to make it worthwhile. Given the relatively small benefit gained in their examination scores, students’ preference may be justified. Future work should look at students’ motivations for learning and con- cordance of the platform features with those motivations. For example, both of the most frequent objections to the 4-panel platform were features that increased the amount of time it took students to complete course content. The second hypothesis predicted that students’ aca- demic performance would be better on the material pre- sented using the 4-panel platform. Students’ academic performance was not improved using the 4-panel plat- form. Theoretically, teachers can improve learning by adapting learning tasks and teaching methods to students’ preferences. 28 The 4-panel platform added a download- able transcript of the lecture that students could read, in addition to the self-assessment and feedback opportuni- ties. In both cases, the students scored marginally higher with the 4-panel platform, but our findings did not cor- roborate studies that demonstrated self-assessment ques- tions and feedback improved students test scores. 13 We did not have a direct measure of the learning styles; there- fore, we assumed that the features added more learning styles to the 4-panel platform. However, while more learning styles may have been incorporated, they may not have aligned with students’ preferred learning styles any better than the traditional platform. The hypothesis that students’ actual performance in the course was related to their preferred content delivery platform was partially supported and, surprisingly, oppo- site to our expectations. Students may have a preference for a specific platform, but it may have little influence on performance. 29,30 Conventional wisdom suggests that American Journal of Pharmaceutical Education 2009; 73 (5) Article 95. 7 adapting instruction to students’ preferred learning styles should improve their academic performance. However, the literature is mixed and learning preference and learn- ing styles are not consistently associated with academic performance.17-19,31 In one study, significant improve- ment in performance did not persist after a delay of sev- eral months, although learning platform differences were seen immediately after the training.15 One plausible ex- planation for our finding of no difference is that testing occurred 2 weeks after the lectures were presented. While 2 weeks seem to be a short time, factors associated with short- and long-term knowledge retention are not well understood.32,33 Contradictory to our hypothesis, we found that objective measures of academic performance (ie, examination scores) were diametrically opposed to students’ own perceptions of how much they learned. Students’ demographic characteristics were hypoth- esized to be associated with platform preference and academic performance. In some studies, students’ prefer- ences for learning strategies and outcomes were associ- ated with personal and contextual factors, such as academic discipline, prior education, age, and gen- der, 21,22 but not in other studies.15,26,34,35 Demographic characteristics played a small role in our comparison of the 2 platforms, with the exception of race. Both prefer- ence variables examined in this study showed that African-Americans, Hispanics, and racial groups other than Asian were significantly more likely to prefer the 4-panel platform compared to Caucasian students. While demographic characteristics (eg, age and gender) have been shown to be predictors of success in pharmacy school, our findings only partially support other studies that showed science/math GPA and past academic perfor- mance (eg, examination scores and maintaining adequate progress in the curriculum) may be among other predic- tors.3,21,22 We based this observation on the fact that stu- dents in the 2 platform groups performed equally well on the first 2 examinations of the semester (, 2% difference on each of both examinations). While many of these var- iables predicted students’ total scores on examination 1 and examination 2, it was their performance on these 2 examinations that predicted students’ scores on the items testing the postrandomization course material. This find- ing indicates these demographic and prepharmacy aca- demic achievement factors did not directly impact postrandomization academic performance between the 2 platforms. Rather, they were mediated by students’ per- formance on the current semester’s course (Figure 1). Moreover, after both the platform and preference varia- bles were added to the model, the coefficients represent- ing the impact of the demographic and preference variables remained nearly identical. So, it seems that de- mographic characteristics and the delivery platform’s im- pact on students’ academic performance is independent from their lecture platform preference. In fact, students’ perceptions about which platform helped them learn bet- ter were opposite to their preference and those with no preference scored better. Limitations Interpretation and incorporation of these findings should only be done within the context of its limitations and should be considered in the design of any future work into educational programs. Individual students on the same campus viewed the lectures within the same format. We have no evidence they were aware that the platform was different among the campuses for the postrandomiza- tion schizophrenia and bipolar lectures. We think students were successfully desensitized to the experiment by pro- viding washout lectures using the traditional video format before randomizing campuses to platform and by provid- ing them with the same video of the same lecturer in both platforms. Therefore, the primary differences between the 2 platforms were the quizzes, feedback, and printed lec- ture transcript. Randomization was not individual student based, but rather campus based. Therefore, biased findings and mis- interpretations may have resulted from the ecological fallacy.36 That is unlikely, however, since 2 campuses randomized into the same intervention group differed on their content delivery platform preferences. The finding of a nonsignificant difference in aca- demic performance also may be because the intervention was not strong enough or may have been underpowered. Both platforms had the same lecturer on the screen and the same audio narrative. The additional transcription, em- bedded questions, and feedback may not have been suf- ficient to impact their preference or their performance. Another platform with different features may have been a better venue to show learning differences. In other cases, the findings may have been underpowered because of the small sample size of the comparison groups. For example, the alpha error for students out of sequence (n 5 39) was p 5 0.07 for step 2 in Table 3, but the magnitude of the regression coefficient revealed that it had the second larg- est impact on students’ performance on the schizophrenia questions. A similar situation occurred with the compar- isons of the traditional and 4-panel platforms with those with no preference (n 5 33) for the bipolar questions. Even so, this study was a legitimate evaluation of 2 dif- ferent commercially available platforms, which was our original goal. If the obtained difference in academic per- formance were replicated over the typical 100-point ex- amination, it would have made , 2% difference in the American Journal of Pharmaceutical Education 2009; 73 (5) Article 95. 8 average student’s total score. These findings were espe- cially useful for decision-making purposes since the 4- panel was significantly more costly to deliver and the College continued to use the traditional video format. We were unable to include more questions in the test of the outcome and unable to psychometrically test the questions embedded in the course’s third examination in advance. The variability in the outcome was also limited because 1 of the post-randomization items was answered correctly by almost everyone. The schizophrenia and bi- polar questions were only 16% of the third examination because the rest of the topics presented during the semes- ter (eg, sleep products, stimulants) were tested in the same examination. In addition, 10% of the items tested the de- pression content of the first 4-panel lecture and the anx- iety content within the washout video. Therefore, nearly three-fourths of the examination questions were unrelated to the trial. While not optimal, the conditions in this study showed that a legitimate demonstration of a randomized trial is possible in academic and scholarly research. While this study has limitations, it has important strengths in comparison to most pharmacy educational interventions. First, it is a cross-over design where stu- dents serve as their own control; significantly increasing the power of the study. Campuses were randomized and students and researchers were unaware of assignment un- til after the course concluded. The 2 platforms and test items also were delivered within the everyday educational process of the college, rather than presenting the students with a special test of the content, as is the case with most studies of educational interventions. Finally, all of the students were initially introduced to the learning platform at the same time before the measured questions. The orig- inal format was also added so that it might washout the after effects of any platform-specific learning difficulties with the first exposure to the new platform. The washout strategy appeared to work because the difference in learn- ing effect size was approximately equal and independent of the order in which the 4-panel platform was presented. CONCLUSIONS Pharmacy students’ academic performance was not significantly impacted by whether lectures were delivered using a traditional or 4-panel learning platform. The extra faculty time, effort, and cost put into presenting the class material in a 4-panel platform, and the students’ extra time and effort required to review the material, did not produce a comparable benefit in student preference and performance. Factors other than the platform technology were more important in influencing students’ preference and academic performance. Students’ perceptions were in opposition to the conventional wisdom; namely, students who preferred the traditional video format thought they learned better with the 4-panel platform and those who preferred the 4-panel platform thought they learned better with the traditional video format. ACKNOWLEDGEMENTS This paper was presented at the 109th Annual Meet- ing of the American Association of Colleges of Pharmacy (AACP), July 2008, Chicago, IL. The authors would like to acknowledge Kristin Weitzel, PharmD. At the time of this research Dr. Weitzel was a faculty member at the University of Florida, Col- lege of Pharmacy, and course coordinator for the pharma- cotherapy course where these platforms were tested. We would also like to thank Markus Maxim for his assistance with collating and cleaning the data and for conducting some of the preliminary data analyses. Herr Maxim was an exchange student from the Fachbereich Pharmazie, the University of Düsseldorf in Düsseldorf, Germany. REFERENCES 1. Department of Health and Human Services, Health Resources and Services Administration, Bureau of Health Professions. The Pharmacist Workforce: A Study of the Supply and Demand for Pharmacists. Washington, DC. 2000. ftp://ftp.hrsa.gov/bhpr/ nationalcenter/pharmacy/pharmstudy.pdf. Accessed July 7, 2009. 2. Lee HW. Enhancing education through technology. J Am Acad Bus. 2005;7(1):327-30. 3. Ried LD, McKenzie M. A preliminary report on the academic performance of pharmacy students in a distance education program. Am J Pharm Educ. 2004;68(3):Article 65. 4. Janda MS, Botticelli TA, Mattheos N, Nebel A, Wagner D, Nattestad A, Attström RA. Computer-mediated instructional video: a randomized controlled trial comparing a sequential and a segmented instructional video in surgical hand wash. Eur J Dent Educ. 2005;9(2):53-9. 5. Lovelace MK. Meta-analysis of experimental research based on the Dunn and Dunn model. J Educ Res. 2005;98(3):176-83. 6. Cook DA, Gelula MH, Dupras DM, Schwartz A. instructional methods and cognitive and learning styles in web-based learning: report of two randomised trials. Med Educ. 2007;41(9):897-905. 7. Lujan HL, DiCarlo SE. First-year medical students prefer multiple learning styles. Adv Physiol Educ. 2006;30(1):13-6. 8. Cook DA, Dupras DM. A practical guide to developing effective Web-based learning. J Gen Intern Med. 2004;19(6):698-707. 9. DeMuth JE, Bruskiewitz RH. A comparison of the acceptability and effectiveness of two methods of distance education: CD-ROM and audio teleconferencing. Am J Pharm Educ. 2006;70(1):Article 11. 10. Clark R. Confounding in educational computing research. J Educ Comput Res. 1985;1(2):28-42. 11. Keane D, Norman G, Vickers J. The inadequacy of recent research on computerized assisted instruction. Acad Med. 1991;66(8):444-8. 12. Cook DA. The research we still are not doing: an agenda for the study of computer-based learning. Acad Med. 2005;80(6):541-8. 13. Cook DA, Thompson WG, Thomas KG, Thomas MR, Pankratz VS. Impact of self-assessment questions and learning styles in American Journal of Pharmaceutical Education 2009; 73 (5) Article 95. 9 web-based learning: a randomised, controlled, crossover trial. Acad Med. 2006;81(3):231-8. 14. Brown G, Manogue M. AMEE Medical Education Guide No. 22: Refreshing lecturing: a guide for lecturers. Med Teach. 2001;23(3):231-44. 15. Kennedy GE. Promoting cognition in multimedia interactive research. J Interact Learn Res. 2004;15(1):43-61. 16. Swagerty Jr. D, Studenski S, Laird R, Rigler S. A case-oriented web-based curriculum in geriatrics for third-year medical students. J Am Geriatr Soc. 2000;48(11):1507-12. 17. Maleck M, Fischer MR, Kammer B, Zeiler C, Mangel E, Schenk F, Pfeifer KJ. Do computers teach better? A media comparison study for case-based teaching in radiology. Radiographics 2001;21(4):1025-31. 18. Hudson JN. Computer-aided Learning in the real world of medical education: does the quality of interaction with the computer affect student learning? Med Educ. 2004;38(8):887-95. 19. Devitt P, Palmer E. Computer-aided learning: an overvalued educational resource? Med Educ. 1999;33(2):136-9. 20. Hambleton RK, Jones RW. Comparison of classical test theory and item response. Educ Meas Issues Pract. 1993;12(2):38-47. 21. Chisholm MA, Cobb HH, Kotzan JA. Significant factors for predicting academic success of first year pharmacy students. Am J Pharm Educ. 1995;59(4):364-70. 22. Jacoby KE, Plaxco WB, Kjerulff K, Weinert AB. Use of the demographic and background variables as predictors of success in pharmacy school. Am J Pharm Educ. 1978;42(1):4-7. 23. Rasch G. Probabilistic models for some intelligent and attainment tests. 1980. Chicago: MESA Press. 24. Bond TG, Fox CM. Applying the Rasch model: fundamental measurement in the human sciences Mahwah, NJ: Lawrence Erlbaum Associates; 2001. 25. Linacre JM. What do infit and outfit, mean-square and standardized mean? Rasch Measurement Transactions. 2002;16(2):878. 26. Alkhasawneh IM, Mrayyan MT, Docherty C, Alashram S, Yousef HY. Problem-based learning (PBL): assessing students’ learning preferences using VARK. Nurse Educ Today. 2007;28(5):572-9. 27. Novak S, Shah S, Wilson JP, Lawson KA, Salzman RD. Pharmacy students’ learning styles before and after a problem-based learning experience. Am J Pharm Educ. 2006;70(4):Article 74. 28. Henke H. Learning Theory: Applying Kolb’s Learning Style Inventory with Computer-based Training. 9–10. http:// www.chartula.com/learningtheory.pdf Accessed March 18, 2009. 29. Rassool GH, Rawaf S. The influence of learning style preferences of undergraduate nursing students on educational outcomes in substance use education. Nurse Educ Pract. 2008;8(5):306-14. 30. Schwartz LR, Fernandez R, Kouyoumjian SR, Jones KA, Compton S. A randomized comparison trial of case-based learning versus human patient simulation in medical student education. Acad Emerg Med. 2007;14(2):130-7. 31. Cook DA. Learning and cognitive styles in Web-based learning: theory, evidence and application. Acad Med. 2005;80(3):266-78. 32. Bell DS, Fonarow GC, Hays RD, Mangione CM. Self-study from Web-based and printed guideline materials: a randomized, controlled trial among resident physicians. Ann Intern Med. 2000;132(12):938- 46. 33. Nadir JP, Adla T, Janda A, Feberova J, Kasal P, Hladikova M. Long-term retention of knowledge after a distance course in medical informatics at Charles University in Prague. Teach Learn Med. 2004;16(3):255-9. 34. Slater JA, Lujan HL, DiCarlo SE. Does gender influence learning style preferences of first-year medical students. Adv Physiol Educ. 2007;31(4):336-42. 35. Barakzai MD, Fraser D. The effect of demographic variables on achievement in and satisfaction with online coursework. J Nursing Educ. 2005;44(8):373-80. 36. Piantadosi S, Byar DP, Green SB. The ecological fallacy. Am J Epidemiol. 1988;127(5):893-904. American Journal of Pharmaceutical Education 2009; 73 (5) Article 95. 10