Distance Education Vol. 31, No. 2, August 2010, 175–192 ISSN 0158-7919 print/ISSN 1475-0198 online © 2010 Open and Distance Learning Association of Australia, Inc. DOI: 10.1080/01587919.2010.503343 http://www.informaworld.com Developing a mobile learning solution for health and social care practice J.D. Taylora*, C.A. Dearnleyb, J.C. Laxtonc, C.A. Coatesd, T. Treasure-Jonese, R. Campbellf, and I. Hallg aOffice of the Pro Vice Chancellor Assessment Learning and Teaching, Leeds Metropolitan University, Leeds, UK; bSchool of Health Studies, University of Bradford, Bradford, UK; cFaculty of Medicine and Health, University of Leeds, Leeds, UK; dFaculty of Health Placement Unit, Leeds Metropolitan University, Leeds, UK; eSchool of Medicine, University of Leeds, Leeds, UK; fEcommnet Ltd, Newcastle, UK; gMyKnowledgeMap, York, UK Taylor and FrancisCDIE_A_503343.sgm (Received 4 January 2010; final version received 26 June 2010) 10.1080/01587919.2010.503343Distance Education0158-7919 (print)/1475-0198 (online)Original Article2010Open and Distance Learning Association of Australia, Inc.312000000August 2010J.D.TaylorJ.D.Taylor@leedsmet.ac.uk In this article we share our experiences of a large-scale five-year innovative programme to introduce mobile learning into health and social care (H&SC) practice placement learning and assessment that bridges the divide between the university classroom and the practice setting in which these students learn. The outputs are from the Assessment & Learning in Practice Settings (ALPS) Centre for Excellence in Teaching & Learning (CETL), which is working towards a framework of interprofessional assessment of Common Competences in the H&SC professions. The mobile assessment process and tools that have been developed and implemented and the outcomes of the first-stage evaluation of the mobile assessment tools are discussed from the student perspective. Keywords: mobile learning; mobile assessment; common competencies; shared services; practice placement learning; health and social care students; work-based learning Introduction Practice-based education is a core element of all health and social care (H&SC) profes- sional programmes and is an essential component for registration to practice. Profes- sional practice is based on a ‘network of knowledge’ acquired as a result of experiences in practice (Moon, 1999, p. 53). Current government policy promotes the implementation of interprofessional education (IPE) (Craddock, O’Halloran, Borth- wick, & McPherson, 2006), and consequently current education and training for H&SC students has to encompass elements of interprofessional learning. Fundamental to the care of service users within modern H&SC practice is for all professionals to have a high level of professional competence in communication, teamwork, and ethical practice. Assessment of these common competences is increasingly a key focus in many IPE programmes (Simmons, Wagner, AJeCeries, & Reeves, 2010). The ALPS (Assessment & Learning in Practice Settings) programme involves five UK universities (the universities of Bradford, Huddersfield, Leeds, Leeds Met, and York St John) and its National Health Service (NHS) partners comprising the *Corresponding author. Email: J.D.Taylor@leedsmet.ac.uk 176 J.D. Taylor et al. Educational Commissioning, Yorkshire and Humberside Strategic Health Authority, clinical networks, professional and statutory regulatory bodies, and 16 H&SC profes- sional groups. The ALPS programme is using mobile devices to deliver learning resources and assessments to enrich, enhance, and extend practice learning. A key aim of the ALPS programme is to assist students to feel confident, as well as competent, at the start of their professional careers. The ALPS approach to improve confidence and competence of students is to build on the Boud (2000) theory of sustainable assessment. Students are encouraged to take feedback from a variety of sources, reflect on that feedback, and deduce further action to improve performance. This student activity is predominantly reflection ‘on’ and ‘in’ action (Schön, 1995) and enhances the richness and quality of the students’ reflections, thus helping them in developing lifelong learning skills. ALPS aimed to achieve this by using interpro- fessional learning opportunities, common interprofessional assessment tools (to assess the Common Competences of Communication, Teamwork, and Ethical Practice), and a mobile learning (m-learning) solution using mobile devices; phone-sized hand-held computers – the HTC Vario I and Vario II. The potential of mobile devices for learning and assessment is of increasing inter- est to the H&SC professions (Sandars & Pellow, 2006) and the wider higher education community (Anderson & Blackwood, 2004). The personal nature of m-learning and the interactivity of this mode have been found to encourage learner engagement (Savill-Smith, Attewell, & Stead, 2006). Key features of m-learning are that it provides ‘just enough, just for me, just in time’ learning (Nycz & Dragon, 2005) often typically situated (in the workplace or field) and contextualised through interaction with tutor, mentor, or peers. Authentic learning environments in higher education typically involve these characteristics (Herrington & Herrington, 2006). Studies in health settings have been undertaken mainly in the USA and Australia and have focused on nurses (Miller et al., 2005), paramedics (Norman, 2005), and doctors (Scheck McAlearney, Schweikhart, & Medow, 2004). These studies investigated the use of mobile devices to provide access to information and assessed the usability of the devices for health care professionals. Walton, Child, and Blenkinsopp (2005) explored the perceptions of health care students regarding the use of mobile devices in the form of personal digital assistants (PDAs) in the community. They have been found to be an effective resource for students, especially for reference materials (Miller et al., 2005). Very few studies have assessed the effectiveness of PDAs for practice assessment and those that have been reported have mainly involved self- assessment activities with medics (Bent, Bolsin, Creati, Patrick, & Colson, 2002; Engum, 2003) and nurses (Kneebone, Nestle, Ratasothy, Kidd, & Darzi, 2003; Koeniger-Donohue, 2008). A number of early m-learning pilots involving the use of mobile devices were conducted by ALPS to assess the readiness of the partner institutions to adopt mobile technologies for H&SC practice placement learning and assessment. These pilot stud- ies demonstrated that the benefits of m-learning for students were improved lecturer and peer support, better access to information and resources, and the ability to record and reflect on their clinical experiences in real time (Dearnley et al., 2009; Haigh, Dearnley, & Meddings, 2007; Parks & Dransfield, 2006; Taylor, Coates, Eastburn, & Ellis, 2006, 2007). Where mobile devices were used for assessment, students valued the increased student-centredness of the process (Sandars & Dearnley, 2009). ALPS subsequently commissioned the development and implementation of a tailor-made innovative m-learning architecture designed to support the assessment of Distance Education 177 the ALPS Common Competences to include mobile delivery of common interprofes- sional assessment tools and to provide any time, anyplace access to the assessments, learning materials, and tutor support. This article presents the findings from a first-stage evaluation of the ALPS mobile assessment processes using the ALPS mobile architecture, from the student perspective. Research question The research question investigated in this study was: how do students perceive the impact of ALPS mobile assessment processes upon their learning and assessments in practice settings? Materials and methods The evaluation of the mobile assessment processes from the student perspective is an ongoing two-stage evaluation. Stage one is now complete and will be reported on here. This first-stage evaluation involved a qualitative investigation conducted using a mixed-methods approach. Cohort-specific focus groups were undertaken with eight of the ALPS professional groups: dietetics; midwives; dental hygiene and therapy; speech and language therapy; occupation therapy; audiology; social work; and child- branch nursing. In addition to the focus groups, students were invited to complete online diaries/blogs. It was anticipated that they could do this on their mobile devices and then upload them to the e-portfolio entries or email them directly to the research- ers. Few students engaged in this activity, despite financial rewards (book tokens) being offered. This is interesting in itself and perhaps reflects the students’ heavy workloads. However, a few students did send diary notes to us and these were informative and contributed to the thematic analysis. A final schedule of questions for the focus groups was developed using an iterative process of peer review by a group of researchers from the five partner institutions. The ALPS research officer attended all focus groups to ensure consistency across partner sites and the recordings of the discussions were transcribed verbatim. Each partner site undertook independent thematic analysis of focus group sessions prior to an analysis workshop at which outcomes were shared and discussed, and overall key outcomes agreed. Ethics The investigation of students’ perceptions of the introduction of these devices for assessment purposes was considered to be a curriculum and teaching development project that did not require ethical review by an NHS Research Ethics Committee or approval from the NHS Research and Development Office. Ethical approval was sought from partner university ethics committees. This process was made easier by an agreement early in the collaboration, that the ethics committee of each university would recognise the decisions of partner institutions. Thus, collaborative research was initially scrutinised by the ethics committee of one university, and the outcomes relayed to the committees of partners. Informed consent was obtained from those taking part and students had the right to withdraw from the study and their anonymity was respected at all times. 178 J.D. Taylor et al. Each student was issued with an ID-type card to carry, certifying that their device had been issued as a learning and assessment tool by their university. Patient confidentiality was secured by requiring students to sign a ‘Contract of Use’ drawn up jointly by the five ALPS partner universities with legal support. The contract also had a number of governance purposes, including responsible ownership of the device, where to go for help and fair usage of the device, particularly advice on appropriate use of the camera function in line with NHS policy and guidelines. In reality, many H&SC workers own mobile devices with such capabilities and in some cases this issue had not been addressed. Indeed the work of the ALPS CETL (Centre for Excellence in Teaching & Learning) acted as a catalyst in these instances for Trust mobile device policies to be developed. Participants ALPS is a large-scale implementation with over 900 users across the partnership, who have been supplied with HTC Vario smartphones and unlimited free data connectiv- ity. A convenience sample of 79 students from eight of the ALPS professional groups were recruited from across the programme to take part in the first-stage evaluation of students’ experiences of the ALPS mobile learning and assessment processes. Instruments The ALPS common assessment tools A centrepiece to the work of ALPS has been a Common Competency Mapping exercise involving all of the 16 professional groups. This exercise led to the produc- tion of three Common Competency Maps on the topics of teamwork, communication, and ethical practice. The interactive Communication Map is illustrated in Figure 1. Figure 1. The ALPS Communication Map. The map is divided into clusters, dimension statements, elements, and performance criteria. The Common Competency Maps define the skills and standards that students must reach to be assessed as competent in the respective areas (Holt et al., 2010). The maps were used to generate an assessment toolkit that was used to assess the Common Competences. So far five assessment tools have been developed and used in the practice assessment scenarios: ● Gaining consent ● Providing information to a colleague ● Knowing when to consult or refer ● Demonstrating respect for service user during an interaction ● Working interprofessionally. Each tool enables students to collect 360-degree feedback from a range of participants in the assessment scenario, including the practice educator (from their own or a different profession), peers (from their own or a different profession), service users (and carers), and self. The ALPS common assessment tools were developed using agreed best practice from the different professions involved in the ALPS programme. For example, social work students already gain feedback from service users at some time during their practice placement experience. This concept of service user feedback is also being considered in the nursing profession (Speers, 2008) and is rapidly gaining acceptance across other professions, supported to some extent by the work of ALPS. Distance Education 179 The self-assessment section of each tool consists of a series of open-response questions, whereas the remaining sections have a mixture of open-response and multiple-choice question types. There is also a section in each tool for students to reflect on the feedback they have been given and to develop an action plan that can be signed off by a practice educator, thus ensuring that feed forward is a key outcome of the process. The notions of feed up, feedback, and feed forward embodied in the ALPS assessment tools are recognised as important strategies for effective feedback to occur. They enable the student to gain a better understanding of the performance requirements to achieve the learning goal (Hattie & Timperley, 2007). The ALPS common assessment tools have been used in paper-based, electronic, and mobile formats. In the mobile format used in this study, each question is displayed on a different page to take account of the screen size of the mobile devices. The mobile format takes advantage of the affordances of devices and allows evidence of compe- tence to be captured as text, supported by the predictive text and spell-check functions, and in an audio format using the voice recorder. Figure 2 shows a question page from the ‘Gaining consent’ ALPS Mobile Assessment Tool with an example of an open- ended question and the ability to record the response in both text and audio formats. Figure 2. Question page from the ‘Gaining consent’ ALPS Mobile Assessment Tool illustrating an example of an open-ended question and the ability to record the response in text or audio formats. The ALPS assessment cycle and mobile assessment architecture Work-based learning in H&SC is built around a tripartite relationship between the learner, the workplace, and the university. The systems architecture was developed by Figure 1. The ALPS Communication Map. The map is divided into clusters, dimension statements, elements, and performance criteria. 180 J.D. Taylor et al. ALPS to reflect this dynamic relationship and support the ALPS pedagogic processes. The ALPS mobile assessment cycle is depicted in Figure 3. Figure 3. The ALPS assessment cycle. The ALPS mobile architecture (Figure 4) links together software to create, pack- age, and securely push out the assessments and learning materials to the students’ devices. Figure 4. The ALPS assessment suite architecture. The ALPS mobile assessment client on the devices enables students to view, complete, and save their completed assessments onto the device and then upload them to their e-portfolio (Figure 5). Here they can review their completed assess- ments and any additional feedback posted by their university tutor. The e-portfolio also has the facility for students to keep a blog of their experiences. Students can also upload photographic images, when permitted within the constraints of health care settings, captured using the camera function of the mobile device to supplement their blogs. Figure 5. The ALPS e-portfolio student view page. The e-portfolio has functionality to alert the tutor, when logged in, to any new assessments that have been uploaded to it by the students (Figure 6). The tutor can match the student’s performance on the assessment with the relevant skills defined in the ALPS common competency frameworks (Figure 7). Figure 6. ALPS e-portfolio tutor view page.Figure 7. ALPS e-portfolio competency framework page. The system synchronises every few hours, and if it fails because no connection is available, will persist until a connection is made. This, combined with advice to the students to leave their mobile devices turned on and charged up, ensures that assess- ments are delivered promptly to the students’ devices even when they have been out of network coverage for a few hours, or students have had their device switched off. In practice, this means that students are still able to complete assessments whether Figure 2. Question page from the ‘Gaining consent’ ALPS Mobile Assessment Tool illustrating an example of an open-ended question and the ability to record the response in text or audio formats. Distance Education 181 Figure 3. The ALPS assessment cycle. Figure 4. The ALPS assessment suite architecture. 182 J.D. Taylor et al. Figure 5. The ALPS e-portfolio student view page. Figure 6. ALPS e-portfolio tutor view page. Distance Education 183 they are in a ward, in a lead-lined room taking an X-ray, in a city centre, or out in a rural community location. Students have to log into the device using a username and password. The system encrypts the data that is stored on the device and storage card at login, during trans- mission, and following upload of an assessment to the e-portfolio. There is provision for central device management so that a device can be disabled and any data wiped if it is lost or stolen. This addresses the security concerns raised by most health care settings concerning data security and patient confidentially. Training and support All students received standard training on how to use the device and the assessment software in a face-to-face, classroom-based session. As the purpose of ALPS is ulti- mately to embed the programme at an institutional level, the session was delivered locally by partner site representatives assisted by core team members. It was recogn- ised in the design that the training programme had to be a genuine combination of information technology (IT) and pedagogy to prevent the focus becoming the technol- ogy rather than the learning experience (ALPS CETL, 2010). All stakeholders from across the partnership had access, via the ALPS website, to a suite of online training tools, including videos and written documentation, which could be shared with practice assessors. A selection of briefing documents for practice assessors was also developed and distributed via university link tutors and practice Figure 7. ALPS e-portfolio competency framework page. 184 J.D. Taylor et al. learning facilitators, and all assessor training and update programmes included infor- mation about the ALPS processes. All practice managers were informed about the work of the CETL and posters were prepared and sent to appropriate practice settings for display in clinical areas used by students during their placement. A shared ALPS helpdesk based in Learner Support Services at the University of Bradford provided support and advice to all users across the whole of the ALPS programme. Results Generally speaking, the results can be divided into two sets – those concerned with the hardware of the mobile device and those concerned with the assessments that were delivered via the devices. Key themes within the hardware category were ‘becoming familiar with the devices as learning tools,’ ‘device functionality,’ and ‘training.’ Key themes within the assessment category were ‘the challenge of gaining service user and/or carer feedback’ and ‘assessment for lifelong learning.’ A smaller, additional category emerged which was related to culture, client group, and context. These cate- gories and themes will now be discussed. Mobile devices Becoming familiar with the devices as learning tools Many students reported in the focus groups that there was a considerable commitment of time required to become familiar with the devices and if they could not see an added value compared with that of their existing practice, they would not persevere. There were two notable situations, however, where students found the devices particularly valuable to their learning. One was where they were working in commu- nity settings and had no computer access; for example, social work students working in sheltered accommodation or occupational therapy students on work placement away from home and university and living in rented accommodation. In these cases the devices provided valuable links to social or academic networks. The other was students with dyslexia. For these students the inherent benefit of pocket spell-checking facilities and audio recording was sufficient to motivate them to learn how to use the technology (Dearnley & Walker, 2009). As one student said: I’m dyslexic, so I record everything on it then I can type it up later on. Or if I’m having a thought and I’m stood in the kitchen and I haven’t got a piece of paper to hand I’ll get my Dictaphone and record it to a Dictaphone and then transcribe it later on, but it has taken me a while to get used to my own voice because I don’t like it. Clearly the role of mobile devices in learning goes beyond the functions of a mobile phone, but not all students realised this. For example, one student reported that the lack of phone access on this device was a barrier to engagement: I think when you get your own mobile phone though you get really excited about it and you want to find out about it … at the end of the day, when you have a device that you are going to use all the time and you might rely on it every single hour of the day to make phone calls or to receive texts from people, you’re going to want to learn how to use it. But this it’s like – well what’s it going to offer me if I can’t phone? Distance Education 185 Because this student did not recognise the learning opportunities of a mobile device taking time to become familiar with its functions was not a priority in her busy sched- ule. As more students and staff start to engage with mobile devices that offer more than phone and text functions, it is likely that the opportunities they offer for learning will be more recognisable. Device functionality Issues relating to the functionality of the devices were reported by most groups. This included the small size of the screen (audiology) and the poor quality of the camera (midwives). Some students reported that the password protection made it difficult to engage practice educators; that is, they couldn’t leave the device with them to complete the assessment documents at a later date. However, from an educational perspective, this is a positive aspect of mobile assessment; students have to maintain control over the process and be present when the assessor gives them feedback and potentially therefore engaged in the feedback dialogue. This supports the findings of the ALPS m-learning pilot studies. Although students reported that the device functionality was ‘good for reflecting’ and ‘good for quick notes,’ some of the other functionality issues had a less favourable impact on student learning and assessment. A common problem was that the Internet access slowed down after essential security software was added to the device. As one student said: The programmes are good – like Word – it’s the connection to the Internet that’s slow. Another student reported: ‘My device wouldn’t sync.’ She obtained help from the helpdesk but the device then froze when she tried to open documents, so she gave up. This was a familiar tale and relates to the earlier theme of students becoming familiar with the devices and learning how to use them. If they encountered technical difficul- ties, their enthusiasm for the devices soon wavered. Although they would approach the helpdesk initially, they would generally give up after encountering a couple of problems. Finally, a social work student made the following comment about functionality which may relate to how she had configured her device: A big (for me) shortcoming of the calendar function – it deletes appointments etc a week after the date has passed. I often look back through my diary to check what I’ve done, what date I did something, etc. I think this may be enough to send me back to a paper diary. The range of functionality issues reported here demonstrates some of the challenges inherent in a large-scale mobile learning and assessment initiative. These go beyond issues of software application and student and staff engagement, to issues related to particular devices and particular student needs. They involve technical support for student engagement and staff training. Training There was a need to ensure that students, their lecturers, and their practice assessors fully understood the potential uses of a mobile device for learning in practice settings. 186 J.D. Taylor et al. Student training was extensive, as discussed above. However, it became clear during the evaluation that some students had not used their devices to full capacity because they had not learned how to really use them. An example of this was the audio func- tion. Dyslexic students in the MEDS project1 found this invaluable as they were able to capture their thoughts quickly without having to worry about writing them down in an acceptable way. They reported being able to respond to the assessment questions and reflect in practice, which they would not otherwise have been able to do (Dearnley, Walker, & Fairhall, 2010). Arguably this should be true for all students, yet few seemed to harness this learning opportunity and reported the reasons for not doing so was the lack of knowledge/understanding of how to do it (despite it being an integral part of the training package). As several students said that they didn’t like hearing their own voices, it is likely that using the mobile audio function to full capac- ity for enhancing reflection will take time to gain acceptance. This also demonstrates the need for those facilitating mobile learning to be mindful of the wider capabilities of the devices for supporting learning. Overall then the key findings related to student use of mobile devices for practice- based learning and assessment were that for students to fully engage with mobile devices and take time to learn how to use them the devices must be seen to offer significant benefits over alternative tools. Cognisance should be taken of all the inher- ent benefits to learning within the device, such as audio functions, spell-check, rather than focusing on specific tasks and ensure that a robust technical support system is in place. The mobile assessment processes The challenge of gaining service user and/or carer feedback In many of the ALPS professions, gaining feedback on performance from service users (i.e., patients/clients) and/or their carers was a new process. For others, such as social work students, this was accepted practice; our findings reflect this variance. Students who were new to gaining service user feedback tended to focus their discus- sion more on the process and expressed concerns about reliability and validity of feed- back obtained, for example: ‘It won’t be honest’ or ‘They’ll be worried that the service they receive might be effected if they, if they’re not complimentary about you.’ Others, however, like these social workers, related their concerns to the use of mobile devices to record the feedback: I think to get proper service user feedback it has to be anonymous really … or at least have, if it’s not completely anonymous, at least have somebody else other than yourself getting it. She went on to say: I couldn’t give them that (the device) and expect them to use it, it’s easier to give them a paper form, a paper copy and then they sent it back to me, when, when one of their carers could help them fill it in [i.e., their usual practice]. Other professions, however, liked the assessment processes and found no prob- lems in getting service users to complete them, as these midwives said: Distance Education 187 I thought they were good, I thought they were easy use, for your women to fill in. Yeah I think probably the most useful was feedback from the women that you look after. Clearly there were professional differences here, based on culture, history, and client group. Although the realities of service user and/carer assessment of practice per se continue to be debated by the professions, it is clear that an informed judgment will always be required in relation to suitability of the circumstances and whether or not it is appropriate for feedback to be obtained on a mobile device, will be part of that equation. Assessment for lifelong learning The key element within the ALPS mobile assessment processes is the capacity for students to obtain formative feedback on specific areas of practice regularly through- out their placement. This is in addition to the more general feedback and in some cases grades, which they obtain at the end of their placement. Students recognised this strength, and how it supported their learning. As one said: But this does go towards our course because we’ve got lifelong learning to think about. Mobile assessment seems to support lifelong learning because it allows access any time anyplace and allows regular engagement with learning and assessment processes in short bursts. Students reported preferring the devices to carrying a lot of pieces of paper and found this encouraged them in their work; they also like getting feedback on their practice assessment from their university-based lecturers: You see it’s nice getting the comments from [Tutor] because I don’t get to see [Tutor] a lot in clinic … and I can see if we’d been doing that from the start that would have been good because he’d have been able to see how I’d progressed. Being reflective is fundamental to lifelong learning. Helping students to develop the skills of reflection is not easy. It is not unusual for students to pay lip service to reflec- tion, while concentrating on ‘assignments’ that will be ‘summatively assessed’ and will get them the grades. ‘Becoming reflective’ is like learning to drive: you have to learn the rules before it becomes buried as tacit knowledge and results in spontaneous actions (Dearnley & Matthew, 2007). There was some indication that the mobile assessments, in generating regular reflection on/in practice through structured self-assessment, were indeed beginning to embed the ‘rules’ of reflection. As one student said: It takes me a long time to get my head round when I could do these things, but once I’ve got into the habit of doing them then I do them all the time. Generally, students in this study liked the reflective nature of the formative assess- ments. Again, however, there were some differences of opinion across professions based on previous experiences and expectations. Culture, client group, and context Interestingly, there appeared to be clear differences in acceptability and usage across the professions, which could be accounted for by differences in the culture, client 188 J.D. Taylor et al. group, and context in which they operated. For example, audiologists, who regularly use technology in their practice, reported widespread use, as opposed to some social work students who felt that the technology created a barrier between them and their client group. So we found that students fell into two overall extreme categories; those who used the mobile device a lot and those who didn’t use it at all. Discussion A number of practical, logistic, and educational issues influenced our decision to ‘go mobile’ and issue ALPS mobile devices to students in order to access and complete the ALPS assessment tools. These issues also helped to define the design and specifi- cation of the ALPS mobile architecture. There was no guarantee that students had access to PCs, laptops, or even a login to a wireless network in the placement setting, which would facilitate use of the web version of the assessment tools. NHS Trusts often employ firewalls that make online access difficult. Also the portability of the mobile device lends itself to delivery of the ALPS assessments to students who are on the move, which is especially useful in community settings. It was considered high risk (at the time) to rely on using students’ own mobile phones, as while most students had mobile phones, the capabilities of the phones varied greatly (Sandars & Pellow, 2006). Another consideration in the design of the project was that some ALPS students worked in locations with limited or no connectivity; for example, social work students working in certain remote rural areas or radiography students working in lead-lined rooms. This meant the system needed to work offline, or at least in a sometimes disconnected environment, which is a feature of the ALPS archi- tecture. Healthcare Trusts were concerned about data security so there needed to be a fit with their IT and security policies, which the system provides for by ensuring data is encrypted and that access to the device is via a secure login. Tutors needed to feel confident that any feedback gathered by students was genuine and trustworthy so the ability for practice educators to sign off the assessments on the device allayed their concerns. Karadeniz (2009) has shown that although there is no significant difference between the achievement level of the students who took paper-, web-, and mobile- based assessment, our findings indicate that students had positive perceptions of web- and mobile-based tests, compared with paper, due to the ease of use, and comprehensive and instant feedback. The aim of this study was to gain insights on how the mobile delivery of the ALPS assessment and learning processes can help H&SC students to gain maximum benefit from the interprofessional learning opportunities on offer to them while on practice placements. Our findings show that there is still much to learn about the impact of mobile learning and assessment on the learning experience. For example, there appears to be a clear difference across the professions due to culture, client group, and context in which they are used. It is likely, therefore, that in professional education, early users will be those professions that already use technology as part of their work. The ALPS mobile assessment processes were designed to encourage reflection both ‘in’ and ‘on’ action (Schön, 1995); this is largely through the self-assessment processes that are central to use of the ALPS tools. Reflection is well recognised for its importance in the learning process (Dearnley & Matthew, 2007) as is the role of self-assessment in effective learning, for future professional development and lifelong learning (Boud, 1995; Taras, 2001). The ALPS mobile device enabled reflection anywhere, any time by allowing students to make quick written notes or to use the Distance Education 189 audio facility to capture thoughts, or even to take a photograph that can be revisited to evoke memories and more thorough consideration. Self-assessment is an extension of reflective practice and this too can be undertaken in a more informal, but in the moment, approach, using the ALPS mobile assessment process. Students recognised these benefits and reported using the devices for reflecting on the bus journey home or in their lunch breaks. Students liked the idea of having something where they could look up information on the move, even when working in isolated settings. For this ideal to be realised, it became clear from our work that a number of things were important to students. They didn’t like our devices, which increasingly became outdated – they liked their own. They didn’t want to carry two devices, and they wanted the device they did carry to be a phone as well as having Internet access. We underestimated how much training and support our students needed. Oblinger (2003, 2004) considers that the key traits of today’s learners are that they are digitally literate, ‘always on,’ mobile, experimental, and community-oriented. We had there- fore assumed they were mostly ‘digital natives’ (Prensky, 2001), but this assumption was not borne out in our experience. The ALPS assessment scenarios can be used at different stages of an individual student’s career and also by different professions at different levels of their academic development. For example, the dental hygiene and therapy students used the ‘Gaining consent’ for peer feedback in their second-year paediatric placement, in order to enhance their feedback skills and reflective skills. Norcini (2003) noted that general- isations about peer assessment are difficult to derive and that this form of assessment can be good or bad depending on how it is carried out. The audiology students used the same tool for their level-three students in general clinics, while in practice. The tutors of dentistry asked the students to complete an assessment about how they gained consent from their patients when completing a tooth extraction. The students did the procedure, completed the assessment, gathering evidence from their practice assessor, peers, and in some cases even the patient, and uploaded this to their e-portfolio. Their tutors back in the university then reviewed the assessments and gave feedback that could be seen immediately by the students, who could take onboard the feedback for future practice. Students also used the dialogue with their tutor as an opportunity to discuss what other learning materials (e.g., video clips of technical procedures) could be usefully delivered via the ALPS m-learning platform. In conclusion, work-based learning is a method of learning using the work envi- ronment as a place for study; it is a growing approach used in both the commercial sector and within secondary, further, and higher education generally. The UK government push for a wider choice of educational opportunities, especially in the curriculum for 14–19 year olds, has resulted in specialist schools and academies offering a range of options including work-based learning (Department for Children, Schools and Families, 2007). There is an increasing trend for more degree programmes, outside of H&SC, to include a work placement as students are able to gain real-world experience in authentic settings and develop skills and competences vital to their future employment opportunities. Currently, much of the research evidence on the impact of shared learning concentrates on the taught elements of programmes rather than the work-based experience. The problem is that the opportu- nities available for shared interprofessional learning in the practice environment are neglected as practical and organisational difficulties often stand in the way. Mobile technologies offer a vehicle to overcome some of these difficulties. The approach 190 J.D. Taylor et al. and processes adopted by ALPS have, therefore, the potential to be used more widely across the higher education sector to bridge the divide between the classroom and work-based learning. Acknowledgements The authors are grateful for the support of the ALPS Evaluation Group. This work was funded by the Higher Education Funding Council for England (HEFCE). Notes 1. Mobile Enabled Disabled Students (MEDS) was a separate study funded by ALPS to inform the development of the mobile assessment process in relation to accessibility. Notes on contributors Jill Taylor is a national teaching fellow with a long track record in both medical and educa- tional research. As a co-director of the University Technology Enhanced Learning (TEL) team Jill has a strategic role leading the adoption and embedding of TEL across Leeds Metropolitan University. Chris Dearnley has a keen interest and a comprehensive research portfolio in learning and assessment for health care practitioners. Her main area of research is pedagogy for health and social care practice, including learning technologies that promote inclusivity and support inde- pendent learning for all students. Julie Laxton is the CETL ALPS teaching fellow for the Faculty of Medicine & Health, University of Leeds. A dietitian by background, Julie has experience in the field of education, learning, and teaching, particularly from a practice (NHS) perspective. Catherine Coates is director of the Faculty Health practice Learning Unit at Leeds Metropolitan University and is an ALPS CETL teaching fellow. Her main focus of activity is the promotion of best practice in work-based learning and the development of personal and professional skills in her students. Tamsin Treasure-Jones is the ALPS mobile technologies manager. She has worked in higher education as a project manager for 10 years, on projects in online learning, business support, and knowledge transfer. Robert Campbell is managing director of eccommnet Limited and has over 25 years’ experi- ence in the IT industry. He was the original co-author and architect of the ALPS mobile services platform and specialises in security and encryption, reliability and dependability design, and mobile applications development. Ian Hall is technical services director at MyKnowledgeMap, managing the team responsible for the development of MyKnowledgeMap’s technical projects and services. Since joining MyKnowledgeMap in 2004, Ian has worked in a variety of roles, including systems develop- ment, systems analysis, and project management. References ALPS CETL. (2010). Implementation, 10.3: Methods of training. In Assessment & Learning in Practice Settings (ALPS) – Implementing a large scale mobile learning programme (IT Process Report) (pp. 42–43). Leeds: School of Medicine, University of Leeds. Retrieved from http://www.alps-cetl.ac.uk/publications.html Distance Education 191 Anderson, P., & Blackwood, A. (2004). Mobile and PDA technologies and their future use in education. Bristol: JISC. Retrieved from http://www.jisc.ac.uk/uploaded_documents/ ACF11B0.pdf Bent, P.D., Bolsin, S.N., Creati, B.J., Patrick, A.J., & Colson, M.E. (2002). Professional monitoring and critical incident reporting using personal digital assistants. Medical Journal of Australia, 177(9), 496–499. Retrieved from http://www.mja.com.au/public/ issues/177_09_041102/ben10246_fm.html Boud, D. (1995). Assessment and learning: Contradictory or complementary? In P. Knight (Ed.), Assessment for learning in higher education (pp. 35–48). London: Kogan Page. Boud, D. (2000). Sustainable assessment: Rethinking assessment for the learning society. Studies in Continuing Education, 22(2), 151–167. doi:10.1080/713695728 Craddock, D., O’Halloran, C., Borthwick, A., & McPherson, K. (2006). Interprofessional education in health and social care: Fashion or informed practice? Learning in Health and Social Care, 5(4), 220–242. doi:10.1111/j.1473-6861.2006.00135 Dearnley, C.A., & Matthew, R.G.S. (2007). Factors that contribute to undergraduate student success. Teaching in Higher Education, 12(3), 377–391. doi:10.1080/13562510701278740 Dearnley, C.A., Taylor, J., Hennessay, S., Parks, M., Coates, C., Haigh, J., et al. (2009). Using mobile technologies for assessment and learning in practice settings: Outcomes of five case studies. International Journal on E-Learning, 8(2), 193–208. Retrieved from http:// www.editlib.org/p/25319 Dearnley, C.A., & Walker, S.A. (2009). Mobile enabled research. In G. Vavoula (Ed.), Researching mobile learning (pp. 259–276). Oxford: Peter Lang. Dearnley, C.A., Walker, S.A., & Fairhall, J.R. (2010). Accessible mobile learning: Exploring the concept of mobile learning for all. In A. Bromage (Ed.), Inter professional e-learning and collaborative work: Practices and technologies (pp. 352–366). Hershey, PA: IGI-Global. Department for Children, Schools and Families. (2007). Building on the best: Final report and implementation plan of the review of the 14–19 work related learning. Retrieved from http://www.dcsf.gov.uk/14-19/documents/14-19workrelatedlearning_web.pdf Engum, S.A. (2003). Do you know your student’s basic skills exposure? American Journal of Surgery, 186(2), 175–181. doi:10.1016/S0002-9610(03)00182-X Haigh, J., Dearnley, C.A., & Meddings, F.S. (2007). The impact of an enhanced assessment tool on students’ experience of being assessed in clinical practice: A focus group study. Practice and Evidence of the Scholarship of Teaching and Learning in Higher Education, 2(1). Retrieved from http://www.pestlhe.org.uk/index.php/pestlhe/index Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112. doi:10.3102/003465430298487 Herrington, A., & Herrington, J. (2006). What is an authentic learning environment? In A.J. Herrington & J. Herrington (Eds.), Authentic learning environments in higher education (pp. 1–13). Hershey, PA: ISP. Holt, J., Coates, C., Cotterill, D., Eastburn, S., Laxton, J., Mistry, H., et al. (2010). Identifying common competences in health and social care: An example of multiinstitutional and interprofessional working. Nursing Education Today, 30(3), 264–270. doi:10.1016/ j.nedt.2009.09.006 Karadeniz, S. (2009). The impacts of paper, web and mobile based assessment on students’ achievement and perceptions. Scientific Research and Essay, 4(10), 984–991. Retrieved from http://www.academicjournals.org/sre/index.htm Kneebone, R., Nestle, D., Ratasothy, J., Kidd, J., & Darzi, A. (2003). The use of handheld computers in scenario-based procedural assessments. Medical Teacher, 25(6), 632–642. doi:10.1080/01421590310001605660 Koeniger-Donohue, R. (2008). Handheld computers in nursing education: A PDA pilot project. Journal of Nursing Education, 47(2), 74–77. doi:10.2202/1548-923X.1866 Miller, J., Shaw-Kokot, J.R., Arnold, M.S., Boggin, T., Crowell, K.E., Allergi, F., et al. (2005). A study of personal digital assistants to enhance undergraduate clinical nurs- ing education. Journal of Nursing Education, 44(1), 19–26. Retrieved from http:// www.journalofnursingeducation.com/ Moon, J. (1999). Reflection in learning and professional development. London: Routledge. Norcini, J.J. (2003). Peer assessment of competence. Medical Education, 37(6), 539–543. doi:10.1046/j.1365-2923.2003.01536 192 J.D. Taylor et al. Norman, A. (2005). Handheld public services. Retrieved from http://www.handheldlearning. co.uk/content/view/14/2/ Nycz, M., Smok, B., & Dragon, B. (2005). Knowledge management in open systems of educa- tion. Retrieved from Akademia Ekonomiczna we Wroclawiu: http://www.e-edukacja.net/ _referaty/10_e-edukacja.pdf Oblinger, D.G. (2003). Boomers & gen-Xers, millennials: Understanding the ‘new students.’ EDUCAUSE Review, 38(4), 37–47. Retrieved from http://www.educause.edu/er Oblinger, D.G. (2004). The next generation of educational engagement. Journal of Interactive Media in Education, 2004(8). Retrieved from http://www-jime.open.ac.uk/ Parks, M., & Dransfield, M. (2006). Mo-Blogging – Supporting student learning while in health care practice settings. Paper presented at mLearn Conference: Across Generations and Cultures, Banff, Alberta, Canada. Prensky, M. (2001). Digital natives, digital immigrants. On the Horizon, 9(5), 1–6. Sandars, J.E., & Dearnley, C.A. (2009). Twelve tips for the use of mobile technologies for work based assessment. Medical Teacher, 31(1), 18–21. doi:10.1080/01421590802227966 Sandars, J.E., & Pellow, A.J.H. (2006). Handheld computers for work based assessment – Lessons from the recent literature. Work Based Learning in Primary Care, 4(2), 109–115. Retrieved from http://www.radcliffe-oxford.com/journals/J16_Work_Based_Learning_ in_Primary_Care/ Savill-Smith, C., Attewell, J., & Stead, G. (2006). Mobile learning in practice. London: Learning and Skills Network. Scheck McAlearney, A., Schweikhart, S.B., & Medow, M.A. (2004). Doctors’ experience with handheld computers in clinical practice: Qualitative study. British Medical Journal, 328,1162. doi:10.1136/bmj.328.7449.1162 Schön, D.A. (1995). The reflective practitioner: How professionals think in action. Aldershot: Arena. Simmons, B., Wagner, S., AJeCeries, A., & Reeves, S. (2010). Assessment in interprofes- sional education: Why it is needed! Paper presented at the All Together Better Health 5 International Interprofessional Conference, Sydney, Australia. Speers, J. (2008). Service user involvement in the assessment of a practice competency in mental health nursing: The stakeholders’ views and recommendations. Nurse Education in Practice, 8(2), 112–119. doi:10.1016/j.nepr.2007.04.002 Taras, M. (2001). The use of tutor feedback and student self-assessment in summative assess- ment tasks: Towards transparency for students and tutors. Assessment and Evaluation in Higher Education, 26(6), 605–614. doi:10.1080/02602930120093922 Taylor, J.D., Coates, C., Eastburn, S., & Ellis, I. (2006). Evaluating the impact of mobile tech- nologies on the student learning experience in health practice placements. Retrieved from http://www.docstoc.com/docs/20125845/Evaluating-the-impact-of-mobile-technologies- on-the-student Taylor, J.D., Coates, C., Eastburn, S., & Ellis, I. (2007). Interactive learning using mobile devices to enhance healthcare practice. Paper presented at the European Federation for Open and Distance Learning (EFODL) International Conference, Belfast. Walton, G., Child, S., & Blenkinsopp, E. (2005). Using mobile technologies to give health students access to learning resources in the UK community setting. Health Information and Libraries Journal, 22(2), 51–65. doi:10.1111/j.1470-3327.2005.00615.x