key: cord-0050354-a02z1ojf authors: Aull, Laura title: Student-centered assessment and online writing feedback: Technology in a time of crisis date: 2020-09-18 journal: nan DOI: 10.1016/j.asw.2020.100483 sha: 4cfd31c66dcd052dc7eaea79cee91e2bc4c5795f doc_id: 50354 cord_uid: a02z1ojf nan A brief look at the evolving nature of writing assessment tools and tech helps offer context for these reviews and underscores the earlier point about ongoing learning. Assessing Writing itself provides evidence and context. When the journal began over 25 years ago, predominant themes in writing assessment research included assessor-centered considerations such as rater judgments and the relationship between textual features and writing quality. It also included relatively new assessments that helped shift to student-centered considerations, such as writing portfolios that expanded notions of validity to include impact on students (Huot, 1994) . In subsequent decades, writing assessment research expanded to foreground social and cognitive dimensions of written language and related design issues, from task and rubric design to scoring processes and their accompanying interpretation and use arguments as they impact student learning (Behizadeh & Engelhard, 2011; Kane, 2013) . More recent research has worked to draw needed, systematic attention to other factors that impact students. These include various domains related to the construct of writing assessed (Aull, 2020; MacArthur & Graham, 2008; White, Elliot, & Peckham, 2015) , the constitutive force of assignment design (2020, Aull, 2017) , the overlap between design, interpretation, and consequence of writing assessment (2016b, Kane, 2016a; Poe, 2014) , and how opportunity and identity are configured (and disfigured) by writing assessments (Inoue & Poe, 2012) . Within such considerations, even single concepts evolve as we learn. Poe and Elliot (2019) study of the changing perceptions of the meaning and importance of fairness in assessment show varied stances and methodological challenges that remain with us today. Most recently, Slomp and Elliot (2020) call for integrated theory, action, and appraisal in support of principled ways of thinking about assessment components, action mechanisms, and consequences in terms of curricular fairness and justice. Their call highlights writing assessment as a dialogic process of student-instructor and student-student interaction in which students infer ideas about writing and themselves as writers, and in which students face particular choices and consequences. Formative feedback tools are an important part of this dialogic process, and that brings us to the focus of our two reviews in this forum. Early feedback tools such as ETS's Criterion® and MyReviewers were designed largely according to assessor-centered goals such as providing scores and retrieving data. New tools work within the framework of a digital ecology, including Web-based peer review platforms embedded within, and informed by, fluid writing classrooms. These next generation systems can help support more context-specific, student-centered considerations. Increased focus on formative instructor feedback-the use of feedback to support process and development rather than explain grades-and the giver's gain-the clear value of peer review for the reviewer-help highlight the collaborative nature of a writing process in which students' choices and experiences are central. The particular focus of these reviews on online tools is valuable for the many partially or fully virtual writing courses today. Within online writing instruction, formative feedback may be even more crucial as students navigate their writing development more often on their own. Formative feedback is essential for student writing development, particularly as they draft and revise (Anderson, Anson, Gonyea, & Paine, 2015; Ferris, 2014) . It also requires support, to help students feel capable and supported as they respond to and receive feedback, particularly when they feel they themselves are still developing as writers (Hart-Davidson & Meeks, 2020). As I hope is always true in this forum, the two reviews draw attention to the epistemic nature of assessment choices: the fact that any assessment tool constitutes a particular view of writing and a student's writing to assessors and students. The reviews accordingly draw attention to the possibilities and logistics of the tools as well as their limitations and challenges. These details helps us remain aware, intentional, and critical about the constructs of writing we emphasize and leave out, and to what end, because ethical, valid assessments are characterized by transparency about goals and expectations and consistency across design, interpretation, and consequences (2016b, Kane, 2016a; Poe, 2014) . Fernando reviews Moodle quizzes in light of their possible use in instructors' formative assessment of academic writing. In particular, Fernando outlines the value of using Moodle quizzes as a way to work toward several goals of formative assessment: facilitating learning, promoting sustained engagement, and delivering ongoing feedback in order to support students. Strengths of this tool described by Fernando, for instance, include its scaffolded process steps and its ability to support multimodal submissions as well as multiple modes of feedback. These options support multiple modes for engagement in keeping with accessible, inclusive practices of online writing instruction (CCCC, C. f. E. P. f. O. W. I., 2013). A useful aspect of Fernando's review is its practical detail regarding usability and options, including clarifying aspects of the Moodle quiz that might feel counterintuitive within a 'quiz' structure but are applicable to providing feedback to student writers. Fernando also describes possibilities for assessing assessments and inviting students to reflect on their self-efficacy through the Moodle Quiz Gradebook (and question behavior settings therein), learning analytics features that can be used regardless of whether students' responses are graded. At the same time, Fernando identifies limitations. These include some non-intuitive features and in particular that the Moodle quizzes come short of accommodating peer review, an integral part of formative writing assessment in most classrooms today. Fernando thus underscores that Moodle quizzes can provide one set of tools for supporting students' academic needs and aspirations, among other tools that instructors use. As a valuable complement to Fernando's review, Laflen reviews Eli Review, a tool specifically designed to support peer feedback activities. Laflen draws valuable attention to the fact that we need more knowledge of the student-centered aspects of writing assessment: she writes, "we do not yet fully understand how these different feedback options impact the nature of the feedback provided or student…perceptions of it." But, she rightly notes, every feedback technology is built on assumptions about writing that function in a given course and assessment task. Some of the possibilities underscored by Laflen include that Eli Review facilitates frequent, small writing tasks that often characterize and support online writing instruction (CCCC, C. f. E. P. f. O. W. I., 2013). Laflen specifically addresses the many benefits of peer review supported by Eli Review. These reinforce the notion of the "giver's gain," including evidence of the particular gain for writers who may not score highly in their own summative assessments. As Laflen shows, Eli Review supports important possibilities like archived feedback and data analytics of both qualitative and quantitative data that can be used in assessment as research (and) assessments of assessments. Her review underscores that students' own choices and reactions are central in their use of Eli Review, in which they can make plans, rate peer feedback, and receive instructor feedback of their feedback. Laflen's review also identifies limitations of Eli Review, including cost and inflexibility, which leads Laflen to call for more research on specific platforms and approaches on student writers and their instructors. Fernando and Laflen's reviews focus on online tools for supporting writing communities in their interactive process-communities of instructors and peers who are forming, revising, and discussing their written ideas. With their help, may we make choices that meet this moment not (only) in terms of what institutions or instructors want to know vis-à-vis their expectations for writing, but in terms of student writers' goals, lives, and learning. In this new time, in other words, may these choices make us newly able to support curricular fairness and justice. I close with thanks to these two scholars and you, readers, for your dedication to examining writing assessment tools and tech. And I extend my best wishes to you and yours in this uncertain time. The contributions of writing to learning and development: Results from a large-scale multi-institutional study Corpus analysis of argumentative versus explanatory discourse in writing task genres How students write: A linguistic analysis Historical view of the influences of measurement and writing theories on the practice of writing assessment in the United States Position statement of principles and example effective practices for online writing instruction (OWI) Show me your true colours: Scaffolding formative academic literacy assessment through an online learning platform. Assessing Writing Responding to student writing: Teachers' philosophies and practices Feedback analytics for peer learning: Indicators of writing improvement in digital environments (forthcoming) Editorial: An introduction to assessing writing Race and writing assessment Validating the interpretations and uses of test scores Explicating validity. Assessment in Education Principles Policy and Practice Validation strategies: Delineating and validating proposed interpretations and uses of test scores. Handbook of test development Responding to student writing online: Tracking student interactions with instructor feedback in a learning management system Writing research from a cognitive perspective The consequences of writing assessment Evidence of fairness: Twenty-five years of research in assessing writing What's your theory of action? Making good trouble with literacy assessment (forthcoming) Journal of Adolescent & Adult Literacy