Reflecting on reflective writing analytics: Assessment challenges and iterative evaluation of a prototype tool
- Publication Type:
- Conference Proceeding
- Citation:
- ACM International Conference Proceeding Series, 2016, 25-29-April-2016 pp. 213 - 222
- Issue Date:
- 2016-04-25
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
Reflecting_on_Reflective_Writing_Analyti.pdf | Accepted Manuscript version | 2.13 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
© 2016 ACM. When used effectively, reflective writing tasks can deepen learners' understanding of key concepts, help them critically appraise their developing professional identity, and build qualities for lifelong learning. As such, reflecting writing is attracting substantial interest from universities concerned with experiential learning, reflective practice, and developing a holistic conception of the learner. However, reflective writing is for many students a novel genre to compose in, and tutors may be inexperienced in its assessment. While these conditions set a challenging context for automated solutions, natural language processing may also help address the challenge of providing real time, formative feedback on draft writing. This paper reports progress in designing a writing analytics application, detailing the methodology by which informally expressed rubrics are modelled as formal rhetorical patterns, a capability delivered by a novel web application. This has been through iterative evaluation on an independently humanannotated corpus, showing improvements from the first to second version. We conclude by discussing the reasons why classifying reflective writing has proven complex, and reflect on the design processes enabling work across disciplinary boundaries to develop the prototype to its current state.
Please use this identifier to cite or link to this item: