Feedback on Student Programming Exercises: Teaching Assistants vs Automated Assessment Tool

Nynne Grauslund Kristiansen, Sebastian Mateos Nicolajsen, Claus Brabrand

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Abstract

Existing research does not quantify and compare the differences be- tween automated and manual assessment in the context of feedback on programming assignments. This makes it hard to reason about the effects of adopting automated assessment at the expense of man- ual assessment. Based on a controlled experiment involving N=117 undergraduate first-semester CS1 students, we compare the effects of having access to feedback from: i) only automated assessment, ii) only manual assessment (in the form of teaching assistants), and iii) both automated as well as manual assessment. The three conditions are compared in terms of (objective) task effectiveness and from a (subjective) student perspective.
The experiment demonstrates that having access to both forms of assessment (automated and manual) is superior both from a task effectiveness as well as a student perspective. We also find that the two forms of assessment are complementary: automated assess- ment appears to be better in terms of task effectiveness; whereas manual assessment appears to be better from a student perspec- tive. Further, we found that automated assessment appears to be working better for men than women, who are significantly more in- clined towards manual assessment. We then perform a cost/benefit analysis which leads to the identification of four equilibria that appropriately balance costs and benefits. Finally, this gives rise to four recommendations of when to use which kind or combination of feedback (manual and/or automated), depending on the num- ber of students and the amount of per-student resources available. These observations provide educators with evidence-based justifi- cation for budget requests and considerations on when to (not) use automated assessment.
Original languageEnglish
Title of host publicationACM Proceedings of the 22nd Koli Calling International Conference on Computing Education Research (Koli 2023)
Place of PublicationKoli, Finland
Publication date2023
Publication statusPublished - 2023

Keywords

  • Automated assessment
  • Manual assessment
  • Feedback effectiveness
  • Programming assignments
  • Student perspective

Fingerprint

Dive into the research topics of 'Feedback on Student Programming Exercises: Teaching Assistants vs Automated Assessment Tool'. Together they form a unique fingerprint.

Cite this