Feedback on Student Programming Assignments: Teaching Assistants vs Automated Assessment Tool

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Abstract

Existing research does not quantify and compare the differences between automated and manual assessment in the context of feedback on programming assignments. This makes it hard to reason about the effects of adopting automated assessment at the expense of manual assessment. Based on a controlled experiment involving N=117 undergraduate first-semester CS1 students, we compare the effects of having access to feedback from: i) only automated assessment, ii) only manual assessment (in the form of teaching assistants), and iii) both automated as well as manual assessment. The three conditions are compared in terms of (objective) task effectiveness and from a (subjective) student perspective.
The experiment demonstrates that having access to both forms of assessment (automated and manual) is superior both from a task effectiveness as well as a student perspective. We also find that the two forms of assessment are complementary: automated assessment appears to be better in terms of task effectiveness; whereas manual assessment appears to be better from a student perspective. Further, we found that automated assessment appears to be working better for men than women, who are significantly more inclined towards manual assessment. We then perform a cost/benefit analysis which leads to the identification of four equilibria that appropriately balance costs and benefits. Finally, this gives rise to four recommendations of when to use which kind or combination of feedback (manual and/or automated), depending on the number of students and the amount of per-student resources available. These observations provide educators with evidence-based justification for budget requests and considerations on when to (not) use automated assessment.
Original languageEnglish
Title of host publicationProceedings of the 23rd Koli Calling International Conference on Computing Education Research, Koli Calling 2023, Koli, Finland, November 13-18, 2023
EditorsAndreas Mühling, Ilkka Jormanainen
Number of pages10
Place of PublicationNew York, NY, USA
PublisherAssociation for Computing Machinery
Publication date2023
Pages2:1-2:10
Article number2
ISBN (Electronic)9798400716539
DOIs
Publication statusPublished - 2023
EventKoli Calling International Conference on Computing Education Research - Koli, Finland
Duration: 13 Nov 202318 Nov 2023
Conference number: 23

Conference

ConferenceKoli Calling International Conference on Computing Education Research
Number23
Country/TerritoryFinland
CityKoli
Period13/11/202318/11/2023

Keywords

  • Automated assessment
  • Feedback
  • Student experiments
  • Teaching Assistants

Fingerprint

Dive into the research topics of 'Feedback on Student Programming Assignments: Teaching Assistants vs Automated Assessment Tool'. Together they form a unique fingerprint.

Cite this