Item Feedback AB Testing
In addition to collecting learners' general sentiment of a course, we devised a system to collect detailed feedback on the content quality and technical issues at the item level. This provided rich information to partners and Coursera operational teams, allowing them to pinpoint and improve problem areas in courses. This keeps content quality and learner satisfaction high.
To maximize the interaction rate, we tested eight experimental buckets with various combinations of icons, interaction, and text links. The winning variant is the one shown below, with a 34% higher response rate compared to other buckets. This lightweight mechanism has received high response rates with 8.21% of all learners participating.
Providing such actionable data provides strong incentive for course staff to improve content quality. For example, one course team improved a course rating by 0.7 points over 6 weeks by addressing the content problems reported by learners.