The "Best-So-Far" Infrastructure (Previously called TEACHER-ASSIST)
New Press! Read this from Jill Barshay at Hechinger on this project.
TeacherASSIST gives teachers the ability to create feedback for their students for any problem they have selected. The feedback can be provided in the form of hints or explanations and teachers can choose between text or video format. Any teacher already using ASSISTments can turn this feature to show their students hints and explanations.
The Teacher-ASSIST Vision
TeacherASSIST allows ASSISTments to be a Wikipedia for teachers. ASSISTments will crowdsource hints and explanations for all the commonly used questions in middle school mathematics, in particular those offered as Open Educational Resources (OER). By keeping ASSISTments free, we will be able to impact as many teachers and students as possible!
How it all started
Dr. Heffernan was inspired to create TeacherASSIST by Chris LeSiege, a teacher in Maine. Excited by ASSISTments, Mr. LeSiege went ahead and made a comment for every problem in an an entire textbook!
Dr. Heffernan created Teacher ASSIST to support awesome teachers like Mr. LeSiege. Now, the platform allows teachers to share their excellent tips with other teachers around the country.
Other have done this too and wrote an NCTM article about it.
Blog about Teacher-ASSIST
Read Andrew Burnett's Blog about TeacherASSIST as one of 4 steps to making homework a learning opportunity.
Or read this in this NCTM magazine by a teacher talking about her making videos for her students and delivering them via ASSISTments.
ASSISTments Star Teachers
We like to highlight amazing teachers! If you want to share your problem hints and explanations with fellow teachers, teachers can apply to be Star-Teachers. Any hint created by a Star-Teacher will be delivered to all students using the content. If you have started building feedback apply to be a Star-Teacher, email us at email@example.com.
Testing Content for Effectiveness
At ASSISTments, we want to distribute only the best content. In order to see which explanations and hints are working, we will run tests on teacher-sourced content. This will help us to not only find content that works, but also personalize content to different types of student learners.
The following paper showed that TeacherASSIST is effective. This paper won the prestigious "Best Student Paper Award, making it among the top 5-10% of papers submitted.
Patikorn, T. & Heffernan, N. T. (2020, August 12) Effectiveness of Crowd-Sourcing On-Demand Tutoring from Teachers in Online Learning Platforms. Proceedings of the Seventh ACM Conference on Learning @ Scale (L@S). Pages 115–124. https://doi.org/10.1145/3386527.3405912. Best Student Paper Awardee.
This paper also reports that 149 other teacher used Teacher ASSIST to write content for this student. 29 teachers have done this for over 50 problems. Some 8 plus teachers that have written feedback for over 200 problems. My PhD student Thanaporn "March" Patikorn is the lead on this and shown below.
A follow up paper has shown that we can start to tell the difference between what is good and what is not as good content.
Prihar, E., Patikorn, T., Botelho, A., Sales, A., & Heffernan, N. (2021). Towards Personalizing Students' Education with Crowdsourced Tutoring. Learning@Scale 2021. Camera Ready Copy. Pages 37–45. https://doi.org/10.1145/3430895.3460130
There is now a lot more data in what we call 2.0 data! (Oct 8th, 2021)
On October 8th, Neil's PhD student pulled some numbers and found that 1,829,147 times a students was randomized across 6,044 problems. In this sense each problem represents a separate independent experiment. Each problem has about 3 different "student supports" (or newer name of hints or explanations--in the first paper at LAK we called these "TeacherASSISTs"). In total across the 6,044 problems we have 15,964 "student supports". There were 72,294 student so on average each student participated in each experiment 299 times, (median of 135). We have yet to analyze this data set.