ASSISTments

From Open Pattern Repository for Online Learning Systems
Revision as of 09:40, 19 June 2015 by Pinventado (talk | contribs) (Add online learning system infobox)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
ASSISTments
Creator ASSISTments team led by Neil Heffernan
Creation date 2003
Data sets produced %TITLE%
%TITLE%
Data analyses conducted %TITLE%

ASSISTments was initially developed by Neil Heffernan and his team at Worcester Polytechnic Institute in 2003 to help students practice Math for the Massachusetts Comprehensive Assessment System (MCAS), to help teachers identify students who needed more support, and to identify topics that needed to be discussed and practiced further [1]. It is an online learning system that allows teachers to create math problems and exercises with corresponding solutions, feedback, and others that they can assign to students and get student performance assessments from. ASSISTments was developed by a multi-disciplinary team consisting of content experts, ITS experts and system developers among others. It was designed and built using architecture and interface development best practices with the help of math teachers who provided expert advice and content.

Screenshot of ASSISTments showing a students' view when answering a problem. Students can attempt to solve the problem and request for hints

.

Initially, math teachers were asked to provide the content (i.e., problems, answers, hints, feedback) for ASSISTments. Later, questions from math textbooks were also added and teachers who used the system were allowed to create their own questions or adapt their own versions of existing questions. Over time, ASSISTments underwent many changes to support feature requests and improvements. It also allowed researchers to run studies and collect student data using customized content, feedback, and other features of the interface (e.g., Broderick et al. 2011[2], Li et al. 2013[3], Whorton 2013[4]]).

ASSISTments has been collecting large amounts of data since 2003 from close to 50,000 students using the system each year in 48 states of the United States (ASSISTments, Heffernan & Heffernan 2014[1], Mendicino et al. 2009[5]). Data is represented using multiple features that describe the student learning experience, such as the number of hint requests in a problem, the number of answer attempts in a problem, the correctness of students’ answers, the timestamps of students’ actions, and predictions of students’ affect while answering a problem (i.e., concentration, frustration, confusion, boredom) (Heffernan & Heffernan 2014 [1], Ocumpaugh et al. 2014[6]). Predictions of students’ affective states were generated by a machine-learning model built using expert-labeled data of student affect while using ASSISTments. It used various features related to the number of previous incorrect answers, time taken to solve problems, number of hint requests, etc. to make the predictions (Baker et al. 2012[7], Ocumpaugh et al. 2014[6]).

References

  1. 1.0 1.1 1.2 Heffernan, N.T., and Heffernan, C.L. (2014). The ASSISTments Ecosystem: Building a Platform that Brings Scientists and Teachers Together for Minimally Invasive Research on Human Learning and Teaching. International Journal of Artificial Intelligence in Education 24, 4 (2014), 470-497.
  2. Broderick, Z., O'Connor, C., Mulcahy, C., Heffernan, N. & Heffernan, C. (2011). Increasing Parent Engagement in Student Learning Using an Intelligent Tutoring System. Journal of Interactive Learning Research, 22(4):523-550.
  3. Li, S., Xiong, X., and Beck, J. (2013). Modeling student retention in an environment with delayed testing. Educational Data Mining 2013. International Educational Data Mining Society, 328-329.
  4. Whorton, S. (2013). Can a computer adaptive assessment system determine, better than traditional methods, whether students know mathematics skills? Master’s thesis, Computer Science Department, Worcester Polytechnic Institute.
  5. Mendicino, M., Razzaq, L. and Heffernan, N. (2009). A Comparison of Traditional Homework to Computer-Supported Homework. Journal of Research on Technology in Education 41(3): 331-359.
  6. 6.0 6.1 Ocumpaugh, J., Baker, R., Gowda, S., Heffernan, N., & Heffernan, C. (2014). Population validity for Educational Data Mining models: A case study in affect detection. British Journal of Educational Technology, 45(3):487-501.
  7. Baker, R., Gowda, S., Wixon, M., Kalka, J., Wagner, A., Salvi, A., Aleven, V., Kusbit, G., Ocumpaugh, J. and Rossi, L. (2012). Towards Sensor-free Affect Detection in Cognitive Tutor Algebra. In Proceedings of the 5th International Conference on Educational Data Mining, International Educational Data Mining Society (pp. 126-133).