Expanding a National Network for Automated Analysis of Constructed Response Assessments to Reveal Student Thinking in STEM

TitleExpanding a National Network for Automated Analysis of Constructed Response Assessments to Reveal Student Thinking in STEM
Publication TypeConference Paper
Year of Publication2016
AuthorsUrban-Lurain, M, Bierema, AM-K, Haudek, KC, Hoskinson, A-M, Kaplan, J, Knight, JK, Lemons, PP, Lira, CT, McCourt, J, Merrill, JE, Moscarella, R, Nehm, RH, Smith, MK, Steele, M, Sydlik, MAnne
Conference NameAAAS Envisioning the Future of Undergraduate STEM Education
Date Published04/2016
PublisherAmerican Association for the Advancement of Science
Conference LocationWashington, DC
KeywordsAACR, automated analysis, Lexical analysis
AbstractNeed: Faculty who wish to respond to, and build upon, students' existing understandings of key STEM concepts must first know what and how students think about the concepts. While multiple-choice assessments are easy to administer, they cannot measure students’ abilities to organize individual bits of knowledge into a coherent and functional explanatory structure. Writing is an authentic task that can reveal student thinking, but is time-consuming to evaluate and therefore difficult to implement in large classes typical of many introductory STEM courses. The Automated Analysis of Constructed Response (AACR, pronounced “acer”) project combines educational research-based methods with computerized linguistic analysis to quickly evaluate student writing, generating useful and timely feedback for faculty to inform their instruction. www.msu.edu/~aacr Goals: We are a large, multi-institutional collaboration (TUES 3 and WIDER funding) with several connected goals: 1) create a national web portal to access AACR conceptual assessments and analysis; 2) use resulting reports to focus community collaborations between STEM education researchers and instructors; 3) transport AACR innovations through ongoing faculty professional development; 4) expand the range of STEM disciplines in which we pursue this research from our biology into chemistry, chemical engineering, physics/astronomy, and statistics; 5) engage in ongoing project evaluation for continuous quality improvement; and 6) lay the foundation for sustainability. Approach: We use a variety of computerized lexical analysis and machine learning tools to create statistical models that predict expert rating of student writing with inter-rater reliability as good as expert-to-expert IRR (>0.8). These models are used to generate reports for faculty that detail both scientific and alternative conceptions in their students’ responses. Local Faculty Learning Communities (FLCs) meet to discuss the reports and create instructional interventions to improve student outcomes. We are developing a web portal that will allow any faculty to obtain questions and upload their students’ responses for analysis. Outcomes: Our research has led to new insights into students’ struggles with key concepts, such as the Central Dogma of Biology, with FLC faculty collectively creating new instructional materials to address these challenges. Research on the FLC members shows faculty are moving from asking “How many students got the right answer?” to reflecting on student thinking and modifying instruction to address common learning challenges. We continue to explore a variety of lexical analysis and classification techniques to speed up and improve the development of questions and analytic resources. We are working on the web portal to completely automate report generation and make these analyses widely available to participating FLCs. Broader Impacts: We created a set of FLCs across multiple institutions that engage STEM faculty teaching foundational courses to administer AACR questions, reflect on the results and implement revised instruction. In this current year we have added additional faculty to each FLC. We are expanding questions development beyond biology into chemistry, statistics, thermodynamics, and physics/astronomy. We have presented at conferences, published several papers, and created two web sites to disseminate our results.
URLhttp://www.enfusestem.org/projects/collaborative-research-expanding-a-national-network-for-automated-analysis-of-constructed-response-assessments-to-reveal-student-thinking-in-stem-5/
Refereed DesignationRefereed

Attachments: 

thumbnail of small NSF logo in color without shading

This material is based upon work supported by the National Science Foundation (DUE grants: 1438739, 1323162, 1347740, 0736952 and 1022653). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the NSF.