Automated analyses of written responses reveal student thinking in STEM

Poster session for Spring Conference on Teaching, Learning, and Student Success.

Formative assessments can provide crucial data to help instructors evaluate pedagogical effectiveness and address students’ learning needs. The shift to online instruction and learning in the past year emphasized the need for innovative ways to administer assessments that support student learning and success. Faculty often use multiple-choice (MC) assessments due to ease of use, time and other resource constraints. While grading these assessments can be quick, the closed-ended nature of the questions often does not align with real scientific practices and can limit the instructor’s ability to evaluate the heterogeneity of student thinking. Students often have mixed understanding that include scientific and non-scientific ideas. Open-ended or Constructed Response (CR) assessment questions, which allow students to construct scientific explanations in their own words, have the potential to reveal student thinking in a way MC questions do not. The results of such assessments can help instructors make decisions about effective pedagogical content and approaches. We present a case study of how results from administration of a CR question via a free-to-use constructed response classifier (CRC) assessment tool led to changes in classroom instruction. The question was used in an introductory biology course and focuses on genetic information flow. Results from the CRC assessment tool revealed unexpected information about student thinking, including naïve ideas. For example, a significant fraction of students initially demonstrated mixed understanding of the process of DNA replication. We will highlight how these results influenced change in pedagogy and content, and as a result improved student understanding.

research poster- see text below the image for full description
To access a PDF of the “Automated analyses of written responses reveal student thinking in STEM” poster, click here.

Description of the Poster 

Automated analyses of written responses reveal student thinking in STEM 

Jenifer N. Saldanha, Juli D. Uhl, Mark Urban-Lurain, Kevin Haudek 

Automated Analysis of Constructed Response (AACR) research group 

CREATE for STEM Institute, Michigan State University 

Email: jenifers@msu.edu 

Website: beyondmultiplechoice.org  

QR code (for website):  

 

Key highlights: 

  • Constructed Response (CR) questions allow students to explain scientific concepts in their own words and reveal student thinking better than multiple choice questions. 
  • The Constructed Response Classifier (CRC) Tool (free to use: beyondmultiplechoice.orgcan be used to assess student learning gains 

In an introductory biology classroom: 

  • Analyses by the CRC tool revealed gaps in student understanding and non-normative ideas. 
  • The instructor incorporated short term pedagogical changes and recorded some positive outcomes on a summative assessment. 
  • Additional pedagogical changes incorporated the next semester led to even more positive outcomes related to student learning (this semester included the pivot to online instruction). 

The results from this case study highlight the effectiveness of using data from the CRC tool to address student thinking and develop targeted instructional efforts to guide students towards a better understanding of complex biological concepts.   

Constructed Response Questions as Formative Assessments 

  • Formative assessments allow instructors to explore nuances of student thinking and evaluate student performance.  
  • Student understanding often includes scientific and non-scientific ideas [1,2].  
  • Constructed Response (CR) questions allow students to explain scientific concepts in their own words and reveal student thinking better than multiple choice questions [3,4]. 

Constructed Response Classifier (CRC) tool 

  • A formative assessment tool that automatically predicts ratings of student explanations.  
  • This Constructed Response Classifier (CRC) tool generates a report that includes: 
  • categorization of student ideas from writing related to conceptual understanding. 
  • web diagrams depicting the frequency and co-occurrence rates of the most used ideas and relevant terms. 

CRC Questions in the Introductory Biology Classroom :  

A Case study 

Students were taught about DNA replication and the central dogma of Biology. 

Question was administered as online homework, completion credit provided. Responses collected were analyzed by the CRC tool. 

CRC question: 

The following DNA sequence occurs near the middle of the coding region of a gene. 
 DNA   5‘  A A T G A A T G G* G A G C C T G A A G G A  3′     

There is a G to A base change at the position marked with an asterisk. Consequently, a codon normally encoding an amino acid becomes a stop codon.  
How will this alteration influence DNA replication? 

  • Part 1 of the CRC question used to detect student confusion between the central dogma processes.  
  • Related to the Vision & Change core concept 3 “Information Flow, Exchange, and Storage” [5], adapted from the Genetics Concept Assessment [6,7]. 

Insight on Instructional Efficacy from CRC Tool 

Table 1: Report score summary revealed that only a small fraction of students provided correct responses post instruction. (N = 48 students). 

Student responses 

Spring 2019 

Incorrect 

45% 

Incomplete/Irrelevant 

32% 

Correct 

23% 

 

Sample incorrect responses:  

Though both incorrect, the first response below demonstrates understanding of a type of mutation and the second one uses the context of gene expression. 

  1. “This is a nonsense mutation and will end the DNA replication process prematurely leaving a shorter DNA strand” (spellchecked) 
  1. “It will stop the DNA replication… This mutation will cause a gene to not be expressed” 

CRC report provided: 

  • Response score summaries 
  • Web diagrams of important terms 
  • Term usage and association maps 

The instructor Identified scientific and non-scientific ideas in student thinking  

This led to: 

Short term pedagogical changes, same semester  

  • During end of semester material review, incorporated: 
  • Small group discussions about the central dogma.  
  • Discussions about differences between DNA replication, and transcription and translation. 
  • Worksheets with questions on transcribing and translating sequences. 

Figure one: 

The figure depicts an improvement in student performance observed in the final summative assessment.  

Percentage of students who scored more than 95% on a related question: 

In the unit exam = 71% 

Final summative exam = 79% 

Pedagogical Changes Incorporated in the Subsequent Semester 

CR questions: 

  • Explain the central dogma. 
  • List similarities and differences between the processes involved. 
  • Facilitated small group discussions for students to explain their responses. 

 

Worksheets and homework:  

Transcribe and translate DNA sequences, including ones with deletions/additions.  

Students encouraged to create their own sequences for practice.  

Revisited DNA replication via clicker questions and discussions, while students were learning about transcription and translation. 

Table 2: 68% of students in the new cohort provided correct responses to the CRC question post instruction. (N = 47 students). 

Student Responses 

Spring 2020 

Incorrect 

19% 

Incomplete/Irrelevant 

13% 

Correct 

68% 

Conclusions 

The results from this case study highlight the effectiveness of using data from the CRC tool to address student thinking and develop targeted instructional efforts to guide students towards a better understanding of complex biological concepts.   

Future Directions 

  • Use the analytic rubric feature in the CRC tool to obtain further insight into normative and non-normative student thinking. 
  • Use the clicker-based case study available at CourseSource about the processes in the central dogma [8]. 
  • Incorporate additional CRC tool questions in each course unit. 

Questions currently available in a variety of disciplines: 

Biology, Biochemistry, Chemistry, Physiology, and Statistics 

Visit our website beyondmultiplechoice.org and sign up for a free account 

References: 

  1. Ha, M., Nehm, R. H., Urban-Lurain, M., & Merrill, J. E. (2011).  CBE—Life Sciences Education, 10(4), 379-393. 
  1. Sripathi, K. N., Moscarella, R. A., et al., (2019). CBE—Life Sciences Education, 18(3), ar37. 
  1. Hubbard, J. K., Potts, M. A., & Couch, B. A. (2017). CBE—Life Sciences Education, 16(2), ar26. 
  1. Birenbaum, M., & Tatsuoka, K. K. (1987). Applied Psychological Measurement, 11(4), 385-395. 
  1.  “Vision and change in undergraduate biology education: a call to action.” American Association for the Advancement of Science, Washington, DC (2011). 
  1. Smith, M. K., Wood, W. B., & Knight, J. K. (2008). CBE—Life Sciences Education, 7(4), 422-430. 
  1. Prevost, L. B., Smith, M. K., & Knight, J. K. (2016). CBE—Life Sciences Education, 15(4), ar65. 
  1. Pelletreau, K. N., Andrews, T., Armstrong, N., et al., (2016). CourseSource. 

Acknowledgments.  

This material is based upon work supported by the National Science Foundation (DUE grant 1323162). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the supporting agencies.