2061 Connections
An electronic newsletter for the science education community

January 2010

Learning from Wrong Answers

For many years educators have been aware of the alternative ideas students have about the natural world and the impact those ideas can have on their understanding of scientific phenomena. A great deal of systematic research has been done to document those alternative ideas, and Project 2061 is currently in the process of independently verifying much of that research by administering hundreds of test items on 16 core topics in science to thousands of students nationwide. In addition to describing the current state of students' science understanding, researchers at Project 2061 are also uncovering new misconceptions that have not previously been identified in the literature. The results from Project 2061’s large-scale field tests of science assessment items can now provide researchers and educators with an up-to-date picture of the ideas that students use to explain the world around them.

The Project 2061 test items include misconceptions as answer choices so that researchers are able to analyze students’ responses to determine the prevalence of particular misconceptions and to investigate when in a progression of understanding certain misconceptions are most popular and when they disappear. Using Rasch analysis to estimate the probability that students along the ability continuum will hold a particular misconception, researchers can get a clearer picture of how students’ understanding develops over time.

At a recent meeting for National Science Foundation-funded grantees, Project 2061’s deputy director George DeBoer and research associate Cari Herrmann-Abell presented a poster and paper describing Project 2061’s process for developing assessment items and the insights they are gaining from their large-scale testing.

Insight into Student Understanding
When Project 2061's items are field tested with large national samples of students, whether students respond correctly or incorrectly, their answers can reveal a great deal about where they are in their understanding of the targeted ideas. For their presentation, DeBoer and Herrmann-Abell drew on results from field tests in middle school chemistry conducted in 2007 and 2008 with more than 7000 students.

“We are field testing a very wide range of students,” said DeBoer, “so we can gauge the prevalence and strength of particular misconceptions with students of different ability levels. Rasch modeling allows us to make these estimates answer choice by answer choice, item by item, and across clusters of items.” 

Using several example items designed to test for knowledge of atoms, molecules, and states of matter, as well as basic ideas about chemical reactions, the research team’s presentation discusses how the Rasch analysis has helped them detect predictable sequences or hierarchies of misconceptions—that is, misconceptions that appear to fade as students make progress in their understanding of particular ideas.

“We’re also studying how these sequences of misconceptions play out for other topics and think our data will be a significant contribution to the study of how and when students learn key concepts in science,” DeBoer said.

The Project 2061 research team will be presenting more about their assessment work and findings at the upcoming National Association for Research in Science Teaching (NARST) Annual Conference in Philadelphia, March 20-24, 2010. Look for more information in the next issue of Project 2061 Connections.

Read the paper [PDF, 101KB].  View the poster [PDF, 1447KB].

Find out more about Project 2061’s assessment research and development efforts:

# # #

For more information, please contact:

Director: Dr. Jo Ellen Roseman, (202) 326-6752
Deputy Director: Dr. George DeBoer, (202) 326-6624
Research Associate: Dr. Cari Herrmann-Abell, (202) 326-6648

[Subscribe] [Archive]