2061 Connections
An electronic newsletter for the science education community

January/February 2007

Student Feedback Takes Center Stage

Posters highlight assessment of force and motion concepts

What do researchers learn when they compare students’ test answers to the explanations they give for their answers? Often, as Project 2061 is finding out, they learn that test items have a long way to go before they can be considered accurate measures of what the students know. In some cases, students choose the correct answer but don’t really know the targeted science idea. In others, students are too confused by ambiguous language or unclear diagrams to demonstrate their science knowledge.

As part of its development of an online collection of model assessment items and resources, AAAS Project 2061 is making the most of data collected from student interviews and pilot tests. The student feedback is proving essential as researchers seek to design test items that gauge as precisely as possible students' understanding of the key ideas in AAAS's Benchmarks for Science Literacy and the National Research Council's National Science Education Standards (read an overview of the assessment project).

In January, research associate Dr. Thomas Regan traveled to the American Association of Physics Teachers (AAPT) Winter Meeting in Seattle to share what the research team has learned so far about middle grades items targeting key force and motion concepts.

Learning from the Students

Regan presented two posters that highlight how the student data is informing Project 2061's design of multiple-choice assessment items. Regan spoke with teachers, professors of education, graduate students, state administrators, and education outreach officers from national laboratories, all of whom could use the assessment resources in their work.

"There's a real need for assessment items that are aligned to the specific ideas that standards expect students to know," says Regan. "The educators I met with appreciated the detailed information we are gathering about each item when we ask students in pilot tests to comment on every answer choice, not just the one they chose. By finding out if an incorrect answer choice is a plausible distractor, we learn a lot more about student thinking and the features of items that need improvement."

The posters present examples that show how student feedback from pilot testing and interviews is being used to improve test items in terms of vocabulary, context, incorrect answer choices (distractors), comprehensibility, grade-level suitability, and effective representations:

The test items featured on the posters are still in development and researchers continue to modify them in a number of areas (not all of which are indicated on the posters).

As the research team develops assessment items in a number of key topics for the online collection, they are supplementing the items with a wealth of related resources: (1) clarifications of each key science idea that pinpoint what student are expected to know, (2) common student misconceptions identified by research and useful for designing distractors, and (3) assessment maps that show how key ideas build toward student understanding.

For Regan, the usefulness of the forthcoming collection for science educators lies in how all of these resources work together. “Project 2061's goal in this project is to put together a model system of the components needed for effective standards-based assessment,” says Regan. “Teachers and researchers alike will be able to use the items and other resources to study and improve assessment in the context of classrooms or statewide tests, or as part of curriculum development projects.”

# # #

For more information about Project 2061's assessment research, please contact:

Principal Investigator: Dr. George DeBoer, (202) 326-6624


[Table of Contents]