2061 Connections
An electronic newsletter for the science education community

July 2004

Building a Collection of Test Items Aligned to Standards

Sometime in elementary school, most students will be expected to know that there is a relationship between the features of plants and animals and the ability of those plants and animals to survive in particular environments. This idea is found in national content standards for science as well as in most state standards.

Now consider the following assessment task and the likelihood that students' answers will shed light on what they know—or do not know—about the concept above:

Draw a place where plants and animals live. Be sure to show at least two kinds of plants and animals that live in the place you draw.

Although the task provides an opportunity for students to describe through their drawings what they know about animal and plant features and about the characteristics of the environment where those plants and animals live, students are not asked to link the two, which is at the heart of this particular concept. This lack of precision in testing is fairly common today both in classroom assessment and high-stakes state tests. As a result, critical decisions are being made based on data from assessments that are poorly aligned with the content standards for which students, teachers, and schools are being held accountable. The federal No Child Left Behind Act of 2001 now requires that statewide assessments be based on each state's content standards, so the need for assessment items that are closely aligned to those standards has become more urgent than ever.

As one of the first organizations to focus on content standards and their role in curriculum, instruction, and assessment, AAAS's Project 2061 has been studying the alignment and effectiveness of science and mathematics test items drawn from a wide variety of sources. Working with teams of experienced educators and assessment specialists, Project 2061 has developed a set of criteria and a procedure for analyzing and profiling assessment items for their alignment with content standards and for other characteristics that affect their usefulness in providing information about what students know about specific ideas. The procedure considers whether the ideas in the content standard are needed to complete the assessment task successfully or if the task can be completed in some other way, and whether those ideas are enough by themselves or if other ideas and skills are needed. The procedure also involves analyzing the task for

  • comprehensibility;

  • susceptibility to test-wise solution strategies;

  • bias related to gender, class, race, and ethnicity; and

  • appropriateness of the task context.

Project 2061's criteria and procedures are being used to study assessment items of all types—from selected-response items such as multiple choice questions to more involved performance tasks—and to analyze items used for both diagnostic and evaluative purposes.

The May 2004 issue of 2061 Connections described how Project 2061 is launching a new effort funded by the National Science Foundation to help science and mathematics teachers, curriculum and test developers, and education researchers meet the challenges of standards-based assessment. Here we offer a more detailed look at the scope of that work and its possible applications. By the end of this five-year effort, we expect to produce:

1. An online collection of more than 300 science and mathematics assessment items. Focusing on assessments needed for science and mathematics in grades 6 through 10, the collection will allow users to search for items that are well aligned to learning goals in Project 2061's Benchmarks for Science Literacy, the National Research Council's National Science Education Standards, the National Council for Teachers of Mathematics' Principles and Standards for School Mathematics, and the content standards of nearly every state. Each item in the collection will also be reviewed for its suitability for use with a wide range of students, including English language learners.

To build the collection, we plan to screen hundreds of existing middle and early high school science and mathematics assessment items from as many sources as possible, including released items from the Third International Mathematics and Science Study, the National Assessment of Educational Progress, and a variety of state tests. Following the initial screening and sorting, items will undergo a more rigorous analysis to describe their precise alignment to the ideas being targeted by the standards. Items will be field tested with students throughout the development process.

Each entry in the collection will include the assessment item itself along with a profile of the item describing its match to specific science or mathematics standards and the knowledge needed to answer the item correctly. The profile will also point out whether the item can be used to determine if students hold common misconceptions about particular ideas; whether the item is likely to be approached differently by diverse learners (taking into account the item's use of visuals, linguistic demands, and so on); and whether the item measures declarative knowledge (concepts), procedural knowledge (skills), or contextual knowledge (applications). Users will be able to search for and retrieve items based on the features described in the profiles.

2. Assessment maps that provide a conceptual framework for selecting items. For each of the 16 science and mathematics topic areas covered by the collection, we are creating an assessment map to display connections among ideas related to the relevant content standards. The maps are adapted from those in Project 2061's Atlas of Science Literacy (co-published with the National Science Teachers Association) and will give test developers a convenient visual boundary around the ideas they might want to test. At the same time, the maps will allow them to choose assessment items that can yield diagnostic information about student learning, especially with respect to misconceptions and prerequisite knowledge that pertain to specific ideas on the maps. The maps will serve as the main interface for the collection. Users will be able to click on specific ideas on the maps to access the items and other resources in the collection.

3. Clarifications of content standards. The collection will also include clarification statements for each of the content standards for which assessment items are provided. Although national content standards (and some state standards) often include explanations of the kinds of instructional activities that can be used to advance student learning with regard to the ideas and skills being targeted, they rarely provide much guidance on how students should be assessed. What is more, the exact meaning of content standards is not always immediately evident, and yet teachers, curriculum and test developers, and researchers need a clear sense of what students are expected to know and what constitutes evidence of that knowledge. The clarifications in the collection will suggest ways in which students might demonstrate or apply their knowledge, describe assessment task contexts that are appropriate and engaging to students at a particular age, and specify a range of cognitive skills that students might reasonably be expected to use to demonstrate what they know and can do.

Your Input Needed
Over time, the collection will become a permanent part of the Project 2061 Web site and will be updated and expanded regularly with new maps, clarifications, and high-quality, well-aligned assessment items. We would appreciate receiving test items that you believe are aligned to particular learning goals in Benchmarks or National Science Education Standards. Please forward such test items to us for consideration. And if you are interested in field testing items in your classrooms as the work proceeds, we would welcome your participation. As our new effort gets underway and prototypes become available online, please let us know what you think and how you might make use of these assessment resources.

For more information, please contact:
Principal Investigator: Dr. George DeBoer, (202) 326-6624


[Table of Contents]