Reprinted here with the permission of the National Science Teachers Association. No further republication or redistribution is permitted without the written permission of the editor.

Science Teacher, January 1997 - Volume 64

Lessons from Project 2061

Practical ways to implement benchmarks and standards

Jo Ellen Roseman
Project 2061

Since its inception in 1985, Project 2061 of the American Association for the Advancement of Science has pushed to reform K-12 education around clear statements of what students should know and be able to do. The Project’s 1989 report, Science for All Americans, provides a concise but broad definition of adult science literacy, encompassing ideas from the natural and social sciences, mathematics, and technology. Benchmarks for Science Literacy, published in 1993, provides further help to reformers by translating the ideas in Science for All Americans into a coherent set of specific learning goals for grades K-2, 3-5, 6-8, and 9-12. Meanwhile, a standards-based reform movement has taken hold throughout K-12 education, with each discipline developing its own set of learning goals or content standards.

Creating standards and building consensus for them among the disciplinary experts and educators has been a complex undertaking, but the much more difficult task of implementation still lies ahead. Project 2061 has been working closely with teachers, curriculum specialists, teacher educators, and materials developers to help them understand and use Benchmarks for Science Literacy. Based on our experiences with educators in hundreds of workshops, it seems clear that using learning goals such as benchmarks or standards will require thoughtful effort and close attention to the meaning of learning goals. A new kind of professional development may be needed as well, one that emphasizes the role of classroom teachers as decision makers and prepares them to be effective implementers of standards-based reform.

This paper will look at some of the lessons about standards-based reform that Project 2061 has learned. It makes a case for the careful study of any set of specific learning goals used to reform curriculum and instruction, describes some practical study techniques, and outlines a procedure for using benchmarks to analyze a curriculum material. These suggestions offer some practical directions for implementing a coherent set of specific learning goals--whether benchmarks or standards--as the basis for reform.

What Does a Benchmark Mean?

Project 2061 put its benchmarks into clear language that reflects the level of sophistication actually expected from students. Even so, people frequently interpret benchmarks in terms of their own current understanding of science and the current curriculum of the state, school district, or school. Often, they read extra meanings into benchmarks. For example, consider this K-2 goal from the Benchmarks section The Earth:

Water left in an open container disappears, but water in a closed container does not disappear.
(American Association for the Advancement of Science [AAAS], 1993)

When participants at Project 2061 workshops encounter this benchmark, some of them see it as an expectation for K-2 students to understand the mechanism of evaporation, including molecules, invisible vapor, and the term “evaporation” itself. Teachers who currently teach evaporation in grades K-2 are even more inclined to interpret the benchmark this way. In fact, the benchmark says nothing about “evaporation”; it merely describes an observable phenomenon, which is all some benchmarks do. Later benchmarks build on this observation, and add the idea of air as a gas, to develop the notion of evaporation.

On the other hand, some of our workshop participants underestimate what a benchmark requires. Consider this grade 6-8 benchmark from Benchmarks Section 5E, Flow of Matter and Energy:

Food provides molecules that serve as fuel and building material for all organisms. First, plants use the energy from light to make sugar molecules from carbon dioxide and water. This food can be used immediately by the plant or stored for later use. Organisms that eat plants break down the plant structures to obtain the materials and energy they need to survive. Then they are consumed by other organisms. (AAAS, 1993)

Participants might interpret this benchmark to mean that students should understand food chains merely in terms of “what eats what,” not the successive breakdown and reassembly of invisible molecular units. Participants less familiar with the molecular nature of matter and how it relates to living systems may neglect the word “molecules” in the benchmark.

Is such meticulous attention to wording within benchmarks—or, for that matter, any specific learning goals—really that important? Yes, if the benchmark or learning goal is to inform decisions about the curriculum. And it’s becoming clear to us through our workshops that the more educators think about and work with benchmarks, the more likely they are to see such learning goals as a strong foundation upon which to make important choices about what they teach and how they teach it.

Studying Learning Goals

Project 2061 has developed a procedure for studying benchmarks to help educators probe their meaning and to realize how important it is to do so. The procedure used by Project 2061 workshop participants can easily be modified for the study of National Science Education Standards or other specific learning goals. What is critical is to search for the full meaning of a particular learning goal through careful consideration and study of the following:

Adult literacy goals. For each Benchmarks section, there is a corresponding Science for All Americans section describing adult science literacy goals for that topic; it can help participants understand where the benchmark is aiming.
K-12 context. A review of benchmarks for other grade levels from the same Benchmarks section helps participants understand the level of sophistication intended by the benchmark.

Instructional strategy. The introductory essays in the Benchmarks section for the benchmark being studied help participants understand difficulties students may have with the benchmark topic and offer some suggestions for helping students achieve the benchmark.

Research base. Summaries of research on the topic from Benchmarks Chapter 15, The Research Base, suggest likely limitations in student understanding of the benchmark and, therefore, imply an appropriate grade level for the benchmark. They also point participants to the original research articles.

Growth-of-understanding maps. A relevant conceptual strand map, developed for Benchmarks on Disk, depicts a K-12 sequence of benchmarks for a particular Science for All Americans idea. The maps help participants see how other benchmarks relate to and contribute meaning to the benchmark being studied.

Consider again the K-2 benchmark on the water cycle. By consulting the relevant section of Science for All Americans, educators see this early benchmark as a step toward understanding climatic patterns. By examining other K-12 benchmarks from the same section, and discovering that Benchmarks delays until grades 3-5 the idea of liquid water turning into a gas (vapor), educators are less likely to read too much into the K-2 benchmark. And from the essays and research, they learn that evaporation is difficult to understand even for upper elementary students and should be reserved for middle-school instruction.

Comments of workshop participants before and after in-depth study of benchmarks [see box, below] convince us that reform will depend on thorough understanding of learning goals—benchmarks, the “fundamental concepts” in National Science Education Standards, or any statement of comparable specificity.

“Water left in an open container disappears, but water in a closed container does not disappear.”

Before using the procedure described in this article to study the full meaning of a benchmark, Project 2061 workshop particpants interpreted this K-2 benchmark to mean that students should:

“...learn about evaporation.”
“...understand states of matter and changes of state.”
“...have a quantitative sense of volume.”
“...should know about properties of water—for example, that water is a colorless liquid.”

After following the study procedure, participants had changed their thinking about the meaning of the benchmark:

“This is just about what happens when the container has a lid versus when it doesn’t.”
“This is strictly about observation—no explanation is needed.”
“The map shows that K-2 students aren’t held accountable for the term ‘evaporation’.”
“We can expect that by the end of grade 2 students will be familiar with this phenomenon.”
“Research says it’s OK to stop with the phenomenon.”

Putting Learning Goals to Work

To make study of benchmarks more relevant to the immediate needs of educators, Project 2061 has developed activities that engage participants in using benchmarks for a variety of practical purposes: to analyze curriculum frameworks, to analyze or design instruction, to select and adapt curriculum materials, and so on. Each use requires careful reading of benchmarks. The five-step procedure described here helps educators analyze how well a particular curriculum material addresses Benchmarks and then consider how the material could better serve particular benchmarks. Again, the technique could just as easily be applied to studying how well a particular curriculum materials serves National Science Education Standards or some other set of learning goals.

  • Step 1: Identify benchmarks that appear to be covered by the curriculum material. After reviewing the curriculum material and relevant sections of Benchmarks, educators develop a list of specific benchmarks the material might address. Next, looking more closely at the curriculum material, they reduce their list of benchmarks to only those on which the material actually focuses.
  • Step 2: Study the benchmarks to clarify their meaning. Selecting one benchmark from their list, participants then examine relevant passages in Science for All Americans and Benchmarks according to the procedure described earlier in this article.
  • Step 3: Analyze the material to determine the extent to which it addresses the actual content of the benchmarks. Educators evaluate how well activities in the material address the substance of the benchmarks on the list, whether the activities are appropriate for the intended grade level, and whether the whole benchmark is addressed. They may find, for example, that a set of lessons on the water cycle, advertised as K-2, focuses on evaporation and hence addresses benchmarks for grades 3-5.
  • Step 4: Estimate the likelihood that students will learn the ideas in the benchmarks from the prescribed activities. Next, they consider whether the material’s activities provide students with concrete experiences, opportunities to reflect on them, and opportunities to explore concepts in varied contexts.
  • Step 5: Improve the match by adapting and supplementing the activities in the material. Based on the results of this procedure, educators then suggest ways to revise activities. For example, they might add questions to prompt student reflection, or identify additional contexts in which students can explore concepts.

Educators who have examined materials in this way at Project 2061 workshops find themselves less likely to accept at face value claims that a given curriculum material helps students learn a particular set of goals. They agree that the activities are well worth the time and effort expended.

Reaching More Educators: Resources for Science Literacy

The number of intensive workshops that Project 2061 can offer does not meet the growing demand for help in using Benchmarks, let alone the new science education standards, mathematics standards, and so on. To reach more educators, Project 2061 has developed a new CD-ROM tool, Resources for Science Literacy: Professional Development, scheduled for release later in 1996. This tool is designed for teacher preparation programs, in-service staff-development programs, and individual study. It includes the Project 2061 Workshop Guide and a set of sample workshops, along with information on university course syllabi and science trade books to improve teachers’ understanding of the science, mathematics, and technology that constitutes science literacy; comparisons of Benchmarks to national content standards in science, mathematics, and social studies; and descriptions of research articles and videos that illustrate difficulties students have learning science concepts.

Project 2061 is also working with teachers, teacher educators, and materials developers to refine and field-test its procedure for analyzing curriculum materials for their match to specific learning goals. The procedure will be published, along with sample analyses, on Resources for Science Literacy: Curriculum Materials, a companion to Resources for Science Literacy: Professional Development.

More Work Ahead

The challenges described here raise important issues, not just for Project 2061 or for science education, but for all reform efforts that are based on the adoption of specific learning goals or content standards. Our experience suggests that goals-based reform is far more radical than it may appear at first glance. For such reform to take hold, educators will need a great deal of help and support in understanding and implementing benchmarks, standards, or other such learning goals.


American Association for the Advancement of Science. (1993). Benchmarks for science literacy. New York: Oxford University Press.

American Association for the Advancement of Science. (1990). Science for all Americans. New York: Oxford University Press.

National Research Council. (1996). National science education standards. Washington, DC: National Academy Press.

Roseman, J. E. 1997. Lessons from Project 2061: Practical Ways to Implement Benchmarks and Standards. Science Teacher, 64.