The Project 2061 Curriculum-Analysis Procedure

(Reprinted here from the TIMSS Resource Kit)


Deciding which curriculum materials to use is one of the most important professional judgments that educators make. Textbook adoption committees make recommendations that influence instruction for years to come, and the daily decisions teachers make about which teaching units or chapters to useand how to use themlargely determine what and how students will be expected to learn.

Such important decisions require a valid and reliable method for evaluating the quality of curriculum materials. Even an in-depth review of the topics covered by a textbook or a teaching unit may not be sufficient to determine whether the material will actually help students learn that content. What is needed is a manageable process for examining curriculum materials that gets below the surface by focusing intensely on the appropriateness of content and the utility of instructional design.

With funding from the National Science Foundation and in collaboration with hundreds of K-12 teachers, curriculum specialists, teacher educators, scientists, and materials developers, Project 2061 of the American Association for the Advancement of Science (AAAS) has been developing a process for analyzing curriculum materials. Field tests suggest that Project 2061's curriculum-analysis procedure will not only serve the materials adoption needs of the schools but also help teachers revise existing materials to increase their effectiveness, guide developers in the creation of new materials, and contribute to the professional development of those who use it.

Specific Learning Goals Are Key

Until recently, there was nothing against which to judge appropriateness of content and utility of instructional design. Now, as a result of the standards-based reform movement in education, these judgments can be made with a high degree of confidence. In science and mathematics, for example, the appearance of Science for All Americans (AAAS, 1989), Curriculum and Evaluation Standards for School Mathematics (National Council of Teachers of Mathematics, 1989), Benchmarks for Science Literacy (AAAS, 1993), and National Science Education Standards (National Research Council, 1996) has made it possible to make more thoughtful decisions about curriculum materials than ever before.

Although the Project 2061 curriculum-analysis procedure was developed using the learning goals in Benchmarks and the mathematics and science standards, subsequent work has indicated that some state education frameworks also can be used. Indeed, the process would seem to apply to any K-12 school subject for which specific learning goals have been agreed upon. These goals must be explicit statements of what knowledge and skills students are expected to learn, and they must be precise. Vague statements such as "students should understand energy" are not adequate. Instead, consider this benchmark dealing with energy-related concepts that students should know by the end of the eighth grade:

Most of what goes on in the universefrom exploding stars and biological growth to the operation of machines and the motion of peopleinvolves some form of energy being transformed into another. Energy in the form of heat is almost always one of the products of an energy transformation.

Similar explicit statements can be found in the fundamental concepts of the National Research Council's National Science Education Standards (NSES).

At its simplest level, the Project 2061 curriculum-analysis procedure involves the following five steps:

  • Identify specific learning goals to serve as the intellectual basis for the analysis. This is done before beginning to examine any curriculum materials. The source for appropriate goals can be national standards or benchmark documents such as those mentioned above, state or local standards and curriculum frameworks, or sources like them. To be useful, the goals must be precise in describing the knowledge or skills they intend students to have. If the set of goals is large, a representative sample of them should be selected for purposes of analysis.
  • Make a preliminary inspection of the curriculum materials to see whether they are likely to address the targeted learning goals. If there appears to be little or no correspondence, the materials can be rejected without further analysis. If the outlook is more positive, go on to a content analysis.
  • Analyze the curriculum materials for alignment between content and the selected learning goals. The purpose here is to determine, citing evidence from the materials, whether the content in the material matches specific learning goalsnot just whether the topic headings are similar. At the topic level, alignment is never difficult, since most topicsheredity, weather, magnetism, and so forthlack specificity, making them easy to match. If the results of this analysis are positive, then reviewers can take the next step.
  • Analyze the curriculum materials for alignment between instruction and the selected learning goals. This involves estimating the degree to which the materials (including their accompanying teacher's guides) reflect what is known generally about student learning and effective teaching and, more important, the degree to which they support student learning of the specific knowledge and skills for which a content match has been found. Again, evidence from the materials must be shown.
  • Summarize the relationship between the curriculum materials being evaluated and the selected learning goals. The summary can take the form of a profile of the selected goals in terms of the content and instruction criteria, or a profile of the criteria in terms of the selected goals. In either case, a statement of strengths and weaknesses should be included. With this information in hand, reviewers can make more knowledgeable adoption decisions and suggest ways for improving the examined materials.

In addition to its careful focus on matching content and instruction to very specific learning goals, the Project 2061 procedure has other features that set it apart. For example, its emphasis on collecting explicit evidence (citing page numbers and other references) of a material's alignment with learning goals adds rigor and reliability to decisions about curriculum materials. Similarly, the Project 2061 procedure calls for a team approach to the analytical task, thus providing opportunities for reviewers to defend their own judgments about materials and to question those of other reviewers. These and other characteristics help make participation in the analytical process itself a powerful professional development experience.

The Project 2061 Curriculum-Analysis Procedure in Detail

To provide a better sense of how the procedure works, the following describes in more detail each step in the procedure, using learning goals from Project 2061's Benchmarks for Science Literacy as an illustrative frame of reference. The description pays particular attention to the various criteria used to evaluate the instructional effectiveness of materials.

Identify specific learning goals to serve as the intellectual basis for the analysis.

After reviewers have agreed upon a set of learning goals as a framework for the analysis (in this case, the benchmarks in Benchmarks for Science Literacy), the task is then to choose specific benchmarks that will serve as the focus of further study.

When evaluating standalone curriculum units that cover a relatively short period of time, it might be possible and worthwhile to analyze all of the benchmarks that appear to be targeted by the material. However, in the evaluation of year-long courses or multiyear programs, this becomes impractical. Therefore, a crucial step in the analysis procedure is the sampling of a few benchmarks that will lead to valid and reliable generalizations about the material.

Sampling of benchmarks should be representative of the whole set of goals specified in Benchmarks for Science Literacy and should reflect the reviewers' needs. For example, if the review committee's task is to select a course in high school Biology that is aligned with Benchmarks, it might identify a sample of benchmarks from life science sections in Benchmarks (e.g., cells, heredity, and evolution) and from other sections (e.g., nature of scientific inquiry, models, and communication skills). When examining middle-school science materials, one would probably want to broaden the range of benchmarks examined to include some from physical and earth science topics (e.g., energy, forces, and processes that shape the earth).

Make a preliminary inspection of the curriculum materials to see whether they are likely to address the targeted learning goals.

Once benchmarks have been selected, the next step is to make a first pass at the materials to identify those whose content appears to correspond reasonably well to Benchmarks. Materials that do not meet these initial criteria are not analyzed further.

Reviewers then examine materials on the shortened list more carefully to locate and record places where each selected benchmark seems to be targeted (e.g., particular readings, experiments, discussion questions). If several sightings are found for some or all of the sample benchmarks in the material, then these sightings will be looked at more carefully in subsequent steps of the analysis. If, on the other hand, sightings cannot be found for a significant number of the sample benchmarks, then the material is dropped from the list.

Analyze the curriculum materials for alignment between content and the selected learning goals.

This analysis is a more rigorous examination of the link between the subject material and the selected learning goals and involves giving precise attention to both ends of the matchthe precise meaning of the benchmark on one end and the precise intention of the material on the other.

With respect to each of the sampled benchmarks, the material is examined using such questions as:

  • Does the content called for in the material address the substance of a specific benchmark or only the benchmark's general "topic"?
  • Does the content reflect the level of sophistication of the specific benchmark, or are the activities more appropriate for targeting benchmarks at an earlier or later grade level?
  • Does the content address all parts of a specific benchmark or only some? (While it is not necessary that any particular unit would address all of the ideas in a benchmark or standard, the K-12 curriculum as a whole should do so. The purpose of this question is to provide an account of precisely what ideas are treated.)

In addition, an attempt is made to estimate the degree of overlap between the material's content and the set of benchmarks of interest. Thus, this step in the analysis is designed to answer questions regarding the material's inclusion of content that is not required for reaching science literacy and the extent to which the material distinguishes between essential and non-essential content. (While distinguishing content essential for literacy from non-essential content in material might seem to be a luxury, it assists teachers in determining the range of students for which the material can be used. Identifying the non-essential material makes it easier for the teacher to direct better students to enrichment activities and allows students themselves to avoid overload from ideas that go beyond what is vital.)

Analyze the curriculum materials for alignment between instruction and the selected learning goals.

The purpose here is to estimate how well material addresses targeted benchmarks from the perspective of what is known about student learning and effective teaching. The criteria for making the judgments in the instructional analysis are derived from research on learning and teaching and on the craft knowledge of experienced educators. In the context of science literacy, these are summarized in Chapter 13, "Effective Learning and Teaching," of Science for All Americans; in Chapter 15, "The Research Base," of Benchmarks for Science Literacy; and in Chapter 3, "Science Teaching Standards," of National Science Education Standards.

From these sources, seven criteria clusters (shown below) have been identified to serve as a basis for the instructional analysis (for the specific questions within each cluster, see Appendix on page 137). The proposition here is that (1) the analysis would tie the instruction to each one of the sample benchmarks rather than look at instructional strategies globally and (2) in the ideal, all questions within each cluster would be well-addressed in any materialthey are not alternatives.

  • Cluster I. Providing a Sense of Purpose: Part of planning a coherent curriculum involves deciding on its purposes and on which learning experiences will likely contribute to those purposes. But while coherence from the curriculum designers' point of view is important, it may not give students an adequate sense of what they are doing and why. This cluster includes criteria to determine whether the material attempts to make its purposes explicit and meaningful to students, either by itself or by instructions to the teacher.
  • Cluster II. Taking Account of Student Ideas: Fostering better understanding in students requires taking time to attend to the ideas they already have, both ideas that are incorrect and ideas that can serve as a foundation for subsequent learning. Such attention requires that teachers be informed about prerequisite ideas/skills needed for understanding a benchmark and what their students' initial ideas arein particular, the ideas that may interfere with learning the scientific information. Moreover, teachers can help address students' ideas if they know what is likely to work. This cluster examines whether the material contains specific suggestions for identifying and relating to student ideas.
  • Cluster III. Engaging Students with Phenomena: Much of the point of science is explaining phenomena in terms of a small number of principles or ideas. For students to appreciate this explanatory power, they need to have a sense of the range of phenomena that science can explain. "Students need to get acquainted with the things around themincluding devices, organisms, materials, shapes, and numbersand to observe them, collect them, handle them, describe them, become puzzled by them, ask questions about them, argue about them, and then try to find answers to their questions." (Science for All Americans, p. 201) Furthermore, students should see that the need to explain comes up in a variety of contexts.
  • Cluster IV. Developing and Using Scientific Ideas: Science for All Americans includes in its definition of science literacy a number of important yet quite abstract ideas (e.g., atomic structure, natural selection, modifiability of science, interacting systems, common laws of motion for earth and heavens). Such ideas cannot be inferred directly from phenomena, and the ideas themselves were developed over many hundreds of years as a result of considerable discussion and debate about the cogency of theory and its relationship to collected evidence. Science literacy requires that students see the link between phenomena and ideas and see the ideas themselves as useful. This cluster includes criteria to determine whether the material attempts to provide links between phenomena and ideas and to demonstrate the usefulness of the ideas in varied contexts.
  • Cluster V. Promoting Student Reflection: No matter how clearly materials may present ideas, students (like all people) will assign their own meanings to them. Constructing meaning well is aided by having students (1) make their ideas and reasoning explicit, (2) hold them up to scrutiny, and (3) recast them as needed. This cluster includes criteria for determining whether the material suggests how to help students express, think about, and reshape their ideas to make better sense of the world.
  • Cluster VI. Assessing Progress: There are several important reasons for monitoring student progress toward specific learning goals. Having a collection of alternatives can ease the creative burden on teachers and increase the time available to analyze student responses and make adjustments in instruction based on those responses. This cluster includes criteria for evaluating whether the material includes a variety of goal-relevant assessments.
  • Cluster VII. Enhancing the Learning Environment: Many other important considerations are involved in the selection of curriculum materialsfor example, the help they provide to teachers in encouraging student curiosity and creating a classroom community where all can succeed, or the material's scientific accuracy or attractiveness. The criteria listed in this cluster provide reviewers with the opportunity to comment on these and other important features.

Summarize the relationship between the curriculum materials being evaluated and the selected learning goals.

In the preliminary inspection, a few benchmarks were selected as representative of the set of goals that the material appears to target. Having analyzed whether the content in the material matches these specific benchmarks and how well the instructional strategies in the material support students learning these benchmarks, the final step in the process is to provide a profile of the material based on this analysis.

The analysis makes it possible to produce two sets of profiles. The first illustrates how well the material treats each benchmark (for which a content match was found) across all criteria examined in the instructional analysis. Based on these profiles, conclusions can be made about what the material under consideration can be expected to accomplish in terms of benchmarks. For example, the profiles may indicate that the material treats one of the examined benchmarks well and the rest only moderately or poorly.

The second set of profiles illustrates how well the material meets each criterion in the instructional analysis tool across all benchmarks examined. These profiles point to major strengths and weaknesses in the instructional design of the material. For example, the profiles may indicate that the material consistently includes appropriate experiences with phenomena relevant to the benchmarks but only occasionally provides students with opportunities to reflect on these experiences. Depending on the time available and their interests, a review committee could decide to produce either one or both sets of profiles. Profiles of different materials provide the basis for selection decisions.

Support for Users

Project 2061 is in the process of developing "Resources for Science Literacy: Curriculum Evaluation," a CD-ROM that will offer full instruction in using the procedure. The CD-ROM and its print companion volume will contain (1) detailed instructions for evaluating curriculum materials in light of Benchmarks, national standards, or other learning goals of comparable specificity; (2) case-study reports illustrating the application of the analysis procedure to a variety of curriculum materials; (3) a utility for relating findings in the case-study reports to state and district learning goals; and (4) a discussion of issues and implications of using the procedure.

Project 2061 also offers introductory workshops and longer training institutes to groups of educators. Typically three to six days long, the training institutes can be adapted to suit a variety of needs and time constraints. The project has offered customized workshops for K-12 science and mathematics teachers, teacher educators, school and university administrators, developers of curriculum materials, and policy makers. Depending on the interests of participants, the workshops can focus on understanding learning goals, selecting materials, revising existing materials, or evaluating curriculum frameworks, among other possibilities.

For information on Project 2061's workshops and institutes (or any aspects of Project 2061's work) contact Mary Koppal, Project 2061, American Association for the Advancement of Science (see the Guide to Using the Methods of Analysis section of this Guidebook for contact information).

Putting the Project 2061 Curriculum-Analysis Procedure to Work

Many of the educators involved in developing and field testing the Project 2061 procedure have begun to use it to decide on materials for their classrooms, school districts, or states; to identify shortcomings in materials they are using; and to suggest ways to improve them. Here are a few examples of how educators have adapted the procedure to their local needs and time constraints:

San Antonio.

Faced with the task of selecting a new high-school physical science textbook from five possible choices, a San Antonio school district committee requested training in the Project 2061 curriculum-analysis procedure. Already familiar with Benchmarks for Science Literacy and Science for All Americans, these 12 educators spent two days studying Project 2061's analytical criteria, as well as some additional criteria decided locally. Committee members then evaluated one material apiece, with at least two committee members independently evaluating each material. When finished with their independent evaluations, those educators reviewing the same material met to compare their results and to come to an agreement about the value of the material. Then, about three weeks after the initial training, the whole group reconvened to share their results and settle on the material. Because the evaluation procedure requires reviewers to cite evidence for judgments made, the reviewers were prepared to justify their recommendations, pointing to specific instructional strategies for particular learning goals in physical science.

After much discussion, the reviewers reached agreement on one material for the district. Throughout the process, the reviewers were very reflective and motivated by the work at hand. In fact, because the evaluation procedure had revealed some weaknesses even in the material they agreed to select, they decided to write a supplemental unit on one topic.


The Philadelphia school district was already committed to teaching toward specific learning goals derived from Benchmarks and National Science Education Standards when it set out to identify materials that are aligned with those goals. Their list of possibilities included some materials developed through National Science Foundation funding and some materials that had been favorably evaluated by the Project 2061 pilot and field test participants. The district held training institutes to introduce teachers to the evaluation procedure and to develop evaluation skills that they would use to select materials from the list of possibilities. More than 200 teachers participated in the institutes, giving the district a cadre of leaders who could assist in the school-based selection of curriculum materials.

After employing the procedure to select materials for use in their classrooms, teachers planned to make a more thorough evaluation of the materials when they eventually put them to use. Findings from the procedure also will be used to improve materials currently being implemented in district classrooms. Such remedies might include developing questions to focus students' reflection on benchmark ideas, adding activities to address student learning difficulties, or demonstrating how benchmark ideas are useful for making sense of the students' world outside the classroom.

Through its work with the Project 2061 procedure, Philadelphia has developed a group of educators who are becoming more knowledgeable about specific learning goals in Benchmarks and the National Science Education Standards and about the analysis criteria used to judge materials in light of these goals. As new, better aligned materials become available, the district will have a cadre of informed consumers who can recognize them and put them to work. Most important, district classrooms will reflect teaching and learning that engage all students in achieving science literacy goals essential for the 21st century.


In the fall of 1996, Project 2061 began to work with the director of the Kentucky Middle Grades Mathematics Teacher Network to adapt the project's curriculum analysis procedure to mathematics. The Kentucky Network, which already reaches some 2,000 teachers, aims to align the state's mathematics content and teaching practices in fifth through eighth grades with the recommendations of the National Council of Teachers of Mathematics' Curriculum and Evaluation Standards for School Mathematics. In particular, the network helps to train teachers in reviewing and analyzing curriculum materials so that they can (1) discriminate between materials that only claim to align with the mathematics standards and those that actually do and (2) recognize standards in the newer, integrated mathematics curricula.

As the criterion for alignment, Project 2061 has used Kentucky's Mathematics Core Content for Assessment (which elaborates the national mathematics standards into more specific goal statements) to analyze five middle-school curriculum projects funded by the National Science Foundation (NSF). While developing the analysis procedure and applying it to the materials, Project 2061 received continual feedback from Kentucky teachers and from a national advisory committee that included the developers of the NSF-funded curricula.

During a 1997 two-week summer workshop, 32 Kentucky teachers used the analysis procedure and case-study reports to examine middle-grade mathematics materials and develop workshops for teachers throughout the state. In doing so they (1) gained a better understanding of integrated, problem-based mathematics curricula; (2) developed the skills necessary to evaluate mathematics curricula in light of specific learning goals; and (3) developed skills necessary to effectively share what they have learned throughout their regions. The workshop participants worked during the 1997-1998 school year with teachers, schools, and districts in their regions to assist in analyzing and evaluating mathematics curriculum materials.

1998. Guide Book To Examine School Curricula: The Project 2061 Curriculum-Analysis Procedure. Reprinted from the TIMSS Resource Kit