To Analyze Curriculum Materials, Option B:
Science Place: Air, Sun, And Water

Estimated Time: 5 hours.

List of Materials

Notes:

  1. The potential of curriculum activities can be estimated by examining whether they address specific important learning goals (content match), and whether they are based on effective principles of teaching and learning (pedagogical match). However, the actual curriculum will depend on teachers’ interpretations and use of the materials, and the achieved curriculum will depend on individual students’ skills, interests, and prior knowledge. Therefore, the suitability and effectiveness of curriculum activities and materials are ultimately empirical problems. Until activities are tried out with students there will not be evidence for what is actually learned. Data from classroom trials are needed to go beyond speculation to an empirical match where assessment indicates reliably that a considerable proportion of students can understand the main goals targeted in the curriculum activities. Emphasize this point in your discussion, if time is available.
  2. Group work is particularly valuable in this workshop, as discussion brings varying points of view to bear on issues.
  3. Participants learn best by first practicing the analysis procedure with a short, simple resource. If you plan to adapt this sample workshop script to discuss the analysis of other instructional materials, please note that no more than three lessons should be attempted in a half-day. When participants have more experience with this method of analysis, have them apply it to longer, more complex materials, such as textbook sections or curriculum units.
  4. Reviewing the content and pedagogical analyses for two benchmarks related to Air, Sun, and Water will help you prepare for this workshop (see Science Place: Air, Sun, and Water: Notes on Curriculum Analysis in Chapter 5: Selected Readings). The analyses are not intended for distribution to participants, although you might use selected portions to illustrate specific points during the workshop.
Sample Presentation
Presenter: The purpose of this activity is to demonstrate the use of Science for All Americans (SFAA), and Benchmarks for Science Literacy (Benchmarks) in analyzing how effectively a curriculum material addresses specific learning goals. (The purpose is not for you to become skilled in using the analysis procedure; that would take more time than is available in this workshop.) We will analyze three lessons of a grade 1 curriculum in the Science Place series by Scholastic, Inc. Although Science Place was not developed with Benchmarks in mind, it serves as a useful case study to illustrate the use of an analysis procedure developed by Project 2061. This analysis highlights issues schools and districts will encounter as they attempt to select and adapt existing curriculum materials in ways that help students progress toward science literacy.  

Hand out copies of Science for All Americans, Benchmarks for Science Literacy, and Science Place.

TRANSPARENCY: Steps in Resource Analysis—Air, Sun, and Water.

Presenter: Before we begin analyzing how effectively a curriculum resource addresses the learning goals recommended by Project 2061, let me take a moment to review each step of the procedure we will use.

First, you will identify benchmarks that appear to be central to the resource. This involves your becoming familiar with the resource, becoming familiar with SFAA and Benchmarks, identifying candidate benchmarks; and narrowing the list to a few that appear to be central to the lessons.

Next, you will use SFAA and Benchmarks to study the meaning of those benchmarks—in other words, what they are actually expecting students to know or be able to do.

Then, you will analyze the content match between the benchmarks and Science Place activities. If you find a content match, you will consider the likelihood that the Science Place activities will help students achieve the benchmarks—the pedagogical match.

You could think of this analysis procedure as a multistep filtering process in which the filter’s pores get finer with each step.

Throughout the analysis, it is important that you examine the resource as it is. You should not consider how good teachers might use or supplement Science Place.

Finally, we will apply what is learned from the analysis to improving Science Place.

Step 1: Identifying benchmarks that appear to be central to Science Place. (1 - 2 hours)

Have participants read through and summarize the curriculum activities. Allow 30 minutes if participants are not familiar with the curriculum material; 10 minutes if they are. (As you plan the workshop, you may refer to Science Place: Air, Sun, and Water: Summary of Lessons 10, 11, 12 in Chapter 5: Selected Readings for an overview of activities. This summary is not intended for workshop participants.) TRANSPARENCY: Benchmarks for Science Literacy (Table of Contents).

Ask participants to refer to the front-cover, fold-out table of contents in their copies of Benchmarks for Science Literacy, or distribute the HANDOUT: Benchmarks Table of Contents.

Explain, depending upon how much background you have already presented, that to use SFAA and Benchmarks, participants will need to be able to find their way around in them. This experience will help them do so.

Have participants examine the Benchmarks table of contents and identify sections in which they think benchmarks related to the Science Place activities are likely to be found. Ask participants to consider why they expect to find related benchmarks in those sections. The purpose of this activity is to familiarize participants with the contents of Benchmarks. Participants then share their choices with the larger group, explaining why they would look in the particular Benchmarks sections they chose.

Note: To give you a sense of the range of benchmarks that can be generated, see Science Place: Air, Sun, and Water: Candidate Benchmarks (Chapter 5: Selected Readings). This list is not intended for workshop participants. Organize participants into small groups. Using the Benchmarks sections participants identified as possible sources of related benchmarks, assign a different section to each small group. Have participants identify "candidate" benchmarks—benchmarks that might be addressed in the curriculum resource.

Presenter: You have had an opportunity to become familiar with this curriculum resource and to identify sections of Benchmarks that might contain benchmarks related to it. Review the material again to identify specific benchmarks within the assigned section that you think might be addressed and note reasons for your selections. We will call this list our candidate benchmarks. When you have made your selections, record the benchmarks—chapter, section, grade level, page—on chart paper.  

Ask groups to report on one or two benchmarks they chose, giving reasons for their selection.

Ask participants to read through the activities more carefully, noting particular pages and activities where each benchmark is addressed. This process will reveal which benchmarks are explicitly dealt with in the resource and which are not.

Presenter: We have identified a candidate list of benchmarks, those we think might be addressed in activities described in the curriculum material. Now we will shorten this list by searching the curriculum more carefully to locate activities that are related to the benchmark. Record page numbers and notes about the activities on your charts.

Ask participants to explain, for a benchmark that they eliminated from their candidate list, why they did so. Then, ask them to explain, for a benchmark that they kept in their candidate list, why they kept it.

During this activity, participants often start discussing how literally benchmarks should be taken. It becomes apparent that different participants interpret the benchmarks in different ways. This is a good lead-in to the next activity, in which workshop participants study a benchmark to clarify its intent.

Step 2: Studying the benchmarks to clarify their meaning. (90 minutes)

Presenter: A key principle of Project 2061’s approach to science education is that curriculum and instruction should explicitly address learning goals—the benchmarks listed in Benchmarks for Science Literacy.

But how can an educator know when curriculum and instruction actually address specific benchmarks? To know that, we need a clear understanding of what the benchmark is expecting students to know or be able to do. In this part of today’s workshop we will demonstrate a procedure that can be used to help educators fully understand the intent of a benchmark. This procedure can be used for any benchmark (or standard). We will use it to understand the intent of benchmarks that are dealt with in the resource. These are two benchmarks from the lists you have generated.  

TRANSPARENCY: Benchmarks Related to Air, Sun, and Water.

Presenter: According to these benchmarks, what are students expected to know?

Ask the participants to describe to their partners their understanding of the benchmarks. Have several pairs report to the whole group. Record some of their comments on a blank transparency.

TRANSPARENCY: Exploring Project 2061 Tools - 4B.

Presenter: We are now going to see how Science for All Americans and Benchmarks for Science Literacy can be used to give us additional insights about these benchmarks. We will study the benchmarks in relation to the five readings shown on the transparency. 

Briefly review for participants the purposes of the five readings shown on the transparency:

TRANSPARENCY: Strand Map: Water Cycle.

Distribute the HANDOUT: Strand Map: Water Cycle to each participant.

Presenter: Before we begin the readings, I’ll take a moment to introduce the strand map. Strands are networks of benchmarks through which students might progress on their way to the adult literacy goals defined in Science for All Americans. The strands show the development of concepts from rudimentary benchmarks at the elementary level through middle school learning to the sophisticated level of understanding expected of high school graduates. Strand maps show how related benchmarks build on and reinforce one another. There are 30 strand maps on Benchmarks on Disk.  

TRANSPARENCY: Exploring Project 2061 Tools - 4B.

Presenter: Your task is to study the reading you’ve been assigned to see how it affects your understanding of the benchmarks and then share what you have learned with your group.  

After participants have completed their studies, ask individuals to explain to other members of their group or to the total group what their particular reading contributed to their understanding of the benchmarks. Record these comments on a blank transparency. Use this transparency and the one you created earlier in the session to compare understanding before and after the study.

Step 3: Analyzing how specifically Air, Sun, and Water addresses the actual content of the benchmarks. (45-60 minutes)

Presenter: In the next two activities, we will look systematically for evidence that there is a match between these benchmarks and the activities in Science Place. We will look for evidence in two steps. First we will examine how well the activities in Science Place match the content of the benchmarks. If we find a content match, we will then examine the pedagogy described in Science Place to assess whether it is likely to contribute to students’ learning the benchmarks.  

TRANSPARENCY: Content Match Questions.

Distribute the HANDOUT: Content Match Questions to each participant.

Presenter: For a content match, we analyze all student materials—reading materials, laboratory activities, discussion questions, and so forth—for their fit to the content of each benchmark. The questions on this transparency will be helpful in this process.   Ask participants to read the first question.

TRANSPARENCY: Topic or Substance?

Presenter: This transparency shows two examples to clarify the difference between addressing the topic of the benchmark and addressing the substance of the benchmark.

Presenter: Consider this benchmark:

Some events in nature have a repeating pattern. The weather changes some from day to day, but things such as temperature and rain (or snow) tend to be high, low, or medium in the same months every year. The topic of the benchmark seems to be "weather." Activities that involve students in thinking about what the weather is like in other parts of the world or in gathering temperature data during a month seem to contribute to the learning goal. But if we read the benchmark carefully, we see that the benchmark is really about repeating patterns of weather. To address the substance of the benchmark, an activity would need to involve students in taking temperature or precipitation measurements over a year, comparing them with measurements students took in previous years, and looking for an overall pattern in the same months over several years. 

Presenter: Consider the second benchmark:

Clear communication is an essential part of doing science. It enables scientists to inform others about their work, expose their ideas to criticism by other scientists, and stay informed about scientific discoveries around the world. What is the topic of this benchmark?   (Probable response, communication, communication in science).

Presenter: Activities that involve students in communication—working in groups and sharing information—would seem to contribute toward this learning goal. But is communication the substance of the benchmark?   (Take responses.)

Presenter: We can see that the benchmark is really about the essential role of communication in science; students need to understand that communication is not an end in itself but a crucial means to sharing information. To contribute to learning this benchmark, activities would have to be structured so that students reflect on the importance of communication, perhaps after having direct experience with needing to communicate about investigations.  

TRANSPARENCY: Content Match Questions.

Point to the second question: "Does the activity reflect the level of sophistication of the benchmark, or does the activity target a benchmark at an earlier or later grade level?" Remind participants that examining strand maps is helpful in thinking about the level of sophistication of the benchmark.

Then, point to the next question: "Is only a part of the benchmark or is the whole benchmark addressed?" Comment that there is nothing wrong with an activity addressing only a part of the benchmark, but it is important that we know exactly what the activity addresses.

Presenter: With these questions, our analysis becomes even more rigorous. Our filtering device is much finer so that even fewer benchmarks will pass through the filter. If you were analyzing a curriculum material on your own, you would use the Content Match Questions on the handout to review the activities for each benchmark in your short list and assess how well they match the content of the benchmarks. This analysis would probably reduce further your list of benchmarks. Because of time constraints, today we will use the Content Match Questions to review activities for only the two benchmarks that we studied in Step 2.  

TRANSPARENCY: Benchmarks Related to Air, Sun, and Water.

Give participants 30 minutes to analyze the activities for content match and report their findings. For each benchmark examined, ask participants what evidence indicates that the activities match or do not match the content of the benchmark.

As the discussion proceeds, try to have participants briefly examine and discuss how well the material’s activities address the content of some other benchmarks in their short list.

Step 4: Estimating the likelihood that students will learn the benchmarks from the presented activities. (45-60 minutes)

Presenter: How effective will the teaching recommended in this resource be? The potential of curriculum activities can be estimated by examining whether they address specific important learning goals (content match) and whether they are based on effective principles of teaching and learning (pedagogical match). However, the actual curriculum will depend on teachers’ interpretations and use of the materials, and the achieved curriculum on individual students’ skills, interests and prior knowledge. Therefore, the suitability and effectiveness of curriculum activities and materials are ultimately empirical problems. Until activities are tried out with students there will not be evidence for what is actually learned. Data from classroom trials are needed to go beyond speculation to an empirical match where assessment indicates reliably that a considerable proportion of students can understand the main goals targeted in the curriculum. However, we have found that estimating the likelihood of student learning improves our own thinking about resource selections and gives us clues about how they can be improved.  

Distribute the HANDOUT and show the TRANSPARENCY: Pedagogical Match Questions: Curriculum Analysis.

Tell participants that this list is based on Chapter 13: Effective Learning and Teaching in Science for All Americans. Participants will use the list to evaluate how effective the teaching procedures recommended in this resource will be. Explain that some criteria may not be applicable to all single resources. For example, it may be impossible to determine from a single short resource whether activities are sequenced in a way that reflects progressive levels of difficulty and complexity. However, in a longer, more complex resource such as a textbook, this determination might well be made.

Tell the participants that it is essential that they assume the teaching will be exactly as it is described in the resource. For example, if no questions requiring reflection are asked, the analyzer must assume there will be none. This level of objectivity in analysis is especially hard for good teachers, who will usually see ways to improve almost any resource. Tell participants they will have a chance later to suggest improvements.

Give groups 30-60 minutes to analyze and discuss the pedagogical match to benchmarks (one benchmark per group).

Step 5: Improving the match by adapting and supplementing existing materials.
(45-60 minutes)

Ask participants to prepare, in small groups, summary reports on their findings from the content and pedagogical analyses. Ask them to include their ideas on how they could use the results of this analysis procedure to adapt and improve Science Place to address the two benchmarks.

In the whole group, take some ideas from the summary reports. Have participants discuss how they could use content and pedagogical match questions to guide them in improving instruction; for example, by adding questions that foster reflection or by adding practice activities in different contexts. Ask for specific examples. Allow sufficient discussion for participants to get the flavor of using the analysis to improve instruction.

Presenter: Even though you may not be able to control what curriculum materials are available to you, you can, by using this analysis procedure, identify changes that you can make to address specific learning goals more effectively. Sometimes changes can be as simple as asking students different questions or omitting some sections and spending more time on other sections that do address a benchmark. Sometimes you can make curriculum connections that you would not have thought of before; for example, using themes, integrating mathematics, thinking about design or the nature of science.