Estimated Time: 5 hours.
Notes:
Hand out copies of Science for All Americans, Benchmarks for Science Literacy, and Science Place.
TRANSPARENCY: Steps in Resource Analysis—Air, Sun, and Water.
Presenter: Before we begin analyzing how effectively a curriculum resource addresses the learning goals recommended by Project 2061, let me take a moment to review each step of the procedure we will use.
First, you will identify benchmarks that appear to be central to the resource. This involves your becoming familiar with the resource, becoming familiar with SFAA and Benchmarks, identifying candidate benchmarks; and narrowing the list to a few that appear to be central to the lessons.
Next, you will use SFAA and Benchmarks to study the meaning of those benchmarks—in other words, what they are actually expecting students to know or be able to do.
Then, you will analyze the content match between the benchmarks and Science Place activities. If you find a content match, you will consider the likelihood that the Science Place activities will help students achieve the benchmarks—the pedagogical match.
You could think of this analysis procedure as a multistep filtering process in which the filter’s pores get finer with each step.
Throughout the analysis, it is important that you examine the resource as it is. You should not consider how good teachers might use or supplement Science Place.
Finally, we will apply what is learned from the analysis to improving Science Place.
Step 1: Identifying benchmarks that appear to be central to Science Place. (1 - 2 hours)
Ask participants to refer to the front-cover, fold-out table of contents in their copies of Benchmarks for Science Literacy, or distribute the HANDOUT: Benchmarks Table of Contents.
Explain, depending upon how much background you have already presented, that to use SFAA and Benchmarks, participants will need to be able to find their way around in them. This experience will help them do so.
Have participants examine the Benchmarks table of contents and identify sections in which they think benchmarks related to the Science Place activities are likely to be found. Ask participants to consider why they expect to find related benchmarks in those sections. The purpose of this activity is to familiarize participants with the contents of Benchmarks. Participants then share their choices with the larger group, explaining why they would look in the particular Benchmarks sections they chose.
Presenter: You have had an opportunity to become familiar with this curriculum resource and to identify sections of Benchmarks that might contain benchmarks related to it. Review the material again to identify specific benchmarks within the assigned section that you think might be addressed and note reasons for your selections. We will call this list our candidate benchmarks. When you have made your selections, record the benchmarks—chapter, section, grade level, page—on chart paper.
Ask groups to report on one or two benchmarks they chose, giving reasons for their selection.
Presenter: We have identified a candidate list of benchmarks, those we think might be addressed in activities described in the curriculum material. Now we will shorten this list by searching the curriculum more carefully to locate activities that are related to the benchmark. Record page numbers and notes about the activities on your charts.
Ask participants to explain, for a benchmark that they eliminated from their candidate list, why they did so. Then, ask them to explain, for a benchmark that they kept in their candidate list, why they kept it.
During this activity, participants often start discussing how literally benchmarks should be taken. It becomes apparent that different participants interpret the benchmarks in different ways. This is a good lead-in to the next activity, in which workshop participants study a benchmark to clarify its intent.
Presenter: A key principle of Project 2061’s approach to science education is that curriculum and instruction should explicitly address learning goals—the benchmarks listed in Benchmarks for Science Literacy.
But how can an educator know when curriculum and instruction actually address specific benchmarks? To know that, we need a clear understanding of what the benchmark is expecting students to know or be able to do. In this part of today’s workshop we will demonstrate a procedure that can be used to help educators fully understand the intent of a benchmark. This procedure can be used for any benchmark (or standard). We will use it to understand the intent of benchmarks that are dealt with in the resource. These are two benchmarks from the lists you have generated.
TRANSPARENCY: Benchmarks Related to Air, Sun, and Water.
Presenter: According to these benchmarks, what are students expected to know?
Ask the participants to describe to their partners their understanding of the benchmarks. Have several pairs report to the whole group. Record some of their comments on a blank transparency.
TRANSPARENCY: Exploring Project 2061 Tools - 4B.
Presenter: We are now going to see how Science for All Americans and Benchmarks for Science Literacy can be used to give us additional insights about these benchmarks. We will study the benchmarks in relation to the five readings shown on the transparency.
Briefly review for participants the purposes of the five readings shown on the transparency:
Distribute the HANDOUT: Strand Map: Water Cycle to each participant.
Presenter: Before we begin the readings, I’ll take a moment to introduce the strand map. Strands are networks of benchmarks through which students might progress on their way to the adult literacy goals defined in Science for All Americans. The strands show the development of concepts from rudimentary benchmarks at the elementary level through middle school learning to the sophisticated level of understanding expected of high school graduates. Strand maps show how related benchmarks build on and reinforce one another. There are 30 strand maps on Benchmarks on Disk.
TRANSPARENCY: Exploring Project 2061 Tools - 4B.
After participants have completed their studies, ask individuals to explain to other members of their group or to the total group what their particular reading contributed to their understanding of the benchmarks. Record these comments on a blank transparency. Use this transparency and the one you created earlier in the session to compare understanding before and after the study.
Step 3: Analyzing how specifically Air, Sun, and Water addresses the actual content of the benchmarks. (45-60 minutes)
TRANSPARENCY: Content Match Questions.
Distribute the HANDOUT: Content Match Questions to each participant.
Presenter: For a content match, we analyze all student materials—reading materials, laboratory activities, discussion questions, and so forth—for their fit to the content of each benchmark. The questions on this transparency will be helpful in this process. Ask participants to read the first question.
TRANSPARENCY: Topic or Substance?
Presenter: This transparency shows two examples to clarify the difference between addressing the topic of the benchmark and addressing the substance of the benchmark.
Presenter: Consider this benchmark:
Presenter: Consider the second benchmark:
Presenter: Activities that involve students in communication—working in groups and sharing information—would seem to contribute toward this learning goal. But is communication the substance of the benchmark? (Take responses.)
Presenter: We can see that the benchmark is really about the essential role of communication in science; students need to understand that communication is not an end in itself but a crucial means to sharing information. To contribute to learning this benchmark, activities would have to be structured so that students reflect on the importance of communication, perhaps after having direct experience with needing to communicate about investigations.
TRANSPARENCY: Content Match Questions.
Point to the second question: "Does the activity reflect the level of sophistication of the benchmark, or does the activity target a benchmark at an earlier or later grade level?" Remind participants that examining strand maps is helpful in thinking about the level of sophistication of the benchmark.
Then, point to the next question: "Is only a part of the benchmark or is the whole benchmark addressed?" Comment that there is nothing wrong with an activity addressing only a part of the benchmark, but it is important that we know exactly what the activity addresses.
Presenter: With these questions, our analysis becomes even more rigorous. Our filtering device is much finer so that even fewer benchmarks will pass through the filter. If you were analyzing a curriculum material on your own, you would use the Content Match Questions on the handout to review the activities for each benchmark in your short list and assess how well they match the content of the benchmarks. This analysis would probably reduce further your list of benchmarks. Because of time constraints, today we will use the Content Match Questions to review activities for only the two benchmarks that we studied in Step 2.
TRANSPARENCY: Benchmarks Related to Air, Sun, and Water.
Give participants 30 minutes to analyze the activities for content match and report their findings. For each benchmark examined, ask participants what evidence indicates that the activities match or do not match the content of the benchmark.
As the discussion proceeds, try to have participants briefly examine and discuss how well the material’s activities address the content of some other benchmarks in their short list.
Step 4: Estimating the likelihood that students will learn the benchmarks from the presented activities. (45-60 minutes)
Distribute the HANDOUT and show the TRANSPARENCY: Pedagogical Match Questions: Curriculum Analysis.
Tell participants that this list is based on Chapter 13: Effective Learning and Teaching in Science for All Americans. Participants will use the list to evaluate how effective the teaching procedures recommended in this resource will be. Explain that some criteria may not be applicable to all single resources. For example, it may be impossible to determine from a single short resource whether activities are sequenced in a way that reflects progressive levels of difficulty and complexity. However, in a longer, more complex resource such as a textbook, this determination might well be made.
Tell the participants that it is essential that they assume the teaching will be exactly as it is described in the resource. For example, if no questions requiring reflection are asked, the analyzer must assume there will be none. This level of objectivity in analysis is especially hard for good teachers, who will usually see ways to improve almost any resource. Tell participants they will have a chance later to suggest improvements.
Give groups 30-60 minutes to analyze and discuss the pedagogical match to benchmarks (one benchmark per group).
Step 5: Improving the match by adapting and supplementing existing
materials.
(45-60 minutes)
Ask participants to prepare, in small groups, summary reports on their findings from the content and pedagogical analyses. Ask them to include their ideas on how they could use the results of this analysis procedure to adapt and improve Science Place to address the two benchmarks.
In the whole group, take some ideas from the summary reports. Have participants discuss how they could use content and pedagogical match questions to guide them in improving instruction; for example, by adding questions that foster reflection or by adding practice activities in different contexts. Ask for specific examples. Allow sufficient discussion for participants to get the flavor of using the analysis to improve instruction.
Presenter: Even though you may not be able to control what curriculum materials are available to you, you can, by using this analysis procedure, identify changes that you can make to address specific learning goals more effectively. Sometimes changes can be as simple as asking students different questions or omitting some sections and spending more time on other sections that do address a benchmark. Sometimes you can make curriculum connections that you would not have thought of before; for example, using themes, integrating mathematics, thinking about design or the nature of science.