.
Project 2061
Algebra Textbooks
A Standards-Based Evaluation


Browse evaluations of algebra textbooks
Make comparisons of data
Examine Project 2061's analysis procedure
Find out how the evaluation was conducted
Learn how to use the features of this report
Explore how to use the results of the evaluation
 

Evaluation Methods

Report
Preparation:
The following Project 2061 staff members were responsible for supervising the development of the procedure, training analysts, compiling data, and preparing the reports.
    Gerald Kulm, Program Director
    Kathleen Morris, Senior Program Associate
    Laura Grier, Project Coordinator

Freelance Editor Paul Elliott completed final editing of the reports.

The following staff members prepared the reports for publication on the Web.
    Francis Molina, Technology Director
    Thelxi Proimaki, Technology Specialist
    Susan Shuttleworth, Senior Editor
    Brian Sweeney, Technology Specialist
    Ann-Marie Martin, Intern

Reviewers:  The analysts who reviewed and rated the texts were experienced mathematics teachers and university mathematics and mathematics education faculty trained in the use of the Project 2061 procedure. The reviewers were evenly divided between experienced, practicing classroom teachers and mathematics educators who were knowledgeable about research on mathematics learning and teaching. All of the reviewers were highly capable in mathematics content. Reviewers were formed into 6 teams of two persons each, at least one of whom was an experienced classroom teacher. The reviewers were:

John Beem, Ph.D., University of Missouri - Columbia
Mark Deegan, Fairfax County (VA) Public Schools
Florence Fasanelli, Ph.D., College-University Resource Institute, Inc., Washington, DC
Mary Ellen Foley, Ph.D., Louisiana State University - Shreveport
Henry Frandsen, Ph.D., University of Tennessee
Linda Hackett, Ph.D., American University, Washington, DC
Stephen Hays, Montgomery County (MD) Public Schools
Cheryl Jenkins, Atlanta (GA) Public Schools
Bill Kunnecke, Kentucky Department of Education
Susan Mast, Kyrene School District, Tempe, AZ
Linda Montgomery, Berea (KY) Public Schools
Curtis Pyke, Ph.D., George Washington University, Washington, DC
Sue Reehm, Ph.D., Eastern Kentucky University
Diane Surati, Waterbury-Duxbury (VT) Public Schools
James Telese, University of Texas - Brownsville

Training: More than half of the reviewers had one or more previous experiences with the Project 2061 analysis procedure in reviewing and rating middle grades mathematics materials. The persons who had not had previous reviewing experience attended an intensive three-day training session immediately before the analysis. In the session, they learned the analysis procedure and applied it to the analysis of a middle grades mathematics textbook that had been reviewed earlier as part of the reliability study. Their ratings on each criterion were discussed and compared with the ratings that had been assigned by experienced reviewers. After the training, each of these reviewers was teamed with an experienced reviewer.
Design: Each team rated a set of algebra standards from the Principles and Standards for School Mathematics (NCTM, 1998/2000). Each set encompassed ideas from one area of algebra: Functions, Operations, and Variables. Each team used the same idea set to review a total of twelve textbooks or sets of textbooks over a period of 12 days. About half of the materials consisted of a single textbook intended for use in an algebra I course. The other materials consisted of a set of textbooks intended to cover four to six years of integrated secondary mathematics. Two teams completed a review for each textbook or textbook series independently.
Analysis: Preliminary sightings for each standards idea set were identified by Project 2061 staff and independent content reviewers prior to the analysis. After clarifying each idea set, the reviewers checked the sightings in the textbook, making necessary additions, deletions, or revisions. Reviewers then aligned sightings with the relevant instructional criteria and used indicators and rating guides to rate how well the textbook addressed each criterion. After completing the analysis for an idea set, teams met to reconcile their ratings, and to receive necessary clarifications from Project 2061 staff if ratings differed significantly. Teams submitted (1) the content sightings for each idea set; (2) the sightings and justifications for each instructional criterion; (3) the rating of each instructional indicator [Met, Not Met, Unsure]; and, (4) the rating and justification for each instructional criterion [High, Medium, Low, None] including supporting notes.
Reports: Project 2061 staff took these data plus notes derived from debriefing the teams and prepared detailed reports on each textbook.

Copyright © 2002 American Association for the Advancement of Science