Amid growing debate on the state's pupil testing program, a study prepared for the University of Maryland, College Park praised the exams yesterday as a model for the nation, but said they need some updating and fine-tuning.
The study by the California-based research company SRI International says the Maryland School Performance Assessment Program is an "exemplary state assessment" that does a good job of testing reading and writing.
But the firm's study, presented to the state school board yesterday, also concurred with some conclusions of last year's Abell Foundation report on the MSPAP exams. The new study found "a number of inaccuracies in historical and political fact, judgment and emphasis" and more testing of "process" than "content knowledge" in such subjects as science.
"The panels [of researchers] wanted to emphasize they see MSPAP as unique," said Edys S. Quellmalz, associate director of assessment at SRI's Center for Technology in Learning. "It sets an example, and it's something many states would like to do but cannot."
State education officials - who described the report as one of the most comprehensive analyses of MSPAP in its 10-year history - said they would study the identified problems and make changes.
"I think this is invaluable in helping us to move forward," said state schools Superintendent Nancy S. Grasmick.
The centerpiece of Maryland's education reform efforts of the 1990s, the MSPAP exams are given each May to all of the state's third-, fifth- and eighth-graders.
Unlike traditional, standardized, multiple-choice exams, the MSPAP tests aim to measure more than basic reading and math skills. For five mornings, pupils are called upon to apply their knowledge, often by working in groups and writing long essays.
The tests are not designed to judge the abilities of individual pupils, but to grade the effectiveness of schools' instruction in six subjects - math, reading, writing, language, social studies and science. The goal is for 70 percent of pupils to score "satisfactory" on the exams, but the state remains far below that target.
While some parents and teachers have questioned the MSPAP tests since they were first given in 1991, debate has intensified in the past year after a report from the Abell Foundation concluded that the tests are full of mistakes and are an invalid measure of children's skills and knowledge.
State educators disputed the Abell report's conclusions and dismissed the quality of its research.
In response to that report, Grasmick asked the Maryland Assessment Research Center for Education Success - a state-funded arm of the University of Maryland - to commission an independent review of the MSPAP exams.
The Maryland assessment center hired SRI, paying almost $300,000 for a study that included university researchers and other national testing experts, said Robert Lissitz, chairman of the measurement department and director of the center. Lissitz said state education officials did not participate in choosing SRI or any researchers.
But one state board member immediately attacked the study, saying that the group of researchers was not racially diverse and failed to examine the question of cultural bias in the MSPAP exams.
"Many parents believe the test is, in effect, culturally biased," said board member Reginald Dunn. "This is a major flaw in whatever is delivered."
In yesterday's presentation, Quellmalz said state officials need to explain better what they believe MSPAP is accomplishing.
State education officials say the MSPAP exams test higher-order thinking skills, but the study "did not find a document that specified these skills" or why they were chosen to be tested.
But Quellmalz said researchers like MSPAP's structure of testing pupils on complicated tasks and having them write lengthy responses - rather than answer multiple-choice questions.
"The panels all felt this was extremely important, never, never, never to be abandoned as part of the MSPAP," she said.
The study suggested test preparation and test-taking should be standardized - questioning whether pupils in classes with novice teachers might perform worse than classes with teachers experienced in giving MSPAP. "The idea is to make it as consistent as possible so it's fair to everyone," Quellmalz said.
In response to the discovery of factual inaccuracies, state officials will have questions checked for content by university professors, said Gary Heath, chief of the State Department of Education's arts and science branch.
The Maryland assessment center will review yesterday's report and offer recommendations to the education department by Dec. 15, including what additional research ought to be done, Lissitz said.