According to experts, the test was shorter though not easier, which might have meant that students taking it were less tired. The tests have been given in grades three through eight in reading and math for the past five years."There was a psychological advantage," said Ronald A. Peiffer, Maryland deputy state school superintendent. "They weren't as tired this year. ... It doesn't mean it wasn't as difficult."
Outside experts said the changes were justified but did have an impact, making comparisons between this year's scores and previous results imperfect. Maryland saw significant gains on the MSA this year, particularly among black and low-income students and those learning English.
Baltimore schools had some of the largest increases, which was celebrated this week by the governor and other elected officials.
City schools chief Andres Alonso said the news that the test had been changed did not diminish the accomplishments of Baltimore students. "Since the test had the same degree of difficulty, and the fundamental question is how our students are performing in relationship to the curriculum, the questions, to me, are irrelevant," he said.
"It has been my experience that when black and Latino children demonstrate great gains, people look to explanations other than good, hard work to explain that performance. It's what I experienced in New York. We know why that happens."
Baltimore students' scores improved not only in relation to past years but also in comparison with those of students around the state.
But Kate Walsh, who was recently appointed to the state school board, said members were not made aware of the change when they were briefed Tuesday.
"I am disappointed [that] these changes, however justified, weren't shared with the board," Walsh said.
In fifth- and seventh-grade reading, scores increased more than in any other grade or subject. The percentage of students passing the fifth-grade reading test went up 10 percentage points, from 76.7 percent to 86.7 percent. Seventh-grade reading scores rose by 11 percentage points.
Such large increases had not been seen before during the five years that the test has been administered and were a red flag to a standing panel of education testing experts - known as psycometricians - hired by the Maryland State Department of Education to advise them on the validity of test scores.
"We had a lively discussion on these results within the panel," said Huynh Huynh, a professor of statistics and education at the University of South Carolina and a member of the panel.
Long before the results were released to the public, the panel asked Harcourt Assessment Inc., the company hired by the state to oversee testing, to do further analysis.
In the end, Huynh said, the panel concluded that the test was equivalent to the one given in 2003, the year the test was first used, and to subsequent tests. But the panel also concluded that the changes in the test had contributed to the large increases in the fifth- and seventh-grade scores. How much effect the changes had on scores the panel could not estimate, he said.
In putting together the original MSA in 2002, Maryland, like a number of states, decided to take an off-the-shelf standardized test that can be purchased from companies such as Harcourt and combined it with questions that were created in-house.
The locally produced questions made up the majority of the test and were more closely aligned with material on the state curriculum - the material that Maryland officials say ought to be taught in classrooms. The standardized tests, on the other hand, had great reliability because they are given to tens of millions of students across the nation.
Although students had to answer about 40 questions on the standardized portion of the test, Maryland officials did not count most of them. Instead, they elected to count the questions that focused on material they cared about and those that reflected the state curriculum.
But Maryland students were not informed that some questions did not count and might have gotten bogged down on questions that covered unfamiliar material. In addition, teachers who looked at the tests when they were given each year saw material they had not focused on in class and might have been confused about what to teach the next year, according to Leslie Wilson, who heads the state's assessment office.