In late 1988, Maryland and Minnesota economic development officials became the envy of their counterparts in the other states. They were given an opportunity, for a small investment, to prove to the world they were doing a good job and, more importantly, to find out what they were doing wrong.
Minnesota took the ball and ran with it. Officials there used a new survey method to monitor performance and applied it to a variety of programs, repeating detailed customer surveys every year. They made changes in their programs to reflect the results, used the data to plead for more money from state budget officials, and even turned the assessment process inward, examining offices whose customers are other state programs.
Maryland officials haven't gotten much further than the scrimmage line.
That's partly because of a lack of resources, both money and staff, state officials say. But the benefit that may be lost is the ability to figure out which state-funded programs are working and which can stand some improvement. And the stakes for taxpayers here and nationwide are rising.
The average state spent about $4.7 million on economic development in 1982, according to the Washington-based Urban Institute, the non-profit policy research and education organization that initiated the monitoring and evaluation program in the two states. That figure rose to $20 million in 1988.
The money is spent to improve the health of a state's economy: attracting businesses, helping local companies expand, creating jobs. "Yet, effective performance monitoring systems have not been developed and used by most economic development agencies," the institute writes in the report that was born of the Maryland and Minnesota work.
That's why it approached the two states in 1988 to be guinea pigs in developing a prototype monitoring program.
For a $40,000 contribution, matched and exceeded by federal and private funds, Maryland for the first time would have a means of figuring out which economic development programs are worthwhile and which need help, and exactly how their customers have benefited.
Lee Munnich, until January a deputy commissioner for Minnesota's Trade and Economic Development Department, said, "The idea is not just to do a one-time survey of your customers, but to do it on a regular basis so you can see: Did we improve . . . and are we being effective in what we're doing?"
Some of his state's programs have conducted surveys up to three times since late 1988, when the Urban Institute arrived, said Dan Quillin, the department's supervisor of information services, and a fourth survey is imminent for a few programs. "Our expansion has not only been external but internal, in terms of how we service ourselves," he explained, referring to surveys conducted by the department's fiscal services and policy analysis units.
The results have been put to good use. The department's trade office, for instance, found that its clients were too heavily concentrated in the Twin Cities area, so a new task force has created a "Greater Minnesota" strategy to reach out to the rest of the state, Mr. Quillin said.
Even positive findings have been useful, Mr. Quillin said. The state budget office "is saying, 'We want to know about outcomes,' " he said. That's a refrain that was voiced many times in Annapolis this year as lawmakers sought justifications for the money they allocated to various programs.
Mr. Quillin said his department has been able to use the Urban Institute surveys to back up performance claims.
"So the programs had a leg up in preparing their budgets," he said.
Mike Lofton, deputy secretary of Maryland's DEED, said his department has not used the results from its surveys to help pry open the legislative purse.
That's partly because DEED wants results from a few years of surveys to compare current efforts to past performance, he said. But it's also because not that many surveys have been done.
One problem is the disparity in staff and resources. Minnesota officials believe they have an advantage because all the computer tabulation work was done in-house, whereas DEED contracted with a University of Maryland research center.
Staff turnover in College Park, and the fact that Robert G. Schult, whose Program Analysis and Audit office has coordinated the research for DEED, originally was a one-man show running the program in Baltimore have caused some delays, Mr. Schult and an Urban Institute official said.
The surveys have not been expanded beyond the original five programs: international trade, national marketing (or business development), tourism and promotion, and two financing programs for small and minority businesses.
Minnesota has expanded its evaluations to 12 or 13 programs. A total of eight surveys have been completed since 1988 in Maryland, compared to Minnesota's 20.
Maryland officials were unable to name many changes that have been made in response to the surveys. "These numbers were so good that the sense of urgency wasn't immediate to go back," said Mr. Schult. He said that "our plans for this summer are to develop time lines to go back and resurvey."
Both he and Mr. Lofton acknowledged the importance of customer and program monitoring, but said the state's budget problems have made an intensive evaluation program difficult right now.
"Most of their work has been in actually delivering their services," Mr. Lofton said of his managers. "But by no means are we abandoning the service. It's essential. We have to do it."