Maryland universities to use data to predict student success — or failure

In the future, Maryland colleges could predict which students are likely to fail.

Officials at the University System of Maryland have begun to analyze student data — grades, financial aid information, demographics, even how often they swipe their ID cards at the library or the dining hall — to find undergraduates who are at risk of dropping out.

Law enforcement agencies, political campaigns, retailers and other universities all mine data to help focus their efforts. University system officials say the practice, called predictive analysis, will boost graduation rates by enabling educators to intervene with struggling students before failure becomes inevitable.

But privacy advocates and some students are concerned that the practice could easily be misused. They're urging system officials to tread carefully — and publicly.

David Rocah, an attorney for the American Civil Liberties Union of Maryland, warned of "a larger trend in our society of more and more institutions acquiring and keeping more data on us that we have no control over."

"Students deserve to know what data is being collected and how it will be used and should be able to opt in or opt out as they wish," he said. "It's easy to imagine how data could be misused to deny students opportunity based on predicted success — and I would be very concerned if that happened."

University system officials have high hopes for predictive analytics. They believe it could help educators identify choke points, such as a single difficult class, or a combination of pressures that land all at once, that are leading students to drop out.

If the data shows a student is unlikely to succeed on his or her current path, advisers could suggest a new major, tutoring help or another remedy.

Colleges that have used predictive analytics say it has helped narrow achievement gaps for minority and low-income students, raise graduation rates and shorten the amount of time it takes students to graduate.

Universities already collect loads of data on students in areas as varied as class schedules, race, financial aid and ID use.

None of the data the Maryland system is now analyzing — course schedules, enrollment status, grades — is new. But Ben Passmore, the assistant vice chancellor for administration and finance, said discovering what could be learned from that data was a revelation.

"We were building houses on top of fields of gold," Passmore said. "There was so much that could be done."

Some schools in the system already are using predictive analysis. Officials at University of Maryland University College, for example, have found that students who enroll in a course at the last minute tend not to perform well.

They have changed the school's policy to prevent students from enrolling in a course in the four days before the first class, according to Pete Young, the senior vice president for analytics planning and technology. And a student may now drop the class without penalty up to four days after it begins.

Analytics "becomes the prerequisite to taking action," Young said. "It's an enormously important raw ingredient."

Elsewhere, Passmore said, officials have learned that the earlier transfer students meet with advisers, the better chance they have of success.

Officials say they are aware predictive analysis can be misused — for example, to encourage or force struggling students to drop out, in order to improve official graduation rates and national college rankings.

Leaders at Mount St. Mary's University in Emmitsburg came under fire this year for administering a survey to incoming freshman in an attempt to predict which of them were unlikely to succeed. They planned to encourage them to drop out.

State system officials say that's not their goal. They say they're trying to boost graduation rates by helping students who are struggling.

"Our philosophy is one about success, not weeding people out in order to improve the numbers," said Donald Z. Spicer, the university system's associate vice chancellor for information technology.

System officials are still crunching the data to see where it leads. They haven't yet determined how best to use it to help students — or how to protect them from harm.

"We hope down the road to have a much better sense of the types of interventions that get the biggest bang for the time spent," said Nancy O'Neill, an official in the university system's William E. Kirwan Center for Academic Innovation.

The 11 traditional universities in the system are in different stages of developing predictive analytics. All are using the Student Success Matrix, a data tool developed by the nonprofit Predictive Analytics Reporting Framework.

Officials hope they can use it to determine, for example, whether a C in an introductory engineering course indicates a student is unlikely to graduate in the major. Instead of waiting until the student is on academic probation, an adviser could talk with the student about how to improve grades, or whether changing majors would help.

Such an approach is called "intrusive advising."

Katherine Swanson, the student body president at the University of Maryland, College Park, said predictive analytics sounds promising, but she urged officials to be cautious about the interventions they choose.

If a student got a C in a class and was told it meant he had only a 25 percent chance of graduating, Swanson said, it could discourage him from continuing in a field he loved.

She said advisers should be careful about how much they share about a student's calculated chances.

"There are quite a few people that start out struggling and rise to the top," Swanson said. "And I worry that it would discourage people from trying their hardest."

Adrian Boafo, an officer with the University System of Maryland Student Council, said the group has been discussing data and its implications all year, and that students have a range of opinions.

Boafo, who graduated from the University of Baltimore last month, said predictive analytics can be beneficial, as long as administrators are careful.

"I think there is a need to gather data, because at the end of the day it does help us," he said, "if that data is being used correctly."

Marc Rotenberg, executive director of the privacy group Electronic Privacy Information Center, said officials should disclose how they use predictive analytics.

"If the outcome of this effort is to help students succeed where they might otherwise fail, that's a step in the right direction," Rotenberg said. But "there's a real risk that this strategy could lead to real stigmatization, creating obstacles where they might not otherwise exist."

Patrick Ball, the director of research for the Human Rights Data Analysis Group, said universities using analytics to predict student success is less worrisome than police departments using it to predict whether someone will commit a crime.

He said the level of statistical analysis needed is minimal, and university statisticians could have been doing it 40 years ago.

"It seems like if you're concerned about student retention, one of the first things you should do is look for all the things that correlate with students not getting through," Ball said. "This is a fairly obvious move. Calling it 'analytics' is kind of highfalutin."

M.J. Bishop, director of the Kirwan Center, said protecting students' privacy is a "primary concern." She said officials will likely soon disclose to students how their data is being collected and used.

"Every time that we've being thinking about analytics, we are keeping in mind the ways we can increase protections and students' privacy is protected," she said.

"I do think younger folks are more tolerant of being monitored in some of these ways. It makes life easier. … People are becoming less alarmed by that kind of thing as they begin to see the benefits of it."

The University of Maryland, Baltimore County and the University of Maryland University College, are the furthest along in their efforts. UMUC spun off its analytics office last year and invested $10 million to create a private company called HelioCampus. Universities from around the country can buy the program through an annual subscription.

Darren Catalano, the former vice president for analytics at UMUC, is now CEO of HelioCampus.

Outside the state system, the Johns Hopkins University has been developing a program called HopReach since 2013. The software flags students who are missing assignments or skipping class; the university is hiring "case managers" to refer the student to tutors, financial aid representatives or health care providers, depending on the problem.

John Fritz, assistant vice president of instructional technology at the University of Maryland, Baltimore County, put his findings in front of the students.

He found that D- and F-students used Blackboard, an online assignment, discussion and grading program, 40 percent less than A-, B- and C-students.

So he built a tool with Blackboard called Check My Activity that allows students to compare their Blackboard activity with an anonymous summary of their classmates who earn the same, higher or lower grade on any assignment.

Students who use Check My Activity are 1.5 to 2.8 times more likely to earn at least a C.

"The fact is that we know very little about intervention science," Fritz said. "I've been trying to use feedback as a form of intervention."

University system officials are looking at predictive analytics practices at university systems in Tennessee and Georgia as models.

Timothy Renick, an administrator at Georgia State University, found about 800 different data points that help predict whether a student will drop out, and he set up an alert system so an adviser can sit down with the student within 48 hours.

"What we're trying to do with these interventions is correct problems at their earliest sign and get students back on track so they can not only graduate but graduate in a timely fashion," Renick said.

He said the program has helped raise graduation rates by six percentage points and eliminate achievement gaps for low income, first generation and minority students.

In the Tennessee system, vice chancellor for academic affairs Tristan Denley developed a program that uses data to recommend classes to students, much as Netflix recommends movies.

Tennessee officials learned that half the freshman who started off with an undecided major eventually dropped out, so they now require students to pick a broad field of study, such as business.

Since the program began, achievement gaps for low income and minority students have narrowed by more than 50 percent, the graduation rate has risen and students who take classes highly recommended for them are 50 percent more likely to get at least a B.

Denley and Renick say predictive analytics is critical for helping the growing number of students who are low-income, minority or first in their families to go to college.

"These systems are one of the only things proven successful in leveling the playing field," Renick said. "We have a moral and economic imperative here to try to address these issues."

cwells@baltsun.com

Copyright © 2017, The Baltimore Sun, a Baltimore Sun Media Group publication | Place an Ad
30°