A visiting political scientist says AI tools punish the poor

Event Date: 2/13/2019

Virginia EubanksVirginia Eubanks is a political scientist with a knack for telling stories. She told three of them to a riveted Fowler Hall audience Wednesday afternoon, revealing how algorithms designed to reduce poverty have been shown to actually increase it.

An associate professor of political science at the University of Albany, SUNY, Eubanks periodically read excerpts from her book, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, which reports in detail the three stories she gathered during seven years of research and over 100 interviews nationwide. For 20 years, Eubanks has worked in community technology and economic justice movements. She was a founding member of Our Data Bodies Project and was a fellow at New America.

THE ‘DIGITAL POORHOUSE

Eubanks’ stories all speak to a “digital poorhouse,” her term for the modern, digitized version of the poorhouses of the 19th century. “I use the metaphor of the digital poorhouse to illustrate what I think is the deep social programming of these new tools that we’re seeing in social service,” she said. “At their heart is this decision we made in the 1820s that public service programs should focus most of their attention and energy on being moral thermometers — on deciding who is deserving and who is undeserving — rather than acting as a universal floor under us all.”

Beyond that commonality, each of the three stories, based on interviews with digital system developers, administrators, caseworkers and applicants, depicts a different version of programmed digital dysfunction that in various ways builds barriers between those in need and the services that could help them.

A TAX-SAVING STATE INITIATIVE

Eubanks’ first story, and the one closest to campus, was that of Sophie Stipes, of Kokomo, Indiana. A severely disabled girl who died at age 13, Sophie’s parents had applied for Medicaid for her when Sophie became ill at age 6. The request was denied. A letter addressed to Sophie from the state said that she “had failed to cooperate in establishing eligibility for the program,” Eubanks said.

The denial happened just as Indiana was making its transition to automating all the eligibility-application processes of the state’s welfare system. The state had contracted with a consortium of high-tech companies who, in an effort to contain costs and streamline services, replaced the hands-on work of local caseworkers with online applications and private, regional call centers.

The new automation meant that caseworkers would no longer work with sets of families they would come to know over time, but instead respond to cases from an automated workflow-management system — much like tickets being submitted for IT support.

“The caseworkers no longer had local context for the folks applying for public assistance,” Eubanks said. “In the past, they may have been able to say, ‘it looks like you’re not going to be eligible for food stamps but, because you live in Kokomo, I know there’s a food pantry near you that’s open on Tuesday nights.’ Caseworkers felt like it really separated them from the people who they were hoping to serve.”

THE ACCOUNTABILITY VOID

Eubanks says applicants she interviewed, including Sophie Stipes’ parents, felt as if no one was responsible or accountable when mistakes were made in applications that could run as long as 50 pages. Common, often minor or hidden mistakes would trigger an automatic application denial, which applicants would then have to call to try to rectify. To their frustration, Eubanks said, each call would be a conversation with someone new who would need to learn the problem from the beginning. Applicants would then get different or conflicting advice, or the same bad advice repeatedly. “No one was accountable,” Eubanks said. “The responsibility for finding errors — no matter where the error happened — fell solely on the shoulders of applicants.”

The result, Eubanks said, was 1 million application denials in the first three years of the Indiana experiment, which was a 54 percent increase over the three-year period preceding the experiment. The reason for nearly all the denials was the same given to Sophie Stipes: “failure to cooperate in establishing eligibility,” the catch-all phrase that meant “someone, somewhere in the application process had made a mistake,” Eubanks said.

Ironically, the labor-intensive process required to find and solve mistakes that triggered the denials — and sometimes legal challenges of the denials — would eat up the cost savings derived from the automation, Eubanks said.

Eubanks’ two other stories, from California and Pennsylvania, were variations on the same digital-poorhouse theme with digitized processes aimed at efficiency ultimately interfering with people in need obtaining assistance.

In Los Angeles County, Eubanks said a man she interviewed who was seeking housing had been arrested and incarcerated. After his release from jail, his incarceration disqualified him from housing assistance because Los Angeles County’s automated system categorized his recent incarceration as housing.

In Allegheny County, Pennsylvania, Eubanks said the digital decision-making tools designed to eliminate racial or cultural bias, actually hid it. Sharing the story of one poor Allegheny County family, Eubanks illustrated how poverty can be misconstrued as neglect and reported by well-meaning people or people with cultural or racial biases. The parents lived in fear of the Allegheny Family Screening Tool, an automated system that kept track of such reports, and whose algorithms could unpredictably determine them to be unfit parents and recommend that their daughter be taken away from them.

AN ENIGMATIC CHALLENGE

Eubanks concedes that the fix for such pervasive, troubling dysfunction is complex. It starts, she says, with honesty about the prevalence of poverty, and a shift in attitudes about public assistance and the people who seek it. Those changes would then need to be reflected in the automated systems used to manage public assistance eligibility.

“We can decide as a political community, that there is a line below which no one is allowed to go for any reason,” Eubanks said. “We can say today: No one in the United States goes hungry. You can’t make a choice that means you deserve to starve … no family in the U.S. is split up because parents can’t afford a child’s medication. In other places around the world, people really quickly identify these things as human rights violations.

“There’s something really profound and troubling about how much we are increasingly thinking about these problems not as human’s rights issues but as issues of systems engineering.”

 

 

Continued Reading

Pin It on Pinterest

Share This