Skip to main content

What happens when an algorithm cuts your health care

Illustrations by William Joel; Photography by Amelia Holowaty Krales

For most of her life, Tammy Dobbs, who has cerebral palsy, relied on her family in Missouri for care. But in 2008, she moved to Arkansas, where she signed up for a state program that provided for a caretaker to give her the help she needed.

There, under a Medicaid waiver program, assessors interviewed beneficiaries and decided how frequently the caretaker should visit. Dobbs’ needs were extensive. Her illness left her in a wheelchair and her hands stiffened. The most basic tasks of life — getting out of bed, going to the bathroom, bathing — required assistance, not to mention the trips to yard sales she treasured. The nurse assessing her situation allotted Dobbs 56 hours of home care visits per week, the maximum allowed under the program.

For years, she managed well. An aide arrived daily at 8AM, helped Dobbs out of bed, into the bathroom, and then made breakfast. She would return at lunch, then again in the evening for dinner and any household tasks that needed to be done, before helping Dobbs into bed. The final moments were especially important: wherever Dobbs was placed to sleep, she’d stay until the aide returned 11 hours later.   

Dobbs received regular reassessments of her needs, but they didn’t worry her. She wouldn’t be recovering, after all, so it didn’t seem likely that changes would be made to her care.

When an assessor arrived in 2016 and went over her situation, it was a familiar process: how much help did she need to use the bathroom? What about eating? How was her emotional state? The woman typed notes into a computer and, when it was over, gave Dobbs a shocking verdict: her hours would be cut, to just 32 per week.  

Tammy Dobbs.
Tammy Dobbs.

Dobbs says she went “ballistic” on the woman. She pleaded, explaining how that simply wasn’t enough, but neither of them, Dobbs says, seemed to quite understand what was happening. Dobbs’ situation hadn’t improved, but an invisible change had occurred. When the assessor entered Dobbs’ information into the computer, it ran through an algorithm that the state had recently approved, determining how many hours of help she would receive.

Other people around the state were also struggling to understand the often drastic changes. As people in the program talked to each other, hundreds of them complained that their most important lifeline had been cut, and they were unable to understand why.

Algorithmic tools like the one Arkansas instituted in 2016 are everywhere from health care to law enforcement, altering lives in ways the people affected can usually only glimpse, if they know they’re being used at all. Even if the details of the algorithms are accessible, which isn’t always the case, they’re often beyond the understanding even of the people using them, raising questions about what transparency means in an automated age, and concerns about people’s ability to contest decisions made by machines.

Dobbs wrote that institutionalization would be a “nightmare”

Planning for the cut in care, Dobbs calculated what she could do without, choosing between trips to church or keeping the house clean. She had always dabbled in poetry, and later wrote a simple, seven-stanza piece called “Hour Dilemma,” directed toward the state. She wrote that institutionalization would be a “nightmare,” and asked the state “to return to the human based assessment.”

The change left Dobbs in a situation she never thought she would be in, as the program she’d relied on for years fell out from below her. “I thought they would take care of me,” she says.

The algorithm that upended Dobbs’ life fits comfortably, when printed, on about 20 pages. Although it’s difficult to decipher without expert help, the algorithm computes about 60 descriptions, symptoms, and ailments — fever, weight loss, ventilator use — into categories, each one corresponding to a number of hours of home care.

Like many industries, health care has turned to automation for efficiency. The algorithm used in Arkansas is one of a family of tools, called “instruments,” that attempt to provide a snapshot of a person’s health in order to inform decisions about care everywhere from nursing homes to hospitals and prisons.

The instrument used in Arkansas was designed by InterRAI, a nonprofit coalition of health researchers from around the world. Brant Fries, a University of Michigan professor in the school’s Department of Health Management and Policy who is now the president of InterRAI, started developing algorithms in the 1980s, originally for use in nursing homes. The instruments are licensed to software vendors for a “small royalty,” he says, and the users are asked to send data back to InterRAI. The group’s tools are used in health settings in nearly half of US states, as well as in several countries.

The US is inadequately prepared to care for a population that’s living longer

In home care, the problem of allocating help is particularly acute. The United States is inadequately prepared to care for a population that’s living longer, and the situation has caused problems for both the people who need care and the aides themselves, some of whom say they’re led into working unpaid hours. As needs increase, states have been prompted to look for new ways to contain costs and distribute what resources they have.

States have taken diverging routes to solve the problem, according to Vincent Mor, a Brown professor who studies health policy and is an InterRAI member. California, he says, has a sprawling, multilayered home care system, while some smaller states rely on personal assessments alone. Before using the algorithmic system, assessors in Arkansas had wide leeway to assign whatever hours they thought were necessary. In many states, “you meet eligibility requirements, a case manager or nurse or social worker will make an individualized plan for you,” Mor says.

Arkansas has said the previous, human-based system was ripe for favoritism and arbitrary decisions. “We knew there would be changes for some individuals because, again, this assessment is much more objective,” a spokesperson told the Arkansas Times after the system was implemented. Aid recipients have pointed to a lack of evidence showing such bias in the state. Arkansas officials also say a substantial percentage of people had their hours raised, while recipients argue the state has also been unable to produce data on the scope of the changes in either direction. The Arkansas Department of Human Services, which administers the program, declined to answer any questions for this story, citing a lawsuit unfolding in state court.

When similar health care systems have been automated, they have not always performed flawlessly, and their errors can be difficult to correct. The scholar Danielle Keats Citron cites the example of Colorado, where coders placed more than 900 incorrect rules into its public benefits system in the mid-2000s, resulting in problems like pregnant women being denied Medicaid. Similar issues in California, Citron writes in a paper, led to “overpayments, underpayments, and improper terminations of public benefits,” as foster children were incorrectly denied Medicaid. Citron writes about the need for “technological due process” — the importance of both understanding what’s happening in automated systems and being given meaningful ways to challenge them.

Critics point out that, when designing these programs, incentives are not always aligned with easy interfaces and intelligible processes. Virginia Eubanks, the author of Automating Inequality, says many programs in the United States are “premised on the idea that their first job is diversion,” increasing barriers to services and at times making the process so difficult to navigate “that it just means that people who really need these services aren’t able to get them.”

One of the most bizarre cases happened in Idaho, where the state made an attempt, like Arkansas, to institute an algorithm for allocating home care and community integration funds, but built it in-house. The state’s home care program calculated what it would cost to care for severely disabled people, then allotted funds to pay for help. But around 2011, when a new formula was instituted, those funds suddenly dropped precipitously for many people, by as much as 42 percent. When the people whose benefits were cut tried to determine how their benefits were determined, the state declined to disclose the formula it was using, saying that its math qualified as a trade secret.

“It really, truly went wrong at every step of the process.”

In 2012, the local ACLU branch brought suit on behalf of the program’s beneficiaries, arguing that Idaho’s actions had deprived them of their rights to due process. In court, it was revealed that, when the state was building its tool, it relied on deeply flawed data, and threw away most of it immediately. Still, the state went ahead with the data that was left over. “It really, truly went wrong at every step of the process of developing this kind of formula,” ACLU of Idaho legal director Richard Eppink says.

Most importantly, when Idaho’s system went haywire, it was impossible for the average person to understand or challenge. A court wrote that “the participants receive no explanation for the denial, have no written standards to refer to for guidance, and often have no family member, guardian, or paid assistance to help them.” The appeals process was difficult to navigate, and Eppink says it was “really meaningless” anyway, as the people who received appeals couldn’t understand the formula, either. They would look at the system and say, “It’s beyond my authority and my expertise to question the quality of this result.”

Idaho has since agreed to improve the tool and create a system that Eppink says will be more “transparent, understandable, and fair.” He says there might be an ideal formula out there that, when the right variables are entered, has gears that turn without friction, allocating assistance in the perfect way. But if the system is so complex that it’s impossible to make intelligible for the people it’s affecting, it’s not doing its job, Eppink argues. “You have to be able to understand what a machine did.”

Cash, Arkansas.
Cash, Arkansas.

“That’s an argument,” Fries says. “I find that to be really strange.” He’s sympathetic to the people who had their hours cut in Arkansas. Whenever one of his systems is implemented, he says, he recommends that people under old programs be grandfathered in, or at least have their care adjusted gradually; the people in these programs are “not going to live that long, probably,” he says. He also suggests giving humans some room to adjust the results, and he acknowledges that moving rapidly from an “irrational” to a “rational” system, without properly explaining why, is painful. Arkansas officials, he says, didn’t listen to his advice. “What they did was, in my mind, really stupid,” he says. People who were used to a certain level of care were thrust into a new system, “and they screamed.”

“What they did was, in my mind, really stupid.”

Fries says he knows the assessment process — having a person come in, give an interview, feed numbers into a machine, and having it spit out a determination — is not necessarily comfortable. But, he says, the system provides a way to allocate care that’s backed by studies. “You could argue everybody ought to get a lot more care out there,” he says, but an algorithm allows state officials to do what they can with the resources they have.

As for the transparency of the system, he agrees that the algorithm is impossible for most to easily understand, but says that it’s not a problem. “It’s not simple,” he says. “My washing machine isn’t simple.” But if you can capture complexity in more detail, Fries argues, this will ultimately serve the public better, and at some point, “you’re going to have to trust me that a bunch of smart people determined this is the smart way to do it.”

Shortly after Arkansas started using the algorithm in 2016, Kevin De Liban, an attorney for Legal Aid of Arkansas, started to receive complaints. Someone said they were hospitalized because their care was cut. A slew of others wrote in about radical readjustments.  

De Liban first learned about the change from a program beneficiary named Bradley Ledgerwood. The Ledgerwood family lives in the tiny city of Cash, in the Northeast of the state. Bradley, the son, has cerebral palsy, but stays active, following basketball and Republican politics, and serving on the city council.

When Bradley was younger, his grandmother took care of him during the day, but as he got older and bigger, she couldn’t lift him, and the situation became untenable. Bradley’s parents debated what to do and eventually decided that his mother, Ann, would stay home to care for him. The decision meant a severe financial hit; Ann had a job doing appraisals for the county she would have to quit. But the Arkansas program gave them a path to recover some of those losses. The state would reimburse Ann a small hourly rate to compensate her for taking care of Bradley, with the number of reimbursable hours determined by an assessment of his care needs.

Legal Aid attorney Kevin De Liban.
Legal Aid attorney Kevin De Liban.

When the state moved over to its new system, the Ledgerwood family’s hours were also substantially cut. Bradley had dealt with the Arkansas Department of Human Services, which administered the program, in a previous battle over a dispute on home care hours and reached out to De Liban, who agreed to look into it.

With Bradley and an elderly woman named Ethel Jacobs as the plaintiffs, Legal Aid filed a federal lawsuit in 2016, arguing that the state had instituted a new policy without properly notifying the people affected about the change. There was also no way to effectively challenge the system, as they couldn’t understand what information factored into the changes, De Liban argued. No one seemed able to answer basic questions about the process. “The nurses said, ‘It’s not me; it’s the computer,’” De Liban says.

“it’s not me; it’s the computer.”

At the time, they knew it was some sort of new, computer-based system, but there was no mention of an algorithm; the math behind the change only came out after the lawsuit was filed. “It didn’t make any sense to me in the beginning,” De Liban says. When they dug into the system, they discovered more about how it works. Out of the lengthy list of items that assessors asked about, only about 60 factored into the home care algorithm. The algorithm scores the answers to those questions, and then sorts people into categories through a flowchart-like system. It turned out that a small number of variables could matter enormously: for some people, a difference between a score of a three instead of a four on any of a handful of items meant a cut of dozens of care hours a month. (Fries didn’t say this was wrong, but said, when dealing with these systems, “there are always people at the margin who are going to be problematic.”)

The Ledgerwood family.
The Ledgerwood family.

De Liban started keeping a list of what he thought of as “algorithmic absurdities.” One variable in the assessment was foot problems. When an assessor visited a certain person, they wrote that the person didn’t have any problems — because they were an amputee. Over time, De Liban says, they discovered wildly different scores when the same people were assessed, despite being in the same condition. (Fries says studies suggest this rarely happens.) De Liban also says negative changes, like a person contracting pneumonia, could counterintuitively lead them to receive fewer help hours because the flowchart-like algorithm would place them in a different category. (Fries denied this, saying the algorithm accounts for it.)

But from the state’s perspective, the most embarrassing moment in the dispute happened during questioning in court. Fries was called in to answer questions about the algorithm and patiently explained to De Liban how the system works. After some back-and-forth, De Liban offered a suggestion: “Would you be able to take somebody’s assessment report and then sort them into a category?” (He said later he wanted to understand what changes triggered the reduction from one year to the next.)

Somehow, the wrong calculation was being used

Fries said he could, although it would take a little time. He looked over the numbers for Ethel Jacobs. After a break, a lawyer for the state came back and sheepishly admitted to the court: there was a mistake. Somehow, the wrong calculation was being used. They said they would restore Jacobs’ hours.

“Of course we’re gratified that DHS has reported the error and certainly happy that it’s been found, but that almost proves the point of the case,” De Liban said in court. “There’s this immensely complex system around which no standards have been published, so that no one in their agency caught it until we initiated federal litigation and spent hundreds of hours and thousands of dollars to get here today. That’s the problem.”

It came out in the court case that the problem was with a third-party software vendor implementing the system, which mistakenly used a version of the algorithm that didn’t account for diabetes issues. There was also a separate problem with cerebral palsy, which wasn’t properly coded in the algorithm, and that caused incorrect calculations for hundreds of people, mostly lowering their hours.

“As far as we knew, we were doing it the right way,” Douglas Zimmer, the president of the vendor, a company called the Center for Information Management, says about using the algorithm that did not include diabetes issues. New York also uses this version of the algorithm. He says the cerebral palsy coding problem was “an error on our part.”

“If states are using something so complex that they don’t understand it, how do we know that it’s working right?” De Liban says. “What if there’s errors?”

About 19 percent of beneficiaries were affected by the diabetes omission

Fries later wrote in a report to the state that about 19 percent of all beneficiaries were negatively impacted by the diabetes omission. He told me that the swapped algorithms amounted to a “very, very marginal call,” and that, overall, it wasn’t unreasonable for the state to continue using the system that allotted fewer hours, as New York has decided to. In the report and with me, he said the diabetes change was not an “error,” although the report says the more widely used algorithm was a “slightly better” match for Arkansas. One item listed as a “pro” in the report: moving back to the original algorithm was “responsive to trial result,” as it would raise the plaintiffs’ hours close to their previous levels. It’s not clear whether the state has since started counting diabetes issues. As of December, an official said he believed they weren’t. The Department of Human Services declined to comment.

But in internal emails seen by The Verge, Arkansas officials discussed the cerebral palsy coding error and the best course of action. On an email chain, the officials suggested that, since some of the people who had their hours reduced didn’t appeal the decision, they effectively waived their legal right to fight it. (“How is somebody supposed to appeal and determine there’s a problem with the software when DHS itself didn’t determine that?” De Liban says.) But after some discussion, one finally said, “We have now been effectively notified that there are individuals who did not receive the services that they actually needed, and compensating them for that shortcoming feels like the right thing to do.” It would also “place DHS on the right side of the story.”

“Compensating them for that shortcoming feels like the right thing to do.”

The judge in the federal court case ultimately ruled that the state had insufficiently implemented the program. The state also subsequently made changes to help people understand the system, including lists that showed exactly what items on their assessments changed from year to year. But De Liban says there was a larger issue: people weren’t given enough help in general. While the algorithm sets the proportions for care — one care level, for example, might be two or three times higher than another — it’s the state’s decision to decide how many hours to insert into the equation.

“How much is given is as much a political as a service administration issue,” Mor says.

Fries says there’s no best practice for alerting people about how an algorithm works. “It’s probably something we should do,” he said when I asked whether his group should find a way to communicate the system. “Yeah, I also should probably dust under my bed.” Afterward, he clarified that he thought it was the job of the people implementing the system.

Kevin De Liban after a visit to Tammy Dobbs.
Kevin De Liban after a visit to Tammy Dobbs.

De Liban says the process for people appealing their cuts has been effectively worthless for most. Out of 196 people who appealed a decision at one point before the ruling, only nine won, and most of those were Legal Aid clients fighting on procedural grounds. While it’s hard to know, De Liban says it’s very possible some had errors they weren’t aware of.

Eubanks, the author of Automating Inequality, writes about the “digital poorhouse,” showing the ways automation can give a new sheen to long-standing mistreatment of the vulnerable. She told me there is a “natural trust” that computer-based systems will produce unbiased, neutral results. “I’m sure it is in some cases, but I can say with a fair amount of confidence it is not as descriptive or predictive as the advocates of these systems claim,” she says.

Eubanks proposes a test for evaluating algorithms directed toward the poor, including asking whether the tool increases their agency and whether it would be acceptable to use with wealthier people. It doesn’t seem obvious that the Arkansas system would pass that test. In one sign officials have been disappointed with the system, they’ve said they will soon migrate to a new system and software provider, likely calculating hours in a different way, although it’s not clear exactly what that will mean for people in the program.

Dobbs has done well up until now. Her house sits off a winding road on a lakeside hill, dotted in winter with barren trees. When the sun sets in the afternoon, light pours in through the windows and catches the plant collection Dobbs manages with help from an aide. A scruffy, sweatered dog named Spike hopped around excitedly when I visited recently, as a fluffy cat jockeyed for attention. “Sometimes I like them better than humans,” Dobbs says. On the wall was a collection of Duck Dynasty memorabilia and a framed photo of her with Kenny Rogers from when she worked at the Missouri building then known as the Kenny Rogers United Cerebral Palsy Center.

Dobbs with Kenny Rogers.
Dobbs with Kenny Rogers.
Outside Dobbs’ home.
Outside Dobbs’ home.

For the time being, she’s stuck in limbo. She’ll soon come up for another reassessment, and while it’s almost certain, based on what is known about the system, that she’ll be given a cut, it’s hard to say how severe it will be. She’s been through the process more than once now. Her hours were briefly restored after a judge ruled in the plaintiffs’ favor in the federal lawsuit, only for them to be cut again after the state changed its notification system to comply with the ruling and reimplemented the algorithm. As she went through an appeal, the Department of Human Services, De Liban says, quietly reinstated her hours again. This, he says, was right around the time the cerebral palsy issue was discovered. He says this may have been the reason it was dropped: to save face. But as many people grappling with the changes might understand, it’s hard to know for sure.