Apple's ResearchKit sounds promising, in theory: the software platform is designed to let medical researchers create iPhone apps for their studies that will help them recruit participants — without said participants ever setting foot in the lab. Since recruiting patients is one of the most difficult steps in any study, this could be a significant boon to medical research.
The research apps are easy to download, and the Apple audience is very large; both are advantages for recruitment. But those things can also be ethical liabilities: both Apple and ResearchKit researchers will have to make extra efforts to ensure that participants are eligible for studies, that they are knowledgeable about the risks, and that their data is secure.
I probably shouldn't ask about ethics and informed consent with the Apple ResearchKit thing should I?— Erin Ryan (@erinleeryan) March 9, 2015
Minors, Keep Out
The ResearchKit announcement was a big deal yesterday, so it's understandable that the five ResearchKit apps that Apple announced yesterday are currently featured in a banner at top of the App Store. But it's not only adults who own iPhones; over 60 percent of American teens have one, too. And that's the hitch: minors aren't supposed to take part in medical studies without parental consent. Unfortunately, I was able to lie about my age, and then change it to conform to study the eligibility criteria for at least one of these apps.
I was able to go back and change my age
When you open up "Asthma" — one of the five ResearchKit apps released yesterday — you're asked a number of questions, including about your age. If you say you're not over 18, you're ineligible. But answering "yes" to the age question, and going through a few other questions (Are you pregnant? No. Do you live in the US? Yes.) lets you know that you're eligible for the study.
Of course, on the internet no one knows you're a dog — or a minor. It's easy to lie about your age online, and Mount Sinai Hospital, the research group behind the app, appears to have dropped the ball in preventing it. In person, it's easy to demand an ID, like a driver's license, to verify age. When I told the app that I was under 18, I was kicked out of the study. But I could still go back to the previous page, and change my answer — and when I did, I was suddenly eligible. The same is true of the Parkinson's disease app "mPower" and the breast cancer app "Share the Journey," according to Sage Bionetwork's Chief Commons Officer, John Wilbanks, who worked on the apps.
"It's tough, you know — it's certainly possible to have a system where you only get one shot with the eligibility criteria in our studies, but we chose not to do that," Wilbanks says. That has a lot to do with the nature of the population Sage Bionetworks is serving, Wilbanks says. "With Parkinson's disease, or post-chemo cognition, people tap in the wrong place, and so we don't want that to disqualify anyone." Sure — but perhaps if a user were to change her answer about her age, it might be wise to automatically require an ID photo for age verification.
"We are less worried about fraud and trolling and impersonation."
Wilbanks doesn't think that impersonation will be a significant problem. "We are less worried about fraud and trolling and impersonation because there's a fair amount of work involved in being in this study, and doing it just for fraudulent purposes is kind of a strange behavior," he says. "People might do it for shits and giggles for a week or two, but I can't see someone doing voice testing and gait testing for 52 weeks when there's no sort of public pay off."
The disease-specific issues raised by Wilbanks make sense, and it is possible that people who wish to impersonate others will get bored with the charade. But there are children with asthma, and it's easy to imagine minors would enroll in these studies without getting consent from their parents — an issue that could open these studies up to an ethical challenge, says Steven Joffe, a medial ethicist at the University of Pennsylvania. "There are special research protections for minors, including a requirement for parental permission in most cases."
Protecting your privacy
Privacy is also an issue. Apple has clearly thought about this, as evidenced by the words "Apple will not see your data" splashed on the gigantic screen behind Jeff Williams, Apple's Vice President of Operations, yesterday. But making sure Apple doesn't get its hands on the data is just part of the issue.
"Apple will not see your data"
"Everything depends on the quality of the anonymization," says Nir Eyal, a medical ethicist at Harvard University. "If the data can be re-identified, then subjects are exposed to some invasions of privacy."
To ensure that the data they receive is anonymous, Stanford and Oxford University are working with Sage Bionetworks. Sage is the entity that receives the data gathered from users' iPhones. When someone signs up for a study, Sage checks to see if any identifiable information is contained in the data that their phone sends out. If there's anything that could tie a user back to where they live or who they are, it's removed from that person's data stream, Wilbanks says. Sage also makes sure that it doesn't happen again. Once the data is anonymized, it's sent to Stanford so researchers can perform an analysis. Unfortunately, "we can't promise perfect anonymity," Wilbanks says. "The biggest risk in these studies is to your privacy; we're going to de-identify it, but because we're going to make it available for lots of research, there exists a chance that someone could re-identify you."
"there exist a chance that someone could re-identify you."
In the medical research world, data isn't shared with other clinics or universities unless the participants give their consent at the beginning of the study, says Michael McConnell, cardiologist at Stanford University and the principal investigator behind the Researchkit-made "MyHeart Counts" app. From what we can tell, the apps that have been released so far appear to make that clear. Of course it's possible that a research group might one day decide to sell the information — for instance, to a pharmaceutical company. "There seems to be nothing preventing researchers from bringing in commercial companies or selling data, with user approval, to commercial bodies," Eyal says. "That's not a problem in its own right, but especially given that the system is based on millions of altruistic volunteers, it would only be appropriate if funding bodies and [review boards] made sure that studies that use this system address major sources of human disease burden — and that products that come out of it will be accessible widely around the world."
Nothing stopping Pfizer from using ResearchKit, too
Of course, ResearchKit is open-source, so pharmaceutical companies might actually prefer to get in on the action directly. So far, there doesn't appear to be anything stopping Merck or Pfizer from building an app using ResearchKit and submitting it to the App Store (The Verge contacted Apple to inquire about this yesterday and today, but we have yet to hear back). In that case, these apps would essentially provide a portal for pharmaceutical companies wishing to gather personal health information from iPhone users — and profit from it.
Giving Informed Consent
Study eligibility is just a tiny part of the enrollment process. After that, iPhone users must then encounter another thorny ethical area: informed consent. Informed consent is crucial for any medical study; it's the step that lets a participant know what they're in for and what risks are involved in taking part. Normally, researchers go through informed consent with participants in person, one-on-one. "That interaction is pretty crappy," says Sage Bionetworks' Wilbanks.
With the apps, the process works a little differently. Participants are shown images, and sometimes videos that help them understand how the study will run. Then, they're given a quiz to test their knowledge. In "Share the Journey" and "mPower," users have to get all the questions right, Wilbanks says, to move on the part where they can voluntarily agree to taking part in the study. Those who fail have to go through the consent training all over again.
If you get a question wrong, you go through the informed consent training again
Being able to take someone through informed consent on a mobile device makes designing large studies easier, says Corey Bridges, CEO of LifeMap Solutions, a mobile health company that partnered with Mount Sinai Hospital to build the Asthma app. "What we get with ResearchKit is the unshackling of science from those bricks and mortar constraints, so we can do something much more scalable."
Despite some of the improvements to the informed consent process, one problem remains: many of these apps haven't figured out a way to let users ask questions during the process, which is what the one-on-one interaction allows. "We haven't yet cracked how to ask questions yet," Wilbanks says. "That something that we are going to try to figure out next, for instance, can we come with some sort of peer-to-peer system where people can ask questions."
"We haven't yet cracked how to ask questions yet."
There's also the possibility that someone might forget that they're taking part in the study because the app is simply gathering data from HealthKit, the iPhone health app that runs in the background. "It would be too easy for users to turn the function on and then forget about it," Joffe says. To fix this, researchers might consider having people renew informed consent on a yearly basis. As a bonus, Joffe says, "periodic re-consent would encourage partners to provide users with updated information."
In the medical world, Western, educated people from industrialized, rich, democratic countries are referred to as WEIRD. They reason and make decisions differently than other populations, and they even differ in things like visual perception. Yet, many scientific studies sample from these populations because they take place on college campuses. This is what researchers call a "population bias" — it's a problem that arises when the population you're studying doesn't represent the entire population affected by the disease.
If ResearchKit becomes the standard for digital medicine, there may be similar problems. For one thing, people who have iPhones tend to be wealthier — and diseases vary by income. For instance, people with lower socioeconomic status are more likely to develop heart disease compared with their wealthier counterparts.
And that's to say nothing of another population bias: engagement. The people most likely to jump on the ResearchKit bandwagon will be "people who are serious about doing something about their disease — a particular kind of patient," Eyal says. If these people are more willing to work to improve their health, we might end up with large data sets that track only the most-engaged patients.
If ResearchKit becomes the standard for digital medicine, we're going to run into some problems
Of course, it’s possible that the study populations will end up being more diverse than traditional studies. Researchers might be able to reach people who live in remote areas, for instance, instead of sampling solely from urban areas with nearby colleges.
"It’s kind of a shot across the bow of the traditional way of doing clinical research, which is to pretend that surveying 2,000 whites dudes in Norway about fish oil is representative of the world," Wilbanks says. A medical study with 2,000 people isn't unusual. But when it comes to politics or TV ratings, we expect larger sample sizes. Medicine should be no different, he says. "There’s coming a day when you’re not going to have an excuse to have a tiny cohort just because you chose not the use digital technologies to engage people."
The Learning Curve is Real
The researchers involved in these apps are all trying to do something good — and they're doing it by the book, as much as possible in this new territory. These aren't the first apps used in studies, of course. All four of the apps mentioned in this article have an informed consent procedure, and all four have gained institution review board (IRB) approval prior to their release. IRBs are committees that are charged with approving, monitoring, and reviewing research that involves humans; they determine if a study is ethical and safe. They are, in other words, a kind of patient protection.
Studies involving mobile apps aren't new, but they're still experimental
I find myself wondering if IRB approval is a requirement for all ResearchKit apps that Apple will feature. That's not all I'm wondering about, either. Will medical researchers who use ResearchKit be required to anonymize the data they gather from our phones? Finally, will Apple work with research teams to make sure that study participants are truly are over 18? As of publication, Apple hasn't responded to our requests for comment on these issues.
Studies involving mobile apps aren't new, but they're still very experimental. And with Apple's ResearchKit, running medical studies through mobile apps is likely to go mainstream. Last night, Wilbanks tweeted that the Parkinson's disease app mPower had recruited over 7,000 study participants in just a few hours — a number that handily surpasses the previous largest Parkinson's study, which Willbanks says had just 1,700 participants. "We were pretty shocked as those numbers went up yesterday — in a good way," he told The Verge. That means this study could represent a gigantic leap forward for people suffering from Parkinson's disease.
"ironic if the only purpose for which information cannot be collected is medical research."
Advertising agencies and corporations get to profit off our data all the time, so in a way, it makes sense to finally let medical researchers have a seat at the data-driven table. "It would be ironic if the only purpose for which information cannot be collected is medical research for the betterment of the community," Harvard's Eyal says.
But if we're going to do this, we need to do it right. ResearchKit won't be available for developers until next month, so it's possible that Apple will make changes before then, or that it will release information about the strict rules it plans to enforce. Right now, though, we're in some pretty murky ethical waters — and displaying the words "Apple won't see your data" is a half-inflated life preserver, at best.