Massachusetts may soon join more than a dozen US states in offering a smartphone app that can warn users of possible exposure to people infected with COVID-19. But it’s far from obvious that the highly touted technology will actually help check the spread of the disease.
In the states where such apps are available, relatively few people bother to use them. And flaws in the design of the apps make them prone to false positive results, according to Ryan Calo, codirector of the Tech Policy Lab at the University of Washington.
“There is just next to no evidence that they work,” said Calo. “People should just wear masks.”
Still, some of Massachusetts’ brightest minds are working to develop contact tracing apps. PathCheck, an app developed at the Massachusetts Institute of Technology, is already being used in Hawaii, Minnesota, Guam, Puerto Rico, and the European nation of Cyprus. And earlier in November, Worcester Polytechnic Institute announced a joint effort with Virginia Tech University to build a new COVID tracing app that could be ready for a public test by summer.
A WPI engineer said the app is being designed to overcome one of the biggest problems with the current generation of tracing apps. In an effort to guarantee privacy, most of the apps don’t record the location where an infected person was, or how long he spent there. WPI said it’s developed a way to track location and time, while still concealing the identify of the infected person.
The Massachusetts Department of Public Health is currently soliciting proposals for a tracing program based on technology in Apple and Android phones that use Bluetooth signals to trace where an infected person has been, and with whom, and then notifies people they might have come into contact with.
Crucially, the system depends on cooperation from the public: The state wants to encourage residents to upload and run the COVID tracking software in their phones. Most important is that people who have tested positive would enroll in the app, so it can warn those with whom they have had contact, via an alert sent by Bluetooth.
Apple and Google began working on the underlying technology in April. Since then, universities and government health departments worldwide have put together custom apps based on the technology and have begun distributing these apps to the public. For example, Ireland’s public health service began offering its version of the app in early July. And about 15 US states now have similar apps.
But Massachusetts was in no hurry to join the trend. In a press conference last week, Governor Charlie Baker said he wanted to make sure that such an app would not violate privacy by needlessly revealing a person’s location and movements.
“Part of the reason you haven’t seen digital contact tracing adopted in a big way ... is the concern about basically targeting and tagging people based on the presence of their phone,” Baker said. “The privacy issues associated with this ... are ones that we don’t believe have been adequately addressed by any of the platforms that we’ve talked to.”
The Apple-Google technology is designed to protect privacy, because the alerts would not include the identify of the infected person; app users would know only that at some point they were near somebody infected with COVID.
But even with privacy protections, relatively few Americans are using the tracing apps. For example, Alabama, one of the first states to offer such an app, has seen only about 150,000 downloads since it launched in August, out of a total population of about 5 million. And while over 226,000 Alabamians have been infected so far, only about 380 of those have entered their diagnostic data into the app.
Sue Feldman, associate professor at the University of Alabama School of Medicine, conceded that low usage of her state’s app is limiting its effectiveness.
“Only through wide adoption and use can we give ourselves and others the best chance for the app to have a wide impact,” Feldman said.
Even if the apps begin to catch on, they may produce many false positives. Imagine a group of cars stopped at a traffic light. One of the drivers is infected and his phone sends a Bluetooth warning message to all nearby phones. The infected man is sealed inside his car and poses no risk. But the tracing app has no way of knowing this. So the other drivers may sign up for unnecessary COVID tests.
Patrick Schaumont, professor of electrical and computer engineering at WPI, sees an even bigger problem: Human contact tracers employed by local governments have to ask specific and sensitive questions about an infected person’s activities, information that could end up revealing their identity.
“You need to build a record of location and time points,” said Schaumont. “Where has this person been? Who was there in the room with him? What meeting was it? At what time of the week?”
In the name of protecting privacy, the Apple-Google system does not collect such information. As a result, said Schaumont, “it’s not compatible with manual contact tracing” programs so many states have also launched.
Schaumont is joining with researchers at Virginia Tech to create a tracing app that would track the user’s location and movements. This data will be stored on the phone in encrypted form and never shared with any government agency, unless the user is infected with COVID-19.
An infected person’s location data would be made available to public health officials, who would use it to find other people who had been in the same places. Those people would be notified of a possible exposure but would not be told the identity of the infected person. Thus a person who went to the public library at 3 p.m. last Saturday could check to see if an infected person had gone to the same place at the same time but wouldn’t know who the infected person was.
Schaumont said such a system would be much more effective than the current Bluetooth-based approach, while still providing adequate privacy protection.