Google’s Totally Creepy, Totally Legal Health-Data Harvesting

A Google sign

The summer after college, I moved back home to take care of my widower grandfather. Part of my job was to manage his medications; at 80, he was becoming a fall risk and often complained that his prescriptions made him light-headed. But getting someone on the phone was exhausting, and privacy law prevented pharmaceutical call-line employees from answering some of my questions about side effects.

So I’d ask Google. I’d sit at my laptop and type incomprehensible words such as methocarbamol or meloxicam into the search bar alongside my concerns. Does it cause dizzy spells? Can you take it without eating? Can you mix it with other medicines? What about caffeine or alcohol? I was 24, overwhelmed, and using a search engine as a stopgap medical advisory board.

In the six years since, Google has gone from a basic digital reference book to a multibillion-dollar player in the health-care industry, with the potential to combine medical and search data in myriad alarming new ways. Earlier this month, it announced its $2.1 billion acquisition of the wearables company Fitbit, and suddenly the company that had logged all our late-night searches about prescriptions and symptoms would potentially also have access to our heart rates and step counts. Immediately, users voiced concern about Google combining fitness data with the sizable cache of information it keeps on its users.

Google assured detractors that it would follow all relevant privacy laws, but the regulatory-compliance discussion only distracted from the strange future coming into view. As Google pushes further into health care, it is amassing a trove of data about our shopping habits, the prescriptions we use, and where we live, and few regulations are governing how it uses these data.

The Fitbit acquisition seems quaint compared with news of Google’s latest endeavor. The Wall Street Journal reported on Monday that Google had secretly harvested “tens of millions” of medical records—patient names, lab results, diagnoses, hospitalization records, and prescriptions—from more than 2,600 hospitals as part of a machine-learning project code-named Nightingale. Citing internal documents, the Journal reported that Google, in partnership with Ascension, a health-care provider operating in 20 states, was planning to build a search tool for medical professionals that would employ machine-learning algorithms to process data and make suggestions about prescriptions, diagnoses, and even which doctors to assign to, or remove from, a patient’s team.

Neither affected patients nor Ascension doctors were made aware of the project, the Journal reported. And again, all parties asserted that HIPAA, the package of privacy regulations protecting patient data, allows for its existence. In response to my requests for comment, both Google and Ascension referred to their respective recent blog posts on the topic. “All of Google’s work with Ascension adheres to industry-wide regulations (including HIPAA) regarding patient data, and come with strict guidance on data privacy, security and usage,” Google’s post reads.

The Department of Health and Human Services is probing the legality of the deal. Under Google’s interpretation, the company is merely a “business associate” helping Ascension better render its services—and thus warrants a different level of scrutiny than an actual health-care provider. But if HHS determines that Google and its handling of private information make it something more akin to a health-care provider itself (because of its access to sensitive information from multiple sources who aren’t prompted for consent), it may find Google and Ascension in violation of the law and refer the matter to the Department of Justice for potential criminal prosecution.

But whether or not the deal goes through, its very existence points to a larger limitation of health-privacy laws, which were drafted long before tech giants started pouring billions into revolutionizing health care.

“It’s widely agreed that HIPAA is out of date, and there are efforts ongoing right now to update it for the 21st century,” says Kirsten Ostherr, a co-founder and the director of the Medical Futures Lab at Rice University. HIPAA was signed into law in 1996—years before Google knew if you were pregnant or could algorithmically estimate your risk of suicide. “Most of the kind of data [Google’s] trafficking in is not considered to be personally identifiable information in the way that it was conceived back in the ’90s, when [much of] the tech world didn’t even exist.”

These days, digital behavior is already used to determine all kinds of real-world outcomes. Google and Facebook can infer your emotional state and predict your chance of depression based on your behavior. Children’s YouTube videos were used in scientific research about the potential of artificial intelligence to diagnose autism. Insurance companies use social-media posts to determine premiums. For years, lending institutions have done the same to evaluate creditworthiness. It’s unsettling. It’s legal.

Google says it doesn’t combine its user data with Ascension patient data. But the fact remains that the data it already has on all its users are tremendously revealing. Your IP address contains information about where you live, which in turn is associated with social determinants of health such as income, employment status, and race. Search terms such as nearest food pantry or nearest HIV test can offer further clues about income level, sexual orientation, and so on.

“HIPAA’s an exceptionally low bar,” says Travis Good, a medical doctor and privacy specialist. “None of that [search] data, whether you’re searching for STI clinics or Plan B or a dermatologist, none of that’s covered under HIPAA.”

A recent report from the Financial Times, done in collaboration with Carnegie Mellon University, notes that Google, like Amazon and Microsoft, collects data entered into popular health and diagnosis sites. Google’s ad service, DoubleClick, receives prescription names from Drugs.com, for example, while WebMD’s symptom checker shares information with Facebook. The data are not anonymized, and the legal experts interviewed argued that the collection may violate European Union privacy law.

Your very online existence—the sites you access, where you access them from, the ads you click on—gives Google the kind of holistic, robust, up-to-date view of your health that was largely unimaginable a decade ago. “The hype, or hope, is that as you gather more and more info, and when you’re able to combine [different data sets], you’re able to come up with super-tailored care pathways and eventually treatments,” Good says. “So it’s not just, you’re 35 and have pancreatic cancer. It’s, you’re 35, have pancreatic cancer, here’s your medical history, your family history, and genetic markers for oncology, and here’s the care pathway just for you.”

Creating tailored medical treatments for countless patients at scale requires an enormous amount of data that need to be standardized, tested for accuracy and bias, stored securely, processed rapidly, and made comprehensible enough that doctors can understand and confer with one another on a patient’s best care. This is Google’s specialty. It doesn’t need your consent; it already has your information.

[“source=theatlantic”]