The summer after college, I moved back home to take care of my widower grandfather. Part of my job was to manage his medications; at 80, he was becoming a fall risk and often complained his prescriptions made him lightheaded. But getting someone on the phone was exhausting, and privacy law prevented pharmaceutical call-line employees from answering some of my questions about side effects.
So, I’d ask Google. I’d sit at my laptop and type incomprehensible words like “methocarbamol” or “meloxicam” into the search bar alongside my concerns. Does it cause dizzy spells? Can you take it without eating? Can you mix it with other medicines? What about caffeine or alcohol? I was 24, overwhelmed, and using a search engine as a stopgap medical advisory board.
In the six years since, Google has gone from a basic digital reference book to a multibillion dollar player in the healthcare industry, with the potential to combine medical and search data in myriad alarming new ways. Earlier this month, it announced its $2.1 billion dollar acquisition of the wearables company Fitbit, and suddenly, the company that had logged all our late-night searches about prescriptions and symptoms would potentially also have access to our heart rates and step counts. Immediately, users voiced concerns about Google combining fitness data with the sizable cache of information Google keeps on its users.
Google assured detractors that it would follow all relevant privacy laws, but the regulatory compliance discussion only distracted from the strange future coming into view. As Google pushes further into healthcare, it is amassing a trove of data about our shopping habits, the prescriptions we use, and where we live, and few regulations are governing how it uses this data.
The Fitbit acquisition seems quaint compared to news of Google’s latest endeavor. The Wall Street Journal reported Monday that Google secretly harvested “tens of millions” of medical records—patient names, lab results, diagnoses, hospitalization records, and prescriptions—from more than 2,600 hospitals as part of a machine learning project codenamed Nightingale. Citing internal documents, the Journal reported that Google, in partnership with Ascension, a healthcare provider operating in more than 20 states, was planning to build a search tool for medical professionals that will employ machine learning algorithms to process data and make suggestions about prescriptions, diagnoses, and even which doctors to assign to, or remove from, a patient’s team.
Neither affected patients nor Ascension doctors were made aware of the project, The Journal reported. And again, all parties assert that HIPAA, the package of privacy regulations protecting patient data, allows for its existence. In response to requests for comment from The Atlantic, both Google and Ascension referenced their respective recent blog posts on the topic. “All of Google’s work with Ascension adheres to industry-wide regulations (including HIPAA) regarding patient data, and come with strict guidance on data privacy, security and usage,” Google’s post reads.
The Department of Health and Human Services is probing the legality of the deal. Under Google’s interpretation, the company is merely a “business associate” helping Ascension better render its services—and thus warrants a different level of scrutiny than an actual healthcare provider. But if HHS determines Google and its handling of private information make it something more akin to a healthcare provider itself (because of its access to sensitive information from multiple sources who aren’t prompted for consent) it may find Google and Ascension in violation of the law and refer the matter to the DOJ for potential criminal prosecution.
But whether or not the deal goes through, its very existence points to a larger limitation of health privacy laws, which were drafted long before tech giants started pouring billions into revolutionizing healthcare.
“It's widely agreed that HIPAA is out of date and there are efforts ongoing right now to update it for the 21st century,” said Kirsten Oshterr, co-founder and Director of the Medical Futures Lab at Rice University. HIPAA was signed into law in 1996—years before Google knew if you were pregnant or could algorithmically estimate your risk of suicide. “Most of the kind of data [Google’s] trafficking in is not considered to be personally identifiable information, in the way that it was conceived back in the ’90s, when [much of] the tech world didn't even exist.”
These days, digital behavior is already used to determine all kinds of real-world outcomes. Google and Facebook can infer your emotional state and predict your chance of depression based on your behavior. Children’s YouTube videos were used in scientific research about using AI to diagnose autism. Insurance companies use social media posts to determine premiums. For years, lending institutions have done the same to evaluate creditworthiness. It’s unsettling. It’s legal.
Google says it doesn’t combine its user data with Ascension patient data. But the fact remains that the data it already has on all of its users is tremendously revealing. Your IP address contains information about where you live, which in turn is associated with social determinants of health such as income, employment status, and race. Search terms like “nearest food pantry” or “nearest HIV test” can offer further clues about income level, sexual orientation and so on.
“HIPAA's an exceptionally low bar,” said Travis Good, a medical doctor and privacy specialist. “None of that [search] data, whether you're searching for STI clinics or plan B or a dermatologists, none of that's covered under HIPAA.”
A recent report from the Financial Times, done in collaboration with Carnegie Mellon, notes that Google, like Amazon and Microsoft, collects data entered into popular health and diagnosis sites. Google’s ad service, DoubleClick, receives prescription names from Drugs.com, for example, while WebMD’s symptom checker shares information with Facebook. The data is not anonymized and the legal experts interviewed argued the collection may violate EU privacy law.
Your very online existence—the sites you access, where you access them from, the ads you click on—gives Google the kind of holistic, robust, up-to-date view of your health that was largely unimaginable a decade ago. “The hype, or hope, is that as you gather more and more info and when you're able to combine [different data sets], you're able to come up with super tailored care pathways and eventually treatments,” Good said. “So it's not just you're 35 and have pancreatic cancer. It’s, you're 35, have pancreatic cancer, here's your medical history, your family history, and genetic markers for oncology and here’s the care pathway just for you.”
Creating tailored medical treatments for countless patients at scale requires an enormous amount of data that needs to be standardized, tested for accuracy and bias, stored securely, processed rapidly, and be made comprehensible enough that doctors can understand and confer with each other on patient’s best care. This is Google’s specialty. It doesn’t need your consent; it already has your information.
from Technology | The Atlantic https://ift.tt/32Li2dJ