Google’s Project Nightingale Collects Health Data on Millions of Americans

Google logo

Teaming up with Ascension, the second-biggest health system in the United States, Google’s Project Nightingale aims to collect health data from millions of Americans, without telling patients or doctors (via WSJ).

Project Nightingale

The data included in Project Nightingale includes:

  • Lab results
  • Doctor diagnoses
  • Hospitalization records
  • Birth dates
  • Patient names

The Health Insurance Portability and Accountability Act of 1996 (HIPAA) says that hospitals can share data with business partners without telling patients as long as the information is used “only to help the covered entity carry out its health-care functions.”

Google has no health care functions. Instead it wants this health data to design software and use machine learning to suggest changes to individuals’ health care. Currently at least 150 Google employees have access to this data, already reaching into the tens of millions of patients.

Further Reading:

[If Your YouTube Account Isn’t ‘Commercially Viable’ Google Will Delete It]

[macOS Mail Stores Encrypted Emails in Plain Text]

One thought on “Google’s Project Nightingale Collects Health Data on Millions of Americans

  • Andrew:

    This is why we need to change our current legislative and regulatory paradigm.

    It is less about so-called ‘big tech’ than it is specifically about the aptly named ‘surveillance capitalism’ industry, whose internal engine is AI and whose fuel is our data, and whose outcome has metastasised beyond a simple product, namely the delivery of the consumer to the point of sales, to a larger cancer of unchecked expansion into as yet untapped speculation, not unlike real estate speculation of old, in which investors make a land grab in the educated speculation that such real property will have tremendous value, and therefore yield substantial profits, in the future; and these investors will reap the benefit of a seller’s market. Surveillance capitalists are exploiting the power of AI to redefine and boost the potential energy of consumer data, repackaged and targeted to specific and vulnerable markets eager for a competitive advantage, and making a bet that such will be the demand for targeted and weaponised consumer data, that these vulnerable industries, like healthcare, transportation, and traditional service sectors, to name but a few, themselves under the stress of rapid and profound change in structure, payment paradigms, workforce and business models, will pay dearly for such data; and with it cripple their competition still plodding at yesteryear’s pace.

    These surveillance capitalists (SCs) are not wrong, even if unprincipled. These moves are the natural consequence of unfettered and unregulated market dynamics. They (SCs) are using AI to guide them on AI’s own potential to exploit vulnerabilities, namely markets and practices that exist just beyond the limits of current established regulations and law. As a result, they are moving at near light speed relative to an enervated and obsolete 20th Century model of industry creation followed by legislation/regulation all in good time. Using this AI paradigm, these new industries, like AI-cogitated and channeled patient data, and their emerging markets will be up, running, matured and morphing into even newer constructs by the time our current legislators even realise that real harm has been done to millions, and begin legislating to where the industry is already moving beyond. This will continue to be skating to where puck was at its most pathetic, and with real consequence to human life and well-being.

    Rather, liberal democratic legislatures themselves need to be AI powered; enabling law makers to identify and prioritise threats to public safety, consumer privacy, and individual liberties before AI itself can be bent by SCs to make these threats, and can begin, prior to industry creation, to close off data exploitation opportunities with new legislation, thereby restoring the choice of how and under what circumstances our data are used to the people, as it should be in a democracy.

    Philip Dick was prophetic. We face an imperative to employ a real pre-cog to prevent injury and harm before it can be committed, indeed to codify it as a crime and contain it, before it can even be conceived and implemented. And that pre-cog is AI.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.