The Guardian reported that Google’s AI-powered DeepMind Health is revolutionizing the way medical records are stored and tracked, while also providing medical professionals with access to personal patient data in real time – through blockchain technology.
DeepMind is using “Verifiable Data Audit” to help clinicians predict, diagnose, and prevent disease by serving as a data processor for hospitals, meaning that their role will be to provide secure data services while leaving hospitals in full control.
Medical data received by DeepMind’s system will be logged on a digital ledger that shares many (but not all) properties with a blockchain. Verifiable Data Audit will add entries to the ledger as patient procedures are administered, along with the reasons why. For example, data from a patient’s blood test can be checked against the NHS’s (National Health Service, UK) national algorithm to detect possible acute kidney injury.
Similar to a blockchain, the ledger will only append; enabling data to be added but not erased, creating an immutable record that third parties can use to verify the validity of entries, ensuring that the patient data has not been tampered with.
How It Works
According to DeepMind, it would be wasteful to use a public blockchain to facilitate the processes involved with Verifiable Data Audit, due to the energy expended in mining.
The company states:
“To prevent abuse, most blockchains require participants to repeatedly carry out complex calculations, with huge associated costs…This isn’t necessary when it comes to the health service, because we already have trusted institutions like hospitals or national bodies who can be relied on to verify the integrity of ledgers, avoiding some of the wastefulness of blockchain.”
DeepMind argues that health data, unlike cryptocurrency, doesn’t need to be decentralized but rather “federated” amongst a small group of healthcare providers and data processors. The company cites that by replacing the “chain” component of blockchain with Merkle trees, the overall effect will be similar because every time an entry is added to the ledger, a value known as a “cryptographic hash” is generated. The hash process summarizes the latest data entry as well as all previous values in the ledger, effectively making it impossible for someone to alter it without being detected. A modification of the ledger would not only change the hash value of the entry, but also that of the whole Merkle tree.
What It Will Do
Although not fully complete, DeepMind believes that its proposal can serve as the basis of a new model for storing data and access to the NHS, with implications spreading beyond the healthcare industry. According to the company, Verifiable Data Audit will massively improve the way medical records can be audited by authorized staff at partnered hospitals in real time.
Continuous verification of the system will be achievable, enabling partners to easily query the ledger for specific types of data and their uses, and can even run automated queries that would set off alarms if anything unusual took place. Eventually, partners of the project will be able to allow other parties to verify DeepMind’s data processing, including individual patients or patient groups.
Addressing Privacy Concerns
DeepMind’s option of allowing other trusted sources to verify patient data is key to achieving transparency and trust within the healthcare industry. The company has been working in partnership with the Royal Free Hospital in London to develop a health-monitoring App for patients, called Streams, that is transforming the way nurses and doctors prevent avoidable patient deterioration (and death) in NHS hospitals.
DeepMind has faced criticism from patient groups who claim that the data sharing agreements between the NHS and DeepMind are overly broad and excessively reaching. Documents discovered by New Scientist in April 2016 reveal that the agreement grants DeepMind access to a wide range of healthcare data on 1.6 million patients, including information about people who are HIV-positive as well as details of drug overdoses and abortions. The agreement also includes access to patient data from the last five years.
The agreement clearly states that Google cannot use the data in any other part of its business and is obligated to delete its copy in September 2017, when the agreement with the NHS expires. The data itself is stored in the UK by a third party that is contracted by Google and not located in DeepMind’s offices.
In defense of the broad data collection, Google claims that it is necessary to collect a patient’s health history (five years or more) in order to identify medical conditions that have no separate dataset (as is the case with kidney conditions). According to DeepMind, casting a wide net of data collection supports its desire to help doctors make predictions based on data that is too broad in scope for an individual to assess. Although not planning to automate clinical decisions – such as what treatments to provide patients – DeepMind’s data collection will allow systems to monitor for outbreaks of infectious disease and can even be used to predict when a person might develop bipolar disorder.
Part of DeepMind’s success (or failure) relies on the security of their systems and data to prevent abuse or other forms of malicious intent with patient information. DeepMind co-founder, Mustafa Suleyman, states:
“It’s really difficult for people to know where data has moved, when, and under which authorised policy. Introducing a light of transparency under this process I think will be very useful to data controllers, so they can verify where their processes have used or moved accessed data.
“That’s going to add technical proof to the governance transparency that’s already in place. The point is to turn that regulation into a technical proof.”
Suleyman believes that the audit system could eventually be expanded so that patients can have direct oversight of how and where their data is being used and stored. In order for this to occur, the present security and access concerns must first be solved. Future versions of the system could exist in which patients experience unprecedented levels of privacy and transparency, effectively protecting their information while helping to save lives through preventative identification of health issues.