Google's AI subsidiary was given 'legally inappropriate' access to United Kingdom medical data

Google's AI subsidiary was given 'legally inappropriate' access to United Kingdom medical data

Google's AI subsidiary was given 'legally inappropriate' access to United Kingdom medical data

GOOGLE HAS been accused of having access to sensitive NHS data on an "inappropriate legal basis" as moral panic over the recent WannaCry ransomware attack on the NHS mounts and spreads into entirely separate issues.

Although the work was paused - later to be restated in November 2016 under a new agreement and a commitment from DeepMind to be more open with the public - the initial agreement is under investigation by the UK's data watchdog, the Information Commissioner's Office.

In a statement, the Royal Free has maintained that the Streams app was now in use at the hospital after being devised in collaboration with its clinicians.

Now, it appears our National Data Guardian agrees.

Dame Fiona says in the letter that she informed Royal Free and DeepMind in December that she "did not believe that when the patient data was shared with Google DeepMind, implied consent for direct care was an appropriate legal basis".

The legal basis for the transfer of 1.6 million patent records from the UK's National Health Service to Google Deepmind has been described as "inappropriate" by a leading data protection figure in the NHS.

It is my view and that of my panel that the objective for the transfer of 1.6 million identifiable patient records to Google DeepMind was for the testing of the Streams application, and not for the provision of direct care to patients.

"My considered opinion therefore remains that it would not have been within the reasonable expectations of patients that their records would have been shared for this purpose". In the specific case of the app's testing phase, the data guardian concluded in the leaked letter that development of the technology could not be counted as "direct care" that would allow for patient data to be shared with the company to develop the app. It could also issue an enforcement notice requiring DeepMind to delete or stop using the data.

Nicola Perrin, head of Understanding Patient Data, said such technologies offer potential to provide better patient care, "but there must be appropriate governance so that everyone can have confidence that patient data is being used responsibly". This was to check that the app was presenting patient information accurately and safely before being deployed in a live patient setting.

Speaking to Sky News in the wake of the NHS cyberattack, the clinical lead at Google DeepMind, Dr Dominic King, said patient data was safe with the firm.

"Nurses and doctors have told us that Streams is already speeding up urgent care at the Royal Free and saving hours every day".

In mid-2016, the RFH told the NHS data adviser that the Google app "is not now in use", adding that "only small scale testing of the pre-production prototype version of Streams has taken place to date". "Safety testing is essential across the NHS, and no hospital would turn a new service live without testing it first".

The spokesperson added that the trust "take seriously the conclusions of the NDG, and are pleased that they have asked the Department of Health to look closely at the regulatory framework and guidance provided to organisations taking forward this type of innovation, which is essential to the future of the NHS". "We should also have done more to engage with patients and the public at that time".

Dr Julia Powles, a researcher at Cornell Tech in NY and an expert in technology law, said there were "fundamental errors" at the beginning of the data sharing project and warned that these errors could put other data sharing deals in "real peril".

Related news