Now Reading
Outsourced AI and deep learning in the healthcare industry – Is data privacy at risk?
0

Outsourced AI and deep learning in the healthcare industry – Is data privacy at risk?

Posted by Zenobia HegdeAugust 23, 2017

As emerging technologies, artificial intelligence (AI) and deep learning are proven to provide powerful business insights. This is especially true for the healthcare industry, says Jonathan Martin, EMEA operations director at Anomali, where Freemium AI and machine learning software packages such as theano, torch, cntk, and tensorflow can effectively predict medical conditions such as cancer, heart attacks, and many other image-based diagnoses.

The integration of AI and deep learning into medical practices is, therefore, an inevitable and critical next step for the healthcare industry, although such an endeavour is not without its challenges.

One of the most pressing issues that prevents organisations from taking full advantage of these technologies is a lack of technically trained staff. There are many cybersecurity professionals who could likely fill the demand for technical talent, but with an already limited supply of professionals in the cybersecurity industry itself it is unlikely that supply will meet demand anytime soon.

To further complicate matters for the healthcare industry, implementing these technologies requires access to Personally Identifiable Information (PII), which is some of the most targeted data in cyber-attacks due to its sensitive and therefore lucrative nature.

The National Health Society (NHS) elected to circumvent issues of staffing and data privacy by partnering with Deepmind, a company acquired by Alphabet/Google. This gave Deepmind access to 1.6 million medical records, which included information on blood tests, medical diagnostics, historical patient records, and even more sensitive data such as HIV diagnosis and prior drug use. Whether or not this was an appropriate risk has been the source of some controversy in the industry.

As we saw from the WannaCry attack on the NHS, a cyber-attack can have devastating effects on the industry. However, this should not stop organisations from sharing and advanced analysis of information. AI and other technologies are essential to the progression of healthcare, and hiring technical talent is integral to fully harness the power these hold in a secure way that eliminates the need to outsource. Organisations should also maintain consistency of best effort practices in order to minimise an organisation’s risk.

One of these best practices includes redacting all Personally Identifiable Information. Any organisation outsourcing data should instead use pseudonyms, where the unique identifier and the PII are held only by the trusted entity. Semi-sensitive information that would have value to the machine learning model should also be removed. A patient’s geographical location is a perfect example.

This data may be a powerful indicator of an illness, but the raw data could be used to reverse-engineer the PII of a given patient. Discarding such information is an effective trade-off between empowering the AI’s prediction power and protecting patient confidentiality.

These best-effort strategies can help mitigate against most concerns, however, this is not a foolproof method of insuring confidentiality. At the moment, it isn’t possible to guarantee that AI can’t reconstruct your PII. In one study by CMU, researchers found social security numbers were surprisingly predictable and that the AI algorithm could usually reconstruct a social security number from information such as birth date and gender.

In the future, organisations may look towards more advanced technology to secure efforts of outsourcing private data. Recent developments in federated learning could increase flexibility and allow groups to store data on premise. Another related technology of homomorphic encryption is also being developed. With homomorphic encryption, the computations occur on encrypted data without ever having to decrypt the data, which significantly reduces the security concern.

At the moment, we are still years away from technology solving the problem of data privacy directly. However, the promise of the benefits from AI are too great for the healthcare industry to wait. In the near future, industries must strike a balance to protect citizens and prevent unnecessary vulnerabilities.

The author of this blog is Jonathan Martin, EMEA Operations director at Anomali

Comment on this article below or via Twitter: @IoTNow OR @jcIoTnow

About The Author
Zenobia Hegde

Leave a Response