Data protection versus health protection

On 25th of May 2018 the EU General Data Protection Regulation (GDPR) went live. GDPR is designed to protect EU citizens from organisations using their data irresponsibly. An organisation that fails to follow the new rules can be fined up to €20 million.

There has always been the need to balance data sharing and data protection. It is recognised that data sharing is vital for patient safety, quality and integrated care. Much medical research, including epidemiological research, has relied on the use of patient data without their explicit consent. Indeed, Britain has led the world in epidemiological research. From John Snow and Florence Nightingale (yes, the founder of modern nursing was also a pioneering medical statistician) in the nineteenth century to Austin Bradford-Hill, Richard Doll and Richard Peto in the twentieth century, UK researchers have been using patient data to discover links that have led to major advances in medicine and health care.

Independent researchers also play a role in evaluating health services and holding providers to account. Just as industry relies on statistical analysis to quality manage their production, medical and health services need close monitoring to ensure that they are up to scratch.

Two recent incidents suggest that there has been a lack of such independent monitoring. In one, cervical tests from Ireland were being examined in a laboratory in America. A review discovered some years later that the quality of smear reading in the American lab was not what would have been expected from an Irish lab. As a consequence, several women (currently estimated as 208) had developed cervical cancer that would probably have been prevented if their cytology test had been read properly. This situation only came to light after one such patient won a substantial out of court settlement from the laboratory concerned and refused to accept a gagging clause.  If screening data were routinely linked to data on cervical cancer and made available to researchers, this problem could probably have been spotted earlier before so many women had developed cancer.

The second example is the failure of the breast cancer screening programme in parts of England to comply with national policy. As a result, some 450,000 women were not invited for breast screening aged 68-70 and it is estimated that between 135 and 270 women have already died prematurely.  Approximately 50,000 fewer invitations were being sent out each year than should have been. Out of some 2.5 million invitations sent out each year 50,000 might be hard to spot. But it means that there were 20% fewer invitations being sent to women aged 68-70 than there should have been. And if screening data were routinely linked to breast cancer data and examined each year, one might have spotted the problem several years ago.

Despite the acknowledgement that the balance is off and that some people have become overly concerned about protecting confidentiality, there are no fines for failing to share data and it is not criminal to flatly refuse any requests to allow access to identifiable data.  Most organisations are now terrified of breaking the law by inappropriately sharing potentially identifiable data. However, few are concerned about the health consequences of not sharing such data.

I don’t know of any example where anyone has really suffered from a breach of data confidentiality in medical research, but a lack of balance between data sharing and data protection can costs lives.

You may be interested in the following related content

The views expressed are those of the author. Posting of the blog does not signify that the Cancer Prevention Group endorse those views or opinions.

Subscribe to our mailing list to get updates of new posts…

1 Trackback / Pingback

  1. Computer says “no” – Cancer Prevention Group Blog

Leave a Reply

Your email address will not be published.


*