Public Trust and Data Sharing practices: When not all data subjects are made equal
On the 3rd of October, the campaign organisations the3million and Open Rights Group lost the high court challenge (but planning an appeal) over the Immigration Exemption clause in the Data Protection Act which came into force last year.
The controversial immigration exemption is a section of the UK’s newly introduced Data Protection Act 2018; a national law which adapts the EU’s GDPR and updates the Data Protection Act 1998. The exemption is being challenged on the grounds that it breaches fundamental rights as it denies certain data subjects the right to transparency and access to their personal records. According to the Open Rights Group,
The exemption has never existed in UK law before its introduction last year. It allows data controllers, including public bodies such as the Home Office or a school or hospital and private bodies such as employers or private landlords, to restrict access to personal data if releasing the information would “prejudice effective immigration control.”
The legal challenge has revealed that the UK government has used this exemption in response to 60% of its immigration-related data requests since the beginning of 2019 while it was further confirmed that the individuals affected are not being informed that the immigration exemption is applied hence further obstructing their ability to appeal. It is also worth noting that the Home Office has a high error rate so appeals – which the applicants usually win but not before going through a lengthy, costly and highly traumatic process – are common place.
The developments described above are particularly interesting considering that the Data Protection Act 2018 is the cornerstone of the UK’s AI NHS strategy which is meant to provide the necessary assurances to the British public that it deserves its trust on the handling of its most sensitive personal data.
Public Trust
The issue of public trust appears to be one of the biggest concerns in policy circles. As the House of Lords report on AI in the UK (2018) stressed, “Maintaining public trust over the safe and secure use of their data is paramount to the successful widespread deployment of AI and there is no better exemplar of this than personal health data” (93).
In the same report, Dame Fiona Caldicott, the National Data Guardian, warned the select committee
What we have not done is take the public with us in these discussions, and we really need their views. What is the value? Are they happy for their data to be used when it is anonymised for the purposes we have described? We need to have the public with us on it, otherwise they will be upset that they do not know what is happening to their data and be unwilling to share it with the people to whom they turn for care. That is the last thing we want to happen in our health service (91).
Ill-thought and costly scandals such as care.data (Carter et al 2015, Sterckx et al 2016) and the Deepmind “fiasco” (Powles and Hodson 2017), as the Lords report characterises it, demonstrate that things can indeed go wrong.
As such, public trust is an issue that the government and institutions such as the NHS stress they are taking very seriously. Concerns over privacy or safety are being appeased by a commitment to ethical values and principles inscribed in documents such as the Code of conduct for data-driven health and care technology. The code has been drawn with the help of industry, academics and patient groups and aims to “encourage” (its recommendations are not legally binding) technology companies to meet a gold standard set of principles to protect patient data to the highest standards in order to capitalise -ethically and responsibly- on the opportunities of AI.
The 10 key principles that the code outlines are underwritten by the Data Protection Act 2018, and as the introduction asserts:
People need to know that their data is being used for their own good and that their privacy and rights are safeguarded. They need to understand how and when data about them is shared, so that they can feel reassured that their data is being used for public good, fairly and equitably.
However, what are we, to make of such proclamations in light of the legal Immigration Exemption as outlined above which indicates that not all data subjects are equal and that perhaps not all ‘people’ are worthy of the same consideration?
The exemption comes to be added to a long and ongoing list of scandals and controversial cases that have coloured the experience of UK immigrants and minority communities with a deep distrust towards the British state and its handling of citizens’ personal data. From the long and still unfolding Windrush scandal, to revelations that there is a secret database run by counter-terror police across the UK and accessed by all police forces and the Home Office which contains personal information of individuals referred to the government’s anti-radicalisation programme Prevent without their knowledge, to reports that the Department of Education is sharing children’s data with the Home Office and immigration enforcement services, to the ongoing struggle of healthcare practitioners and advocacy groups to stop the sharing of data with the Home Office , these controversial policies demonstrate that the hostile environment – the systematic and coordinated policy strategy by the UK government to make the lives of Others from difficult to impossible – is not only alive and kicking. It is also enabled by the ambitious data sharing strategy of the UK government and its piecemeal and intimidating implementation.
In a recent report, the National Audit Office recognised the role that the government’s data strategy played in events such as the Windrush scandal, and added that “despite years of effort and many well-documented failures, government has lacked clear and sustained strategic leadership on data”. This report was followed by an open letter published by civil society groups, including the Open Data Institute (ODI), which warned against serious problems with the government’s collection, use and sharing of data. The letter urged the government to adopt a “transformative” data strategy which will focus on “earning the public’s trust”. As it states:
Debate and discussion about the appropriate extent of using citizens’ data within government needs to be had in public, with the public. Great public benefit can come from more joined-up use of data in government and between government and other sectors. But this will only be possible, sustainable, secure and ethical with appropriate safeguards, transparency, mitigation of risks and public support.
Distrust
Distrust by minority, migrant and vulnerable groups of state intervention and sharing of personal information is not a new phenomenon (Hoopman et al 2007, Lane and Tribe 2010, Jutlla and Raghavan 2017). Yet this does not make it any less pressing. This problem of distrust cannot be remedied by rather condescending statements of “people need[ing] to know” and “need[ing] to understand that what is done is for their own good” so that they can feel reassured, as the quote from the code put it above, or by wishful proclamations of ‘not leaving anyone behind’, as the NHS proclaims.
Such responses patronise these communities and the public at large, as they wilfully ignore the fact that these people and communities distrust precisely because they know the real and painful impact these data sharing practices can have on their lives, rather than in spite of it. Unless the voices of these communities are sought and sought listened to, and these groups become not just receivers but central in the forming of policy, these public promises and proclamations of adopting an ethical approach that will guarantee public trust risk, at the very least, ringing hollow.
——UPDATE —–
On 19 March 2020, and with the Covid-19 pandemic taking hold in the UK, the Home Office made public the independent inquiry Windrush Lessons Learned Review led by Wendy Williams. The report was a scathing indictment of the Home Office’s handling of the Windrush generation with Williams writing:
While I am unable to make a definitive finding of institutional racism with the department, I have serious concerns that these failings demonstrate an institutional ignorance and thoughtlessness towards the issue of race and the history of the Windrush generation within the department, which are consistent with some elements of the definition of institutional racism. (7)
Unfortunately, the inquiry was published when the Covid-19 virus was spreading in the UK (the official lockdown was announced on the 23rd of March) hence it received little attention. However, with digital tracking being touted as a key strategy to combat the pandemic, on the one hand, and with reports emerging that the virus appears to disproportionately affect BAME communities, on the other, the synergies between public (dis)trust and data sharing practices especially where minority, migrant and vulnerable groups are concerned warrant close, careful and critical consideration.[1]
[1] https://www.independent.co.uk/news/uk/home-news/coronavirus-undocumented-migrants-deaths-cases-nhs-matt-hancock-a9470581.html