Data companies are offering to mine troves of personal and public information to help local officials in the UK identify people who are struggling in the aftermath of the coronavirus crisis.
Groups such as Experian and CACI, which assemble detailed pictures of an individual’s life by aggregating vast amounts of online and real-life information, and data analysis companies such as Xantura say they can help cash-strapped local governments target their resources towards those most in need.
But privacy campaigners worry that the practice of scoring and mapping individuals with tools that are opaque and potentially intrusive will continue after the pandemic passes.
“Nothing comes for free. Local authorities are lost and underprepared and [these services] entrench the importance of data sets in government,” said Silkie Carlo, director of campaign group Big Brother Watch. “I think the crisis that we are in will catalyse surveillance and data gathering beyond belief at the central government level and the local government level,” she added.
Xantura has teamed up with CIPFA, the accountancy body for the public sector, to deploy a £15,000 tool that uses local authority data to predict a future need for financial support and social care.
The aim is to move beyond assigning risk just based on an individual’s health and also include those who might be at greater risk of domestic violence, marital breakdown and financial difficulties, said Xantura’s chief executive Wajid Shafiq.
As part of the project, local authorities are transferring troves of their data to Xantura, including social care, benefits and revenue data.
Xantura’s software runs the data against a set of risk factors and demographic data, as well as the NHS’s “shielded list” of individuals believed to be most at risk from Covid-19 complications, scoring households according to their risk profile. Mr Shafiq said the company was mindful of privacy concerns: only social care teams with the appropriate authority could drill down to identify specific individuals deemed ‘at risk’ to provide services to them, he said.
Coronavirus has been a “significant accelerant” to linking data sets in order to overcome silos, he added.
Pye Nyunt, head of the insight and innovation team at the London borough of Barking & Dagenham, said Xantura’s system had helped it “to identify over 1000 households who have significant debt . . . of over £1,000, are over 65 and live alone”. He added that in the first week of using the tool, more than 50 households had been referred to cross-council services for further support.
Xantura and CIPFA said that the system could be extended to include coronavirus track-and-trace data to highlight hotspots that required additional safety measures.
But Lina Dencik, of Cardiff University’s Data Justice Lab, which last year published a report examining local authorities’ use of data analytics, urged caution.
“With coronavirus there seems to be a trend towards accepting data-driven solutions. It seems to override the sort of scepticism that we have had in place around some of these technologies before,” she said. “You are capturing only very specific things that are quantifiable. A major issue is that our lives are so much more complex than that.”
Britain’s data watchdog is investigating the data economy and in 2019 said that while data-driven technologies created “enormous opportunities” they also presented “some of the biggest risks” related to the use of personal data.
Besides privacy concerns, there are worries about whether a dependence on technology will permanently change how citizens are viewed. “There are new ways of working that are being established,” said Ms Carlo. “It will entrench the lens through which to see residents and organise them and once you’re in the pattern of doing that, then that tends to stick.”
In recent years local authorities have increasingly deployed machine-learning tools — at times combined with demographic data acquired from private companies. An investigation by The Guardian in 2019 found that one in three councils used computer algorithms to guide decision making, from how to distribute benefits, to identifying children at risk of parental abuse.
Such tools did not replace existing functions, explained Steve Liddicott, of the British Association of Social Workers, who works as a senior manager in social care and has previously teamed up with Xantura. “The systems are doing something that’s very difficult for individuals to do. There is not an equivalent process that is normally done by a person.”
In April, CACI began offering access to its databases to public sector organisations for a three-month period, including postcode level demographic data.
Its health and wellbeing data set, for instance, classifies Britain’s population into four groups according to overall health status and 25 further segments which include designations such as “perky pensioners” or “struggling smokers” or “perilous futures”. The latter, for instance, refers to “young” neighbourhoods with “many children”, primarily living in social rented housing.
Experian, meanwhile, has rolled out a demographic segmentation tool, dubbed “Experian Safeguard”, which it has offered for free to local councils, NHS trusts, fire and police services as well as charities. Such tools are primarily used by private companies to target consumers to market products.
Experian’s flagship Mosaic postcode demographics tool — which arranges the UK population into groups according to factors such as lifestyle and debt levels — has been deployed at a number of local authorities, including Leeds city and Stockton-on-Tees borough councils, according to data gathered by Tussell, a data provider which tracks UK government contracts and expenditure.
The Safeguard data are geographically aggregated and “contain no identifying information”, Experian said. It was used in one example to increase resources to food banks in areas which have high numbers of vulnerable households.
“There is a real question about the extent to which consumer data should be allowed to be mixed with what we might think of as a kind of citizen data, or data that’s held by councils and local authorities,” said Ms Dencik, who fears that dependence on such data presents a “reductionist view” of the population.
This, she said, could exacerbate bias within the system, particularly when it is unclear how models created to designate individuals as risky weigh different factors such as income or debt.
“It has been a serious concern for us for some time. It’s profiling and stereotyping on an enormous scale,” said Ms Carlo of Big Brother Watch. “For most of us, it happens outside our control.”
Mr Shafiq acknowledged the concerns about privacy and potential bias. “We don’t use protected characteristics in the modelling,” he said, referring to attributes such as religion, ethnicity or sexual orientation.
“It’s not up to us to decide what to do, it is up to the councils and I have to say the public sector is very diligent about, and sensitive to, the ethical use of data,” he added.