Inquiry into Surveillance and Data Collection
Foundation for Information Policy Research (FIPR) Response to the House of Lords Constitution Committee
The Foundation for Information Policy Research is an independent body that studies the interaction between information technology and society. Its goal is to identify technical developments with significant social impact, commission and undertake research into public policy alternatives, and promote public understanding and dialogue between technologists and policy-makers in the UK and Europe. We wrote the report for the Information Commissioner in 2006 on ‘Children’s Databases – Safety and Privacy’, which concluded that the proposed sharing of information on children was unsafe and in several respects unlawful. We have also been involved for many years in medical privacy, surveillance, forensics, and the economics of security. We would like to make the following points. 1. The hard question is this: is there anything that the Government may not do to ‘catch Osama’ or ‘save Maddie’? 2. There are actually some answers: the Government may not torture people, or go for more than five years without an election. 3. Unfortunately, the ban on torture is not backed by a prohibition on illegally obtained evidence. Such a law would also help deter unlawful surveillance. But the Government’s response to illegal use of evidence has been to legalise it retrospectively (e.g. wrongly retained DNA samples). 4. Also, the ‘modernisation’ of elections has seen a huge rise in corruption – not because of ‘online’ elections but because postal ballots were made easy, and post (whether paper or electronic) makes surveillance by vote-buyers easier. 5. So is technology changing anything? Well, the huge reductions in the costs of data acquisition, storage and processing alone would cause more personal data to be collected and used. Are we headed for the ‘age of perfect memory’ in which forgetting is difficult or impossible? 6. There are universal issues and UK issues. The interaction between technology and privacy is of the first kind. Human intuitions are not a perfect guide to maintaining privacy, and technology continually magnifies the risks. Internet postings can’t be unposted; the social cues that constrain face-to-face contact are absent; and search engines let us find information that would previously have remained buried in local files. As more embarrassing material about more people comes online, we need more tolerance, better regulation, or both. 7. Commercial data use also raises universal issues. There are strong incentives for firms to collect more data so they can price discriminate, target communications
and assess markets. As the costs of collecting this data fall, the incentives for commercial surveillance become ever stronger. 8. Governments have been much slower to innovate, and the quality of policy debate is poor; few senior politicians are IT-literate, and traditional NGOs don’t understand IT policy. So change has often been driven by scaremongering. From about 1995 to 2001, empire-builders’ favourite mantra was child protection; since 9/11 terrorism and hate speech have been added. 9. But the underlying issues are not new. The Home Office slogan ‘If you’ve nothing to hide, you’ve nothing to fear’ does not justify blanket Internet surveillance any more than it justifies warrantless wiretapping or room bugs. Most people have things they wish to keep private at some time in their lives. So long as there are prejudices, this will continue – online or offline. 10. A serious case of inept use of surveillance data was Operation Ore, where over 4,000 men were raided on suspicion of child pornography, and it turned out that half of them were simply victims of credit card fraud. Thanks to the scaremongering, prosecutions continued even after problems started to emerge; many families were damaged and over 30 men killed themselves, some of them innocent. Innocent men were also driven to plead guilty after evidence critical to their defence was withheld. This disaster occurred in great measure because the state sidestepped a number of the controls we have evolved since the 13th century. The lesson is that technology should not lead us to abandon constitutional controls, but to reassert and strengthen them. 11. A further surveillance issue is equality of arms in both criminal and civil cases. The police have little difficulty getting CCTV files or ANPR data to prove your guilt; you have much greater difficulty getting them to prove your innocence. A bank can get CCTV images to prove that you made a disputed card transaction; you cannot get images to prove you didn’t. 12. So how can we go about refreshing constitutional ideas for the information age? 13. At the level of philosophy, human rights are most commonly founded on the principle of human dignity. Pervasive surveillance will undermine personal dignity, and ultimately support for human rights. 14. There are other theories. A communitarian view is that many public goods depend on social capital – the networks of mutual obligation, reciprocity and trust that exist in society. Diminished social capital increases crime; damages child development; and particularly harms the poor, who have less human or financial capital as a backstop. Social capital is generally built by local action and diminished by central action: involving parents in running a school is vastly preferable to using a government computer as their surrogate. 15. A third view is that privacy is an internalised version of territoriality and serves to order society. This comes from the substantial research literature on the economics of privacy1, in which central problems are why privacy remains more of a luxury good than a fundamental right, and why people do not complain more about privacy erosion. We tend to the view that they are starting to, as awareness spreads from the policy and technical elite to the masses.
See for example http://www.heinz.cmu.edu/~acquisti/economics-privacy.htm
16. Both commercial and government surveillance impose significant costs on citizens. The former leads to social costs associated with ex-directory numbers, call screening, etc; the latter erodes trust in public-sector professionals as well as imposing direct compliance costs. 17. Yet in the UK, all this is ignored. The NHS is trying to centralise all medical records; other governments merely encourage hospitals and GPs to exchange data when needed. DfES plans to share data on children between the NHS, police, school and social work systems. We have a huge ID database project. These ventures appear to be driven less by any clear vision of how to improve services, as by a desire to appear ‘modern’ (and in the case of ID, ‘tough’). The current Whitehall status game seems to be ‘my database is bigger than your database’. 18. The FIPR report on Children’s Databases found that the proposal to share most public-sector data on children was contrary to European human-rights law and data-protection law. The Data Protection Act does not implement European law properly in this respect. For the analysis and argument we refer the Committee to our report, especially chapter seven2. In summary, sensitive data can only be shared with consent or by law specific enough for its effects to be predictable by data subjects. Many of the laws relied on are so broad that their effect is not predictable. The consent provisions are also defective. For example, the Gillick and Axon cases established that when children aged 12–16 are asked for consent, their parents should normally be involved; the DfES has rewritten this into ‘Frazer competence’ (not even spelling Lord Fraser’s name correctly) and a doctrine that children should consent on their own – typically in schools, where they are expected to obey adults. 19. In addition to the proposed systems, some existing initiatives – such as the recent Ofsted study of 10-year-olds – appear to be clearly unlawful. 20. These problems are not limited to children’s databases. For example, the Wilkinson case has shown that people who refuse consent to data sharing may be denied NHS treatment. It can be strongly argued that such consent is coercive and the new NHS databases are therefore unlawful; this is another point of conflict between UK Government policy and European law3. 21. Our existing constitutional rights are being violated but there is no enforcement that works. There is little public action, as the Information Commissioner’s Office was designed to be weak: he is highly resource-constrained; he does not support a rights-based approach; and he will only enforce UK statute law, not European law. There is little private action, as UK rules on costs mean that individuals or NGOs who sue risk bankruptcy. This is in sad contrast to the vigour of the USA. 22. In conclusion, our government has rushed to embrace surveillance without really working out what the technology is good for. In the process it has found ways to sidestep or ignore constitutional restrictions, when these restrictions actually need
‘Children’s Databases – Safety and Privacy’, Foundation for Information Policy Research, Nov 2006; at www.ico.gov.uk and www.fipr.org 3 See Professor Douwe Korff’s oral testimony to the Health Committee Inquiry into the Electronic Patient Record
to be refreshed – and worldwide may increase slowly in the medium term, for example with a broader digital statute of limitations. 23. In the short term, Britain’s basic provisions are fairly sound: the problem is enforcement. In the absence of any realistic prospect of public enforcement, we urge the Committee to consider options for private action. A change to US rules on costs might just be the innovation that reinvigorates the British constitution. Then NGOs like FIPR would be better able to take action against the most egregious violations of privacy and other rights. Professor Ross Anderson Dr Ian Brown Dr Richard Clayton Professor Douwe Korff Professor Martyn Thomas June 8th 2007