Leading cyberlaw academic calls for review of CCTV & AFR implementation
August 2018 by Marc Jacob
As the use of Automated Facial Recognition
(AFR) continues to stir up public debate, a leading cyberlaw academic has
called for an independent review of how the UK implements CCTV analytics
technology - and believes it’s the only way forward if such technology is to
be used legitimately to tackle crime and terrorism without alienating the
public. Andrew Charlesworth, Professor of Law, Innovation & Society at the
University of Bristol, points out that the law is lagging well behind
developments in surveillance technologies such as AFR, which is already
deployed in some areas of the UK and currently the subject of two court
cases.
In a white paper commissioned by Cloudview, Charlesworth highlights the
recently published Home Office Biometrics Strategy (2018)2 as an example of
the Government’s reluctance to grasp the regulatory nettle and provide a
detailed strategy. The document has been criticised by regulators and
campaigning groups alike and, he says, created a policy void in which
campaigning groups are driving the debate in a primarily negative direction.
Charlesworth believes technology is both the problem and the solution.
Citing research which highlights that market forces, social norms and
technology architecture can all be used alongside the law as part of an
effective regulatory strategy 3, he says that ’architecture’ offers a
crucial and effective way to regulate CCTV analytics. In other words,
technology companies could hard-wire the desired regulatory outcome into the
CCTV systems themselves.
"The issue with the use of facial recognition technology is the systems
which underlie it, such as police databases," Charlesworth explains. "We
need to design the actual technology so that it controls the flow of data
and how it is stored and deleted. This should be reliable, transparent and
in full compliance with data protection legislation, because images are just
another form of personal data. In my opinion, this is crucial if analytic
technologies are to be accepted by the public as legitimate security tools
which will help to keep them safe without breaching their human rights."
"Right now we are seeing case after case of biometric technology being used
as the proverbial sledgehammer to crack a nut. The Government is too timid
to say that biometrics is a positive thing if used in the right way, and so
the debate is being brought to the public’s attention by campaign groups who
for obvious reasons aren’t looking to present the full picture," adds James
Wickes, CEO and co-founder of cloud-based visual surveillance specialists
Cloudview.
"The public are rightly reluctant to hand over their digital data, but the
solution is not to ban the technology but to ensure that it’s used properly.
This means limiting use to where it’s genuinely needed, and then having
effective processes such as privacy impact assessments which are designed
into the technology and properly tested, so that our democratic freedoms and
human rights aren’t abused."
The white paper from Charlesworth makes a series of recommendations for a
more constructive approach. These include:
* Ensuring that existing general legal regulation is effectively
overseen and enforced, with personal image data treated in the same way as
other personal data
* Taking a holistic view of CCTV and CCTV analytics use to ensure that
proposed regulation is flexible enough to encourage innovation whilst being
capable of practical implementation and, where necessary, enforcement
* Considering positive uses for the technology to finding a balance
between benefits and risks, whilst encouraging users to self-regulate
through appropriate technology architectures.