On the face of it, this sounds quite worrying:
AN “intelligent” CCTV camera designed to predict when a person may be about to commit a crime is being tested in high streets and shopping centres.The £7,000 device, nicknamed “the Bug”, consists of a ring of eight cameras scanning in all directions. It uses software to detect whether anybody is walking or loitering in a way that marks them out from the crowd. A ninth camera then zooms in to follow them if it thinks they are behaving suspiciously.
The report continues,
The Bug, which has been tested for 18 months in Luton, sounds like a step towards the world portrayed in Minority Report, the 2002 film starring Tom Cruise, in which a police “department of pre–crime” arrests offenders on the basis of what they are about to do.”The camera picks up on unusual movement, zooms in on someone and gathers evidence from a face and clothing, acting as a 24-hour operator without someone having to be there,” said Jason Butler, head of CCTV at Luton borough council. “We have kids with Asbos telling us they hate the thing because it follows them wherever they go.”
However, I think the Minority Report comparison may be a bit misleading; on the face of it, this sounds pretty much what the Royal Academy of Engineering was recommending in its recent report, Dilemmas of Privacy and Surveillance (pdf):
The main aims of camera surveillance are to deter potential crimes, to detect and stop crimes when they occur and to identify and capture the perpetrators of crimes that have already occurred. In order for such aims to be satisfied, it is supposed necessary that ordinary law-abiding citizens will have to endure surveillance. If it were possible for surveillance systems to be developed in such a way that limited this collateral intrusion on privacy, the use of surveillance technology may be more acceptable.One way of attaining this end would be to devise systems that only stepped into action when a suspected crime was taking place. Instead of having operatives scanning hours of mundane footage, feed from the cameras could be examined by an automated system, which alerted the operative when suspicious activities were detected. This would mean that ordinary activities would be effectively ignored, and certainly not scrutinised by an operative. (Section 7.2.5, Anonymous surveillance?)
That is, rather than automatically recording everyone’s every move, the system just kicks in — at least in theory — when it sees something it’s been programmed to regard as suspicious. This may well be, as Stuart Thompson, managing director of Viseum, who make the things, concedes in The Times, something completely innocent —
“It may mistake someone window-shopping for someone loitering, but on every occasion that a crime has been committed the system has always caught evidence,”
but in some ways this seems considerably less intrusive than filming everyone going down the street anyway, which is what happens at the moment. You stop to do some window-shopping and the camera records the event in case you’re intending to put a brick through the window; well, this isn’t noticeably worse than you enter the camera’s field of vision and it records everything you do anyway, window shopping or just walking down the street, which is what happens at the moment. Indeed, in some ways it’s less sinister in its implications, since it makes it far less easy to compile a complete account over everyone’s movements, which is what you could do with the existing camera footage.
It also means there’s less scope for bored operatives — must be a pretty soul-destroying job most of the time, so there’s an obvious temptation to seek your entertainment where you can — to use their cameras inappropriately, and, as the RAE report notes, to allow their biases to intrude:
research shows that stereotypes seem to affect the way that CCTV operators monitor footage, meaning that surveillance systems have a more negative effect on those who tend to receive poorer treatment in other areas of life. Automated surveillance systems could instead be programmed on the basis of fact rather than prejudice. The algorithms used for identifying suspicious behaviour could be open-source and open to public review to avoid prejudices creeping into the system. Anonymous surveillance could therefore offer a much fairer and therefore more effective means of watching over public spaces.
The research to which they allude must, I think, include a study, The Unforgiving Eye: CCTV surveillance in public space by Dr Clive Norris and Gary Armstrong of the Centre for Criminology and Criminal Justice at Hull University (the link usually given for this is no more, but I’ve resurrected it from the Web Archive).
In this study,
Researchers “shadowed” camera operators in 3 major areas covered by a total of 148 cameras. They took details of ” “888 targeted surveillances” which resulted in just 12 arrests.
- 40% of people were targeted for “no obvious reason”, mainly “on the basis of belonging to a particular or subcultural group”. “Black people were between one-and-a-half and two-and-a-half times more likely to be surveilled than one would expect from their presence in the population”.
- 30% of targeted surveillances on black people were protracted, lasting 9 minutes or more, compared with just 10% on white people.
- People were selected primarily on the basis of “the operators negative attitudes towards male youth in general and black male youth in particular. …if a youth was categorised as a “scrote” they were subject to prolonged and intensive surveillance.”
- Those deemed to be “out of time and out of place” with the commercial image of city centre streets were subjected to prolonged surveillance. “Thus drunks, beggars, the homeless, street traders were all subject to intense surveillance”.
- “Finally, anyone who directly challenged, by gesture or deed, the right of the cameras to monitor them was especially subject to targeting.”
(The term ‘scrote’, as I recall, originated in Hill Street Blues; somehow I can just hear the bored operative daydreaming about playing a more exciting role in a more interesting cop show than the one in which he’s actually involved). All this resulted, remember, in only 12 arrests, 7 related to fighting and 3 to theft, and apparently only necessitated calling the police 45 times. One suspects that the 7 arrests relating to fighting involved less than 7 separate fights, of course.
According to the study,
The low level of deployment was accounted for by 2 factors: that CCTV operators could not themselves intervene nor could they demand intervention by the police. This was compounded by the fact that suspicion rarely had a concrete, objective basis which made it difficult to justify to a third party such as a police officer why intervention was warranted.
I’m rather surprised, though, that the ‘Bug’ is as effective as the manufacturers claim. Hang on, let me rephrase that; I would be rather surprised, were it as effective…., since I’m not at all surprised when people trying to sell this sort of kit, be it the manufacturers or the government, make unfounded claims about it. The RAE seem to think that
Algorithmic processing of images by computers for this purpose has so far been less than successful but research should be focussed on how to improve it. … Research into this technology should be encouraged and intensified, as should study into the way people behave in public spaces, in order to characterise and distinguish between suspicious and acceptable behaviour more accurately. Although doing this may be very difficult, exploring the possibility would be valuable. (p 42)
which sounds like a rather less advanced technology than that described by the manufacturers (not a surprise, I suppose), who say on their website that
The key message for this technology is when the potential wrong-doers are thinking of doing wrong they are never quite sure whether an operator or the intelligence is watching them, but either way they will be worried that if they do it they will still get caught.
That, I think, is what the chap from Luton actually meant when he talked about
kids with Asbos telling us they hate the thing because it follows them wherever they go.
Actually, it probably doesn’t do anything of the sort, but they’re worried it might.
So, possibly not as scary as at first it might sound — doubtless the kids in Luton with ASBOs will work that out soon enough, too, but that’s another story — and, in fact, potentially actually an improvement on the present state of affairs, overly-well-supplied with cameras as we are. Still doesn’t, though quite explain why, according to The Times report,
According to the information commissioner, there are now 4.2m cameras in Britain. New research by J P Freeman, a security consultancy based in Connecticut, shows that Germany, the country in Europe with the next highest number, has just 1.6m. The whole of western Europe excluding the UK has 6.5m,
and yet all these cameras don’t seem to make the place any safer or less crime-ridden than other comparable countries.
My explanation for the phenomenon is that successive governments here have shown themselves to be absolute push-overs when confronted with salesmen offering hi-tech solutions to whatever’s upsetting people.
‘Something must be done,’ complains the public — who, in point of fact, generally aren’t so much worried about what’s happening in their neck of the woods as they are that it might get like what they’ve read in the papers it’s like elsewhere (and the papers never sensationalise, of course).
‘Right, then, we’ll do something.’ replies the government. In fact,
I will do such things,–
What they are, yet I know not: but they shall be
The terrors of the earth.
and is then, of course, easy meat for someone peddling a system — be it ID cards or CCTV cameras — that’ll do exactly that. Honest.