Extending the state’s ability to track you via the millions of CCTV cameras that watch our streets every day would generally be perceived as an erosion of privacy.

In fact, the opposite could be true, says one of the men building the computer systems to automate CCTV monitoring in cities around world.

Cyrille Bataller is managing director with the Accenture’s Emerging Technology group, which recently tested a video analytics system on CCTV around the city state of Singapore, as part of the government’s Safe City programme.

The system automatically monitored video feeds, creating anonymised data about the movements of people and traffic across the city, monitoring the state of the streets, and triggering alerts for public disorder, flood risks and other incidents.

But while some may have concerns about authorities mining data about citizens’ daily lives in this way, Bataller argues that removing humans from routine monitoring of CCTV in this way reduces the potential for unwarranted surveillance.

“In a video analytics solution a computer doesn’t have any bias,” he said, saying the argument could be extended to using facial recognition technologies to pick out select individuals from a CCTV feed.

“If there’s an operator watching a camera to look for individuals of interest, he’s got a watchlist but he can’t help but recognise a politician, a movie star, his sister-in-law. Even if they’re not part of his list he will see them and that’s where the privacy is challenged,” he said.

“[With an automated system] even if the face is a well-known public figure, if it doesn’t match any of the faces in the watchlist or in the person’s of interest list they would be ignored. You increase data privacy by removing bias through automation.”

How is facial recognition tech being used?

Across the world 25 law enforcement agencies are trialling facial recognition software from NEC called NeoFace. In the UK Leicestershire Police are using it to run facial recognition against certain individuals captured on video at crime scenes.

NeoFace enhances images of faces and can generate a front-facing photo from a series of images of a person caught at different angles to improve the likelihood of a match, keeping a log of changes it made. These images can then matched against a police force’s database of digital mugshots to identify possible suspects.

In controlled conditions facial recognition systems can identify an individual more effectively than a human. Accenture’s Bataller cites a study by the US National Institute of Standards and Technology in 2007 that found facial recognition software could determine whether two photographs were of the same person more accurately than a person.

In the real world that accuracy falls, and to get a system able to effectively profile faces in a moving crowd currently requires a relatively predictable environment, as well as specialist camera equipment.

“When you’re in an indoor environment where lighting is more controlled, for instance a shopping centre or a metro station, it’s easier, even if there’s a large moving crowd,” said Bataller.

“You do need reasonably frontal cameras. Cameras that are on the ceiling are typically not very useful for that because you get a very skewed angle on the face. So you do need a specific set-up, but when you have a specific set-up you have very good accuracy.”

Facial-recognition systems are already in limited use at European airports today, checking the identity of passport holders at automated gates at Heathrow Aiport in the UK and Schiphol Airport in Holland.

However automated surveillance using existing CCTV cameras, which often look down on street level, is generally limited to what Bataller calls anonymous monitoring, such as counting vehicles or pedestrians to learn more about the flow of people or traffic through an area.

This data can help those monitoring the systems understand “is there a traffic jam?” or “has there been a sudden slowdown in traffic that might indicate an incident?”.

“You can even recognise age, gender, emotion anonymously. So you don’t know who the people are but you can recognise certain characteristics of those people,” said Bataller.

“With more and more sophisticated and accurate and diverse video analytics algorithms at your disposal you can turn video footage into a much richer set of metadata, which is both structured and unstructured, with certain images from video clips of interest.”

Creating a metadata store describing the points of interest from the video allows more raw footage to be disposed of, he said.

“You can use video analytics to enhance privacy around CCTV. You automate the observation and throw away the raw footage and work off the anonymous metadata statistics.”

Why automating monitoring is a privacy threat, not boon

Contrary to Bataller’s argument, reducing the role people play in monitoring any human activity does not in itself result in improved privacy, said Matthew Rice, advocacy officer at Privacy International.

“This kind of idea is held in a lot of places, including one we’re currently focusing on where the intelligence agencies say the collection of communications isn’t an interference until somebody opens and reads it,” said Rice.

“We consider that to be misguided, an incredibly dated interpretation. Any kind of system that threatens privacy is an interference.”

He references the bulk data collection programmes run by spy agencies that capture data relating to millions of individuals not directly of interest to the security services, such as the UK programme Tempora. These programmes show how much revealing information about an individual can be extracted and stored without anyone looking at the raw data.

“Human contact or human input into these systems is a secondary consideration after a while because of the power of the analysis going on,” said Rice.

Alongside state surveillance there, is also money to be made by commercial organisations from using these video tracking technologies as broadly as possible.

Research suggests that data generated by facial tracking technology could be worth $6.5 billion by 2018, as retailers and advertisers gather information about individual and group buying patterns and gain better understanding of our behaviour by watching our emotional reactions.

Advertisers hoover up as much data about us as they can from what we do online, and given the opportunity will they want to track us in equal measure in the real world?

“There certainly seems to be a deterministic perspective that more data is better. That we can’t go wrong with having a couple of thousand more data points because we’ll only get better,” said Rice.

Accenture posits that legislation will protect people from personal intrusion, at least from commercial bodies.

“Robust privacy laws, like those in Europe and those adopted over the past three years in Asia and Latin America, create a sufficient burden (requiring to provide notice and obtain consent for use of sensitive personal data) to ensure that anonymized data is the only practical answer for most video analytics use cases,” it said in a document Video Analytics and Privacy: A Way Forward.

But Rice thinks it is misguided to unquestioningly accept our privacy will be protected from these new technologies by laws that in some cases were drawn up a decade or more ago. As an example of how outdated some of the safeguards stipulated by privacy laws can be, he pointed to recent examples of anonymity being stripped away from data by combining datasets.

“We have to return to these robust laws that we say we have and ask whether they remain robust in the face of new technology?”

Automatically subscribe to the TechRepublic UK newsletter.