New AI software can turn regular security cameras into COVID-19 policy enforcement points

Now being trialed in Georgia smart city Peachtree Corners, the new tech can pick up on people standing too close together and detect whether someone is wearing a mask.

covid-compliance-cameras.jpg

Image: Cawamo

Peachtree Corners, GA, a city northeast of Atlanta known for its pioneering use of smart city technology, is adding a new tool to its lineup: Artificial intelligence (AI)-powered software that gives security cameras the ability to tell if people are violating COVID-19 regulations. 

The software, created by UK-based CCTV tech company Cawamo, can be used on any security camera, meaning there's no need to buy new hardware in order to use it. Instead, the AI monitors live feeds and does its processing from the cloud, or with an optional on-site hardware device that relays data to Cawamo's cloud platform for a second round of analysis before presenting results in a web portal or mobile app. As part of its COVID-19 initiative, Cawamo is offering municipalities free COVID-19 monitoring with the purchase of onsite Cawamo equipment. 

SEE: COVID-19 workplace policy (TechRepublic Premium)

Cawamo's software is capable of performing a number of different analyses on camera feeds, like fire and smoke detection, parking compliance, people counting, and even what it calls "suspected robbery detection." 

As for COVID-19 compliance, Cawamo's software aims to do two things: Detect crowds to determine if people are violating social distancing requirements, and pick out individuals who aren't wearing required face masks. Cawamo said its software is able to do all of its COVID-19-related tasks without facial recognition, and that no personal identifiable information is used in the process. 

Peachtree Corners is rolling Cawamo-powered COVID-19 policy enforcement to its cameras at city hall at first, which Peachtree Corners CTO and assistant city manager Brandon Branham said shows how technology can solve problems without violating privacy.

"Our highest priority, and the aim of a smart city concept overall, is the safety and security of our residents, and we firmly believe this technology will help us maintain important public safety protocols as we work to slow the spread of the virus. Allowing us to instantly convert existing cameras into smart cameras also shows promise for wider deployment across the city, in addition to within workplaces in the near future," Branham said. 

If privacy concerns quickly come to mind, that's because CCTV has been a sticky issue for years, and AI-powered video analysis is sure to only muddy the waters. CBS News reported on privacy concerns surrounding security cameras a decade ago, citing one Chicago paramedic who said cameras in the city in 2010 were able to zoom up to 32x optically and 184x digitally, giving birds-eye cameras the ability to clearly recognize license plates. 

SEE: Big data's role in COVID-19 (free PDF) (TechRepublic)

Nick Benton, the paramedic quoted in the article, said that the city is very strict on use of the cameras, but according to the ACLU those policies don't necessarily mean people aren't snooping unethically. AI-powered analytics and cloud transmission of video feeds only adds to security concerns.

"We comply with the ethical use of AI in video analytics. We do not use facial recognition and strive to comply with regulations like HIPPA, NDAA, TAA, DGPR, PCI, UL and FERPA laws, and others alike," Cawamo said on its site, but adds in its privacy policy that "while we strive to use commercially acceptable means to protect your personal data, we cannot guarantee its absolute security."

Also see