Face recognition introduces reality-based, meaningful use cases, with face matching based on an image within a video or an external image provided to the system. Moving ahead, we can expect governments to proactively develop policies to regulate the use of these technologies and to define the rights of opting out of being tracked digitally.
AI technology developments are taking off – some would say at a worrying speed. Amid all the progress that’s been made, there have been a lot of misconceptions and misinformation. Society is understandably conflicted about advancements to technologies such as face recognition (FR), which have the potential to increase security and efficiency in tangible ways, but which are also subject to abuse and misuse.
Is facial recognition rightfully controversial or just misunderstood? With the FR industry poised to grow by more than 26% by 2025, it’s important to understand what facial recognition technology is and why its use is highly debated.
Face Recognition 101
A subset of Video Content Analytics technology, face recognition (FR) refers to an AI-backed application that matches or verifies people by correlating biometric features from digital sources, such as video frames. Video surveillance operators can upload a watchlist of images to facial recognition software to define which people or objects the system should detect and identify, and then alerts can be triggered when matches are found in surveillance.
For a retailer, this could mean leveraging footage of different shoplifting incidents, assembling a suspect watchlist using frames from video surveillance, and then responding to alerts triggered by the Video Content Analytics system’s detection of a biometric match for the suspected shoplifter. A qualified security or police officer could then investigate further to determine whether the alert has correctly identified the shoplifter and, if so, apprehend the perpetrator before the store bears further losses.
For law enforcement, this can mean triggering face matching alerts for known criminals or suspects on watchlists. Police can use the technology to monitor behavior, identify and arrest suspects and preventatively deploy officers when there’s reason to believe an incident may occur.
Why has Face Recognition Become So Controversial?
Video analytics have been capable of facial recognition for some time now, but the topic has become particularly heated as of late. According to the BBC, this is because image and video quality was poorer in the past. CCTV images were often too blurry for FR software to identify a match and, therefore, face matching with a database wasn’t wholly reliable.
With improvements to camera technology, the widespread installment of high-definition and ultra-high-definition cameras and the growing sophistication of Video Content Analytics, face recognition accuracy has skyrocketed. As a result, and due to the expansive use case applications for law enforcement and other organizations, these solutions are now gaining more attention. FR is also raising privacy and civil rights concerns: the idea of being remotely identified and tracked by law enforcement – or targeted by businesses monetizing face recognition data – understandably makes citizens uncomfortable.
The Future of Face Recognition
What’s next for face recognition technology? Like many other digital solutions evolving today – from social media to mobile phones – video analytics and face recognition provides another way for law enforcement and businesses to leverage face recognition for safety, security and operational efficiencies. Most FR use cases are limited to finding missing persons and identifying known criminal suspects quickly and effectively. The evolution of this capability enables additional applications like triggering loyalty program benefits and personalizing the shopping experience.
Face recognition introduces reality-based, meaningful use cases, with face matching based on an image within a video or an external image provided to the system (as part of a watchlist or as an individual image). However, there will be no image enhancement, no connection to any personal data, no relevance to any other images of the person anywhere outside of the video. Moving ahead, we can expect governments to proactively develop policies to regulate the use of these technologies and to define the rights of opting out of being tracked digitally.