Smile, you’re on camera. And it may know who you are, without you spilling a word.
For the coverage of Prince Harry and Meghan Markle’s royal wedding, Sky News in the U.K. knew wedding watchers wanted the names of the famous and somewhat-famous guests walking into St. George’s Chapel, in Windsor.
They turned to facial recognition technology. As guests sauntered in, the software automatically pegged who they were and popped their names up on the screen.
Some academics and activists have misgivings about the software. Over 70 civil and human rights organizations in the U.S. have penned an open letter to Amazon about the company’s facial recognition service. They want the technology giant to cease the sale of its Rekognition software to police departments and government agencies
In part, the letter read: “Face recognition technology represents an unprecedented threat to privacy and civil liberties that even the best traditional forms of regulation will almost certainly fail to mitigate.”
Meanwhile, in Canada, researchers and advocates are also fighting the rise of these systems.
Ann Cavoukian is the former privacy commissioner of Ontario. She now leads the Privacy by Design Centre of Excellence at Ryerson University. Cavoukian worries about the misuse of the technology.
It’s already happened. A few years ago, a face recognition app called FindFace was used by trolls to out and stalk Russian porn actresses.
“We see with facial recognition, a lot of the time you don’t even know it’s taking place,” said Cavoukian. “And the problem is once your facial image is captured, it can be used for a variety of different purposes—identity theft, etc. But also it can place you at a particular place at a particular time and it can enable tracking of your activities. That’s nobody’s business.”
How facial recognition technology works
The software detects your face on an image and then assesses your features, things like the width of your nose or the length of your jawline. It then creates a code called a faceprint, much like a fingerprint. That faceprint is matched to a set of identified photos and faceprints to name the person in the image.
Avishek Bose is a researcher and graduate student in engineering at the University of Toronto. He’s working on software that fools detection technology by slightly distorting online images.
“What it does is that it adds an Instagram-like filter on top of your face image, such that if humans were to look at it they wouldn’t be able to tell the difference between the original image as well as the perturbed image,” said Bose. “However the changes are small enough, yet strong enough, that if a computer vision models, such as face detection systems, were to look at it, it would completely fail.”
Bose says in many cases, face detection systems are automatically used to track individuals or crowds and control behaviour.
“People have no agency, no power to fight back,” said Bose. “And we felt that it was really important to actually empower individuals and people to actually take back control of their own data.”
Companies must put privacy first in product design
For years, Cavoukian has advocated for an idea called Privacy by Design. It means companies must design with privacy at the core, rather than as an afterthought.
The European Union`s new rules on data protection that came out recently incorporate the concept.
“Proactively identify the risks and embed privacy-protective measures to address and prevent those risks upfront,” said Cavoukian. “Bake it into the code; bake it into the data architecture, into your policies, into your operations.”