Concerns about masks and facial-recognition software

Loading Video…

This browser does not support the Video element.

Masks and facial-recognition software [The Big Idea]

As the coronavirus pandemic prompted many to wear coverings on their faces, tech companies had to tackle the challenge of updating facial-recognition software.

With facemasks now a part of our lives, there are some concerns when it comes to facial recognition.

According to "The Intercept," the U.S. Department of Homeland Security has raised concerns internally that face masks may interfere with facial recognition technology.

“As face masks have come in to play the importance of the details here in the periocular region become that much more important,” said Eric Hess, senior director of product management for Real Network SAFR.

Hess, says SAFR started developing facial recognition software for face masks in late February, early March. He says in just a few months there have been incredible advances.

“I think what is important is the testing evaluation of the technology to make sure it really achieves the levels that it needs," he said.

The first tasks of facial recognition is to be able to detect a face in the filed view of a camera, and wearing a mask can inhibit that, making it more difficult,

"Anytime we start to cover a portion of her face whether be a human or whether be a computer, we start to lose important details that allows us to recognize the person,” says Hess.

Carl Vondrick, an assistant professor of computer science at Columbia University says when people wear facemasks it poses a challenge for the machines because they have not seen facemasks before and so it makes it very hard to estimate similarities, which are challenges the industry is facing.

Hess says available data to train on, where people are actually wearing masks, has been limited. SAFR has reached out internally to its own staff and customers.

Vondrick biggest concern is accuracy. "It is possible that the system will pick up on biases and only work for a certain population,” says Vondrick.

That is because when the algorithms and the software were first created, the artificial intelligence machines were fed lots of pictures of white men, which means the system is significantly better at identifying white males over any other gender or race.

However, Hess says the algorithms can recognize what they’ve been trained upon, meaning it’s up to the software company to do its job and obtain and develop data that represents all races and colors.