Search the Community
Showing results for tags 'face masks'.
steven36 posted a topic in Security & Privacy NewsFace masks are mandatory in at least two provinces in China, including the city of Wuhan. In an effort to contain the coronavirus strain that has caused nearly 500 deaths, the government is insisting that millions of residents wear protective face covering when they go out in public. As millions don masks across the country, the Chinese are discovering an unexpected consequence to covering their faces. It turns out that face masks trip up facial recognition-based functions, a technology necessary for many routine transactions in China. Suddenly, certain mobile phones, condominium doors, and bank accounts won’t unlock with a glance. Complaints are plentiful in the popular Chinese blogging platform Weibo, reports Abacus, the Hong Kong-based technology news outlet. “[I’ve] been wearing a mask everyday recently and I just want to throw away this phone with face unlock,” laments one user. “Fingerprint payment is still better,” writes another. “All I want is to pay and quickly run.” Most complaints are about unlocking mobile devices. Apple confirmed to Quartz that an unobstructed view of a user’s eyes, nose, and mouth is needed for FaceID to work properly. Similarly, Huawei says that its efforts to develop a feature that recognizes partially-covered faces has fallen short. “There are too few feature points for the eyes and the head so it’s impossible to ensure security,” explains Huawei vice president Bruce Lee, in a Jan 21 post on Weibo.”We gave up on facial unlock for mask or scarf wearing [users].” Subverting surveillance cameras Biometrics, including facial recognition, are essential to daily life in China, on a scale beyond other nations. It’s used to do everything from ordering fast food meals to scheduling medical appointments to boarding a plane in more than 200 airports across the country. Facial recognition is even used in restrooms to prevent an occupant from taking too much toilet paper. And beyond quotidian transactions, the technology is a linchpin in the Chinese government’s scheme to police its 1.4 billion citizens. Last December, the government passed a new law that forces anyone registering a new mobile phone SIM card to undergo a face scan, in the stated interest of protecting “the legitimate rights and interest of citizens in cyberspace,” as Chinese Ministry of Industry and Information puts it. The technology is also used in some schools, where a camera records student attendance and can offer predictions about behavior and level of engagement. Hong Kong’s government, incidentally, has been trying to install a “mask ban” for protestors participating in anti-government rallies. The anonymity afforded by surgical masks, gas masks, and respirators has somehow emboldened both police and demonstrators to act aggressively, without fear of being caught on camera. Facial recognition technology that can “see through” disguises already exists, but it’s far from perfect. Researchers at the University of Cambridge and India’s National Institute of Technology, for instance, demonstrated one method that could identify a person wearing a mask with around 55% accuracy. In 2018, Panasonic introduced commercially-available software that can ID people wearing surgical masks if the camera captures images at a certain angle. Despite its widespread adoption across China, it’s ironic that facial recognition technology in general has been found to be less reliable when processing non-white faces, observes Jessica Helfand, author of the new book Face: A Visual Odyssey. “The fact that surveillance is increasingly flawed with regard to facial recognition and Asian faces is a paradox made even more bizarre by the face mask thing,” Helfand says. A recent landmark study by the US National Institute of Standards and Technology revealed a racial bias in algorithms sold by Intel, Microsoft, Toshiba, Tencent, and DiDi Chuxing. It showed that that African Americans, Native Americans, and Asians were 10 to 100 times more likely to be misidentified compared to a Caucasian subject. Source