The "surveillance exclusion" mask was designed by Jip van Leeuwenstein while he was a student of Utrecht School of the Arts in the Netherlands.
"Because of its transparency you will not lose your identity and facial expressions," von Leeuwenstein writes, "so it's still possible to interact with the people around you."
Jing-cai Liu created the wearable face projector, a "small beamer projects a different appearance on your face, giving you a completely new appearance."
Images of Liu's face projector went viral last month after misleading tweets claimed it was being used by protesters in Hong Kong. This claim was later debunked.
Isao Echizen, a professor at the National Institute of Informatics in Tokyo, designed the "privacy visor" as a safeguard against security cameras that could log someone's face without their permission.
The device is fitted with "a near-infrared LED that appends noise to photographed images without affecting human visibility."
When switched on, a user's face no longer scans as a human face to the AI, indicated by the lack of green boxes above.
A makeup technique known as CV Dazzle, first pioneered by the artist Adam Harvey, uses fashion to combat facial recognition. It was recently featured at a workshop at the Coreana Museum of Art in Seoul, pictured here.
The technique gets its name from a World War I tactic — naval vessels were painted with black and white stripes, making it harder for distant ships to tell their size and the direction they were pointed.
"By giving an overload of information software gets confused, rendering you invisible," Weekers wrote of the scarf.
Belgian computer scientists Simen Thys, Wiebe Van Ranst, and Toon Goedemé designed "adversarial patches" as part of a study funded by KU Leuven.
"We believe that, if we combine this technique with a sophisticated clothing simulation, we can design a T-shirt print that can make a person virtually invisible for automatic surveillance cameras," the researchers wrote.
"'Facial Weaponization Suite' protests against biometric facial recognition — and the inequalities these technologies propagate — by making 'collective masks' in workshops that are modeled from the aggregated facial data of participants, resulting in amorphous masks that cannot be detected as human faces by biometric facial recognition technologies," creator Zach Blas writes.
Blas intended the masks pictured above to depict the "tripartite conception of blackness: the inability of biometric technologies to detect dark skin as racist, the favoring of black in militant aesthetics, and black as that which informatically obfuscates," he writes.
Copyright © 2023. Times Internet Limited. All rights reserved.For reprint rights. Times Syndication Service.