Live Nation’s investment last year in Blink Identity, a tech company specializing in facial recognition, has spawned a new campaign in opposition to the use of facial recognition tech at festivals and concerts. It was started by digital rights group Fight For The Future, and already has support from artists such as Tom Morello, The Glitch Mob, and more.
“Music fans should feel safe and respected at festivals and shows, not subjected to invasive biometric surveillance,” reads an Instagram post from Fight For The Future.
— The Glitch Mob (@theglitchmob) September 9, 2019
Protect the privacy of fans at live shows. Ban facial recognition technology at festivals and concerts ⚠️
— The Glitch Mob (@theglitchmob) September 10, 2019
I don’t want Big Brother at my shows targeting fans for harassment, deportation, or arrest. That’s why I’m joining this campaign calling on @Ticketmaster and others not to use #facialrecognition at festivals and concerts. https://t.co/i3a9oPIa5C
— Tom Morello (@tmorello) September 9, 2019
I don’t want Big Brother at my shows targeting fans for harassment, deportation, or arrest. That’s why I’m joining this campaign calling on @Ticketmaster and others not to use #facialrecognition at festivals and concerts.
— Tom Morello (@tmorello) September 10, 2019
Theoretically, facial recognition technology can be used for many good and beneficial reasons. This could range from identifying stalkers, as was used at a Taylor Swift concert at the Rose Bowl stadium in Los Angeles last year, to pairing your face with your ticket, reducing the time you wait in line to get in.
Writes MusicTech, “However, Fight For The Future claims that this technology ‘puts undocumented fans, fans of color, trans fans, and fans with criminal records at risk of being unjustly detained, harassed, or judged.’ It cites a Vice report that reveals similar tech used by Amazon incorrectly identified one in five lawmakers in California as criminals. In fact, the racial bias embedded in many facial recognition systems is dramatic, with nearly 40 per cent of the false matches made by Amazon’s system involving people of colour.” The technology will always have the potential for abuse, so it’s not as much about waiting for regulation or a “responsible hand” to take over. Read more at banfacialrecognition.com.