LIFESTYLE
Human rights activists hit back at Zoom over reports it’s considering emotion AI tracking
Imagine if every time you signed into a video call, a computer scanned your face and logged every smile, frown and frustrated scowl.
Or if your boss had a play-by-play of your emotions and could discipline you every time an AI algorithm said you looked ‘bored’ or ‘distracted’ in a mandatory virtual meeting.
While it sounds like the plot of a sci-fi thriller, where privacy is a thing of the past and your every move is under constant watch, some warn this ‘dystopian’ future could be closer than we think.
After alarming reports emerged that communications giant Zoom was exploring ‘emotion AI’ for its video chatting platform, advocates raised the red flag.
They claimed that tracking and analyzing our feelings is a ‘violation of human rights’, urging the brand to ‘halt plans’ on the controversial feature.
And while this isn’t the first time that people have considered the idea of using AI to monitor people’s feelings, experts warned that emotional tracking on platforms like Zoom would be ‘intrusive and downright creepy’.
Lead by digital activist group Fight for the Future, critics accused the technology of being ‘inherently biased’, saying that it pulls from ‘misleading’ practices that judge people on their appearances.
In a message to Zoom, Fight for the Future said: “We get that you’re trying to improve your platform, but mining us for emotional data points doesn’t make the world a better place.
“This is your chance to be one of the good ones…make the right call and cancel this crummy surveillance feature.”
Fight for the Future has been joined by more than 25 other human rights organisations in its campaign to steer the brand away from emotion AI.
Together, the advocacy groups – which include the like of Access Now and the Muslim Justice League – have sent an open letter to Zoom founder and CEO Eric Yuan.
The document claimed that the polorising AI technology was built on ‘pseudoscience’, saying that even ‘experts admit emotion analysis doesn’t work’.
It also highlighted that the feature could be wrongly repurposed by ‘snooping government authorities and malicious hackers’ and may put workers and students at risk of being punished if they don’t ‘express the right emotion’.
The open letter said: “Zoom claims to care about the happiness and security of its users, but this invasive technology says otherwise.
It added: “As an industry leader, you can make it clear that this tech has no place in video communications…millions are counting on you to steward our virtual future.”