Dear Zoom,
Hi there. đ Itâs usâthe millions of people who learned your name during the pandemic and have stuck with you through thick and thin to make you the most successful video platform on the web. We rely on you for work calls, town halls, and word games with our families.
We like you, we really do. But weâre getting worried about you. đŹ
Protocol reported that youâre planning a feature that claims to track and analyze our emotions. We get that youâre trying to improve your platform, but mining us for emotional data points doesnât make the world a better place. And selling this tech to employers or businesses so that they can monitor and manipulate us for profit is really not cool.đ
Itâs manipulative.
You describe this emotional surveillance tool as a way for businesses to hone their sales pitch by tracking the headspace of the person on the other side of the screen. Even that is a major breach of user trust. But we see the writing on the wall. Ultimately, this software will be sold to schools and employers who will use it to track and discipline us. You say you care about our happinessâso where does this dystopian vision fit in? đ¤
Itâs discriminatory.
Emotion AI, like facial recognition in general, is inherently biased. It has connections to historic practices like physiognomy which have been proven totally bunk (not to mention, totally racist). These tools assume that all people use the same facial expressions, voice patterns, and body languageâbut thatâs not true. Adding this feature will discriminate against certain ethnicities and people with disabilities, hardcoding stereotypes into millions of devices.
Itâs pseudoscience.
Letâs be honestâthis emotional measuring stuff is a marketing gimmick, and experts admit that it doesnât even work. â The way we move our faces is often disconnected from the emotions underneath, and research has found that not even humans can measure emotion from faces some of the time. Why add credence to pseudoscience and stake your reputation on a fundamentally broken feature?
Zoom, you can do better.
Weâve already lost trust in a bunch of other companies because of shady tracking systems and other extractivist practices. Zoom, this is a chance to be one of the good ones. Youâve made the right call before, like in 2020, when you changed your mind about blocking free users from your encrypted service. Youâve even canceled face-tracking features before because they didnât meet your privacy standards. This can be just like those timesâweâre just asking you to put the privacy and happiness of your users first.
Youâre the industry leader, and millions of people are counting on you to steward our virtual future. Make the right call and cancel this crummy surveillance featureâand publicly commit to not implementing emotion AI in the future.
With tons of emotion â¤ď¸,
Your Users