Why do so many companies that rely on monetizing their users’ data seem so enthusiastic about AI? Ask Meredith Whitaker, President of Signal (and I did) , she would simply answer, “Because AI is a surveillance technology.”
On stage at TechCrunch Disrupt 2023, Whittaker said AI is almost inseparable from the big data and targeting industries like Google and Meta, as well as perpetuated by less consumer-focused but equally prominent companies and defense companies. He explained his view. (Her remarks have been lightly edited for clarity.)
“It requires a business model of surveillance. This is a deterioration of the situation we’ve seen since the late ’90s and the development of surveillance advertising. I think AI is a way to establish and scale that business model. ” she said. “A Venn diagram is a circle.”
“And the use of AI is also surveillance, right?” she continued. “When you walk past a facial recognition camera with pseudoscientific emotion recognition, it tells you, rightly or wrongly, ‘You’re happy, you’re sad, you have a bad personality, you’re “You’re a liar after all.” These are ultimately the people who have power over us as a public: our employers, our governments, our border regulators.” monitoring systems sold to and used to make decisions and predictions that shape our access to resources and opportunities. ”
Ironically, the data underlying these systems is frequently organized and annotated by their intended workers (a necessary step in the process of assembling AI datasets), she noted. .
“There’s no way to build these systems without a level of human power to tell the truth about the data. Reinforcement learning with human feedback, but again, that’s like cleaning dangerous human labor with technology. “There is no other way to build these systems, absolutely, because the salaries of thousands of workers are so low and yet they are so expensive overall,” she explained. “In some ways, what we’re seeing is a Wizard of Oz phenomenon. When you close the curtains, there’s very little intelligence there.”
However, not all AI and machine learning systems are equally exploitative. When asked if Signal uses any of her AI tools or processes in its app or development work, she said the app “uses off-the-shelf models as part of the face, not ones we develop. “There’s a small on-device model that does that.” Blur feature in the media editing toolset. It’s actually not that good…but it can detect and blur faces from crowd photos, so when sharing on social media, people’s intimate biometric data is visible on his Clearview, for example. It will never be published. ”
“But here’s the thing: Well… yeah, this is a great use of AI, and I think it just dispels all the negativity that I’ve been spewing out on stage,” she said. added. “Of course, if that’s the only market for facial recognition…but let’s be clear. The economic incentives that drive the very expensive process of developing and deploying facial recognition technology are the only ones where it can be used.” It will never be.”