Microsoft CEO Satya Nadella has responded to the controversy over a sexually explicit AI-generated fake image of Taylor Swift.In an interview with NBC Nightly News Nadella called the prevalence of non-consensual mock nudity “alarming and frightening,” telling interviewer Lester Holt: “I think we need to address this issue immediately.”
In a transcript distributed by NBC ahead of the Jan. 30 show, Holt asked Nadella for a response to “the internet exploding with fakes” and said that he had no idea how Taylor Swift’s fake sexuality was. It emphasizes explicit images. While Mr. Nadella’s response managed to pry open some cans of worms about technology policy, he said surprisingly little about it. This is not surprising if there is no solid solution in sight.
I would like to say two things. One, it goes back to what I think is our responsibility, all the guardrails that we need to put in place around technology so that safer content is produced. And there’s a lot to do and a lot going on there. But it’s a global, societal thing, and it’s about convergence to certain norms. And I think what we can do is, we can govern a lot more than we think, especially if the law and law enforcement and technology platforms can work together. We admit that to ourselves.
Microsoft may have something to do with fake Swift photos. a 404 Media report indicates that you are from a Telegram-based non-consensual porn creation community that recommends using the Microsoft Designer image generator. Designers theoretically refuse to create images of celebrities, but AI generators are easily tricked. 404 It turns out you can break the rules with just a few tweaks to your prompts. This doesn’t prove that Designer was used for Swift’s photos, but it’s the kind of technical shortcoming that Microsoft can address.
However, AI tools have greatly simplified the process of creating fake nudes of real people. cause confusion For the woman who Far less powerful and famous than Swift.. And controlling that production isn’t as easy as forcing big companies to tighten guardrails. Even when major “big tech” platforms like Microsoft are locked down, people can retrain and use open tools like Stable Diffusion. Create NSFW photos Despite trying to make it difficult. Although far fewer users may have access to these generators, the Swift incident shows how far-reaching the efforts of a small community can be.
Social networks that limit the scope of non-consensual images and, obviously, Swifty’s vigilante justice To those who spread it. (Will that count as “convergence to certain norms”?) But for now, Nadella’s only clear plan is to get Microsoft’s own AI house in place.