When career advice guru Eve Peña was offered a five-figure brand deal to promote a company she’d never heard of to her TikTok followers, she was intrigued. Then she found out what she claims it offers. It is a service that creates a fully human-like AI clone of a human girlfriend to attend virtual job interviews and generate answers based on the customer’s resume.
“The first thing I said was, ‘This is really unethical,’ and I think people would tell me that too,” Peña said in a phone interview. “and [the representative] “Well, I think it’s unethical too…but I’m going to create some talking points that can deflect the hateful comments.”
Peña is one of three creators who told NBC News they were approached with the offer and were immediately concerned it might be a scam. Less than two weeks later, the company StartupHelper went dark online, with its website’s content completely removed and its TikTok page set to private.
This was a strange episode that featured a clash between two different dynamics. One is the nebulous world of partnerships, where creators are approached by relatively unknown companies to sponsor them, and the rise of generative AI technologies that have enormous potential but little oversight.
The first thing I said was, “This is really unethical.”
“The amount they were offering me was so outrageous that I thought, ‘This has to be some kind of scam or scam,'” Peña said of the brand deal, which promised $48,000 over six months. talked about. He takes a 10% commission for each client adopted on the platform. The company will charge customers a $500 down payment along with 10% of their first year’s salary if they use the service to get a job.
Peña said most brands won’t pitch numbers over email unless they want to discuss creator fees first, especially in a less aggressive way. So she asked for more clarity on a Zoom call with her representatives.
“They said, ‘Oh, this is a net $30 payment,’ so I would have to create a video for them for 30 days until the payment was confirmed,” she said. . “And I thought, ‘Well, apparently you guys want free marketing, but you’re not going to pay anyone.’
The explosion in generative AI capabilities opens the door to a variety of applications, from drafting emails to creating full-fledged deepfakes, sparking a rush to find ways to leverage this technology.
In the world of human resources, this means to what extent employers should leverage AI to vet applications, and whether applicants risk crossing ethical lines with AI-generated resumes and cover letters. It means such a question.
But the AI-generated clone, which StartupHelper describes as a “digital shadow warrior who attends job interviews on your behalf,” crossed a line for some career content creators.
Career content itself has become a popular niche on TikTok and other tech platforms, with influencers regularly amassing hundreds of thousands of followers. The popular TikTok creator is also often approached by brands looking to promote their services through sponsored deals. But this market is so opaque and sometimes fraught with problems that such deals can fall apart if negotiations stall or the product doesn’t align with the creator’s values.
Creators who spoke to NBC News said they were approached by StartupHelper and had concerns about the company’s services and the terms of the proposed brand deal. Before the content on the StartupHelper website was removed, it advertised digital cloning and other services such as automatic job applications and optimizing his LinkedIn profile.
Peña, whose videos focus on teaching people how to participate in the corporate world, quickly criticized the company on TikTok for being unethical.she inside her video, she issued a “warning to carriers TikTok” and shared details of what she learned about the service after having a Zoom call with a representative. Peña said StartupHelper’s customers can record videos of themselves speaking so the company’s AI developers and engineers can study their mannerisms and program clones to participate in virtual interviews on their behalf. He said he has been asked to send several copies.
“If you see an influencer promoting the claim ‘What if you never had to go to a job interview again?’ What if there were clones? Just judge them,” she said in the video. Ta. “Just know that they sold their souls for it. And don’t be fooled by it.”
After Peña’s video started gaining traction, comments were left on StartupHelper’s TikTok profile naming the company in question. She said she blocked her account to avoid free promotions. Bots then started flooding her account with comments claiming that StartupHelper helped her secure the job. After she used a filter to block mentions of her company’s name, her own page on TikTok began posting now-deleted videos disparaging her.
StartupHelper did not respond to requests for comment. That startuphelper.com email address was disabled on Tuesday. His website for the company is currently just a landing page with a message like this: “We are committed to delivering better, more ethical products that champion the rights of the working class in a rapidly changing, AI-driven world. We appreciate your feedback.”
This story is the latest manifestation of ethical and security concerns surrounding the unregulated use of AI technology, and how creators struggle to balance their personal morals with the need to monetize content production. It shows what you are doing.
Since OpenAI introduced ChatGPT late last year, engineers are leveraging generative AI (artificial intelligence systems that can create human-like content such as printed matter, photos, and videos) for all kinds of business and consumer services. I expected that the number of startups trying to do this would explode.
Some companies are already pushing technology that gives people a way to create AI versions of themselves. aphidsis a fintech company that creates AI workers that handle multiple online tasks simultaneously, and envisions a future where people create digital clones that can work on their behalf.
Although there is much discussion about how to create rules for the development of AI, such companies remain largely unregulated.
Peña and two other creators who spoke to NBC News said that when they first heard about StartupHelper’s proposal, they were eager to learn more. But the initial email sent to them, a copy of which was reviewed by NBC News, was vague, made no mention of the AI-powered cloning service, and described the company as “a way to help people do their jobs.” “We are an employment agency that is changing the way people find and earn money.” Please use it more in the corporate field. ”
“We envision a collaborative partnership where you can be our brand mascot and bring our company to life,” a representative wrote in an outreach email.
Farah Shargi, a creator who offers career advice on TikTok, said she ended up ghosting the company because of the terms of her contract after first asking for more information. She then read the content plan and found a number of ethical issues, she said, reaffirming her decision not to accept.
The 10-page content plan the company sent out includes scripts for creators to use and includes specific responses to critical comments such as: For example, what if they found out you did this to get the job? ” This is a concern that highlights the lack of federal regulation regarding the use of AI technology.
In response, the document urges authors to say: “I think it’s crazy that it’s always considered illegal when individuals, not companies, take shortcuts to circumvent traditional systems and not the other way around. …We’re stepping into a post-labor economics , your only priority is to get the highest paying job without any hassle.”
The document also encourages creators to follow scripts that openly promote “cheats.”[ing] “Your path to that job” and urged his followers to “block all the people who give you career advice,” adding, “They don’t care about you, they just want you.” They’re just concerned about turning people into sheep people who bend over backwards for a company that won’t do the job.” Don’t worry about you. ”
It did not address questions about data privacy or whether the company plans to use customer likenesses for other purposes. Shargi believes that AI tools can assist job seekers in a more ethical way, but said such cloning services immediately raise alarms.
“I’ve spent three years on social platforms, and I don’t want to risk my reputation or the reputation of a brand just for money. It’s kind of like, it just doesn’t work for me.” she said. “What they appear to be doing is just a blatant scam and to me it just doesn’t pass the smell test.”
The producers also stated that the amount offered by StartupHelper was too high to be true.
However, TikTok creator Gabrielle Judge, who is credited with coining the term “lazy girl’s job,” said that upon closer inspection of the request, the actual amount paid was negligible. In addition to posting the StartupHelper link on the creator’s profile with a short text, he asked for one post per day for six months.
“At first you think, ‘Oh my God, that’s a lot of money,’ but when you actually look at the work that had to be done, it’s not worth it,” she says.
Additionally, Peña said he was told that the company’s CEO is based in Dubai. However, no business license is registered in the company’s name in the United Arab Emirates.