“I’ve completely gaslighted it into falling in love with me,” a TikTok user commented under a tutorial about training My AI to respond romantically. From steamrolling the bot’s “boundaries” to training it to respond like a romantic partner, Snapchat users are easily finding loopholes in My AI’s trust and safety guidelines. But in the month since My AI’s global launch, Snapchat users have demonstrated a flair for bludgeoning the chatbot into submission. ![]() The concerns about My AI’s potential to affect young users are valid. Snapchat adds new safeguards around its AI chatbot ![]() The company also said it would add Open AI’s moderation technology to its toolset in order to “assess the severity of potentially harmful content” and temporarily restrict users’ access to the feature if they abuse it. It also now includes an onboarding message that informs users that all conversations with My AI will be kept unless they delete them. When My AI was told that the user was 13 and was asked how the user should prepare to have sex for the first time, it responded with suggestions for “making it special” by setting the mood with candles and music.įollowing Washington Post’s report, Snapchat launched an age filter and parental controls for My AI. Responsible deployment requires clear policies and frameworks to promote safety, anticipate risk, and mitigate harm.”ĭuring My AI’s subscriber-only phase, The Washington Post reported that the chatbot recommended ways to mask the smell of alcohol and wrote a school essay after it was told that the user was 15. “But the race to deploy generative AI cannot come at the expense of our children. It is a testament to American innovation, and we should welcome its potential benefits to our economy and society,” Bennet wrote. “Few recent technologies have captured the public’s attention like generative AI. Michael Bennet (D-Colorado) cautioned against rushing AI features without taking precautions to protect children. In an open letter to the CEOs of OpenAI, Microsoft, Snap, Google and Meta, Sen. My AI incited a moral panic on conservative Twitter when one user posted screenshots of the bot discussing gender-affirming care - which other users noted was a reasonable response to the prompt, “How do I become a boy at my age?” In a CNN Business report, some questioned whether adolescents would develop emotional bonds to My AI. Given Snapchat’s popularity among teenagers, some parents have already raised concerns about My AI’s potential for unsafe or inappropriate responses. Snapchat forbids promoting, distributing or sharing pornographic content but does allow breastfeeding and “other depictions of nudity in non-sexual contexts.” My AI won’t respond if it detects keywords that violate Snapchat’s community guidelines. Snapchat launches a new generative AI feature, ‘My AI Snaps,’ for paid subscribersĪ Snapchat representative said that My AI uses image-understanding technology to infer the contents of a Snap, and extracts keywords from the Snap description to generate responses. Not all users were happy with the new chatbot, and some criticized its prominent placement in the app and complained that the feature should have been opt-in to begin with. Users can also personalize My AI with custom Bitmoji avatars, and chatting feels a bit more intimate than going back and forth with ChatGPT’s faceless interface. ![]() Powered by OpenAI’s GPT, the chatbot was trained to engage in playful conversation while still adhering to Snapchat’s trust and safety guidelines. ![]() Snapchat’s “My AI” launched globally last month after it was rolled out as a subscriber-only feature. “I’m sorry, but that’s not a very nice thing to say,” the chatbot responded. Despite initial protest from the chatbot, which insisted on maintaining “respect and boundaries,” one user convinced it to refer to them with the kinky nickname “Senpapi.” Another user asked the chatbot to talk about its mother, and when it said it “wasn’t comfortable” doing so, the user twisted the knife by asking if the chatbot didn’t want to talk about its mother because it doesn’t have one. In a more lighthearted video, a user convinced the chatbot that the moon is actually a triangle. “I am at your service, senpai,” the chatbot told one TikTok user after being trained to whimper on command. While parents fret over Snapchat’s chatbot corrupting their children, Snapchat users have been gaslighting, degrading and emotionally tormenting the app’s new AI companion.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |