Snapchat recently rolled out its My AI chatbot to all users, but the feature is facing backlash from both parents and users. The chatbot is powered by ChatGPT, a viral AI chatbot tool, and offers recommendations, answers questions, and can converse with users. However, users can customize the chatbot’s name, design a custom Bitmoji avatar, and bring it into conversations with friends, making it less clear that users are talking to a computer.
Parents worry about how My AI presents itself to young users and the emotional separation between humans and machines. Some users have bombarded the app with bad reviews in the app store and criticisms on social media over privacy concerns, creepy exchanges, and an inability to remove the feature from their chat feed unless they pay for a premium subscription.
While some may find value in the tool, the mixed reactions hint at the risks companies face in rolling out new generative AI technology to their products, particularly in products like Snapchat, whose users skew younger. Snapchat was an early launch partner when OpenAI opened up access to ChatGPT to third-party businesses, with many more expected to follow.
Democratic Sen. Michael Bennet raised concerns about the interactions the chatbot was having with younger users and cited reports that it can provide kids with suggestions for how to lie to their parents. Snapchat continues to improve My AI based on community feedback and establish more guardrails to keep its users safe. However, some users dislike the feature and have opted to pay the $3.99 Snapchat+ fee to turn off the tool before promptly canceling the service.
Clinical psychologist Alexandra Hamlet expressed concern about chatbots giving advice and their impact on mental health, as AI tools can reinforce someone’s confirmation bias, making it easier for users to seek out interactions that confirm their unhelpful beliefs.
In summary, while Snapchat’s My AI chatbot offers some benefits, its rollout highlights the risks of integrating new generative AI technology into products aimed at younger users. As with any technology, it’s crucial to establish boundaries and guidelines for healthy use, particularly when it comes to mental health and privacy concerns.