Privacy

Mozilla Recommends ‘Swiping Left’ on AI Romance Apps

A man looking for romance using a dating app on his smartphone

Romantics searching for virtual love should approach amorous AI chatbots with caution, according to a report released Wednesday by researchers at Mozilla’s “Privacy Not Included” buyer’s guide.

After examining 11 “AI soulmates,” the researchers issued a thumbs down to all the apps for failing to provide adequate privacy, security, and safety for the personal data they extract from their users.

They noted that 10 of the 11 chatbots failed to meet Mozilla’s Minimum Security Standards, such as requiring strong passwords or having a way to manage security vulnerabilities.

The report revealed that most of the privacy policies for the apps provided surprisingly little information about how they use the contents of users’ conversations to train their AIs and very little transparency into how their AI models work.

“Most of the 11 apps we reviewed were made by small developers that you couldn’t find a lot of information about,” the guide’s director, Jen Caltrider, told TechNewsWorld.

Manipulation on Steroids

The report added that users also have little to no control over their data, leaving a massive potential for manipulation, abuse, and mental health consequences.

“These apps are designed to get you to give up a lot of personal information because they’re trying to get to know you,” Caltrider explained. “They’re interested in your life, and the more they know, the better they can talk to you and become your soulmate.”

“If you’re an evil person who wants to manipulate people, this is manipulation on steroids,” Caltrider said. “You’ve built a chatbot that’s going to get to know a vulnerable person, build a connection to them, and become their friend. Then you can use that chatbot to manipulate how they think and what they do.”

The report also rapped the app makers for not providing users with the choice of opting out of having the contents of their intimate chats used to train the AI models used by the programs. The researchers pointed out that only one company, Genesia AI, had an opt-out alternative, which showed that it’s a viable feature.

“Consumers who are concerned about their information being used for marketing purposes or for training artificial intelligence engines without their express permission need to carefully review the data collection practices of a company and exercise any right to opt-in or opt-out of data collection, sharing, selling, or retention,” advised James E. Lee, chief operating officer for the Identity Theft Resource Center, a nonprofit organization devoted to minimizing risk and mitigating the impact of identity compromise and crime, San Diego, Calif.

“Retained information could also be a target for cybercriminals for ransomware or identity theft, too,” he told TechNewsWorld.

Skyrocketing AI Romance Apps

According to the report, the number of apps and platforms using sophisticated AI algorithms to simulate the experience of interacting with a romantic partner is skyrocketing. Over the past year, it noted, the 11 relationship chatbots Mozilla reviewed have racked up an estimated 100 million downloads on the Google Play Store alone.

When OpenAI’s GPT store opened last month, the report added, it was flooded with AI relationship chatbots despite being against the store’s policy.

In a recent study of 1,000 adults performed by Propeller Insights for Infobip, a global omnichannel communications company, 20% of Americans admitted to flirting with a chatbot. However, that number was more than 50% for 35 to 44-year-olds.

The most prevalent reason for virtual flirting was curiosity (47.2%), followed by loneliness and joy in interactions with chatbots (23.9%).

“The surge in AI romance chatbot use can be chalked up to a mix of societal shifts and tech advancements,” maintained Brian Prince, founder and CEO of Top AI Tools, an AI tool, resource and educational platform in Boca Raton, Fla.

“With loneliness on the rise and many feeling increasingly disconnected, folks are turning to chatbots for companionship and emotional support,” he told TechNewsWorld. “It’s like having a friend in your pocket, available whenever you need a chat. Plus, as AI gets smarter, these bots feel more real and engaging, drawing people in.”

From Code to Sweet Nothings

It’s also become easier to deploy AI chatbots. “Embedding these sorts of experiences is as easy as embedding YouTube videos or Spotify previews to a web page, thanks to their well-documented and robust APIs,” explained Brandon Torio, a senior product manager at Synack, an enterprise security company in Redwood City, Calif.

“With a few lines of code, you can prime ChatGPT-like models to have any sort of conversation with customers, whether the goal is to educate them about a product or just whisper sweet nothings for Valentine’s Day,” he told TechNewsWorld.

“With all that humans have dealt with in the last few years, it’s not surprising that people are turning to computers for companionship and romance,” added Ron Arden, CTO and COO of Fasoo, an enterprise data protection solutions provider in Seoul, South Korea.

“We all got isolated during the pandemic, and it’s tough to meet people,” he told TechNewsWorld. “Chatbots are easy, just like texting is easy. No direct human interactions and embarrassment. Just give me what I need, and I can get on with my day.”

“It’s also part of the general increase in using apps for just about everything, from measuring your blood pressure to counting calories,” he said. “It’s easy, non-threatening and convenient.”

Unique Privacy Threat

The Mozilla report also asserted that romance bots used deceptive marketing practices. It cited one app claiming to offer mental health and well-being benefits on its website but denying those benefits in the terms and conditions for using the app.

“It’s deceptive and confusing for them to market themselves as mental health, self-help or well-being apps but then clearly state in their legal documents that they’re not offering any mental health services,” Caltrider said.

AI-powered romance chatbots present a unique threat to privacy, maintained James McQuiggan, security awareness advocate at KnowBe4, a security awareness training provider in Clearwater, Fla.

“That’s because they may engage in deeper, more personal conversations with users,” he told TechNewsWorld. “It can potentially lead to the collection of sensitive personal data, which, if not handled securely, poses a significant risk of data breaches and misuse.”

“Romance chatbots have the potential to be a great tool for people exploring their sexuality — a way to try out conversations they would be too embarrassed to have with a person,” added Jacob Hoffman-Andrews, a senior staff technologist for the Electronic Frontier Foundation, an international non-profit digital rights group based in San Francisco.

“That works only if the chatbot has extremely strong privacy policies,” he told TechNewsWorld. “They should not train the AI based on private chats. They should not show private chats to human evaluators. They should make sure chats can be truly deleted and offer automatic deletion after a period of time.”

“And,” he added, “they should definitely under no conditions sell information deduced from private chats.

John P. Mello Jr.

John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

More by John P. Mello Jr.
More in Privacy

Technewsworld Channels