Lonely on Valentines Day?
At least, thats what a number of companies hawking romantic chatbots will tell you.
But as your robot love story unfolds, theres a tradeoff you may not realize youre making.
Photo: Vladimir Vladimirov (Getty Images)
The apps mentioned in this story didnt immediately respond to requests for comment.
For example, CrushOn.AI collects details including information about sexual health, use of medication, and gender-affirming care.
Security was also a problem.
Only one app, Genesia AI Friend & Partner, met Mozillas minimum security standards.
Data issues aside, the apps also made some questionable claims about what theyre good for.
Romantic AI says its here to maintain your MENTAL HEALTH.
Thats probably important legal ground to cover, given these apps history.
Replika reportedly encouraged a mans attempt toassassinate the Queen of England.
A Chai chatbot allegedlyencouraged a user to commit suicide.
News from the future, delivered to your present.
Where users ended up depends on where they started.
DOGE Is Replacing Fired Workers With a Chatbot
Worst.