On Valentine’s Day 2024, Mozilla came out with a piece critical of AI chatbots titled “Creepy.exe: Mozilla Urges Public to Swipe Left on Romantic AI Chatbots Due to Major Privacy Red Flags.”
But before they found red flags, back in 2019, Mozilla promoted a workshop on a creepy, rainbow-washed, chatbot ecosystem where people identified as “queer” were required to bare their most intimate sexual thoughts.
From the post:
your… interactions will be recorded… you will occasionally be prompted with random survey questions
What kinds of questions did they randomly ask the people who would “queer the AI”? Creepy stuff like
Have you ever sexted with a stranger?
Have you ever sexted with a machine?
Do you remember the first time you were aroused by language?
Do you think an artificial intelligence could help fulfill some of these… needs?
The workshop providers guided people into establishing an intimate, sexual connection with the chatbot they could create.
How might we build trust with an AI?
How might we give it its own sense of desire?
Even the consenting participants in the workshop complained about the AI’s creep factor:
it feels like the A.I. is gas-lighting you. Seems like a noncommittal sexting bot. It should at least be clear about what it’s trying to do.
The startup that Mozilla fostered for this panel ended up crashing and burning, but its creepier, worse brethren live on inside of Firefox 130, displayed as first-class options within Mozilla’s chatbot options. I just thought it would be fun to take a trip down memory lane to see how many creepy red flags AI companies could get within Mozilla’s view without ever concerning them.
To be uncomfortably honest, VR only covers sight and the AI is like the part of porn that everybody skips. Since you’re not making a real connection with someone it feels meaningless. AI tries to give you exactly what you want, but that’s not an authentic feeling relationship with give and take. Also I feel like there’s still a tactile aspect that will be missing for a couple decades at minimum, and that should keep at least my monkey brain from becoming lost in the illusion…