aum Posted August 8 Share Posted August 8 To understand the internet, you have to assume that everyone has a constant, low-level amount of horniness going on. I do not make this observation in a crass or judgemental way. My point is that if you’re wondering what technologies will take hold in the future, the first place you want to check is the adult entertainment industry. This is why a recent quote from an AI founder caught my eye. In an interview with Wired, Avi Schiffmann pitched his new startup Friend, which sells a Bluetooth-connected pendant that listens to your conversations—not only with Friend but with everyone you encounter. Most of the company’s marketing materials show the device building an emotional connection with the user. “No, I think you were vulnerable. There’s a difference,” one said. (I’ve been chatting with Schiffmann for the last year and will review a demo unit soon for Every.) It’s not specifically designed to be a sexual companion, but he knows some people will use it that way. “For sure there will be some people that try and fuck the USB-C port of this,” he told Wired. “I think I'm shameless enough to understand what I'm building.” It’s startling to read Schiffman’s assumption—but it’s almost certainly accurate. Pornographic chatbots are some of the most popular and profitable AI applications in use today. Mainstream chatbots such as ChatGPT and Claude won’t engage in sexual chatter, so a cottage industry has sprouted up to serve the good people of the internet. Janitor AI attracted a million users in the first week it launched. Lest you think that this is simply a problem of lonely men, 80 percent of Janitor’s users identify as non-male, and most of their most popular chatbots are male characters. Even the subreddit for the company has nearly 60,000 users! Others like Muah.AI report hitting $1 million in annualized revenue within months of launching. I was able to track down eight different services nearing similar levels of scale with names like Chai, Sakura, and SpicyChat. This is a Cambrian explosion of growth that reminds me of the early days of the Apple App Store. Each app offered not just smutty text, but the ability to design a character’s physical appearance and persona, and generate images of them doing things you asked them to. All told, my back of the napkin math means there are about 5-10 million monthly active users of these services today—something far larger than I think anyone has realized. An even larger market is for adult emotional entertainment. These companies aren’t porn-y, but offer some variation on emotional fulfillment, allowing people to role-play companionship. Character.AI has 3.5 million daily active users, each of whom spend two hours a day chatting on average. It facilitates conversations with fictional characters like Iron Man but also lets users role-play with a “school bully.” To give a sense of scale, the school bully character has had 125 million conversations with users. I don’t know of any other consumer company that had this level of growth and usage outside of early social media. Meta itself offers the ability to design chatbots with an AI studio where you can find psychics, popular anime characters, and even your new “gay bestie.” Each of the chatbots have public stats showing they’ve done hundreds of thousands of conversations with users. The most common reaction to these statistics is a general sense of bile. It feels kinda weird that people are building emotional—and even sexual—relationships with algorithms. Plus, there’s an even more worrying underbelly: Many of the seedier parts of the internet have forum discussions on how to use these bots to roleplay with “under-18” characters, which feels like it should be illegal. Instead, today, I am going to ask you to put that feeling aside. I am going to ask you to focus on the user—the multiple millions of them, many of whom are highly active, seemingly replacing parts of their lives or other products, and are willing to pay. Remember: Where goes porn, so goes the world. Lonely, I’m so lonely Roughly 60 percent of Americans report feeling lonely on a regular basis. This number has been steadily increasing since the 1970s, with all sorts of follow-on effects like decreasing levels of trust in community, faith in neighbors, and happiness. A similar dynamic plays out in sexual relations. We are at the lowest rate of sex in American history, in 2021, 26 percent of Americans didn’t have sex at all. It is important to note that all these effects began 20 years before the internet and 35 years before the smartphone. It is easy to blame Big Tech, video games, or the technology of your choosing, but the loneliness epidemic cannot be solely laid at the feet of Silicon Valley or Hollywood. It is a problem with dozens, if not hundreds of causes ranging from the collapse of religion to rising housing costs. Surprisingly, chatbots appear to be a net-positive intervention for people’s feelings of loneliness. A Harvard Business School working paper published on July 26 found, in a meta-analysis, that chatbots were effective at resolving these feelings. And the most expensive of these services top off around $40 a month, far cheaper than meeting friends at a bar or social club. However, the longest study only measured those feelings over a week. My intuition says that using these chatbots to solve loneliness is the equivalent of using a GLP-1 drug to solve weight issues. Sure, you get the outcome, but without the underlying lifestyle changes you are ultimately reliant on a product that has unknown long-term effects. During my research for this piece, I spent hours reading Reddit forums and chatting with people in various Discord servers, trying to get a sense for how users felt about these services. There is only one word: intense. I’ve never encountered these depths of attachment to a technology before. Users reported that they chatted with AI companions for 10 hours a day. Whenever a startup altered the algorithm, there was a public mourning by users because they felt that the company had “killed” their friends. Even subtle changes in personality or memory are noticeable if you have ten hours of conversation a day. On Character, additional filters trying to reduce the amount of NSFW activity resulted in mass rebellion as users demanded to have them back. One common user pattern for the Chatbots meant exclusively for emotional companionship was trying to get around content filters and have virtual sex with them anyways. Comments saying, “I prefer talking to [my AI created character] over talking to people I’m attracted to,” were incredibly common. One comment on the Character.AI subreddit summarized it nicely: “I don't care how flawed their policies get, I will still prefer [Character.AI] over interacting with real people.” It is tempting to blame these AI services, but remember: loneliness and sexlessness existed before these services did. AI chatbots may exacerbate the problem but they have not caused it. People forming bonds with artificial entities is old news. Dating simulators have been some of the most popular genres of video games in Japan since the '80s. What strikes me as new with generative AI services is that these products actually talk back. They are malleable, adaptable objects that can respond to how a user behaves to a far greater degree than was previously possible. The level of immersion is beyond any previous service capabilities to the point where in June of last year—when the models were far worse than they are now—32 percent of people couldn’t tell when they were talking with a chatbot. And with ChatGPT’s new voice mode, the audio is essentially indistinguishable from most phone calls. And in contrast to previous products that tried to alleviate sexual gratification or loneliness, generative AI blurs the lines between creation and consumption. It requires conversations and requests, the use of the product requires some suspension of belief, and requires users to put in effort. While most previous products for NSFW purposes are passive affairs that just involve scrolling or selecting, chatbots as companions are more work and, seemingly, far more gratifying. What this means for the future From my research around chatbots, I had three primary takeaways: Chatbots provide a new, incredibly addictive form of entertainment: The creation and consumption dynamic is a new lens that I am still parsing through generative AI. Take, for example, music generation apps like Suno, where consumers can prompt their way into creating the perfect song. It simultaneously makes it easier to create music and harder to consume—because you first have to generate the work. Most users will just stream existing Suno songs, but there is a more hands-on form of consumption available. Similarly, these chatbots have the potential to be more emotionally gratifying than human companionship because they are guaranteed to lead to responses. Again, this isn’t sci-fi. This is happening for millions of people around the planet today. Imagine this a chatbot product not just in sex or companionship, but in each aspect of how we spend our time. We are still grappling with what it means to purchase emotions: Injecting the constraints of capitalism into emotions, into companionship feels like it diminishes the majesty of the human experience. Companionships as a product moves us deeper into the objectification that social media and dating apps already encourage. Worse, the only capital required from these chatbots is fiscal, not emotional. They exist solely to please and will completely conform themselves to your whims. This is in contrast to a real relationship which pushes you and stretches you with its demands. Social media platforms like TikTok already give us a steady dopamine drip from entertainment. How much more addictive will services that give us the dopamine drip that comes from love or infatuation be? The attention economy still reigns supreme: Despite this, there is no putting the genie back in the bottle. With the release of Meta’s new open-source models, anyone can build an at least OK chatbot. An open-source LLM removes the majority of the capital constraints and forces all chatbots to compete on the basis of time spent by its users. Perhaps the theory I am most proud of creating can explain what happens next. Double-bind theory argues that attention aggregators have no choice but to allow content that pushes the bounds of social acceptability. Otherwise, engagement moves to another platform. However, it's a double-bind. The more promiscuous your community content guidelines, the more advertisers will be reluctant to advertise on your platform (as Elon Musk is learning with X). I expect a similar dynamic to play out with chatbots—there will be more subscription-based models because the content is so hard to control and will so frequently push the boundaries of social acceptability. It is still too early to say whether these products are net good or net bad. One of the principles of Every is that we try to be supportive of startups, and as such, I am unwilling to cast wholesale judgment on the category. There are real uses to AI chatbots. Not only can they make lonely people feel like they have someone to talk to, or bond intimately with, but they might even help those with disabilities if deployed thoughtfully. An ideal progression is an AI bot that gives you the emotional skills of talking with someone and a user then deploys those skills in real life. We aren’t there yet, but it is a possibility for the future. Whatever the root cause, loneliness is a real problem, and chatbots may offer a temporary salve. Still, that so much of my research pointed to signs of user addiction and people preferring chatbots to humans is cause for concern. A world where humans depend on AI for emotional connection is probably not a better one. Source Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.