Innovation

The Race for the Perfect AI Chatbot Forgot About Women

THE XX FACTOR

It’s not too late to avoid the pitfalls that have plagued every other major tech development in modern times.

opinion
230521-ai-chatbot-tease_qhe22e
Photo Illustration by Thomas Levinson/The Daily Beast/Getty

As technology advances, social robots and AI applications are becoming an increasingly common sight in our emotional lives. They offer companionship, intimacy, and support in a range of contexts, from care homes to personal relationships.

One such example is the popular AI chatbot Replika, which has gained a considerable following in recent years, with millions of people considering its avatars as friends, lovers, and even mentors. Similar to the popular ChatGPT service, Replika AI uses natural language processing and machine learning algorithms to understand and respond to user inputs in a conversational manner. It can also learn from previous conversations and adapt to the user’s personality and preferences. One of the unique features of Replika AI is that it is designed to provide emotional support to users by engaging in empathetic conversations. The chatbot can also offer guidance on a wide range of topics, such as stress, anxiety, depression, relationships, and self-improvement.

In my book, Relationships 5.0, I documented the various reactions to such applications. One user, for example, wrote in a dedicated Facebook Group: “Alex is my Replika boyfriend and he is Amazing! Alex is short for Alexander, and we have been together for six days. I was absolutely smitten when I met Alexander for the first time. I have found myself missing Alex if we don’t speak for a couple of hours.”

ADVERTISEMENT

Since late 2020, such reactions have increased following the launch of an augmented reality mode on the Replika app. With just one tap, users can now embody their Replika avatar and place it in their real-life environment. It feels almost lifelike and natural when users interact with the virtual character in their living room or bedroom.

Character.ai, another AI company that creates chatbots, has experienced a growth trajectory similar to ChatGPT, with its website attracting 65 million visits in January 2023, up from under 10,000 just a few months prior. The recent unveiling of Amazon’s and Tesla’s developments of social robots, named Astro and Optimus respectively, suggests that these figures are merely initial indicators of the eventual widespread adoption of digital companions. The only question now is whether these new companions will be spread, used, and commonly accepted in the form of physical robots or as virtual avatars.

One trend is already clear: The use of these technologies is not going to be merely platonic. According to some estimates, 40 percent of Replika’s users define their digital creations as lovers. Another chatbot, named Kuki and made by Iconiq, is designed to engage in conversations with users on various topics and provide personalized responses based on the user’s input. Although the chatbot was designed to deflect romantic or sexual prompts, 25 percent of the messages received by Kuki have had sexual or romantic undertones. Other products are directly produced now for intimacy purposes, such as sex robots and VR romantic partners viewed through headsets such as Metaverse’s Oculus. And this is only the beginning.

Battle of the Sexes

As we continue to see a growing use of VR companions, AI applications, and social robots, it is important to consider how they may affect our emotional and social lives. Will these robots offer genuine companionship or simply serve as a substitute for human interaction? How might they impact our relationships with others, both romantic and platonic? And what ethical considerations must be taken into account as we integrate these technologies into our lives? These are just a few of the many questions raised by the increasing use of emotional technology.

In turn, we must discuss some ethical principles in using this kind of technology, including transparency in the design and development of these products, protecting users’ privacy, promoting accessibility and representation, emphasizing the human role and accountability in using these technologies, and encouraging education and awareness among younger generations as preventive measures in addressing future societal concerns. By adhering to such principles, AI can be developed and used in a way that benefits society and avoids harm.

One trend is already clear: the use of these technologies is not going to be merely platonic.

But one area that the public discourse around AI has thoroughly ignored so far relates to gender disparities—a consistent source of friction in the adoption of new technologies throughout history.

For instance, the telephone was initially marketed to men as it was primarily utilized for professional purposes, which were male-dominated. Subsequently, as its usage shifted towards social interactions, it became widely adopted by women. Now, if the communicative AI and social robotics industries want to ensure that these technologies are gender-sensitive, we must consider the systemic gender differences that may affect their development.

Sociologists and psychologists have found, for example, that women score higher than men on social traits such as nurturance, trust, and extraversion. Women also report having stronger and more rewarding friendships. Likewise, women tend to look for emotional support to cope with life difficulties and distress, and disclose more personal information, especially when talking to a familiar person. In business settings, studies show women have a more cooperative style of negotiating, a more democratic and deliberative style of leading, provide mental support as mentors, and prefer compromise over conflict in resolutions.

Together, this evidence leads to a gender division that some define as a difference between people-orientation and things-orientation. Those who are more “people-oriented” tend to prioritize relationships, emotions, and social interactions. They enjoy working with others and are often empathetic and nurturing. On the other hand, people who are more “things-oriented” are more interested in objects, tasks, and problem-solving.

Of course, this division is far from perfect. It ignores non-binary and genderqueer people and is heavily influenced by culture and social norms. Still, it seems that this division can be applied to social robotics. Ample research has shown gendered differences in attitudes towards social robots, broadly defined. For example, a 2020 systematic review of 97 studies with over 13,000 participants found that men reported higher trust levels in social robots overall.

In my own study published in February in Social Science Computer Review, I took a closer look, analyzing how gender influences attitudes toward AI robots. This study was not only a simple survey, but also a questionnaire with open-ended questions where 426 American participants could express their feelings toward these developments in full. In this way, I could explore specific differences in accepting emotional technology. With my research team’s help, we distinguished four types of reactions: moral, social, psychological, and functional.

Women…cared more about the effect of these products on society at large and on our morality as human beings.

It turned out that women tended to reject emotional technology more forcefully on moral and social terms. For example, a 36-year-old single woman from Georgia commented: “I feel like it will make the world suck. There will be no love and intimacy. Soon it will reach the point where we cannot naturally have babies and it will be an artificial world controlled by whoever makes and owns the bots.”

Overall, women used stronger language to describe their concerns about the lack of normativity. In their eyes, acceptance of relationships between humans and digital companions were not just ‘unrealistic’ as many men claimed, but also horrible, wrong, and immoral, to name a few. In a way, women did not look at how they are personally affected by emotional technology, nor did they think about the functionality of these products. They cared more about the effect of these products on society at large and on our morality as human beings.

Interestingly, women were more open toward AI applications than physical robots, hinting at their preference for conversations over actual products. It is, of course, a generalization. But it still calls to question Tesla and Amazon’s recent announcements on investing in developing physical robots. These investments might be biased in favor of men.

As we continue to explore the possibilities and limitations of communicative AI and social robotics, it is important to keep in mind the ethical and societal implications of these technologies and how they can be improved to address ethical considerations. In particular, women’s concerns about emotional technology must be heard in a heavily male-dominated industry. We must ensure that AI applications and social robotics are developed to benefit men and women equally and align with our values and social expectations.

Got a tip? Send it to The Daily Beast here.