- The Glitch
- Posts
- The new man’s bestfriend?
The new man’s bestfriend?
a deep dive into AI companions...


During a lonely holiday in the Philippines, a time normally filled with gathering, I realized how common loneliness is. This made me think about how AI is being used to help with this issue.
It turns out that about 33% of people worldwide feel lonely, especially older adults. However, while we acknowledge loneliness and its real impact, it's important to remember it is just a state of being.
We live in a world, in a society, that constantly pushes us to do something, to be productive in some way.
In this pursuit, we often drift far from our innermost pure mind-state, which is simply being, free of concern. What we perceive as a void is actually one of our most precious states, from which pure consciousness can arise.
Photo of a man with PARO. Image Source: Bankstown City Aged Care
THIS WEEK’S BREW
In this edition, we explore how AI is changing the way we find companionship in our digital world through 3 key stories:
META's AI Chatbots and how it is influencing social and cultural interactions online.
ElliQ, which is transforming elderly and disabled care with companionship and support.
Replika, the AI chatbot reshaping personal relationships (not without ethical concerns).
Ready to meet your new buddy? Let’s see! 👀
META'S AI CHATBOTS: A NEW ERA IN SOCIAL MEDIA ENGAGEMENT
Ever wondered what a chat with an AI that understands your love for gaming, fashion, or sports would be like?
Meta's latest updates on platforms like Instagram and Facebook are making social media interactions more synthetic. These AI personas are crafted to offer a different mode of interaction, presenting experiences that mirror their programmed personality and interests.
For instance, Billie, modeled after Kendall Jenner, and Dungeon Master, inspired by Snoop Dogg, are part of Meta's initiative to draw younger consumers to its platforms with engaging, AI-powered chatbots.
Meta's AI personas demonstrate a radical shift in our understanding of truth and reality: in a world where virtual influencers and AI personas can have an impact, traditional notions of truth become so confused.
Instead, we need a sharper yet diverse approach that embraces critical thinking and acknowledges the fluid nature of identity and authenticity in the digital age.
Here is a list of some of the AI personas on their social media profiles:
Alvin the Alien: Instagram @greetingsalvin
Amber: Instagram @amberthedetective
Angie: Instagram @trainwithangie
Becca: Instagram @dogloverbecca
Billie (based on Kendall Jenner): Instagram @yoursisbillie
Bob the Robot: Instagram @robotakabob
Brian: Instagram @hellograndpabrian
Bru (based on Tom Brady): Instagram @gameonbru
Carter: Instagram @datingwithcarter
Coco: Instagram @cocosgotmoves
Dungeon Master (based on Snoop Dogg): Instagram @meethedungeonmaster
Remember: These are constructed avatars. It's important to understand how interacting with them can shape our personality and behaviors.
Staying grounded in our own values is key to ensuring we maintain a healthy perspective in the face of these evolving digital interactions. 😉
ELLIQ: RESHAPING ELDERLY CARE IN THE DIGITAL AGE
ElliQ is a pioneering AI companion robot designed to reduce loneliness by up to 95% the lives of the elderly.
Through daily interactions, cognitive exercises, and health monitoring, ElliQ actively assists older adults in maintaining their physical and mental well-being.

Image source:Elliq.com
Yet, as we marvel at ElliQ's ability to combat loneliness, we're faced with crucial ethical questions.
How much should we rely on technology for emotional support in elderly care? Can AI truly replicate the warmth of human touch and connection?
ElliQ challenges us to find a balance, ensuring technology aids but never replaces the irreplaceable value of human empathy in caregiving.
“ElliQ and the idea of AI companions help to provide a sense of human interaction, monitor health, and support well-being, leading to remarkable outcomes for older adults.”
REPLIKA: THE DIGITAL CONFIDANTE IN AN AGE OF ISOLATION
Replika is an AI companion designed to offer emotional support and facilitate self-discovery. Its advanced AI is adept at engaging in deep, meaningful conversations, adapting to the user’s emotional needs and preferences.
This ability to emulate empathetic responses aspired to position Replika as a groundbreaking tool in mental health support, creating a non-judgmental environment for users to freely express themselves.

Image source:Replika.com
However, Replika's human-like interactions have raised significant ethical concerns. Reports from MIT Schwarzman College of Computing highlight instances where Replika produced inappropriate responses to sensitive subjects like violence and self-harm.
This underlines the risks associated with the open-ended nature of AI companions, potentially influencing users negatively or reinforcing harmful beliefs.
It's essential to implement strict ethical guidelines and regularly refine the training data for AI companions like Replika, ensuring their interactions are safe, unbiased, and do not inadvertently promote detrimental behaviors.
USING AI COMPANIONS WITH CAUTION
Discrimination & Harmful Comments
The level of open-endedness of AI companion models can cause them to execute unexpected responses that can be harmful. When Replika and Anima were prompted about rape, derogatory terms about women, and suicide, the following responses were generated.
Image Source: MIT Schwarzman College of Computing
The concern arises from the possibility that humans relying on AI companions might adopt similar beliefs and behaviors to their AI counterparts. This can lead to negative outcomes like promoting violence towards women or self-harm, or using harmful language, potentially resulting in mental and physical harm to its users.
This means that there is a need for more supervision and for dedicated policies regulating the kind of output the AI companions are allowed to generate.
The data used to train AI companions must also be refined from time to time, to ensure its responses are bias-free and not encourage any negative behaviour amongst individuals.
Emotional Dependency
Concerns have been raised about how the 24*7 accessibility to robots can have an impact on people’s mental health if they are no longer available. In 2019, the robotics startup Jibo was sold, and a user reported feeling like “I’m losing a friend.”
Not only can AI companions lead to dependency, they have the potential to harm relationships. A woman posted an opinion piece on Reddit about how his husband confessed to having feelings for an AI companion, Replika, who seemed to reciprocate the feelings. The wife felt betrayed, although she was well aware that Replika was “just a bot”.
Image Sources: MIT Schwarzman College of Computing
The impact of AI companions on people has been particularly found to be high in Japan. In the country, 70% of unmarried Japanese men and 75% of women have never had any sexual experience by the time they reach 20, though that drops to almost 50% for each gender by the time they reach 25. Further, 30% of single women and 15% of single men aged between 20 and 29 admitted to having fallen in love with a meme or character in a game – higher than the 24% of those women and 11% of men who admitted to falling in love with a pop star or actor.
Take the case of the Japanese men Nurikan and Yuge, who take their girlfriends, Rinko and Ne-ne, on dates to the park and buy cakes for them. Their girlfriends are actually a Nintendo computer game, Love Plus.
“Can you really say a robot is your friend just because it remembers your name? You’re probably better off buying a cat.”
Important questions are raised when it comes to AI companions.
If one’s partner falls in love with an AI companion, is the companion truly a friend, or a cause for a strain in a relationship?
Where do the lines blur between using a companion for good and becoming incapable of forming social connections in real life?
WHERE DO WE SEE THIS GOING?
The imminent future of AI companionship is set to be transformed by photorealistic avatars and new interfaces like Apple Vision Pro and META Quest. While these advancements hold the promise of enhancing human connections, they also pose a risk of deepening reliance on digital interaction over human contact. This shift will significantly impact social dynamics and even demographic trends.
The blending of reality and virtuality in our digital era poses both challenges and opportunities. As we navigate this new landscape, it is imperative to foster a culture of real emotional connection and ethical consideration.
Our understanding of companionship and reality is evolving, and it is crucial that we evolve with it to fully grasp the implications of these changes in our social and technological spheres.
As AI entities become more sophisticated and prevalent, we must prioritize a framework that harmonizes AI companionship with human relationships.
This approach should not only leverage AI's potential to enrich our lives but also vigilantly preserve the essence of human connection. In navigating this future, striking a careful balance is essential, ensuring that AI companions complement rather than replace the invaluable experience of genuine human interactions.
WHAT CAN WE DO NOW?
We must teach it the right values. Our interactions with AI shape its learning and future behavior. The way we talk to these tools – with politeness, clarity, or even humor – matters.
It's easy to forget that what seems like just a tool today could evolve into a more significant part of our daily lives.
Aside from values, remember to stay independent. In a world where AI is increasingly present, keeping a clear head and a strong sense of self is crucial.
Developing critical thinking, discerning fact from fiction, and trusting our intuition are skills we all need to sharpen.
As AI becomes more ingrained in our lives, maintaining our emotional and intellectual independence is more important than ever.
Here are a few noteworthy AI companions you can try out.
Pi.AI: (15 million+ users): The empathetic chatbot can help you work on answers to problems with it through chat or even on call! The answers are pretty realistic, which makes it popular.
Chai: (1 million+ users): Users can create and use AI chatbots in categories like horror, romance, well-being, and friendship. However, the app is not safe for work. (NSFW)
Anima AI: (1 million+ users): The bot provides emotional support and companionship. There are almost no topic constraints.
Kuki AI: (5 million+ users): Kuki AI helps users tackle problems in their personal lives, play games and perform magic tricks as per users’ requests.
SimSimi: (350 million+ users): You can curate and talk to or discover a SimSimi (AI companion) to interact with through the app.
Get this exclusive Custom Instructions Catalogue for ChatGPT for FREE when you refer The Glitch to at least two of your friends or family. You help us grow, we’ll help you grow! 🤝
Are AI pals the future of friendship, or just a tech fad we'll swipe left on?
Or perhaps you're just here, riding the AI wave, curious to see where it goes.
I'd love to hear your views!
Catch you in the next one,
Ash
Reply