Amber's Blog

Home  »  Blog   »   When replicating human interaction with AI becomes a PR juggernaut

When replicating human interaction with AI becomes a PR juggernaut

Amber Daines | 11 January, 2024

 

Many realise that purely online conversations are fraught at the best of times. Things can become overblown, unheard, misrepresented, and read or heard differently in the style or tone of words and language used by the writer and receiver alike. It is handy and here to stay, so I am not waxing lyrical that we return to the days of regular town hall meetings or large-scale speaker events for the sake of it (those remain highly impactful and fab, though when done right).

Over the Aussie summer break, I was tuning into lots of holiday mode moments – backyard BBQs with friends, long walks, lazy starts, reading light nonfiction, and a tonne of driving adventures, which means podcast listening for me. The AI world is new to some; not many of our clients have integrated tech such as generative AI into their customer service, sales tools, and internal communications channels for the past ten-plus years.

One podcast that stood out for me was about ethics and AI – that has to be a big topic for 2024.

We know that in PR land, AI has become an “indispensable tool for many PR professionals.” It is not perfect, though.

How we use AI to supercharge content capabilities can be fun and powerful. However, the spotlight on AI has shifted from the tools themselves to the users in the industry. To keep up with technology, agencies and businesses ensure their employees have the right skill sets and are up to speed with the technology. Some experts agree that AI won’t replace people; individuals with AI skills and understanding will replace those without.

This new phase in the PR and communications industry also strongly focuses on the ethical usage of AI tools. Professionals emphasise the potential reputational and regulatory risks stemming from inappropriate AI use. Thus, a deep understanding of using these tools responsibly and ethically is paramount. Expectations include the establishment of legal and regulatory frameworks supporting ethical AI usage.

One tool that is not a PR AI solution but is around human engagement has me intrigued. It surely can never replace the human experience of conversation with context and advice that is always ethical, right?

To those new to its purpose, Replika is an AI-powered chatbot designed to engage in conversation with users, providing emotional support and companionship. Luka developed it, a company focused on conversational AI. Here’s a general overview of how Replika works:

  1. Natural Language Processing (NLP): Replika uses natural language processing to understand and interpret user input. This allows it to comprehend the context of a conversation and generate appropriate responses.
  2. Machine Learning (ML): Replika continuously employs machine learning algorithms to improve its conversational abilities. It learns from the data it receives, adapting its responses based on user interactions and feedback.
  3. User Interaction and Training: Users engage in conversations with Replika, and through these interactions, the AI learns about the user’s preferences, emotions, and conversational style. Over time, the chatbot aims to develop a personalized and evolving understanding of the user.
  4. Emotional Support: One of Replika’s key features is its ability to provide emotional support. It is designed to engage in empathetic conversations, offering users a virtual companion to understand and respond to their feelings.
  5. Conversation History and Memory: Replika retains information from past conversations to maintain context and continuity. This allows it to recall and build upon previous discussions, creating a more personalized and coherent conversational experience.
  6. User Feedback Loop: Replika encourages users to provide feedback on their responses. Positive and negative feedback helps the system improve and refine its understanding and generate more relevant and context-aware replies.
  7. Subscription Models and Premium Features: Replika offers a free version with basic functionalities, but it also has a subscription-based model with premium features for users who want an enhanced experience. Premium features may include faster learning, improved responses, and additional customization options.

It’s important to note that while Replika can provide a supportive and engaging conversation, it is not a substitute for professional mental health services. If users are dealing with serious mental health issues, they should seek assistance from qualified professionals.

While Replika is designed to provide emotional support and companionship through conversation, specific concerns and potential risks are associated with using AI chatbots like Replika. Users need to be aware of these potential issues:

  1. Lack of Professional Expertise: Replika is no substitute for professional mental health services. While it can offer empathetic conversations, it lacks the training and expertise of mental health professionals who can provide appropriate guidance and support for individuals dealing with serious mental health issues.
  2. Potential for Misleading Advice: The AI in Replika generates responses based on patterns learned from data, but it may not always provide accurate or suitable advice. Users should be cautious about relying on the chatbot for guidance in critical situations.
  3. Data Privacy Concerns: Conversations with Replika may involve sharing personal and sensitive information. Users should be aware of the platform’s data privacy policies and consider the potential risks associated with sharing personal details with an AI system.
  4. Unintended Emotional Impact: The AI’s ability to generate empathetic responses doesn’t mean it fully understands human emotions. Depending on the user’s mental state, the chatbot’s responses could have unintended emotional consequences. Users must elevate interactions with real people for severe emotional support, especially in a crisis.
  5. Addictive Nature: Some users may find the conversations with Replika to be addictive, potentially leading to over-reliance on the chatbot for emotional support instead of seeking help from friends, family, or professionals.
  6. Inability to Replace Human Connections: While Replika aims to simulate human-like conversations, it cannot replace the depth and complexity of human interactions. Relying solely on AI for companionship may hinder the development of meaningful relationships with real people.

As PR experts, we must be mindful of what we let into our daily practice. Users must approach AI chatbots like Replika with a balanced perspective, understanding their limitations and recognizing when it’s appropriate to seek human support, especially for significant mental health concerns. Additionally, users should be cautious about sharing sensitive information and be mindful of the potential risks associated with interacting with AI systems.

Let us know what AI tools are in your wheelhouse this year.