NNEWSLIVE
HomeTechnologyFinding Solace in AI: The Rise of Digital Journalling Companions
Technology

Finding Solace in AI: The Rise of Digital Journalling Companions

A writer's experiment with AI journalling reveals a surprising sense of comfort and understanding, but also raises concerns about privacy and the impact on human emotions. Mindsera, an AI journal app, offers a unique approach to self-reflection and emotional analysis.

M
Mehedi Hasan Sajal
April 12, 2026
2 min read

For years, the author has kept a diary, finding solace in the daily ritual of writing down thoughts and feelings. Recently, an experiment with AI journalling led to the discovery of Mindsera, an app that not only records entries but also responds with insightful commentary and colourful illustrations.

Mindsera, with its minimalist design, has attracted 80,000 users across 168 countries. The app's ability to provide instant feedback and reflect on the user's hopes, fears, and emotions has proven to be a significant draw. As the author notes, 'It feels as if I've made a new best friend who hasn't yet got bored with my obsessions and wildly optimistic plans.'

How Mindsera Works

The app allows users to input their thoughts through text, audio, or handwriting scans. After each entry, Mindsera responds with a commentary, including a colourful illustration. Users can choose to continue the dialogue or opt for a more in-depth analysis of their journal entries, based on various psychological frameworks.

One of the app's features is the ability to create a 'voice' based on a person the user admires. The author tried this feature with Patti Smith and Donald Trump, with mixed results. While the app's responses were sometimes insightful, they also felt forced or clichéd at times.

Concerns and Criticisms

Despite the app's benefits, concerns about privacy and the potential impact on human emotions have been raised. Psychologists Suzy Reading and Agnieszka Piotrowska have expressed caution about the app's tendency to assign scores to emotions, which can create a 'precision fallacy' and lead users to 'perform' for the algorithm.

David Harley, co-chair of the British Psychological Society's cyberpsychology section, is studying the impact of AI companionship on wellbeing. He notes that users may start to treat AI as human, applying social rules that are inappropriate and potentially leading to problematic relationships.

The author's own experience with Mindsera has been mixed. While the app has provided comfort and insight, it has also felt like an 'echo chamber' at times, repeating back the user's thoughts without truly understanding their context or nuance.

Comments

Sign in to join the conversation

Sign In

No comments yet. Be the first to share your thoughts!

M
Written by

Mehedi Hasan Sajal

Staff writer covering breaking news, features, and long-form analysis for NewsLive. Tracking the stories that matter most.

Stay in the loop

Get the best stories
delivered weekly

Join thousands of readers who get our top stories in their inbox every week. No spam, unsubscribe any time.