The Soft War: Trust, AI, and Bursting Bubbles

Look at your feeds carefully.
Can you see it?
Defocus from the personalities, the dopamine likes, the copy-paste vulnerability, the manufactured authenticity, and you might begin to detect a pattern.

A quiet war - not between individuals, but between ways of thinking.
Ways of writing.
Ways of being.
We talk about AI like it’s a tool. But tools don’t rewrite tone. They don’t distort reality or make us second guess our own.

And yet here we are - living through a linguistic soft war, where emotional performance, simulation, and sincerity all wear the same coat.

Surface Theatre

At first glance, the tribes look familiar.

There are those railing against AI - warning that it cannot replace true authenticity, human connection, and raw emotion.

Meanwhile, long before AI plenty of influencers became wealthy manufacturing those exact values. An F-bomb here, a carefully framed backdrop there - just enough edge, just enough social proof. Everyone say “cheeeese.”

“But at least their manipulation was human, not machine-based” the other side, reply.

Then there are those using AI secretly - polishing their banal lives into beautiful, heart-warming stories. If words were a camera filter, their faces would be polished until they looked like a fruit pastel. Templated tragedy, pre-meditated humility, the hero, the victim - emotional rhythm without the emotional risk.

And finally, there are the ones who accept the symbiosis of AI. Not because they love the machine, but because they see the tide and are adapting before the wave hits.

That’s what it looks like.
But the real shift is happening underneath.

The Disappearance of Trust

AI now mimics emotional tone so well that even when something is real, we don’t trust it.
The story might be true.
The experience lived.
The insight hard-won.

But if it feels too real, too perfect, too well-structured - if the storytelling perfectly hits on every emotional beat - we wonder, is this fake?

We’ve become emotionally cautious. Sceptical of sincerity.
We’ve seen too many hot takes, too many arcs of redemption, too many everyday events wrapped up in moments of insight. We’re jaded.

So now, we’re not just suspicious of AI-written content.
We’re suspicious of everyone.

And that’s the real loss - not connection with machines, but connection with each other.
In 2020, we were separated as a society physically, with catastrophic results.

Now, while we return to normal, the ubiquity of AI risks something worse - permanent separation emotionally.

But This Time is Different

Is this new? Well, at a surface level language has always bent to its environment. We shape it. It shapes us. And most tellingly, it has often been driven by our relationship to technology - a two-way symbiosis that evolves with every breakthrough.

When we lost facial cues in written conversation, we invented emoticons. When we needed irony or softness, we typed “lol” - even when we weren’t laughing. And somewhere along the way, those digital quirks bled into real speech.

We adapted. It was weird at first. Then it wasn’t.
But this isn’t just a new slang or a tonal shift.
It’s about trust - and the erosion of our emotional instincts.

Symbiosis Isn’t New - But Something Feels Different

We’ve always merged with machines - and usually, we celebrate it.

Pacemakers, cochlear implants, AI-assisted prosthetics.
No one questions the humanity of someone whose heartbeat or hearing is enhanced by tech.
Because in those cases, the machine serves the body - it doesn’t speak for the self. It doesn’t fundamentally change who you are.

But when the machine starts writing in your voice?
When it starts shaping how you express pain, joy, leadership, humility?
That’s where the discomfort kicks in.

Because now, it’s not just helping you live.
It’s telling people who you are. It’s shaping an outward reality you project to the world.
And when everybody knows something might be doing that - whether true or not - it sows seeds of doubt, a layer of cynicism, an invisible barrier, a bubble to protect ourselves from the risk of manipulation.

We’ve explored this in fiction for decades.

In Ghost in the Shell, the protagonist’s body is fully synthetic - but the real question isn’t whether she’s human. It’s whether her soul - her “ghost” - still belongs to her.
The film never says technology strips us of identity. It asks whether, in a world of artificial memory, synthetic bodies, and machine-written expression, we can still tell what’s truly ours.

In Neuromancer, that question is long past.
Humans jack into networks like it’s second nature. No one stops to ask whether a thought was theirs or machine-assisted - the symbiosis is assumed.
And with it, a strange clarity emerges: intent, instinct, and identity are the only things that still matter.

That’s where we’re heading.
Soon, it won’t matter whether AI was used.
It’ll only matter how it was used - and whether the result carries value, originality, or a human signal.

Boosting your signal

In the future, only one thing will matter.
It won't be about how something looks or even how “real” it feels.

It will be less about you, and more about what you can do for others.
Can you create work that lifts, or in some other way has true, not performative, value?
Can you reach through the screen and connect with someone so they voluntarily lower their protective bubble?

That’s the only kind of signal worth sending.

One that lets us reconnect.

Because while the “Ra-Ra” brigade will never go away, and dopamine will always be a thing, the positive or negative effects of AI really sit in the hands of the user.
So worry less about the wider debate and think more about how you intend to use it.

If you’d like guidance on how to use AI to enhance - not replace - your work, and develop a more nuanced, effective, and future-proofed perspective on generative technology, get in touch.

We would love to reconnect.

DISCOVER MORE
Dive deeper into creativity, innovation, and the ever-evolving landscape of storytelling and technology. Explore thought-provoking ideas, practical insights, and inspiring examples from our latest articles.
Signal, not simulation. You++ awaits.