Spirit’s gone Away. (We Fell for the Aesthetic. Again.)

Me and my gorgeous parents, Ghibli-fied.


A picture’s worth a thousand clicks.

We’ve all seen them by now. The dreamy portraits. Big-eyed avatars, soft pastels. The internet's been overrun with Studio Ghibli-style AI images, thanks to ChatGPT’s new photo generation feature. The internet saw a cute new toy. And did what it always does, tried to monetise it.

Within days, multiple people tried launching app’s to monetise the trend. One user launched a waiting list for an app called Gib, "Instagram, but every photo is Studio Ghibli." It gained some mild attraction, then went viral when the creator posted a dramatic cease-and-desist letter from Studio Ghibli, along with a martyr-like proclamation about the sanctity of AI expression.

"AI creators deserve protection, not punishment.” 😬 Sir??????

People took it seriously, and it caused quite a stir.

Turns out the letter was fake, AI generated.

It wasn’t about creativity. Or even controversy. It was about attention.

Traditionally, virality was a side effect of something meaningful. Because in the AI era, virality isn’t a byproduct of something cool any more. It’s the product.

Now, the goal is virality itself. Not what it's saying, just that it spreads.

The content isn't the product. The spectacle is.



We didn’t want to steal Ghibli, we wanted to FEEL SOMETHING.

People didn’t flock to the Ghibli filter because they’re culture thieves. They did it because it sparked something between novelty and nostalgia, a chance to see themselves as a character in a world that feels hand-drawn and hopeful. It’s fun. It’s beautiful. Beauty felt accessible.

In an online world that rarely offers intimacy or magic, the Ghibli aesthetic delivered a kind of emotional escapism. Not just a stylistic trend, but a playful invitation to reimagine ourselves inside stories that once moved us. It bridged the digital and the physical where the fantasy felt personal.

So yes, this is about human curiosity. Joy. A desire to feel enchanted, the thrill of novelty, the ache to see ourselves as beautiful.

What looks like magic is often marketing. And slowly, we stop noticing the difference. We start accepting imitations as truth. We trade soul for spectacle, and call it progress.

People love the way these AI images of us look. They’re beautiful. But that love becomes conflicted when we realise the aesthetic we’re enjoying was trained on the backs of artists who spent decades perfecting it, without consent, without credit.

Those artists still exist, watching their signature styles go viral without credit, without control, without a cheque.

The Ghibli filter moment reveals a quiet tension between human desires and platform logic.

When platforms simulate emotion purely to drive engagement, we don’t get connection, we get choreography. And eventually, we stop asking if what we’re feeling is real. We just click, react, consume, repeat.

That dissonance isn’t accidental, it’s engineered.




Propaganda, but make it pastel. 💅

ChatGPT’s new image feature didn’t just arrive out of the blue.

The CEO of OpenAI, Sam Altman, recently said, “Believe it or not, we put a lot of thought into the initial examples we show when we introduce new technology.”

He’s right. The feature was released with curated examples of Ghibli-esque portraits. Familiar, harmless, shareable.

What didn’t it launch with? Deepfakes of politicians. Propaganda. Uncomfortable images. The creative casualties. The slow erasure of designers, illustrators, writers, replaced not because they weren’t good, but because AI is faster and cheaper. It didn’t show the portfolios scraped to train it.

It could have. But those would spark panic. But it doesn’t. Because it’s easier to sell a dream than it is to explain a theft.

So instead, we got colourful, soft nostalgia. Emotionally loaded, ethically foggy.

This is not a happy accident. It’s product strategy. Controlled creativity. Choice architecture.

When the default for AI art is soft, dreamy mimicry, we quietly reshape what people believe creativity looks like.

Kiki’s Deepfake Service.

The Studio Ghibli aesthetic is a legacy built by hand, frame by frame, powered by decades of storytelling, spiritual depth, and reverence for life.

The model was trained on real artists’ work, without consent, and now users are being guided, invited, to monetise what was once sacred.

Hayao Miyazaki himself once said, “I would never wish to incorporate this technology into my work at all… I strongly feel that this is an insult to life itself.”

So what does it mean when a man who made films about the spirit of nature, care, effort, love, has his style turned into an app filter?

It means we’re not building on his legacy. We’re flattening it for scale.

Is this democratising artwork? If the tools that claim to empower are built on uncredited labour, scraped portfolios, and mimicry?

I’m not saying TECH IS EVIL, DON’T USE IT. The tech is incredible, it will change industries. But let’s not pretend it’s all about empowerment without asking who paid the price for it. We shouldn’t confuse accessibility with equity.

This is what we’re actually losing

The problem isn’t that people want Ghibli-style images. The problem is what happens when our real human desires for beauty, comfort and meaning get outsourced to machines trained to imitate emotion.

We’ve made it easier to replicate magic than to make it. We’ve confused likeness with depth. Expression with extraction.

And as these systems scale, that confusion grows.

We start to believe that creativity is just a vibe. That culture is just content. That art is just another backdrop for our avatars.

This isn’t just about one app, one stunt, one anime style. This is the blueprint. Launch with theft. Monetise aesthetic. Cry censorship when challenged. We’re watching in real time as culture gets turned into product, and resistance gets reframed as oppression.

If we don’t slow down, we risk building a culture where everything looks familiar but means nothing. Where imagination is infinite, but spirit is gone.

The future is still up for grabs.

I use ChatGPT, and I love it. I believe this technology is transformative. It helps me work faster, connect ideas, think better, and imagine more boldly.

But reverence matters.

Boundaries matter.

Cultural memory matters.

So where does this leave us? What kind of culture are we building when we value vibes over vision, aesthetic over authorship, convenience over care?

AI’s aesthetic output is not inherently bad, but I think there should be more public transparency on the way it’s trained, packaged, and deployed, or we risk quietly distorting our cultural values. We’re not just replicating beauty, we’re commodifying wonder. And if we’re not careful, we’ll forget the difference between magic and mimicry.

Ghibli’s work was powerful because it reminded us that imagination when treated with care, can awaken something sacred. That creativity isn’t just about output. It’s about soul.

So maybe the real question isn’t ‘what can AI make?’

Maybe it’s, ‘what do we want it to do for us?’

Because clearly, we want something. We want beauty. We want comfort. We want to see ourselves in a world that feels enchanted again.

There’s nothing wrong with that.

But let’s not take shortcuts to wonder. Let’s build tools that honour the feelings we’re chasing. Let’s expand imagination, not flatten it. Let’s use AI to amplify human meaning, not override it.

We can use these tools, but let’s also demand transparency. Honour the source. Credit the craft. Choose intention over imitation.

We’re still early enough to choose that version.

It’s not about being anti-AI.

It’s about being pro-human.

Madeline

Ghostwriter and storytelling strategist turning smart ideas into unforgettable, personality-driven content.

https://mad-lines.com
Next
Next

What if your audience isn’t the problem?