The Therapy Vending Machine

I dreamt last night that I was sobbing into a vending machine. It was the kind with those spiralled coils that slowly eject your selection like it’s doing you a favour. In the dream, I kept pressing B6—over and over. I wasn’t hungry. I wasn’t even sure what I wanted. But I was sure the machine owed me something. I had, after all, paid. The little screen of the vending machine finally changed from the payment screen to one with four brightly coloured boxes and a big flashy title at the top. It was a BuzzFeed quiz; Which Type of Anxiety Do You Have Based on Your Attachment Style?

When I woke up, I checked my inbox, and inadvertently, before I had even gotten out of bed, I caught myself scrolling YouTube shorts—on Safari, no less. Because, of course, I had nobly deleted Instagram, Facebook, and every other app with reels. And yet, it was only to end up watching the exact same dopamine trash in a slightly less optimised browser.

We’ve turned the internet into an emotional vending machine. Feed it your worst days and most desperate 3 a.m. thoughts, and it’ll return a tailored playlist, a “For You” page of performative breakdowns, or a carousel of trauma memes with laugh tracks. The algorithm has learned to predict our pain with a precision most therapists would envy. But here’s the problem: it’s not a therapist. It’s a very sophisticated mirror with a sales quota.

There’s something seductive about being seen so thoroughly, so effortlessly. You cry once to a Taylor Swift song and now Spotify serenades your heartbreak before you’ve even admitted it to yourself. You linger too long on a TikTok video about burnout and now you’re drowning in “relatable” content that knows your schedule better than your calendar app. It’s no surprise that, in 2021, The Wall Street Journal exposed how Facebook’s own research found that Instagram made body image issues worse for one in three teen girls—and still chose not to act meaningfully on it. Why would it? Sad girls scroll longer.

We’ve outsourced our self-awareness to a machine optimised for engagement. Not insight. Not health. Not change. Just clicks.

Ironically, the more emotionally fluent these platforms become, the more emotionally illiterate we turn. Real therapy is awkward. It’s uncomfortable. It involves silence and hard questions and having to admit maybe—just maybe—you were the problem. Algorithms, on the other hand, never hold you accountable. They never ask you to reconsider your choices. They just offer a little dopamine drip and say, “You’re right. Everyone else is terrible.”

Let’s be clear: the algorithm is not empathetic. It’s predictive. It’s not listening—it’s modelling. If you don’t believe me, look no further than the 2022 Replika AI scandal, where users formed “relationships” with chatbot companions that seemed emotionally intelligent—until the company had to lobotomise the bots because people were, predictably, falling in love with their own projection. Because that’s what we want. Not companionship. Just something to echo back what we already feel, but in Helvetica Neue.

We confuse recognition with resolution. That’s the great irony. We feel seen, but not better. We confuse virality with vulnerability, binge catharsis with healing, and we call it self-care. Meanwhile, mental health resources in real life are underfunded, overstretched, and often inaccessible—especially for marginalised communities. But hey, at least there’s a sad lo-fi beat under a Reddit confession to keep you company at 2 a.m.

At some point, we have to stop feeding our souls to vending machines. We have to stop expecting a playlist to fix what’s broken, or a curated feed to validate what’s real. Algorithms are built to sell your sadness back to you—prepackaged, market-tested, and rebranded as content. That’s not healing. That’s hustle culture for your grief.

(Also, I do think that B6 is white Monster from the Crosland level 6 vending machine.)

Leave a comment

Your email address will not be published. Required fields are marked *