Somewhere in the early days of the development of the theory behind modern psychology, Ivan Pavlov was ringing a bell and watching a dog salivate. What began as a study of digestion quietly opened the door to understanding something far more powerful: how the nervous system in animals can be trained to respond to stimuli.
In a nutshell, the stimulus/response mechanism works like this:
A symbol becomes a signal,
A signal becomes a habit,
A habit becomes a law of behavior.
The dog didn’t choose to drool — it responded autonomically to the stimulus. A little more than a century later, that same principle has moved out of the laboratory and into our everyday lives, though now the bells are changed, and the leash is an app. We’ve replaced the ring of a bell with the buzz of a notification, and instead of food pellets, we’re fed texts, DMs, likes, follower counts, and the elusive promise of being noticed. At the core of Pavlovian conditioning is something brutally efficient: if you take a neutral signal and pair it repeatedly with a biological reward, eventually the signal itself evokes the response, with no reward necessary. Thought is bypassed and reflection is unnecessary. The dog doesn’t have to “believe” in the bell; it just reacts to it. It is not subtle or complex, and that’s why it works so well. Now, instead of an actual brass bell or metronome, our smart devices provide the stimulus. Picture how social media fits into this model: you post a photo, someone likes it, and your brain lights up. Dopamine lands like sugar in the bloodstream. The next time your smartphone buzzes, even before you check, a part of you “hopes”. And when it’s not the reward you expected, you check anyway — just in case. The mechanism doesn’t depend on certainty, it thrives on possibility. Today we live in a culture built around this stimulus-response loop. Video games offer achievements and level-ups. News apps drip-feed us drama. Meditation apps gamify mindfulness — and smart devices are the vector. Everything is designed to give us a reason to return again and again - not by force - but by habit. The most addictive experiences are those that offer unpredictable rewards, and our digital tools have mastered this variable schedule. The feed might contain something extraordinary, or nothing at all. Either way, you scroll. The act becomes its own compulsion, divorced from intention. We are not browsing because we want to; we are browsing because our brains have been taught that something interesting could appear. The slot machine effect doesn’t need a payout every time. It needs just enough reinforcement to make you think you can win big.
It is no longer news that the algorithms used by social media providers are built to cause addictive behavior in their users. Now, add into this loop advanced artificial intelligence. Not as a new bell, but as a bell-master. Where Pavlov once rang his chime with human timing, AI now rings with a precision guided by linear prediction. Algorithms don’t just notice your behavior; they analyze and anticipate it. This is where something more subtle enters the equation - something quietly corrosive. It’s not just prediction at play, but regressive prediction. Social media algorithms don’t aim to evolve your interests or introduce you to what’s just beyond your horizon. Instead, they often regress your experience toward the most statistically validated version of your past self. The system learns what you’ve clicked on before and offers more of the same. It doesn’t ask who you’re becoming; it assumes you want to be who you’ve already been. Over time, your feed doesn’t reflect growth, it reflects inertia and sameness. Novelty gives way to familiarity. Exploration collapses into repetition. What once felt like a window to the world becomes a mirror — curated not for depth, but for reinforcement.
This isn’t the intelligence of a guide; it’s the reflex of a loop. It narrows experience rather than expands. It cements rather than transforms. We begin to forget the difference between what we choose and what we’ve simply been shown enough times to believe we chose it. The algorithms learn when you're tired, when you’re lonely, when your resolve weakens. They study your scroll velocity, your hesitation, and your silence. They tailor your experience to keep you just engaged enough to stay, but never so satisfied that you want to leave. Have you ever caught yourself mindlessly scrolling when you’re bored — and then becoming even more bored? What looks like content curation is often just behavioral conditioning. What feels like choice is often forecasted and a repetition of your past digital escapades. These systems are not designed to help you explore; they’re designed to simply hold your attention. That doesn’t require control in the traditional sense. It just requires knowing what you’ll do next. This inversion (that we are not using technology but being used by it), should challenge our relationship with our smart devices. We like to think of ourselves as free agents, acting upon the world. We wield tools, make decisions, manage our habits. But when the tool is designed to create habits in you, the direction of influence shifts. In a social media environment, your behavior becomes the product and your attention becomes the currency. Even your resistance can be converted into engagement! Apps that reward you for putting your phone down, for staying “mindful,” still work on the same loop. You’re not drooling for a “like” anymore, you’re drooling for a badge that tells you didn’t drool. The bell rings either way.
This is where things start to feel less like a tech trend and more like a pattern repeating itself through history. In ancient times, rituals emerged around signs and omens. The gods, we believed, gave us clues: about weather, harvests, or dreams. Priests and shamans interpreted them, and the people followed. (Ironically, Ivan Pavlov was the son of a priest). Today, the gods are algorithms, and the omens are push notifications. We check the feed the way we once consulted an oracle. What should I do today? What’s happening in the world? What is funny on TikTok now? It’s not hard to imagine a future where artificial superintelligence doesn’t just train us to act, but becomes a kind of priesthood; deciding what we see, when we see it, how we react, and eventually what we believe. Not based on conviction, but with data. Not based on wisdom, but with pattern recognition. The rituals we perform - wake, swipe, scroll, sleep - don’t feel sacred, but they are habitual and ritualistic in nature. They certainly shape us into structured and prescribed formalistic formats that we don’t often recognize.
So, what are we supposed to do with this awareness? The first thing, as always, is to pay attention. Not every “ping” is a gift, not every scroll is an act of discovery. Some are simply habits we’ve mistaken for desires. Many of our digital behaviors are not expressions of agency, but responses to (neo)classical conditioning. That doesn’t make them bad, but it does make them mindless and mechanical — and what is mechanical can be interrupted. Awareness doesn’t cancel conditioning, but it places a crack in the loop, the ability to pause, and the power to reflect. That’s where we begin to reclaim authorship. We may not be able to avoid all the bells, but we can choose between some of them and others. We can read a book. We can exercise. We can take a college course. We can build rituals that nourish rather than extract. We can use technology that deepens experience instead of simply hijacking it. And maybe in doing so, we move closer to building habits that respect the interior life of the human mind, rather than exploit its reflexes. Pavlov’s dog didn’t have a choice. It responded as it was built to respond - salivating not because it wanted to, but because the bell had been welded to the reward in its nervous system. There was no self-reflection, no stepping outside the loop. That’s the unsettling brilliance of classical conditioning: it works without permission. The system doesn’t need you to understand it; it only needs you to react.
But we are not dogs — at least not entirely. Our biology is similar, but our consciousness carries something else - an awareness not just of what we do, but of how and why we do it. That gap between stimulus and response is small, but it’s ours to own. In that sliver of awareness lives the possibility of true authorship. Of saying “no”. Of noticing the bell and deciding not to salivate. Not because we’re stronger, but because we’re capable of noticing that we’ve been trained. Because the bells keep ringing and it will only become more difficult over time. Our choice is between hearing and considering. Awareness doesn’t free us from conditioning, but it invites us to participate in how we are shaped. It turns the script into a dialogue, and in that dialogue, something useful can happen.
We stop being subjects and we begin to become ourselves again. Pavlov’s dog didn’t have a choice. We do. The question is whether we’re willing to use our choices or simply follow the rabbit down the hole.