It wasn’t an accident. It was engineered
Social media didn’t become addictive by chance. It was intentionally designed that way.
In the early 2000s, Stanford University’s Persuasive Technology Lab, founded by behavioural scientist BJ Fogg, pioneered the field of digital behaviour design. Fogg’s work explored how apps and platforms could influence human behaviour through triggers, motivation, and ease of action, a formula now known as the Fogg Behaviour Model.
Many of Silicon Valley’s top designers and engineers studied under Fogg. Facebook’s early product teams, as well as Instagram founders Kevin Systrom and Mike Krieger, were among them. Tristan Harris, the former Google ethicist who later founded the Center for Humane Technology, was also a student of Fogg’s courses.
See also: Fifa turns World Cup 2026 into a live testbed for enterprise AI
These classes taught students how to manipulate human psychology for engagement. And when those students joined the tech world, they applied those lessons to the real world, on a massive scale.
“It literally changes your relationship with society, with each other... God only knows what it’s doing to our children’s brains,” said Sean Parker, Facebook’s first president, referring to how the platform exploits psychological vulnerabilities.
This wasn’t some passive outcome. According to insiders and executives, Facebook studied dopamine reward systems, neurological feedback loops, and micro-interactions, all to keep users hooked.
See also: Singapore's Gen Z set to lead job hunt in 2026, says LinkedIn
Chamath Palihapitiya, former Facebook VP of User Growth, said: “We have created tools that are ripping apart the social fabric of how society works... The short-term, dopamine-driven feedback loops that we have created are destroying civil discourse.”
In short, Facebook didn’t accidentally become addictive. It was built to be so. And it worked. Those likes, hearts, scrolls, and red notification dots? They weren’t aesthetic decisions. They were the result of deliberate behavioural engineering.
What Stanford taught in theory, Facebook perfected in practice. Even Harvard confirmed it.
Harvard Health summarises Trevor Haynes’s work, stating that social media taps into the same neural pathways “used by slot machines and cocaine”, and that swiping behaviour produces neurological rewards akin to alcohol, nicotine, or drugs.
Filters, facades, and the performance of perfection
Social media isn’t real life. It’s the highlight reel. The best vacations. The perfect photos. The most flattering angles. The curated self.
This constant performance culture breeds comparison, insecurity, and disconnection. Facebook’s leaked internal research (2021) revealed that Instagram worsens body image issues in one in three teenage girls. The US Surgeon General (2023) confirmed that teens who use social media excessively are at significantly higher risk of depression, anxiety, and loneliness.
Sink your teeth into in-depth insights from our contributors, and dive into financial and economic trends
We scroll through the best parts of other people’s lives and wonder what’s wrong with ours. We compare our behind-the-scenes to their polished trailers. This is no longer connection. It’s distortion.
The algorithm doesn’t serve you. It feeds on you
The news feed didn’t just show what your friends were doing; it learned what kept you watching. And then it showed you more of that. More of what you agree with. More of what makes you feel angry. More of what keeps you scrolling.
MIT (2018) found that false news spreads six times faster than true news on Twitter, because it’s more sensational, more emotional, and more engaging. Algorithms don’t prioritise truth. They prioritise time-on-site by showing you more of the same. This is how echo chambers form. This is how polarisation deepens.
Not through conspiracy, but through design. Your brain is being rewired, one like, one outrage-inducing post, one confirmation bias at a time. Neuroscience studies confirm that repetition of the same ideas and images strengthens neural pathways, making users more entrenched in their views, less open to dialogue, and more prone to tribal thinking. We’ve lost the shared reality that democracy depends on. Social media didn’t just break our attention; it fractured our shared reality.
They have built a system that shapes our thoughts, emotions, and even elections. The algorithms that once sold sneakers now sell ideologies, amplified by AI that learns faster than we can regulate it. The line between influence and manipulation has vanished, and our democracy hangs in that blur.
In my next article, we’ll dive deeper into how this has swayed elections, how AI is amplifying it at an exponential rate, and how we as a society can take back control of the narrative before it controls us. Stay tuned for part two.
Michael Helfman is an entrepreneur, writer, and filmmaker exploring the human condition in an age defined by technology and transformation
