Are we curating our feeds, or are our feeds curating us?

ARE YOU IN GROUP 7?

If you’re anything like me, you can easily lose a whole hour scrolling through TikTok or Instagram before you even realise it. But somewhere between the memes and micro-trends, you have to ask, are we choosing what we see, or are algorithms choosing for us? Why does it feel like everyone is thinking about the exact same things I’m thinking about? Or have the exact same opinions I have on a particular matter?

Take, for example, the recent trend in Group 7. In mid-October, singer Sophia James posted seven almost identical TikToks, each labelled as belonging to a different “group.” She called it a “little science experiment.” Her goal? See which video would get pushed hardest by TikTok’s algorithm. But the seventh video blew up. Viral. Billions of loops. And suddenly, countless users declared themselves part of Group 7, creating memes, duetting, and joking about being “elite.” What’s remarkable is that even Sophia doesn’t fully understand why that final video was so massively successful.

The whole experiment reveals a core truth: the algorithm isn’t just recommending what you like, it’s creating a shared narrative. This is what Edward S. Herman and Noam Chomsky referred to as ‘Manufacturing Consent’. The core idea was that Media outlets, often unintentionally, shape public opinion in ways that support the interests of political and corporate elites. In their book “Manufacturing Consent: The Political Economy of the Mass Media,” the authors present a propaganda model comprising five filters: Ownership, Advertising, Flak, and Ideology. The goal of this was to show that it’s not that there’s some big conspiracy behind it, it’s just that the way these platforms are built naturally ends up shaping what we all pay attention to, without us even noticing.

Chomsky and Herman adapted this idea from American journalist Walter Lippmann, who used this term to describe how public opinion is managed. He argued that managing the public’s perception was necessary for democracy to function since public opinion was often irrational. 

In the framework of manufacturing consent, the algorithm becomes a gatekeeper not just of content, but of identity: “If you saw this, you’re in Group 7.” And millions of people bought into it. What really stood out in the Group 7 moment was how the algorithm didn’t just control what people saw, it actually shaped a sense of identity. The moment you landed on the Group 7 video, you weren’t just watching content anymore; you were a part of something. People started joking about being part of an elite group, making their own Group 7 videos, even claiming a kind of digital status from it.

That’s the wild part: the algorithm wasn’t only deciding which videos took off, but it was also influencing how people perceived themselves within the trend. It wasn’t just a meme; it became a label, a club, a tiny online identity, all created by a simple push from the For You Page. This is precisely where the idea of manufacturing consent comes in. The theory argues that media systems don’t just inform the public; they help shape what the public comes to accept as usual and relevant. It’s not about propaganda or someone forcing an opinion on you. It’s about the structures behind the scenes.

At the end of the day, moments like those in Group 7 remind us that social media isn’t just a place where trends emerge, it’s a place where they’re created. And most of the time, they’re made by systems we don’t see and barely understand. We scroll, we laugh, we share, and we move on, but the algorithm is constantly shaping what feels important, what feels popular, and even who we think we are online. Whether we call it manufacturing consent, algorithmic curation, or just “the feed,” the effect is the same: we’re all participating in a shared reality that’s being quietly constructed behind the scenes.

So my question this week is, do you still believe you built your FYP?

Refrences

1 thought on “Are we curating our feeds, or are our feeds curating us?

  1. I love how this blog was so conversational! The point about how the algorithm fosters a sense of community within social media platforms was an interesting observation. Everybody was talking about being in Group 7 just a couple of weeks ago, and I never knew that she had made six other identical videos since the 7th was one that was pushed. The only thing I would add is the opposite opinion. I would’ve loved to hear your opinion on how the algorithm is determined by what people search for, rather than controlling what they see.

Leave a Reply