The Internet Used to Be for Everyone. Now It’s for You.
- Lisa Dubow
- Jan 5
- 3 min read
Updated: Mar 24

Many of us remember the promise of an open, exploratory web that allowed us to connect with people, ideas, and resources in unprecedented ways. It was meant to be a place where discovery felt expansive and unscripted. Today, algorithmic echo chambers have narrowed the internet to mostly showing you what it already knows you like, and soon they won’t just shape your screen, but the world around you.
From Open to Optimized
Google’s original mission was to “organize the world’s information and make it universally accessible and useful.” A 2010 marketing video captured that spirit well: anyone, anywhere, could uncover global news with a few keystrokes. The internet as library. The internet as equalizer.
That framing is largely gone. Today, Google, Meta, and Amazon all lead with the same pitch: content tailored just for you. In an August 2025 campaign, Google’s vision of search is a college student getting personalized dorm decor tips. Meta and Amazon have taken similar marketing approaches, showcasing customized recipes and hyper-personalized shopping experiences. The irony is hard to miss. Every major tech company is competing to stand out, and every one of them is offering the same thing. The commercial internet has shifted from open exploration to algorithmic curation, increasingly narrowing what any one person sees.
You’re Paying With Your Attention
Most people have a vague sense that algorithms shape their online experiences. Fewer understand the mechanism. These systems are built to hold your attention, and they do it by learning from every click, scroll, and pause you make. The more you engage, the more precisely the algorithm can predict what will keep you engaged.
The terms and conditions governing all of this exist, technically. But they’re buried in pages almost no one reads. The same companies that spend enormous resources designing sleek, intuitive interfaces apply none of that clarity to the communication of their data policies. Infinite scroll, push notifications, and reward loops aren’t accidents of product design. They’re deliberate features engineered to make disengagement feel difficult.
Policymakers have started to push back. The Algorithmic Accountability Act of 2025 would require companies to submit algorithmic impact assessments to the FTC and make portions of them public. It’s a real step, but progress is slow, lobbying is heavy, and the technology keeps moving faster than the regulation.
Now the Algorithm Is Leaving the Screen
For two decades, algorithmic influence has been something that happened on a device. You picked it up, you opened an app, you were exposed to it, and eventually you closed it or shut it down. But now that boundary is dissolving. Apple’s latest AirPods can summarize or translate conversations happening around you. Meta’s Ray-Ban glasses identify objects and answer questions about whatever the wearer is looking at. These devices don’t wait for you to search for something. They read the room, continuously.
The questions this raises aren’t hypothetical. What happens when your glasses save a stranger’s photo because you looked at their outfit for five seconds? What does consent mean when data is being collected from everyone in the vicinity, not just the person wearing the device? Surveillance has historically been something institutions do to individuals. AI wearables make it something we do to each other.
And the underlying data practices are already in place. Many social media sites collect behavioral data by default to train their AI systems. Most users are opted in without realizing it. As these capabilities move off screens and onto our bodies, the same logic that built filter bubbles online will start shaping physical space. An AI wearable that learns your preferences could quietly route you toward the same neighborhoods, restaurants, and social circles, building invisible walls around your physical world. Algorithmic tunnel vision, extended into real life.
The Rules Haven’t Caught Up
Most existing privacy law was designed for a screen-based world, where people controlled what they shared through settings and consent checkboxes. That model assumed you were the one deciding what to disclose. AI wearables don’t work that way. They capture data from everyone nearby, whether or not those people made any choices at all.
The internet’s shift from open platform to personalization engine happened faster than anyone anticipated. The move from screens to bodies looks like it will follow the same pattern. The time to ask what oversight should look like is before the technology is everywhere, not after.


Comments