What Happened to the Fair Internet? A Developer’s Take on PR, Hype, and Algorithms

The Gamified Web: Why Quality Doesn’t Win Online Anymore

I’ve been building software for over two decades. I’ve seen the internet evolve from a frontier of genuine curiosity into something else entirely. In the beginning, there was a simple, unspoken promise: if you built something good, useful, and honest, people would find it. Merit was the currency. Today, that promise feels broken, and I can’t be the only one who sees it.

This isn’t about one company’s struggle. It’s about a systemic shift that rewards the loudest voice, not the best product. It’s about the slow decay of discovery and the rise of a gamified web where everyone is forced to play by the same exhausting rules.

The Wikipedia Moat: How Gatekeepers Reinforce the Status Quo

Let’s start with a perfect example: Wikipedia. It was founded on the noble idea of documenting all human knowledge. But its rules, designed to prevent spam, have created a powerful gatekeeper that favors the established.

The rule is “notability,” which requires “significant coverage in reliable secondary sources.” In practice, this means you need articles in the New York Times or TechCrunch. Who gets this coverage? Big companies with PR departments. A small team or a solo developer who creates a genuinely useful app used by thousands? They are often deemed “not notable.”

This creates a frustrating chicken-and-egg problem:

  • You need media coverage to get a Wikipedia page.
  • But having a Wikipedia page is often seen as a sign of legitimacy that helps you get media coverage.

It’s a structural bias that protects the powerful. It’s even worse when you hear stories of corrupt editors asking for “donations” to protect an article from deletion. The system, intended to be fair, has become a reflection of the very power imbalances it was supposed to ignore.

The Great Internet Gold Rush (And the Ghost Towns Left Behind)

What we’ve witnessed over the last 15 years is a series of digital “gold rushes” for attention. The cycle is always the same.

Phase 1: A New Frontier. A new, authentic platform appears. Think of early Wikipedia, Digg, Reddit, or later, Hacker News and Product Hunt. They are exciting places, filled with real creators sharing real work. The signal-to-noise ratio is high.

Phase 2: The Prospectors Arrive. People realize this new platform is a source of valuable, organic attention. Early adopters thrive by sharing quality content.

Phase 3: The Industrial Miners. The secret is out. PR teams, growth hackers, and marketers swarm in. They don’t come to participate; they come to extract. They develop playbooks to game the system: upvote rings, coordinated launch-day campaigns, and stealth marketers posing as “indie hackers.” Suddenly, the front page isn’t about what’s best; it’s about who has the biggest budget and the most cunning strategy.

Phase 4: The Depletion. The original community gets drowned out. The signal is lost in the noise. Trust erodes. The platform becomes another billboard, and the authentic users move on, looking for the next frontier. We saw it with Digg, and we see it happening now on platforms that were once havens for genuine makers.

The Algorithmic Treadmill: “Like, Comment, and Feed the Machine”

This gamification isn’t just on niche communities; it’s everywhere. Look at YouTube. Every creator, no matter how brilliant, is forced to end their video with the same mantra: “Like, comment, and subscribe!”

What started as a clever hack to get an algorithmic edge is now just background noise. It’s a tax on everyone’s time—the creator’s and the viewer’s. Creators are no longer just making content for people; they are performing acts of algorithmic appeasement. They are feeding a machine whose rules are opaque and ever-changing.

This is the same reality for software developers. Instead of just writing good code, we have to master App Store Optimization (ASO), SEO, and a dozen other acronyms. We’re forced to spend more time optimizing our marketing funnel than our product. Honest work gets buried unless you play the game.

The Real Cost: A Crisis of Trust and Discovery

So, where does this leave us? 🤷

For creators and developers, it’s discouraging. It makes them feel like doing honest work is a waste of time. It leads to burnout and cynicism.

For users, it leads to attention fatigue. We are constantly on guard, trying to figure out if a review is real, if a product is genuinely good, or if it’s just propped up by venture capital and slick marketing. It becomes harder and harder to discover the quiet, reliable tools built by people who genuinely care.

My One Big Hope: An AI That Can Actually See 🤖

Despite all this frustration, I hold onto a single, perhaps utopian, hope: an AI that can cut through the noise. Not the current AI, which is trained on the biased, gamified internet we already have. I’m dreaming of a new kind of AI, an arbiter of truth and quality.

Imagine a tool that could:

  • Analyze source code directly. It could look at an open-source project, compile it, and determine its quality, security, and efficiency—not just its number of GitHub stars.
  • Provide historical context. When a tech giant announces a “revolutionary” new feature, this AI could instantly add a footnote: “This is not new. An indie developer has had a better open-source version of this for five years. Here is the link.”
  • Evaluate real value, not just hype. It could test an app and report back: “This app has a beautiful landing page, but its core function is buggy and it sells your data. This other, simpler-looking app is faster, more stable, and respects your privacy.”
  • Become a universal scam detector. That suspicious “investment opportunity” your grandma received? The AI could instantly flag it as a known scam pattern, protecting the vulnerable from the predators who thrive in the internet’s dark corners.

A North Star for a Fairer Web

Is this a fantasy? Maybe. Building an AI with that level of real-world sense is a monumental task. But it’s a goal worth striving for. It’s a North Star.

The tools we built to connect us and surface the best ideas—search engines, social platforms, recommendation algorithms—have been hijacked. They now reward loudness over substance and polish over utility.

I still believe we can build something better. A system—or an AI—that restores balance. One that champions the real builders, finds the forgotten gems, and finally helps the internet deliver on its original, beautiful promise: that quality, care, and a good idea are what truly matter.

Thanks for reading. 🖖