What Big Tech doesn't want you to know about internet freedom
The Internet Was Never Yours
They sold us a fairy tale. A digital commons where anyone could speak, create, connect. The open web. Information wants to be free. Remember that one?
What they didn't mention—what the venture capitalists whispered in boardrooms while we were busy building our "personal brands> —is that freedom was always the product. And products get sold.
The internet you think you know died years ago. What replaced it is a surveillance extraction machine wearing democracy's face. Every click, every scroll, every half-second hesitation before you close a tab—all of it harvested, analyzed, weaponized against you. Not by shadowy government hackers in distant capitals. By the platforms you check before your eyes focus each morning.
Freedom House's 2025 research confirms what critics have long suspected: we're witnessing an uncertain future for the global internet> where three forces—government AI projects, satellite connectivity expansion, and mandatory age verification—are converging to reshape who gets to participate online. The window for genuine dissent is closing. Not with a bang, but with terms of service updates you didn't read.
The Consent Theater
Let's talk about choice.> That sacred American value tech executives invoke when regulators come knocking.
You chose to accept those cookies. You chose to share your location. You chose to let that app access your microphone, your camera, your contacts, your menstrual cycle data. Right?
Here's what they don't say in those cheerful privacy dashboards: **choice without power isn't consent. It's theater.
- Opt-out> systems designed by behavioral psychologists to exhaust you into submission
- 50-page terms of service written in deliberate obfuscation
- Dark patterns that make sharing data frictionless and protecting it labyrinthine
- The fundamental lie that individual digital hygiene> can counter institutional data hunger
As researchers at Cognoscenti documented in March 2025, the AI race has only accelerated industry's lust for our data—because captains of industry believe they need more and more data about us to train their AI models.> This isn't incidental collection. It's structural imperative. The business model demands total information awareness, and they're building it whether we consent> or not.
The Massachusetts privacy legislation they advocate for? Blocked repeatedly by industry lobbyists who spent $28 million in 2023 alone fighting similar measures nationwide. They don't want your informed choice. They want your resignation.
The Censorship Partnership
Here's where conventional wisdom gets it exactly backward. We obsess over government censorship—China's firewall, Russia's digital iron curtain—while ignoring the more sophisticated system emerging in Western democracies.
Freedom House's 2021 analysis revealed a growing number of governments are asserting their authority over tech firms, often forcing the businesses to comply with online censorship and surveillance.> But here's the twist: these partnerships are increasingly voluntary, even enthusiastic.
The platforms don't resist state power. They negotiate with it.
Consider what happens when governments demand content removal:
- **Corporate transparency reports> ** that obscure more than reveal
- Secret negotiations between tech giants and security agencies
- Algorithmic suppression that never appears in any policy document
- Shadowbanning and demonetization that platforms deny while practicing
The result? An unprecedented assault on free expression> that lacks the crude visibility of state censorship. No knock at the door. Just... disappearance. Your post doesn't violate community standards—it simply stops appearing. Your account isn't suspended—it just can't grow. Your video doesn't breach guidelines—it just never surfaces in search.
This is freedom's most dangerous erosion: not prohibition but invisibility. Not silencing but irrelevance.
The False Dichotomy They Need You to Believe
Privacy or security." > Innovation or regulation. > Free speech or safety.
These frames serve power. They constrain imagination. They make us choose between corporate surveillance and government surveillance, as if those were the only options.
The misinformation here is structural. Pundits and politicians repeat these binaries until they feel inevitable. They're not.
Specific falsehoods that persist despite evidence:
- > Strong encryption helps criminals. — No credible sources support meaningful law enforcement harm from widespread encryption; meanwhile, weakened encryption exposes billions to surveillance and crime
- > Section 230 protects Big Tech from all responsibility. — This claim lacks verification; the statute provides limited liability shields that courts have consistently narrowed, and reform proposals largely serve established players who can afford compliance costs
- > The market will provide privacy-protecting alternatives. — The evidence contradicts this claim; network effects and data moats create insurmountable barriers to entry, as documented in multiple antitrust cases
The real agenda? Maintaining a system where extraction continues uninterrupted. Where public investment in digital infrastructure—fiber to every home, municipal broadband, publicly governed platforms—remains unthinkable while billions flow to stock buybacks and executive compensation.
What Surveillance Costs Communities
Let's talk about who pays. Not in abstract "privacy> terms. In material, measurable harm.
Predictive policing algorithms trained on racist data. Tenant screening systems that perpetuate housing segregation. Employment algorithms that filter out disabled applicants. Credit scoring that encodes generational wealth extraction.
These aren't glitches. They're features of a system that monetizes prediction. The more accurately platforms can anticipate your behavior—your likelihood to click, to buy, to default, to dissent—the more valuable their data becomes. And prediction requires history. Your history. Your community's history. The accumulated disadvantages that become self-fulfilling prophecies when encoded in code.
The climate crisis dimension rarely discussed: data centers consuming 2% of global electricity, projected to reach 8% by 2030. Water resources diverted to cool servers while frontline communities face scarcity. Rare earth mining for devices designed for obsolescence. The cloud> is very, very material. And its environmental costs are externalized onto the same communities already bearing pollution burdens.
This is what market-based solutions> deliver: efficiency for extraction, inefficiency for justice.
The Organized Alternative
Here's what they genuinely fear. Not individual opt-outs. Not privacy-conscious consumers making better choices."
Collective action.
The 2023 Writers Guild strike that forced AI concessions. The ongoing organizing at Amazon warehouses that exposes algorithmic management. The European movements that achieved GDPR—not perfect, but a beachhead. The global coalition that stalled the TPP's digital provisions.
These victories share a pattern. They treat digital rights as labor rights, as human rights, as collective goods requiring collective governance. They reject the frame of consumer choice for the framework of worker and community power.
The platforms spend billions on anti-union consultants not because they're afraid you'll delete your account. They're afraid you'll organize your workplace, your neighborhood, your democracy to demand public alternatives.
Municipal broadband in Chattanooga delivers faster, cheaper, more private internet than Comcast offers anywhere. The APP Act proposed in Congress would create genuine data minimization requirements. The EU's Digital Services Act, however imperfect, demonstrates that regulation can force transparency.
These aren't utopian dreams. They're existing models. Underreported, underfunded, under constant industry attack—but real.
The Question They Can't Answer
I'll close with one that should keep you awake.
If these systems truly served human flourishing—if they genuinely expanded freedom, connection, creativity—why do they require such elaborate architecture to keep us participating? Why the infinite scroll? Why the variable reward schedules? Why the social proof manipulation? Why the constant collection, the perpetual analysis, the ever-expanding surveillance?
Systems that deliver genuine value don't need to engineer addiction. Products that respect users don't require dark patterns. Platforms that facilitate real community don't monetize outrage and division.
The internet could be different. Publicly governed, worker-owned, privacy-respecting, accessible to all. We've built such things before—libraries, public utilities, cooperative enterprises. The obstacles aren't technical. They're political. They're the concentrated power of entities that have captured regulatory processes, colonized public imagination, and convinced us that their extractive model is the only possible one.
It isn't. It never was. And the faster we stop performing consent for systems designed to harvest it, the faster we can build what comes next.
Sources
[Freedom on the Net 2025: An Uncertain Future for the Global Internet](https://freedomhouse.
[Freedom on the Net 2021: The Global Drive to Control Big Tech](https://freedomhouse.
[Big tech is hungry for consumer data. Mass. needs privacy legislation now | Cognoscenti](https://www.wbur.
[The State of Privacy Legislation in the US | Electronic Frontier Foundation](https://www.eff.
[Data Center Energy Consumption | International Energy Agency](https://www.iea.
[Chattanooga's Municipal Broadband Success | Institute for Local Self-Reliance](https://ilsr.
Comments
Comment Guidelines
By posting a comment, you agree to our Terms of Use. Please keep comments respectful and on-topic.
Prohibited: Spam, harassment, hate speech, illegal content, copyright violations, or personal attacks. We reserve the right to moderate or remove comments at our discretion. Read full comment policy
Leave a Comment