Saying the Quiet Part Out Loud: Carole Cadwalladr’s Warnings Demand Our Attention
When platforms profit from chaos, democracy pays the price. Cadwalladr shows how data, algorithms, and billionaires reshape our future.
📣 Signal Dispatch — Signals from the field
This isn’t empty alarmism; the evidence is right before us—the data trails, the hidden algorithms, and the platforms profiting from them all.
Rather than tanks in the streets, we’re witnessing a subtler infiltration—where data is quietly collected, online narratives are skillfully shaped and monetized, and our personal information becomes a lucrative asset. Carole Cadwalladr was one of the first to say it aloud on a TED stage: that the democracy we rely on is being dismantled from within, using tools we interact with every single day.
In 2019, she called Facebook a “crime scene” for its role in the UK’s Brexit referendum—revealing how illegal campaign spending and covert targeting on social media had subverted the vote. By the time she returned to TED in 2024, her tone was more urgent. She labeled it a digital coup, with “servers and code” replacing tanks and troops. A handful of platform elites (“broligarchs,” as she puts it) shift entire geopolitical structures in the relentless pursuit of profit and data power.
“No one hands you a manual for this moment. No one teaches you what to do when the world tilts toward something unrecognizable—when democratic norms start to unravel, when institutions feel brittle, when the truth itself becomes a battleground.”
—What Are We Supposed To Do?
Facebook’s Role in the Brexit Referendum
Illegal campaign spending and microtargeting: The UK Electoral Commission found that the official Vote Leave campaign broke election laws by funneling £675,315 to a Canadian data firm (AggregateIQ) for micro-targeted Facebook ads under a joint plan with another group (BeLeave), thereby exceeding legal spending limits by almost £500,000. This illicit funding of targeted social media content subverted oversight and, as Carole Cadwalladr noted, turned Facebook into a “crime scene” for the democratic process. (Integrate this data when discussing Facebook’s complicity in the Brexit outcome, right after describing Facebook as a “crime scene.”)
Data misuse and voter manipulation: Whistleblower evidence revealed that Cambridge Analytica and its affiliates illicitly harvested tens of millions of Facebook profiles to sway voters. Chris Wylie testified to UK MPs that Cambridge Analytica’s “cheating” – including obtaining data on up to 50 million Facebook users – breached UK campaign laws and “may have helped to sway the final Brexit outcome.” He noted that a Canadian company tied to Cambridge Analytica provided Vote Leave’s data analytics, blurring legal lines between campaign spending and foreign data operations. In Cadwalladr’s words, Britain was the “canary in the coal mine” where antiquated election laws were exploited by opaque digital tactics.
“Digital gangsters” and electoral law failures: A 2019 UK parliamentary inquiry concluded that companies like Facebook “should not be allowed to behave like ‘digital gangsters’” operating beyond the law. The committee warned that British electoral rules were “not fit for purpose” in the era of online campaigning and that Facebook’s negligence facilitated the Cambridge Analytica scandal, which “was facilitated by Facebook’s policies”. It called for stricter regulation to curb the “worst excesses of surveillance capitalism” subverting democracy.
A Canary Turned Cassandra
Cadwalladr’s 2019 “canary” metaphor illustrated how Britain was the early warning sign—where election laws written in a paper age couldn’t detect or deter digital manipulation. Four years later, her 2024 talk shifts the conversation from possibility to certainty. The infiltration of our daily lives by these big-tech “broligarchies” and the data surveillance economy, she argues, is no longer a cautionary tale; it’s a lived reality.
Her callouts are specific:
Platforms decide what we see (and what we don’t).
Billionaires run critical communication infrastructures (think Elon Musk, controlling not just a social site but satellites).
Narrative manipulation trumps overt censorship—algorithmic invisibility can suffocate dissent more quietly than state-imposed blackouts.
As Cadwalladr suggests, these tech giants haven’t blatantly ‘hijacked’ our democracies so much as we’ve allowed them to embed themselves in our public sphere—a reality we often take for granted because we rely so heavily on their services.
“Autocracy rarely seizes power in a single dramatic moment. It advances slowly and deliberately, thriving on our exhaustion, misinformation, and disengagement.”
—What Are We Supposed To Do?
Data Collection and Surveillance Capitalism
Pervasive harvesting of personal data: Tech platforms sustain their profits through relentless data harvesting. For example, Meta (Facebook) relies on a “surveillance advertising” model, tracking people across millions of websites and apps. Meta’s tracking pixel is embedded on 30% of the world’s top websites, monitoring users’ browsing behavior and even collecting sensitive information (including health and financial data) without direct user action. A 2022 investigation found that one-third of top U.S. hospitals were inadvertently sending patient data to Facebook via this pixel. In Meta’s own words, even individuals who don’t use Facebook are swept up in its data dragnet.
Data brokers and personal profiles: Our personal information has become currency in an expansive data economy. A 2024 study by Consumer Reports found that, on average, Facebook had data sourced from 2,230 different companies for each user analyzed. In one extreme case, nearly 48,000 companies had shared a single individual’s data with Facebook. Retailers, apps, and shadowy data brokers routinely feed Facebook details about people’s online behavior and offline purchases, which Facebook aggregates into detailed profiles for micro-targeted advertising. This illustrates what Shoshana Zuboff termed “surveillance capitalism” – a system where our every click and personal detail are monetized.
Profit over privacy: The sheer value of personal data is evident in platform finances. Facebook’s parent company Meta derives ~97% of its revenue from advertising, fueled by personal data. When Apple introduced an opt-out for app tracking in 2021, Meta’s ad business took a multibillion-dollar hit – a loss “demonstrating just how valuable your personal data is to its business”. In short, surveillance isn’t a side-effect of their product; it is the product.
From Potential Threat to Normalized Crisis
For years, Cadwalladr faced lawsuits, public smears, and personal harassment—showing how a “digital authoritarianism” no longer needs physical violence to clamp down on critics. Legal intimidation, social-media trolling, and data-driven attacks often suffice. She’s taken the brunt of it and survived, thanks to support from people around the world.
But the point remains: This machinery is not going away by itself. The digital infrastructure that shapes our discourse—and therefore our democracy—has already been built, and it’s profitable. Early hopes for a natural course correction or more ethical tech oversight haven’t materialized, prompting a pressing need for public and legislative intervention, and cultural resistance.
“They do not fear hashtags. They do not fear viral outrage. They fear sustained, strategic resistance.”
—How To Lose Everything Without Even Trying
Algorithmic Manipulation & Information Warfare
Extremism fueled by algorithms: Internal research at Facebook showed that the platform’s own recommendation tools were driving people toward toxic content. An internal Facebook report in 2016 found “64% of all extremist group joins are due to our recommendation tools”, meaning the algorithm itself – via features like “Groups You Should Join” – was responsible for nearly two-thirds of the extremist communities that users joined. In practice, Facebook’s algorithms were “sucking users down the rabbit hole” of conspiracy theories and hate. This confirms Cadwalladr’s warning that algorithmic curation, not just user choice, can radicalize opinions at scale.
Election interference on a global scale: Facebook has been a conduit for information warfare by hostile actors. During the 2016 US election, Russia’s Internet Research Agency covertly posted roughly 80,000 pieces of divisive content on Facebook that reached an estimated 126 million Americans over two years – nearly half the U.S. voting population. In the same election, Russia-linked operatives bought thousands of Facebook ads (seen by at least 11 million people) to inflame social tensions. U.S. intelligence later concluded these social media attacks aimed to sway the election’s outcome. Such examples underscore that Facebook is not just a bystander but a battlefield where foreign agents wage “information warfare” on democracy.
Disinformation in Brazil’s election: The playbook of digital manipulation has been deployed in democracies worldwide. In Brazil’s 2018 presidential race, an “industry of lies” on social media helped propagate Jair Bolsonaro’s candidacy. An investigation revealed that businessmen illicitly funded a campaign to bombard voters with hundreds of millions of messages on WhatsApp (a Facebook-owned platform) filled with fake news about Bolsonaro’s opponent. At least 156 corporate sponsors were implicated in paying for this mass disinformation blitz, skirting Brazil’s election finance laws. This illegal WhatsApp operation – essentially a parallel propaganda machine – exploited encrypted group messaging to spread rumors at scale, a tactic eerily similar to the micro-targeted Facebook ads of Brexit.
Computational propaganda worldwide: Studies by the Oxford Internet Institute show that organized social-media manipulation is now “industrial scale” and present in 81 countries and counting. In 93% of surveyed countries, political disinformation campaigns are regularly using Facebook or Twitter to spread falsehoods. Authoritarian regimes and partisan actors alike employ bots, troll farms, and targeted ads to warp online discourse. This global proliferation of “computational propaganda” highlights that what happened with Brexit or Cambridge Analytica is not an isolated aberration but the new normal in information warfare.
Where Does That Leave Us?
Torch & Tinder Press, in operation only since late February 2025, stands because we see it all unfolding and feel compelled to respond—locally, personally, culturally. There is no neat blueprint for what to do next. But we do know certain truths:
We must name the problem clearly. If it’s a “digital coup,” call it that. If a handful of powerful billionaires are capturing entire communication channels, let’s say so.
We must own our narratives. Narrative sovereignty matters, as Cadwalladr suggests, and we can’t afford for vital truths to be drowned out by an algorithm.
We must refuse to slip into quiet acceptance. “They want us to feel powerless,” she warns. That is by design.
We must practice digital and economic defiance.
We must organize locally. Every community that can provide more for itself increases local resilience and is that much harder to manipulate.
We must practice cultural resistance.
“Today, democracy dies in boardrooms. In algorithm shifts. In financial manipulations. In policies buried so deep in legal jargon that people don’t notice what has changed until it’s already law.”
—How To Lose Everything Without Even Trying
This Isn’t a New Fight—Even If the Emperor Wears a New Coat
Cadwalladr’s efforts connect to a much broader pattern of democracy under siege. She’s been sounding the alarm for several years, yet the situation has only deepened. That said, for most of us, this confrontation with “surveillance capitalism” and “digital coups” can feel startlingly new—especially if we believed that the rule of law or existing tech oversight would save us.
Torch & Tinder Press is new, too. But the reason for our urgency is clear. As Carole Cadwalladr says, “We have to have each other’s backs now. We are the cavalry.” For us, that means forging the resources, networks, and cultural tools to face the crisis. There is no waiting for someone else to “fix it.”
Public Sentiment & Trust Erosion
Eroding public trust in social platforms: The public has grown deeply wary of the impact of big tech on society. Nearly two-thirds of Americans (64%) say that social media have a mostly negative effect on the country’s direction, while only 10% see a positive effect. The top reasons cited are the spread of misinformation, rampant hate and harassment, and the feeling that one can’t know what to trust online. This skepticism is not limited to the US. In the UK and many other countries, trust in content from social platforms is at rock-bottom compared to trust in traditional media. (For instance, Britain’s Ofcom found only 22% of people trust news on social media, vs. ~71% trusting news on TV.) The sense of “truth becoming a battleground” that Cadwalladr described is reflected in these numbers – users feel inundated by false or toxic content and doubt what they see online.
Concerns about misinformation: According to the Reuters Institute’s global survey, 56% of people across 46 countries “worry about distinguishing real news from fake” on the internet. This majority has grown in recent years, indicating rising awareness of the misinformation problem. Importantly, younger audiences increasingly get news from social feeds (or apps like TikTok) and pay less attention to traditional news sources – a shift that leaves them more vulnerable to algorithmically spread falsehoods. Yet even as people rely on these platforms, they express cynicism; in one survey, less than one-third of respondents agreed that having news selected for them by an algorithm was a good idea. The public’s ambivalence – using social media for information but distrusting it – underscores the “normalized crisis” Cadwalladr highlights.
Declining trust in tech companies: The tech industry’s standing has plummeted from its earlier “do no evil” sheen. The Edelman Trust Barometer – an annual global survey – recorded a 10-point drop in trust in the technology sector worldwide between 2019 and 2021, with trust in tech reaching all-time lows in 17 out of 27 countries surveyed. In the United States, tech went from the most trusted industry in 2020 to one of the least trusted by 2021. Scandals from Facebook (Cambridge Analytica, privacy breaches) to Google and Twitter have fed a perception that Big Tech companies put profits over users’ well-being. The 2023 Edelman data showed further erosion, tied to fears of disinformation and societal division. In short, public trust in these platforms’ governance is in tatters. This crisis of trust reinforces Cadwalladr’s argument that we can no longer take tech’s benignity on faith – citizens are demanding accountability.
Concrete Steps, Not Performative Outrage
It’s easy to fall into the “outrage cycle,” fueled by constant shock headlines and online drama. But if we only react with moral indignation, we rarely enact lasting change.
“We have also allowed ourselves to be duped into an ineffective but predictable reactionary role that begins and ends at performative outrage—lacking both compassion and benefit—and with rhetoric that would shame even Chicken Little and The Boy Who Cried Wolf.”
—How To Lose Everything Without Even Trying
Instead, we must build structures that outlast ephemeral fury:
Support investigative reporters through small donations or amplifying their work so lawsuits or funding cuts don’t silence them.
Adopt digital self-defense with privacy tools and encrypted platforms.
Foster local resilience through “decentralized communication networks, community-owned media, and mutual aid,” so we’re less reliant on profit-driven infrastructure.
Engage in policy or legal action (where possible) to demand accountability and transparency from tech giants.
Our free field guide, What Are We Supposed To Do? can help you find your role in the cultural resistance ecosystem.
The fight isn’t about any single victory—rather, it’s about refusing to cede ground in the quiet, everyday spaces where democracy either flourishes or fades away.
Legal Threats to Journalists & Civic Discourse
SLAPP lawsuits silencing critics: After reporting on these issues, Carole Cadwalladr endured a multi-year libel suit from Brexit financier Arron Banks – a case widely condemned as a Strategic Lawsuit Against Public Participation (SLAPP) aimed at intimidating her. Banks sued Cadwalladr personally over comments in her 2019 TED Talk and a tweet, dragging her through court for over three years. In 2022 the High Court dismissed Banks’s claims, ruling that Cadwalladr’s journalism was in the public interest. Press freedom groups hailed the verdict as a “landmark” win , but not before Cadwalladr incurred heavy personal costs. (She faces an order to pay a portion of Banks’s legal fees, an ordeal observers say will chill others from speaking out.) This case exemplifies how deep-pocketed individuals can deploy legal harassment to threaten and exhaust journalists – even when the claims ultimately fail.
Rising tide of legal intimidation: Cadwalladr’s battle is part of a broader pattern of legal threats to civic discourse. Reporters Without Borders and media coalitions note an increase in powerful figures using defamation suits and privacy laws to muzzle journalists and critics. The UK, in particular, has become a hub for libel claims by the wealthy – so much so that London’s courts are a go-to destination for “libel tourism” and SLAPP actions. In 2021, the UK was downgraded to “partially open” on the global free expression index, with a key factor being the chilling effect of legal harassment on journalists. This trend of legal intimidation – effectively privatized censorship via the courts – shows how democratic checks and balances (a free press, open debate) are being eroded not by overt government crackdowns, but by expensive lawsuits that most citizens or reporters cannot afford to fight.
Living in the Crossroads
Cadwalladr acknowledges an existential dread in returning to TED after being sued, harassed, and overshadowed by legal threats. Yet she persists. That kind of fear is part of the climate we’re in—fear that we, too, will get hammered by a lawsuit or targeted online pile-on. But the alternative is silence, which is complicity.
We are at that threshold. If we step back and wait for someone else to defend these freedoms, we risk losing them by default.
What Do We Do About It?
That question resonates through Carole Cadwalladr’s talks—and echoes in the hearts of those newly awakening to how close democracy is to the edge.
“Resisting not just politically, but culturally, socially, and psychologically. Because in the end, freedom and the survival of democracy isn’t just about laws or elections—it’s about people. It’s about whether enough of us are willing to push back, to hold the line, and to create the kind of world where freedom doesn’t just survive but thrives.”
—What Are We Supposed To Do?
We can channel that pushback into cultural resistance: building alliances across communities, stepping outside social media echo chambers to meet real neighbors, and staying vigilant about who shapes our information space. Whether or not we have legal backing at any given moment, persistent solidarity and resourcefulness can still hold power to account.
Endgame or New Beginning?
Cadwalladr’s final call isn’t about surrender; it’s about rejecting the helplessness that creeping authoritarianism wants us to feel. Yes, the situation is dire—and yes, we must choose whether to be silent or become a part of a broad-based, innovative resistance.
For our part at Torch & Tinder Press—despite being newly formed—we feel the same urgency. We’re forging what we can, trying to step outside purely reactive outrage, and working towards building real connections. That’s our way of refusing to let freedom die. This is how we practice.
“Because when history looks back at this moment, there will be only two kinds of people: Those who fought. And those who lost everything—without even trying.”
—How To Lose Everything Without Even Trying
Freedom is a practice. Resistance is an ecosystem
Learn More
(2019): Facebook’s role in Brexit—and the threat to democracy
Cohen, Lena. “Mad at Meta? Don’t Let Them Collect and Monetize Your Personal Data.” Electronic Frontier Foundation, January 17, 2025. https://www.eff.org/deeplinks/2025/01/mad-meta-dont-let-them-collect-and-monetize-your-personal-data.
Consumer Reports and The Markup. “Who Shares Your Information with Facebook?” Study release, January 2024. (Findings reported by) Wes Davis, “48,000 companies sent Facebook data on a single person,” The Verge, January 17, 2024. https://www.theverge.com/2024/1/17/24041897/facebook-meta-targeted-advertising-data-mining-study-privacy.
Edelman. 2021 Edelman Trust Barometer (Global Report). Edelman, 2021. (Key statistic reported in CBS News, April 1, 2021, “Tech sector sees global decline in trust.”)
Lomas, Natasha. “It’s official: Brexit campaign broke the law – with social media’s help.” TechCrunch, July 17, 2018. https://techcrunch.com/2018/07/17/its-official-brexit-campaign-broke-the-law-with-social-medias-help/.
Ingram, David. “Facebook says 126 million Americans may have seen Russia-linked posts.” Reuters, October 31, 2017. https://www.reuters.com/article/facebook-russia-126mn/facebook-says-126-million-americans-may-have-seen-russia-linked-political-posts-idUSKBN1CZ2OF.
Oxford Internet Institute (Howard, Philip et al.). “Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation.” University of Oxford, 13 Jan 2021. (Press release:) “Social media manipulation by political actors an industrial scale problem – Oxford report.” https://www.ox.ac.uk/news/2021-01-13-social-media-manipulation-political-actors-industrial-scale-problem-oxford-report.
Paul, Kari. “‘It let white supremacists organize’: the toxic legacy of Facebook’s Groups.” The Guardian, February 4, 2021. https://www.theguardian.com/technology/2021/feb/04/facebook-groups-misinformation?source=techstories.org.
Phillips, Tom. “Bolsonaro business backers accused of illegal WhatsApp fake news campaign.” The Guardian, October 18, 2018. https://www.theguardian.com/world/2018/oct/18/brazil-jair-bolsonaro-whatsapp-fake-news-campaign.
Reporters Without Borders (RSF). “UK: RSF welcomes landmark judgement in SLAPP case against investigative journalist Carole Cadwalladr.” Reporters sans frontières press release, June 13, 2022. https://rsf.org/en/uk-rsf-welcomes-landmark-judgement-slapp-case-against-investigative-journalist-carole-cadwalladr.
Siddique, Haroon. “Arron Banks loses libel action against reporter Carole Cadwalladr.” The Guardian, June 13, 2022. https://www.theguardian.com/uk-news/2023/feb/28/arron-banks-loses-two-of-three-challenges-to-failed-libel-action-against-carole-cadwalladr?.
Scott, Mark. “Cambridge Analytica helped ‘cheat’ Brexit vote and US election, claims whistleblower.” POLITICO, March 27, 2018. https://www.politico.eu/article/cambridge-analytica-chris-wylie-brexit-trump-britain-data-protection-privacy-facebook/.
UK Electoral Commission. “Vote Leave fined and referred to the police for breaking electoral law.” Official report, July 17, 2018. https://www.electoralcommission.org.uk/media-centre/vote-leave-fined-and-referred-police-breaking-electoral-law.
Waterson, Jim, and Matthew Weaver. “Leave.EU fined £70,000 over breaches of electoral law.” The Guardian, May 11, 2018. https://www.theguardian.com/politics/2018/may/11/leaveeu-fined-70k-breaches-of-electoral-law-eu-referendum.
Auxier, Brooke. “64% of Americans say social media have a mostly negative effect on the country.” Pew Research Center, October 15, 2020. https://www.pewresearch.org/fact-tank/2020/10/15/64-of-americans-say-social-media-have-a-mostly-negative-effect-on-the-way-things-are-going-in-the-u-s-today/.
Coster, Helen. “Fewer people trust traditional media, more turn to TikTok for news – report.” Reuters, June 13, 2023. (Summarizing the Reuters Institute Digital News Report 2023.) https://www.reuters.com/business/media-telecom/fewer-people-trust-traditional-media-more-turn-tiktok-news-report-says-2023-06-13/.
Explore more from Torch & Tinder Press
📣 Signal Dispatch — Signals from the field (you’re here)
🔧 Torchlight Praxis — Tools you can carry
🔥 Embers — Warmth for the long winter
Stay connected: Instagram · Bluesky · Facebook
Support & community editions: Ko-fi



