What Frances Haugen Revealed — and Why It Matters Beyond America

In October 2021, Frances Haugen walked into a United States Senate hearing room and changed the global conversation about social media. A former Facebook data scientist, Haugen had spent months before her resignation secretly copying tens of thousands of pages of internal company documents. What those documents showed was unambiguous: Facebook's own research confirmed the platform amplified misinformation, worsened teenage girls' mental health, stoked political division, and in some regions contributed directly to ethnic violence — and company leadership had repeatedly chosen to prioritise engagement metrics and profit over addressing these documented harms.

Haugen told senators that Facebook "harms children, stokes division and weakens our democracy." Internal Instagram research she disclosed showed that 32 percent of teen girls said Instagram made them feel worse about their bodies, and that 13.5 percent reported the platform made thoughts of suicide worse. Facebook's own researchers had concluded that they "make body image issues worse for one in three teen girls" — findings the company had never made public. At the heart of Haugen's testimony was an architectural argument: Facebook's engagement-based algorithm was structurally designed to surface content that triggered strong emotional reactions because those reactions maximised time on platform. Outrage, fear, and divisive content consistently outperformed calm, accurate information in the system's own metrics. The problem was not a handful of bad actors — it was the platform rewarding them.

Haugen published a memoir in 2023, The Power of One, and founded the nonprofit Beyond the Screen to campaign for platform transparency and algorithmic accountability. A feature film based on the disclosures, The Social Reckoning, written by Aaron Sorkin, is currently in production. Four years after her Senate testimony, the critique Haugen articulated has only grown more urgent as these platforms continue expanding their footprint in countries with weaker regulatory infrastructure. Bangladesh sits squarely in that category.

Bangladesh's Scale of Exposure

Bangladesh is among the world's largest Facebook markets by raw user count. As of early 2025, Facebook had approximately 60 million users in the country according to DataReportal, representing 34.3 percent of the total population and 77 percent of the country's internet user base. By November 2025, NapoleonCat data placed that figure above 72 million. The 18-to-24 age group is consistently the largest segment, accounting for over 30 million users. Bangladesh's total social media user base grew 22.3 percent between January 2023 and January 2024, adding nearly 10 million user identities in a single year.

These numbers carry specific structural significance. For tens of millions of Bangladeshis — particularly those outside major urban centres — Facebook is not one platform among many. It functions as the primary gateway to news, public health information, political discourse, and community life. This is precisely the configuration Haugen identified as most dangerous: when a platform becomes the internet for a population, its design failures and content moderation gaps do not remain contained to individual users. They become structural features of the information environment that an entire society depends on, daily.

A Decade of Documented Harm

The evidence of Facebook's content moderation failures as they affect Bangladesh does not require Haugen's leaked documents — it is built from more than a decade of recorded incidents on the ground. Bangladesh has experienced repeated episodes of Facebook-amplified communal violence. In 2012, the country's first major Facebook-triggered incident saw thousands attack a Buddhist enclave in Cox's Bazar after a manipulated photograph circulated on the platform. In subsequent years, multiple episodes followed the same pattern: posts alleging Islamic blasphemy — frequently fabricated or stripped of context — spread rapidly through algorithmically amplified reach before crowds mobilised into real-world violence against Hindu, Buddhist, and Christian minority communities.

The structural dynamic is consistent with what Haugen described: in communities already marked by religious tension, content alleging desecration of Islam is exceptionally potent at generating the intense, high-velocity engagement Facebook's algorithm rewards with broader distribution. By the time moderation intervened — if it did — the content had already reached millions. Foreign Policy documented this pattern in detail, noting that Facebook's ad hoc approach to content moderation in Bangladesh was producing stark consequences and that the platform's Bengali-language moderation capacity remained chronically inadequate relative to user volume.

The Myanmar Rohingya genocide provides the most extreme documented case of Facebook algorithmic harm in this region — and its consequences fall directly on Bangladesh, which now hosts over one million Rohingya refugees in camps around Cox's Bazar. Amnesty International's 2022 report, drawing partly on Haugen's disclosed documents, found that Meta's algorithms had proactively amplified and promoted anti-Rohingya content from as early as 2012, years before the 2017 massacre that displaced over 730,000 people in a matter of months. A UN fact-finding mission had already concluded in 2018 that Facebook had been a "useful instrument" for vilifying the Rohingya. Internal documents showed that in mid-2014, Facebook had a single Burmese-speaking content moderator — based in Dublin — covering 1.2 million active users in Myanmar. By 2018 that number had grown to approximately 100, still wholly insufficient. An internal 2019 document noted that action was taken against only approximately two percent of hate speech on the platform.

Research examining Bangladesh's own August 2024 political crisis — the quota reform protests and Monsoon Revolution — found that Facebook, WhatsApp, and Telegram collectively amplified ideological polarisation, propagated disinformation, and contributed to real-world violence and community persecution. A 2025 Preprints.org study found Facebook's Bengali-language moderation remained under-resourced, allowing harmful content to circulate without systematic review. According to Carnegie Endowment research, as of 2020 approximately 84 percent of Meta's misinformation prevention efforts were directed toward the United States — leaving 16 percent for the rest of the world combined.

Regulation Aimed Inward, Not at Platforms

Bangladesh's legislative response to social media has been directed almost entirely at controlling citizens' speech on these platforms rather than holding the platforms themselves accountable. The Digital Security Act 2018 (DSA), widely condemned by press freedom organisations and described as "draconian" by Amnesty International, criminalised online speech critical of the government or deemed harmful to religious sentiments. From September 2018 to January 2023, over 7,001 cases were filed under the DSA — 60 percent of them for Facebook activity. Ruling party affiliates were the largest group filing cases against journalists. Only two percent of those accused saw their cases resolved in court. The law functioned primarily as a political suppression instrument, not as a framework for platform accountability.

In September 2023, the government replaced the DSA with the Cyber Security Act (CSA). Amnesty International described it as a "replication of the draconian" predecessor, retaining repressive provisions with only minor penalty adjustments. After the August 2024 Monsoon Revolution, the interim government adopted the Cyber Security Ordinance 2025, removing some of the most-prosecuted provisions — but Article 19 noted the ordinance retained vaguely worded clauses that could still be used to suppress speech. More critically, the Ordinance extended regulatory scope to social media platforms under a framework granting sweeping powers to the Bangladesh Telecommunication Regulatory Commission (BTRC) without corresponding requirements for algorithmic transparency, human rights impact assessment, or accountability mechanisms governing how platforms actually operate.

The pattern is consistent: every major digital law Bangladesh has enacted targets what users say, not what platforms do to amplify, suppress, or algorithmically shape what users see. There is no legislation governing algorithmic accountability. There is no requirement for Meta to conduct or disclose human rights impact assessments for Bangladesh. There is no mandatory standard for Bengali-language content moderation capacity proportionate to Bangladesh's user base.

The Youth Dimension

The impact of Facebook's algorithmic design on Bangladeshi youth carries particular weight given the platform's dominant demographic. Users aged 18 to 24 constitute the largest single Facebook user group in Bangladesh — over 30 million people — and the actual population accessing the platform is younger still, given widespread underage account creation. Haugen's testimony specifically identified the feedback loop young users experience: Instagram's own research showed that as teenage girls consumed content related to eating disorders and body dissatisfaction, the algorithm served more of it, and their emotional engagement deepened while their wellbeing declined. The system was, in Haugen's framing, actively making vulnerable users more vulnerable in order to keep them on platform longer.

For Bangladesh, where digital literacy education is not systematically embedded in secondary school curricula, where parental digital literacy frequently lags children's platform knowledge, and where the gender distribution of social media users already reflects significant structural inequalities — 63 percent male, 37 percent female as of early 2025 — these dynamics interact with existing social vulnerabilities in ways that demand dedicated policy attention. Research published in 2025 found that increased Facebook usage among Bangladeshi youth correlated with higher depression and anxiety scores, with young women aged 15 to 18 disproportionately affected by appearance-based social comparison content. The mechanisms Haugen described as harmful in the American context are operating identically — and with less regulatory protection — in Bangladesh.

What Accountability Would Actually Require

Haugen's consistent argument since 2021 is that content moderation alone cannot solve a problem rooted in algorithmic architecture. As long as Facebook's engagement-based system rewards outrage, fear, and divisive content with greater reach, reactive removal of individual pieces of harmful content will remain insufficient. Changing this requires either voluntary platform redesign — which Meta has shown no meaningful inclination toward — or regulatory requirements that mandate it.

For Bangladesh, meaningful accountability would start with algorithmic transparency: requiring Meta to disclose how its recommendation system functions in Bangladesh, what content receives amplification, and how Bengali-language content performs through the moderation pipeline. It would require demonstrated Bengali-language moderation capacity proportionate to a user base now exceeding 70 million. It would engage with the emerging global legal standard — including a 2024 US Court of Appeals ruling finding that Section 230 immunity protections do not apply to TikTok's recommendation algorithm, and ongoing litigation against Meta in the US and Kenya — establishing that platforms can be held liable for algorithmically amplified harm, not just user-uploaded content.

What Frances Haugen made undeniable with internal corporate evidence in 2021 — that Facebook knowingly deployed an algorithm it understood to cause harm, and chose profit over safety — remains the operating reality for 70 million Bangladeshi users today. The platform's architectural choices directly shape the information environment in which political discourse, religious tensions, minority safety, and public health communication all operate in Bangladesh. The disclosures changed the global conversation. In Bangladesh, the policy response has yet to follow.

win-tk.org is a wintk publication covering global and regional affairs with a focus on Bangladesh and South Asia.