From 'harass dash' to 'dark social,' 25 Facebook code words revealed in leaked whistleblower documents - and what they all mean

Advertisement
From 'harass dash' to 'dark social,' 25 Facebook code words revealed in leaked whistleblower documents - and what they all mean
Facebook CEO Mark Zuckerberg testifies before the US Senate Alex Wong/Getty Images
  • Facebook whistleblower Frances Haugen leaked a trove of documents that Insider has since reviewed.
  • The records provide an inside look at the tech giant's lexicon, from "harass dash" to "eat your veggies."
Advertisement

These Facebook terms and definitions are sourced from disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Frances Haugen's legal counsel. A consortium of news organizations, including Insider, obtained the redacted versions received by Congress.

The majority of this list is sourced from the internal glossary used by Facebook's integrity team. Other terms were found within various company presentations and employee posts included in the Facebook Papers. Facebook did not respond to Insider's request for comment.

1. Eat your veggies

Code word for Facebook's communications protocol surrounding "major/sensitive news feed launches."

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

2. Cannibalism

A general business concept that describes how the growth of one company product may have negative implications on the growth of other company products.

Facebook employees are told to consider "incrementality:" to what degree is the growth of one product, such as Instagram, considered "truly incremental growth" when compared to the growth of other products, such as the main Facebook app.

Advertisement

3. Dark social

Term for private encrypted social messaging such as WhatsApp. Misinformation shared on dark social is "hard to detect, track, or block."

4. Harass dash

Short for an internal tool called "the harassment dashboard." The harass dash uses classifiers such as "bullying and harassment," "hate speech," and "violence and incitement" to flag comments that may violate Facebook's community standards.

5. Waterfall

The "primary production pipeline for misinformation."

6. Surge

2020 was "the year of the surge," according to an internal Facebook presentation called "Working Sustainably in 2021 (and beyond)." One slide defined surges as "periods of intense refocusing and doubling down on a set of products, policies, and processes intended to meet the moments we found ourselves in." These "moments" have included the pandemic, racial justice, economic distress, and the presidential election.

7. Lockdown

When a Facebook team is "put into lockdown" to solve a problem, it is a "time-bounded, tightly organized processes with clear ownership and one specific goal." Due to the high-priority status of a lockdown, the affected team "may request additional resources from neighboring teams."

Advertisement

The author of the presentation defining terms such as surge, lockdown, and war room, noted: "... many of our work terms have military connotations - while there is much to admire in the rigorous discipline and courage of these institutions, words matter. Using the language of battle likely subconsciously impacts the lens with which we view our mission and our output."

8. Astroturf

Facebook groups that are initially populated by fake accounts, but become self-sustaining as real accounts become participants.

9. Blackhole

A blacklisting service for URLs, domains, and IP addresses that may distribute spam or bad content. Facebook uses Blackhole to attach a label such as "OK," "BAD," or "IFFY" to these objects, which has a corresponding effect on how it can be used and viewed on the site.

10. FAIR

Acronym for Facebook AI Research.

11. Remove, Reduce, Inform

Facebook's framework for fighting misinformation on the integrity team: "we remove the content that violates our policies, reduce the spread of problematic content that does not violate our policies, and provide additional context to help people make informed choices about what they read or share."

Advertisement

12. Depolarizing group

A Facebook Group that enables positive interactions between people who are far apart on the ideological spectrum.

13. CORGI

Extremely complex math that looks for patterns in user behavior and identifies cliques on the platform -"very useful" for identifying large-scale attempts at manipulation.

14. Your money, your life

The acronym "YMYL" may refer to posts, pages, websites, or apps which touch on issues affecting a person's livelihood or welfare. "Spam or abuse on such topics can be especially harmful," according to Facebook's integrity team glossary.

15. Faux-tire

Refers to fake satire: "material meant to misinform/push propaganda while trying to self-portray as satire to avoid classification as newsworthy," thus avoiding fact checks.

16. Magnet users

Facebook users who are "hyper-engaged with bad content."

Advertisement

17. Mystery image

Clickbait images that may withhold information "involving superimposed synthetic graphics that put unnatural attention to parts of an image. These are typically low-quality, spammy user experiences."

18. Sacramento

Facebook's code name for "an A/C priv data project." This likely refers to "attorney-client privilege," a legal rule protecting confidential communication between an attorney and his or her client from discovery processes.

19. Speaking

The code name for the "approved talking points" of Facebook's integrity team: "In Integrity, be careful to always stay inside our department's approved talking points ... due to the sensitivity of our topics always preflight any possible messaging through appropriate channels."

20. Critical events

The term used by Facebook's integrity team to reference events such as "elections, crises, and chronic social problems." Critical events may require "operations and tooling to provide an extra focus on proactive sweeps and manually trigger actions to mitigate these highest risk episodes."

21. Divisive content

Facebook's integrity team defines divisive content as "content that ideologically similar people have positive responses to and ideologically different people have negative responses to."

Advertisement

22. False news

Facebook now labels "fake news" as false news, calling the former term for misleading public information "politicized."

23. Aligned friend and Cross-cutting friend

An aligned friend is a person who is near you in the ideological map, also known as "in-group," while a cross-cutting friend is a person who is far from you on the ideological map, or in the "out-group."

24. Blame tool

An internal tool that provides data on "which sources and triggers recommended a bad post."

25. Blue

An adjective referring to the main Facebook app. For example, "blue time" would refer to the total time spent on the main app.

{{}}