The former employee who’s accused Facebook of putting profit over personal safety has testified before members of the U.S. Senate today.
37-year-old Frances Haugen emerged to high praise from many quarters on Sunday on U.S. television to become the whistleblower behind a series of damaging reports in the Wall Street Journal that have heaped political pressure on social media monolith Facebook.
She is a native of Iowa City and studied electrical and computer engineering at Olin College and got an M.B.A. from Harvard. She worked at various Silicon Valley companies, including Google, Pinterest and Yelp.
Haugen told the CBS news program ”60 Minutes’‘ that Facebook’s priority was making money over doing what was good for the public.
The whistleblower hailed as a “21st century American hero” by Senator Ed Markey today says the company’s platforms – including Instagram – knowingly harm children and fuel division.
”Facebook is like Big Tobacco, enticing young kids with that first cigarettes,” he said. “A first social media account designed to keep kids as users for life.”
Frances Haugen told the US committee the tech giant’s aware of the damage its sites can do to mental health and democracy – but didn’t make changes because it’s afraid of losing users.
Facebook has denied the claims. She has accused Facebook of studying children as young as eight for marketing purposes having told committee memebers that the company’s own research found its platforms – including Instagram – are negatively impacting young people’s mental health.
Ms Haugen has also alleged the company prioritises profits over safety.
Here is a summary of the points that Ms. Haugen raised today:
- Facebook intentionally targets teens including children under the age of 13, Haugen says her documents show.
- Lack of transparency around how Facebook’s algorithms work make it impossible to regulate, Haugen says.
- Senators are repeatedly comparing Facebook to Big Tobacco, suggesting we may see similar regulation to the platform as we have seen of cigarettes in the past.
- The platform does not dedicate equal amounts of research and resources to misinformation and hate speech to non-English content, Haugen says, fueling violence in places like Ethiopia.
- Haugen has stressed that Facebook tends to rely on artificial intelligence to automate moderation, even though it only catches about 10-20% of offending content, because it is cheaper.
- Haugen suggested a number of measures to be taken to regulate Facebook, including an independent government body staffed by former tech workers who understand how the algorithm works.