Facebook will cause more unrest as algorithms promote divisive content, whistleblower says
Facebook will fuel more violent unrest around the world because of the way its algorithms are designed to promote divisive content, whistleblower Frances Haugen told the British parliament on Monday.
Haugen, a former product manager on Facebook's civic misinformation team who has turned whistleblower, appeared before a parliamentary select committee in Britain that is examining plans to regulate social media companies.
She said the social network saw safety as a cost centre, lionised a start-up culture where cutting corners was good, and said it was "unquestionably" making hate worse.
"The events we're seeing around the world, things like Myanmar and Ethiopia, those are the opening chapters because engagement-based ranking does two things: one, it prioritises and amplifies divisive and polarising extreme content and two it concentrates it," she said.
Facebook declined to provide any immediate comment in response to Haugen's appearance at the parliamentary committee.
Haugen in October told a Senate Commerce subcommittee hearing that Facebook had devised ways to keep users scrolling even if it was detrimental to their wellbeing, putting profit before people.
She also said she provided the documents used in a Wall Street Journal investigation and a Senate hearing on Instagram's harm to teenage girls. She compared the platform to addictive substances such as tobacco and opioids.
Facebook CEO Mark Zuckerberg has hit back against Haugen's accusations, saying earlier this month: "The argument that we deliberately push content that makes people angry for profit is deeply illogical."
BRITISH INTERIOR MINISTER SEEKS TOUGHER LAWS
Before Monday's hearing, Haugen met the country's interior minister, Priti Patel, who advocates tougher legislation for tech platforms that fail to keep users safe.
Haugen is scheduled to speak at a major tech conference, the Web Summit, next week and in Brussels to European policymakers.
"Facebook has been unwilling to accept even little slivers of profit being sacrificed for safety, and that's not acceptable," she said on Monday, singling out Instagram's impact on the mental health of some young users.
Britain is bringing forward laws that could fine social media companies up to 10% of their turnover if they fail to remove or limit the spread of illegal content, such as child sexual abuse.
Platforms such as Facebook will also need to do more to protect children from exposure to grooming, bullying and pornography, the government has said.
Reuters, along with other news organisations, viewed documents released to the U.S. Securities and Exchange Commission and Congress by Haugen.
They showed Facebook had known that it hadn't hired enough workers who possessed both the language skills and knowledge of local events needed to identify objectionable posts from users in a number of developing countries.
Comments