Facebook: New revelations put the company under further pressure



Exclusive

Status: 10/25/2021 1:59 p.m.

Internal documents that NDR, WDR and SZ are available, show persistent problems with Facebook: polarizing algorithms, weaknesses in moderation, especially outside the USA and little popularity among younger users.

By Svea Eckert, Lena Kampf and Georg Mascolo, WDR / NDR

There are more than a thousand internal documents, chats between employees and Facebook investigations that whistleblower Frances Haugen made available to the US Congress. The data specialist Haugen worked from 2019 to 2021 as a product manager at Facebook in the Civic Integrity Team, which was supposed to develop strategies to protect users from false information before the US election.

At the beginning of October Haugen testified before the US Congress, today she will appear before the British Parliament. The whistleblower is now being advised by a larger team of lawyers, including those close to the Democratic Party.

Whistleblower Haugen drew attention to the problems.

Image: EPA

A European media consortium shared the documents with NDR, WDR and “Süddeutscher Zeitung” (SZ) further evaluated. This reveals new details from the inner workings of the group. Facebook itself describes the leaked documents as incomplete, out of context and no longer up-to-date. These are the main new findings:

1. Facebook itself knows how quickly users drift

In test runs with fictitious users, internal Facebook researchers impressively demonstrate how quickly Facebook suggests dangerous content to users. “Carol Smith” and “Karen Jones” will be registered on Facebook in the summer of 2019, they are 41 years old and come from a small town in the southeastern United States and have even stated the same interests on Facebook: parenting. Christianity. But Carol supports Donald Trump and Karen is a fan of Bernie Sanders.

A handful of likes were enough for extremely hateful, misogynous and derogatory content to be displayed on the platform after a few days. “Carol” are also proposed pages from QAnon, a conspiracy tale.

The conclusion of the Facebook researchers: The content, which “primarily followed our own recommendation systems”, took on “rather worrying, polarizing traits” in an “extremely short period of time”. The researchers are also repeating the experiment in India, with the result that it only took a few days for the recommendations to turn into an endless stream of hatred and agitation against Muslims.

2. The artificial intelligence is worse than Facebook claims

Tens of thousands of people view problematic content on behalf of Facebook. These so-called content moderators often cannot keep up with the deletion. That is why Facebook wants to replace people with machines. For years, Facebook boss Mark Zuckerberg has praised the great progress that Facebook’s systems have made when it comes to detecting hate speech, atrocity videos or terrorist content.

Some of Facebook’s employees distrust this promise of salvation. “We will probably never have a model that detects the majority of integrity violations, especially in sensitive areas,” wrote a researcher in 2019. He speaks of a detection rate of two percent, another internal study names three to five percent. Facebook points out that it is becoming increasingly unlikely that users will even see hate speech on the platform. This metric is more relevant than the recognition rate.

3. Outside of the US, it looks even worse

Example Arabic: Although more than 220 million users speak Arabic and thus make up the third largest group on the platforms, there is an acute lack of moderators for the various countries. An internal study criticizes the fact that the moderators come mainly from Morocco and Syria, but that they cannot really classify content from, for example, Yemen or the Gulf region.

The internal report makes it clear that Facebook assesses almost every single country in which mainly Arabic is spoken as a “high risk country”. The development of artificial intelligence, which is supposed to detect hatred and agitation before the content is shown to users, is expensive and very inadequate in other languages. In Arabic, for example, the algorithms cannot differentiate between quotations from the Koran and actual calls for violence. Facebook says it employs 15,000 moderators for more than 70 languages ​​around the world, including Arabic in Yemen, Libya, Saudi Arabia and Iraq.

4. Facebook is losing out to young users

In a large number of diagrams and tables, internal evaluations show that the young target group is migrating. Most people have long regarded Facebook as their parents’ platform, and Instagram is also becoming increasingly “uncool”. Now is the time to take countermeasures: So far there is only one product for children with the Messenger Kids, officially a minimum age of 13 years applies to all other apps. In reality, however, Facebook can hardly control who logs in anyway. Therefore, a children’s version of Instagram should be created, which is now paused after the criticism.

Profits in the first place

Whistleblower Frances Haugen comments on the documents in an interview: “I have seen again and again how Facebook handles it when there is a conflict between profit and security.” Facebook regularly resolves these conflicts for the benefit of its profit. There are already solutions that would curb hatred and agitation more strongly, according to Haugen. In the end, the marketing department almost always makes the decision. And she would always rely on growth.

Facebook contradicts the impression at the request of NDR, WDR and SZ “The number of hate speech has decreased for three consecutive quarters since we started reporting on it. This is due to improvements in the proactive detection of hate speech and changes in the ranking in the news feed.”

The so-called #FacebookFiles, about which the “Wall Street Journal” first reported, were made available to journalists in some cases, blackened and anonymized by an employee of the US Congress. In Europe, the “Süddeutsche Zeitung”, Ed and WDR together with “Le Monde” from France, the publishing house Tamedia from Switzerland, the Danish newspaper “Berlingske”, the Belgian magazine “Knack”, and the investigative platform OCCRP.


www.tagesschau.de

Leave a Reply

Your email address will not be published. Required fields are marked *