0 14 minuti
Estreme Conseguenze

Facebook, Twitter, YouTube… It’s not only machines and algorithms filtering millions of posts, videos, photos, tweets. Artificial intelligence has not reached a point where it can replace human intervention and, paradoxically, the more users there are on the networks, the more a human eye is requires to vet the huge flow of data that constantly zips across the planet.

Their work is hush-hush, carried out behind closed doors. But recently some have managed to pry open these doors allowing us a glance inside.

They work at breakneck rhythms, under stressful and isolated conditions, their pay is low and contracts are ambiguous. A lot of the simple “filtering” takes place in countries like India and the Philippines, but there are also country specific moderators in accordance with the limitations, laws and specific characteristics of each individual nation. A checker in Manila cannot be expected to understand the offensive subtleties of an Italian dialect, just to give one example…

A furtive, submerged world, but one that is becoming increasingly decisive. Because social media exists and thrives on two basic unwaivable precepts: emotional engagement and the customer experience of those using the platform. In the case of Facebook, especially, something is starting to give. The levels of conflict, rage and division into “partisan packs” competing against each other has reached such a level that people are abandoning Zuckerberg’s little game in droves, and turning to other social networks. This mechanism of rejection is also starting to develop on Twitter and Youtube, with some tentative inroads also being made into what was always considered to be the most benign of the social networks, Instagram, which has long since become the preferred online social media platform for the under-20s.

So in the face of the gradual increase of “haters” on social networks, the role of moderators and checkers is increasingly important.

One of the first people of analyse and study these workers – conferring onto the job role the specific name of Commercial Content Moderation (CCM) – was Sarah Roberts, Assistant Professor of Information Studies at UCLA https://gseis.ucla.edu,who spoke exclusively toEstreme Conseguenze:

EC: How long have you been studying this phenomenon of “checkers”?

SR: Interest in these issues is relatively recent. My field of study and research centres on people who work as professional “checkers” in an organised manner. I make this distinction because up until a short time ago, this kind of work was often done by volunteers. The best case example is that of Wikipedia. The major social networks, though, need to organise the staff and pay them. And in secret.

We are talking about huge numbers of people the world over. A rough estimate brings us to believe there are at least 100,000 checkers. There are a variety of strategies, and a knowledge of the language and the cultural characteristics of every single nation is a vital factor. In all likelihood there is an “Italian team”, just as there’ll be a French one or a German one, because the trend of these big tech companies is to have ever…

ALTRE STORIE