Advertisements

Facebook Automatically Generates Extremist Content

In spite of the constant declarations on its fight against the extremist speeches, Facebook has been accused to lodge and to generate automatically content for terrorist and Nazi groups.

Advertisements
Share Give it a Spin!
RSS
Follow by Email
Twitter
LINKEDIN

In spite of the constant declarations on its fight against the extremist speeches, Facebook has been accused to lodge and to generate automatically content for terrorist and Nazi groups.

“Facebook has produced its famous commemorative videos for extremist groups, which has involved the holding of images of terrorist attacks.”

The complaint has been presented by an anonymous informant to the US authorities. In it, the informant supports his allegations with a study conducted over five months where he reveals that, during this period, Facebook eliminated less than 30% of users publicly associated with terrorist and / or extremist groups.

In addition, the study also shows that Facebook has used its content self-generation tool negligently.

In this sense, the social network has allowed the creation of business pages and places for terrorist organizations, through which the users related to these groups share terrorist propaganda, are targeted as “workers” of the organization and even make recruitments offering positions within the group.

An example of this is the page of a militant group of Syrian Salafists called Hay’at Tahrir Al-Sham (HTS) who are known as al-Qaeda of Syria. In this group, the users enlisted as “workers” mujahideen, that is, as workers who exercise force in the name of jihad.

In total, the research identified up to 31 pages of businesses and places that were automatically generated by Facebook for these extremist groups and that, in addition, used Wikipedia information on these groups to fill in the information section present on these pages.

screenshot

Coupled with this, Facebook has also produced its famous commemorative videos for extremist groups, which has resulted in the celebration of images of terrorist attacks or racist violence.

In the study presented by the informant, the profiles of 3,228 Facebook users who expressed some affiliation with terrorist groups or hate groups were analyzed. Of these profiles, 317 included some flag or symbol indicative of their ideology and the vast majority had published some content of propaganda or violence.

To carry out the study, the informant conducted searches with the names of these terrorist organizations and then went on to review the lists of friends of 12 users who publicly identified themselves as terrorists. In doing so, the informant found the whereabouts of the other 3,000 users, all from different parts of the world and with extremist affiliations.

Although the company reported its work against extremist content on multiple occasions, this complaint lights up alarms about the lack of rigor of Facebook and its community policy systems.

Advertisements