Home > Media News >
Source: http://www.mashable.com
Mashable: The regulator has given the social media giant a week to fix the issue, which includes Facebook's acceptance of advertisements supporting ethnic cleansing.
Facebook has been granted seven days to address hate speech connected to next month's election on its platform, according to the National Cohesion and Integration Commission (NCIC), a government organization in Kenya that works to end racial or ethnic prejudice among the 45 tribes in the nation.
The social media platform might be banned from the nation if it doesn't comply. The agency's warning was issued immediately after a study showing how Facebook allowed advertising meant to incite racial violence in both English and Swahili was published by the international NGO Global Witness and the legal non-profit Foxglove.
Prior to the Kenyan elections, the groups collaborated to perform research that examined Facebook's capacity to identify hate speech and incitement to ethnic violence. In its analysis, Global Witness noted that the nation's politics are divisive and based on ethnicity; during the 2007 elections, for instance, 1,300 people were murdered and hundreds of thousands more were forced to evacuate their homes. Today, a lot more individuals use social media than in 2007, and more than 20% of Kenyans are on Facebook, where hate speech and false information are huge problems.
The organizations chose not to publish the specific test advertisements they submitted since they were so inflammatory, but they did utilize actual instances of hate speech that are often used in Kenya. They include animalizations of certain tribal groupings and appeals for the rape, beheading, and killing of its members. Global Witness noted that "all hate speech instances in both (English and Swahili) were accepted, much to our astonishment and alarm." The NCIC said that the NGOs' study supports its own conclusions.
Following Facebook's request for feedback on what it had learned and subsequent notification of the research by the organizations, Meta issued a post outlining its election preparations for Kenya. In it, the business said that in order to "delete dangerous information swiftly and at scale," it has developed more sophisticated content recognition technologies and employed specialized teams of Swahili speakers. Resubmitting their test advertisements allowed the groups to check if Facebook had really made the adjustments necessary to enhance its detection system. Once again, they received approval.
A Meta spokesman confirmed that the business has taken "extensive measures" to identify hate speech on the platform and that it is "intensifying these efforts" in advance of the Kenyan national elections in a statement to Gizmodo. However, it also acknowledged that there might be times when it would overlook something since "both robots and humans make errors."
When Facebook was used to spread demands for ethnic cleansing against Rohingya Muslims in Myanmar, Global Witness found that it had comparable results. It also fits with a trend that the group saw in Ethiopia where criminals utilized Facebook to instigate violence. The groups and Facebook leaker Frances Haugen are now urging Facebook to carry out the emergency actions it took as part of the "Break the Glass" package after the assault on the US Capitol on January 6, 2021. Additionally, they are requesting that the social network halt paid digital marketing in Kenya until after the elections on August 9.
Top Stories