2021-05-29 19:25:43 Facebook Workers Accuse Company Of Bias Against Arabs And Muslims

Facebook Workers Accuse Company Of Bias Against Arabs And Muslims

Earlier this month, a Facebook software engineer from Egypt sent an open message to his colleagues, warning them that “Facebook is losing trust among Arab users.”

During the Arab Spring of 2011, he said, Facebook was a “tremendous help” for activists who used it to communicate, but during the ongoing Palestinian–Israeli conflict, censorship — either perceived or documented — had made Arab and Muslim users skeptical of the platform. As evidence, the engineer included a screenshot of Gaza Now, a verified news outlet with nearly 4 million followers, which, when liked on Facebook, prompted a “disappointing” pop-up message that said, “You may want to review – Gaza Now to see the types of content it usually shares.”

“I tried liking as many Israeli news pages as I could, and ‘not a single time’ did I receive a similar message,” the engineer wrote, implying that the company’s systems were biased against Arabic content. “Are all of these occurrences the result of a model bias?”

Even after clicking the like button, Facebook users were asked if they were certain they wanted to follow a Gaza Now page, prompting one employee to wonder if this was an example of anti-Arab bias.

The post sparked a chain reaction of responses from other colleagues. One person inquired as to why an Instagram post by actor Mark Ruffalo about Palestinian displacement had been labeled as containing sensitive content. Another claimed that Facebook’s artificial intelligence and human moderators suspended ads from Muslim organizations raising funds during Ramadan with “completely benign content.”

“We could see our communities transferring to other platforms.”

“I fear we are at a tipping point where the next mistake will be the straw that breaks the camel’s back, and we may see our communities migrate to other platforms,” another Facebook employee wrote about the growing distrust among Arab and Muslim users.

While Israel and Hamas have agreed to a cease-fire, Facebook must now deal with a sizable number of employees who have been debating whether the world’s largest social network is anti-Muslim and anti-Arab. Some are concerned that Facebook is selectively enforcing its moderation policies regarding related content, while others believe it is overly enforcing them. Still others are concerned that it may be biased toward one side or the other. One thing they all have in common is the belief that Facebook is once again bungling enforcement decisions in the aftermath of a politically charged event.

While some perceived censorship across Facebook products has been attributed to bugs, such as one that prevented users from posting Instagram stories about Palestinian displacement and other global events, others, such as the blocking of Gaza-based journalists from WhatsApp and the forced following of millions of accounts on a Facebook page supporting Israel, have not been explained. BuzzFeed News also reported earlier this month that Instagram had mistakenly banned content about the Al-Aqsa Mosque, the site where Israeli soldiers clashed with worshippers during Ramadan, because the platform associated its name with a terrorist organization.

“It truly feels like an uphill battle trying to get the company at large to acknowledge and put in real effort instead of empty platitudes into addressing the real grievances of Arab and Muslim communities,” one employee wrote in an internal human rights discussion group.

The situation has become so tense within the company that a group of about 30 employees banded together earlier this month to file internal appeals to restore content that they believe was improperly blocked or removed from Facebook and Instagram.

“This is extremely important content for us to have on our platform, and we have the impact that comes from social media showcasing the on-the-ground reality to the rest of the world,” one member of that group wrote on an internal forum. “People all over the world rely on us to provide a window into what is going on in the world.”

The perception of anti-Arab and anti-Muslim bias is affecting the company’s brands as well. According to one internal post, the Facebook and Instagram apps have recently been flooded with negative ratings on both the Apple and Google mobile app stores, owing to declines in user trust caused by “recent escalations between Israel and Palestine.”

According to NBC News, some employees contacted both Apple and Google in an attempt to remove the negative reviews.

“Are we responding to people’s censorship protests with more censorship?” That is the root cause right here,” one commenter said in response to the post.

“This is the result of years and years of enacting policies that simply do not scale globally.”

“This is the result of years and years of implementing policies that simply do not scale globally,” they went on to say. “For example, internal definitions classify a sizable portion of some populations as terrorists. As a result, our manual enforcement systems and automations are biased.”

Facebook spokesperson Andy Stone acknowledged that the company had made mistakes and stated that a team of Arabic and Hebrew speakers is on the ground monitoring the situation.

“We are aware that there have been a number of issues that have impacted people’s ability to share on our apps,” he said in a statement. “While we have fixed them, they should never have happened in the first place, and we apologize to anyone who felt unable to bring attention to important events, or who believed this was a deliberate suppression of their voice. This was never our intention, and we never want to silence a specific community or point of view.”

On February 4, 2011, anti-government protesters in Cairo hold a sign referencing Facebook, which was instrumental in organizing protesters in Tahrir Square.

Social media companies, including Facebook, have long claimed that their platforms democratized information during the 2011 Arab Spring uprisings against repressive Middle Eastern regimes. Mai ElMahdy, a former Facebook employee who worked on content moderation and crisis management from 2012 to 2017, stated that the social network’s role in revolutionary movements was one of the primary reasons she joined the company.

“I was in Egypt during the revolution, and I saw how Facebook was a major tool for us to use to mobilize,” she explained. “Until now, whenever they want to brag about something in the region, they always bring up the Arab Spring.”

Her time at the company, on the other hand, tainted her views on Facebook and Instagram. While in Dublin, she oversaw the training of content moderators in the Middle East and chastised the company for being “US-centric” and failing to hire enough people with management experience in the region.

“I recall someone saying in a meeting that we should remove content that says ‘Allahu akbar’ because it could be related to terrorism.”

“I remember someone saying in a meeting that maybe we should remove content that says ‘Allahu akbar’ because that could be related to terrorism,” ElMahdy said of a meeting more than five years ago about a discussion of a Muslim religious term and exclamation that means “God is great.”

Stone stated that the phrase does not violate Facebook’s policies.

Jillian C. York, the Electronic Frontier Foundation’s director of international freedom of expression, has studied content moderation within the world’s largest social network and claims that the company’s approach to enforcement around content about Palestinians has always been haphazard. In her book Silicon Values: The Future of Free Speech Under Surveillance Capitalism, she writes that the company’s blunders, such as the blocking of accounts belonging to journalists and a political party in the West Bank, prompted users to create the hashtag #FBCensorsPalestine.

“I do agree that it may be worse now simply because of the conflict, as well as the pandemic and the subsequent increase in automation,” she said, noting how COVID-19 has impacted Facebook’s ability to hire and train human moderators.

The lack of sensitivity to Palestinian content was also attributed to the political environment and a lack of firewalls within the company by Ashraf Zeitoon, the company’s former head of policy for the Middle East and North Africa region; ElMahdy; and two other former Facebook employees with policy and moderation expertise. Those in charge of keeping governments happy on Facebook’s public policy team also weigh in on Facebook’s rules and what should or shouldn’t be allowed on the platform, potentially creating conflicts of interest where lobbyists in charge of keeping governments happy can put pressure on how content is moderated.

According to Zeitoon, this benefited Israel, where Facebook had dedicated more personnel and attention. When Facebook hired Jordana Cutler, a former adviser to Israeli Prime Minister Benjamin Netanyahu, to oversee public policy in a country of about 9 million people, Zeitoon, as head of public policy for the Middle East and North Africa, was in charge of more than 220 million people across 25 Arab countries and regions, including Palestinian territories.

Employees at Facebook have expressed concerns about Cutler’s role and the interests she prioritizes. In a September interview with the Jerusalem Post, she was referred to as “our woman at Facebook,” with Cutler explaining that her job “is to represent Facebook to Israel, and Israel to Facebook.”

“We have meetings every week to discuss everything from spam to pornography to hate speech, bullying, and violence, and how they relate to our community standards,” she explained in an interview. “In these meetings, I represent Israel. It is critical to me that Israel and the Jewish community in the Diaspora have a voice at these meetings.”

Zeitoon, who recalls arguing with Culter about whether the West Bank should be considered “occupied territories” under Facebook’s policies, said he was “shocked” to see the interview. “At the end of the day, you are a Facebook employee, not an Israeli government employee,” he said. (The United Nations considers the West Bank and Gaza Strip to be Israeli-occupied territory.)

According to Zeitoon and others, Facebook’s commitment to Israel shifted internal political dynamics. ElMahdy and another former member of Facebook’s community operations organization in Dublin claimed that Israeli members of the public policy team would frequently exert pressure on their team regarding content removal and policy decisions. During their time at Facebook, they claimed there was no real counterpart who directly represented Palestinian interests.

“The role of our public policy team around the world is to help ensure that governments, regulators, and civil society understand Facebook’s policies, and that we at Facebook understand the context of the countries where we operate,” said Stone, the company spokesperson. He also mentioned that the company now has a policy team member who is “focused on Palestine and Jordan.”

Cutler did not respond to an interview request.

ElMahdy recalls discussions at the company about how the platform would handle mentions of “Zionism” and “Zionist” — terms associated with the restoration of a Jewish state — as proxies for “Judaism” and “Jew.” Like many mainstream social media platforms, Facebook’s rules give mentions of “Jews” and other religious groups special protection, allowing the company to remove hate speech that targets people based on their religion.

Members of the policy team, according to ElMahdy, pushed for the term “Zionist” to be synonymous with “Jew,” and guidelines providing special protections to the term for settlers were eventually put into effect after she left in 2017. The Intercept published Facebook’s internal rules for content moderators on how to handle the term “Zionist” earlier this month, implying that the company’s rules created an environment that could stifle debate and criticism of the Israeli settler movement.

Facebook acknowledged in a statement that the term “Zionist” is used in political debate.

“Under our current policies, we allow the term ‘Zionist’ in political discourse, but we remove attacks on Zionists in specific circumstances, when there is context to show it’s being used as a proxy for Jews or Israelis, which are protected characteristics under our hate speech policy,” Stone explained.

Children hold Palestinian flags near the site of a Gaza house destroyed by Israeli airstrikes on May 23, 2021.

As users around the world complained that their content about Palestinians was being blocked or removed, Facebook’s growth team put together a document on May 17 to assess how the conflict in Gaza affected user sentiment.

Israel, with 5.8 million Facebook users, was the top country in the world for reporting terrorism-related content under the company’s rules.

Among its findings, the team concluded that Israel, with 5.8 million Facebook users, was the top country in the world in terms of reporting content under the company’s terrorism rules, with nearly 155,000 complaints in the previous week. With approximately 550,000 total user reports in that same time period, it ranked third in flagging content under Facebook’s policies for violence and hate violations, outpacing more populous countries such as the United States, India, and Brazil.

In an internal human rights discussion group, one Facebook employee wondered if the Israeli requests had any impact on the company’s alleged overenforcement of Arabic and Muslim content. According to the employee, while Israel has slightly more than twice the number of Facebook users as Palestinian territories, people in the country have reported 10 times the amount of content under the platform’s terrorism rules and more than eight times the amount of complaints for hate violations.

“When I look at all of the above, it made me wonder,” they wrote, including internal links and a 2016 news article about Facebook’s compliance with Israeli takedown requests, “are we ‘consistently, deliberately, and systematically silencing Palestinian voices?”

For years, activists and civil society organizations have wondered whether pressure from the Israeli government through takedown requests has influenced Facebook’s content decision-making. The Arab Center for the Advancement of Social Media tracked 500 content takedowns across major social platforms during the conflict and suggested that “the efforts of the Israeli Ministry of Justice’s Cyber Unit — which over the past years submitted tens of thousands of cases to companies without any legal basis — is also behind many of these reported violations,” according to its own report this month.

“In accordance with our standard global process, when a government reports content that does not violate our rules but is illegal in their country, we may restrict access to it locally after conducting a legal review,” Stone explained. “We do not have a separate procedure for Israel.”

As the external pressure grew, an informal team of about 30 Facebook employees filing internal complaints attempted to triage a situation that their leaders have yet to publicly address. They had more than 80 appeals about content takedowns about the Israeli–Palestinian conflict as of last week and discovered that a “large majority of the decision reversals [were] due to false positives from our automated systems,” specifically around the misclassification of hate speech. In other cases, videos and photos of police and protesters were mistakenly removed due to “bullying/harassment.”

“This has increased people’s distrust of our platform and reaffirmed their concerns about censorship,” the engineer wrote.

It also has an impact on the company’s minority of Palestinian and Palestinian American employees. Earlier this week, an engineer who identified as a “Palestinian American Muslim” published a post titled “A Plea for Palestine,” in which they asked their colleagues to understand that “standing up for Palestinians does not equate to anti-Semitism.”

“I feel like my community has been silenced in a sort of societal censorship, and I feel like I am complicit in this oppression by not making my voice heard,” they wrote. “To be honest, it took me a while to even put my thoughts into words because I genuinely fear that if I speak up about how I feel or try to raise awareness amongst my peers, I will receive an unfortunate response that will be extremely disheartening.”

Though Facebook executives have since formed a special task force to expedite appeals of content takedowns related to the conflict, they appear to be pleased with the company’s handling of Arabic and Muslim content during the Middle East’s escalating tensions.

“We just told 2 billion Muslims that we mistook Al Aqsa, their third holiest site, for a dangerous organization.”

In an internal update last Friday, James Mitchell, a vice president in charge of content moderation, stated that while there had been “reports and perception of systemic over-enforcement,” Facebook had “not identified any ongoing systemic issues.” He also stated that the company had been using “high-accuracy precision” terms and classifiers to flag content for potential hate speech or incitement of violence, allowing it to be automatically removed.

He stated that his team was committed to conducting a review to determine what the company could do better in the future, but he only acknowledged a single error, “incorrectly enforcing on content that included the phrase ‘Al Aqsa,’ which we immediately fixed.”

According to internal documents obtained by BuzzFeed News, it was not immediate. A separate post from earlier this month revealed that Facebook’s automated systems and moderators “deleted” at least 470 posts mentioning Al-Aqsa over a five-day period, blaming the removals on terrorism and hate speech.

Mitchell’s update dissatisfied some employees.

“I also find it deeply troubling that we have high-accuracy precision classifiers and yet we just told 2 billion Muslims that we mistook their third holiest site, Al Aqsa, for a dangerous organization,” one employee responded to Mitchell.

“At best, it sends a message to this large group of our audience that we don’t care enough to get something so basic and important to them right,” they went on to say. “At worst, it contributed to the stereotype that ‘Muslims are terrorists,’ as well as the notion that free speech is restricted for certain populations.”

Source link

Other News

Subscribe to our World NEWS Letter

Facebook Workers Accuse Company Of Bias Against Arabs And Muslims