A former Meta engineer has launched a lawsuit against the tech giant, accusing the company of bias in its content moderation practices related to the conflict in Gaza. Ferras Hamad, a Palestinian-American who was part of Meta's machine learning team since 2021, claims he was wrongfully terminated for attempting to rectify bugs that led to the suppression of Palestinian posts on Instagram.
The lawsuit, filed in a California state court, outlines several allegations against Meta, including discrimination and wrongful termination. Hamad asserts that the company has exhibited a pattern of bias against Palestinian content, pointing to instances where internal employee communications mentioning the deaths of their relatives in Gaza were deleted. Additionally, he claims Meta conducted investigations into employees' use of the Palestinian flag emoji, while no similar scrutiny was applied to the use of Israeli or Ukrainian flags in comparable contexts.
Hamad's dismissal in February 2023 followed his involvement in an emergency procedure, known within Meta as a SEV (site event), which is designed to troubleshoot severe platform issues. According to the complaint, Hamad identified procedural irregularities in handling a SEV related to content restrictions on posts by Palestinian Instagram users. One specific incident involved a video posted by Palestinian photojournalist Motaz Azaiza, which was misclassified as pornographic content despite depicting a destroyed building in Gaza.
Hamad contends that he received mixed messages from colleagues about his authority to address the SEV, although he had previously worked on similarly sensitive issues concerning Israel, Gaza, and Ukraine. Despite this, his manager later confirmed that managing SEVs was within his job responsibilities. The situation escalated in December 2022, when Hamad noted these irregularities and tried to resolve them. Following this, he was informed that he was under investigation and subsequently filed an internal discrimination complaint.
Days after lodging his complaint, Hamad was fired. Meta cited his violation of a policy prohibiting employees from working on accounts of individuals they know personally, referencing Azaiza. Hamad, however, maintains that he had no personal connection to the photojournalist.
Meta has not yet responded to a Reuters request for comment on the lawsuit. The company has faced longstanding criticism from human rights groups regarding its content moderation practices related to Israel and the Palestinian territories. An external investigation commissioned by Meta in 2021 also highlighted these concerns.
The recent conflict in Gaza, which erupted after a Hamas attack on Israel on October 7, has further intensified scrutiny of Meta’s content moderation. The attack resulted in 1,200 Israeli deaths and over 250 hostages taken, according to Israeli authorities. Israel's subsequent offensive in Gaza has caused over 36,000 Palestinian deaths, as reported by Gaza health officials, and led to a severe humanitarian crisis.
Amidst the conflict, Meta has been accused of suppressing pro-Palestinian expressions on its platforms. Nearly 200 Meta employees expressed their concerns in an open letter to CEO Mark Zuckerberg and other company leaders earlier this year. They highlighted what they perceived as a systematic bias in moderating content related to the Israeli-Palestinian conflict.
Hamad’s lawsuit not only raises serious questions about Meta’s internal policies and handling of sensitive geopolitical content but also underscores the broader challenges tech companies face in balancing content moderation with fairness and impartiality. As the case progresses, it could have significant implications for how social media platforms manage content related to contentious global issues, and how they address internal allegations of bias and discrimination.