Why Meta’s Hire of Jordana Cutler Could Erode Public Trust: A Reputation Perspective

Meta’s credibility as a platform that serves billions globally depends heavily on its ability to maintain neutrality, especially in moderating political content. In a world where geopolitical conflicts are complex and polarising, maintaining the trust of users, regardless of their political affiliations, is paramount. This has become a particular challenge for Meta with the controversial hire of Jordana Cutler as Facebook’s Public Policy Director for Israel and the Jewish diaspora. Her past role as a senior advisor to Israeli Prime Minister Benjamin Netanyahu is now raising significant concerns over bias, particularly with regards to the Israeli-Palestinian conflict.

The Perception of Bias

From a reputation management standpoint, the perception of bias can be as damaging as actual bias. When a platform that claims to be neutral hires someone with strong governmental ties—especially to one side of a deeply contentious issue like the Israeli-Palestinian conflict—it risks alienating entire user groups. Jordana Cutler’s background, especially her close ties to the Israeli government, raises legitimate concerns about how Meta handles content moderation (“). While her role might not directly influence specific content decisions, the optics alone are troubling.

Advocates for Palestinian rights have long accused Meta of disproportionately censoring pro-Palestinian content. Reports by digital rights organisations have documented numerous cases where Palestinian and Lebanese content has been flagged, removed, or suppressed across Facebook and Instagram. In some instances, posts critical of Israel or expressing solidarity with Palestine were classified as inciting violence or promoting “terrorist” organisations (“)(“).

One prominent example involved the removal of content about the Sheikh Jarrah protests in East Jerusalem, where Palestinian activists documented violence and displacement. Digital rights groups, like 7amleh, reported over 500 cases of such content being restricted or removed during a brief period (“)(“). Similarly, posts containing the hashtag #AlAqsa, referring to the Al-Aqsa Mosque, were blocked on Instagram, with the platform labelling them as being associated with violence (“)(“). These restrictions disproportionately affect Palestinian voices, making it more difficult for them to share their stories and advocate for their rights.

Further supporting this concern, Meta employees themselves have flagged instances where pro-Palestinian accounts were banned. According to internal reports, prominent pro-Palestinian activist groups like Columbia University Students for Justice in Palestine had their Instagram accounts suspended, supposedly for violating vague community standards. Meta employees accused Jordana Cutler of directly escalating the ban of these accounts (“). These examples reinforce the perception that Meta’s moderation policies favour Israeli narratives, even when this perception may not align with Meta’s intentions.

Shadow Banning and Content Suppression

Another critical issue is the allegation of “shadow banning” pro-Palestinian content, where users’ posts are either deprioritised or invisible to their audience without being formally removed. Multiple digital rights groups have flagged this issue, claiming that content tagged with certain phrases or emojis—such as those associated with Palestinian solidarity—may be subtly hidden from feeds, reducing engagement without alerting the user (“)(“).

A report by Access Now, a global digital rights organisation, documented hundreds of complaints about the removal or down-ranking of pro-Palestinian posts, particularly during heightened moments of the Israeli-Palestinian conflict (“)(“). Sada Social, a Palestinian digital rights group, echoed these findings, pointing out that content related to Palestinian rights was systematically removed or suppressed across platforms like Facebook and Instagram. In many cases, users were not informed about the removal of their posts, or they received generic explanations referencing vague “community standards” violations (“)(“).

This lack of transparency and consistency in content moderation disproportionately impacts Palestinian and Lebanese users, creating an atmosphere where these groups feel silenced. The optics of having someone like Jordana Cutler, with her strong ties to the Israeli government, in charge of policy only deepens this mistrust. Even if there is no direct interference from Cutler, the perception that pro-Palestinian voices are being stifled because of her leadership is damaging enough.

Trust and Corporate Responsibility

Building and maintaining public trust is essential for any global platform. For Meta, this trust is especially critical in regions where content has the potential to inflame tensions and deepen political divides. When users believe that a platform is aligned with one political viewpoint—especially on issues as polarising as the Israeli-Palestinian conflict—it undermines Meta’s global credibility.

Meta’s handling of Palestinian and Lebanese content has been under scrutiny for years. The company’s reputation is further damaged when it fails to be transparent or fair in how it moderates content. With Jordana Cutler at the helm of policy for Israel, users and critics alike fear this bias may become institutionalised. It’s not merely about what actions Cutler might take; the perception of bias alone can erode trust and hurt Meta’s standing as a fair platform.

Given the nature of social media, content moderation must be perceived as unbiased. If certain voices feel stifled or censored—especially those advocating for Palestinian rights—public trust quickly disintegrates. A company like Meta must be acutely aware of this dynamic and take active steps to ensure that its policies, and those enforcing them, do not disproportionately harm any group, especially those advocating for human rights.

The Risks of Perception Over Reality

From a reputation perspective, it’s crucial to understand that the perception of bias can be as harmful as actual bias. Even if Jordana Cutler’s role is entirely above reproach, the fact that she served in high-ranking positions in the Israeli government raises legitimate concerns. Critics are right to question whether her presence will further tilt Meta’s policies towards favouring Israeli content at the expense of Palestinian voices.

The repeated removal of Palestinian and Lebanese content exacerbates these concerns. The lack of transparency in Meta’s decision-making processes, combined with the controversy surrounding Cutler’s appointment, adds fuel to the fire. In this highly charged environment, where users on both sides of the conflict are highly sensitive to perceived slights, Meta risks being seen as a partisan platform.

Damage Control: What Meta Should Do

Meta’s challenge now is to navigate this situation carefully, ensuring that both its actions and the public’s perception of its actions reflect fairness. This includes not only hiring practices but also transparent content moderation policies that allow for independent oversight and public accountability. Meta must take tangible steps to show that its policies will be applied fairly to both Israeli and Palestinian content, regardless of the political background of those enforcing these policies.

Additionally, Meta could engage with digital rights organisations, particularly those representing Palestinian and Lebanese voices, to rebuild trust and demonstrate a commitment to impartiality. Acknowledging mistakes, offering clear appeals processes for content that’s removed, and fostering transparency in moderation decisions are essential to reversing the damage caused by this hire.

In conclusion, while Jordana Cutler’s hire brings valuable experience to Meta, it also carries significant reputational risks. The combination of her political background and the pre-existing perception of bias in Meta’s content moderation creates a perfect storm for distrust. Meta must act swiftly and transparently to address these concerns, or it risks losing the trust of key user bases in regions already feeling marginalised.

References:

1. +972 Magazine: Criticisms regarding Meta’s content policies and the perception of bias against Palestinian voices. (“)

2. AJC: Jordana Cutler’s role and background, including her time as a senior advisor to Benjamin Netanyahu.  (“)

3. Daily Dot: Reporting on the role of Cutler in content moderation decisions affecting pro-Palestinian accounts.   (“)

4. QUT DMRC: Analysis of content moderation biases during conflicts and the suppression of Palestinian voices. (“)  

Previous
Previous

Cancelled or Just Experiencing a Setback? Understanding the Difference

Next
Next

The One-Hour Trust Test: Assessing Credibility in Crisis PR