Hamas has been banned from Facebook, removed from Instagram and removed from TikTok. Yet posts supporting the group that carried out terrorist attacks in Israel this month are still reaching massive audiences on social networks, spreading horrifying footage and political messages to millions of people.
According to a review by The New York Times, many accounts sympathetic to Hamas have gained hundreds of thousands of followers on social platforms since the war between Israel and Hamas began on October 7.
An account on Telegram, the popular messaging app that has little moderation, reached more than 1.3 million followers this week, up from about 340,000 before the attacks. According to Gaza Now, that account is associated with Hamas Atlantic CouncilA research group focused on international relations.
Anti-Defamation League chief executive Jonathan A. “We have seen Hamas content on Telegram, such as bodycam footage of terrorists firing at Israeli soldiers,” Greenblatt said. “We have seen pictures of bloodied and dead soldiers not only on Telegram but also on other platforms.”
Such posts are the latest challenge for technology companies as many of them try to reduce the spread of false or extremist content while protecting content that doesn’t violate their rules. In past conflicts, like genocide myanmar Or Other attacks between Palestinians and IsraelSocial media companies have struggled to strike the right balance, with watchdog groups criticizing their responses for being too limited or sometimes overzealous.
Experts said Hamas and Hamas-linked social media accounts are now taking advantage of those challenges to evade restraints and share their messages.
Terrorist organizations and extremist content have long been banned on most online platforms. Facebook, Instagram, TikTok, YouTube and X (formerly Twitter) have banned accounts linked to Hamas or posts overly sympathetic to its cause, saying they violate their content policies against extremism. .
Gaza Now had more than 4.9 million followers on Facebook before it was banned last week, shortly after The Times contacted Facebook’s parent company Meta about the account. Gaza Now did not post the horrific content found on Telegram, but it did share allegations of wrongdoing against Israel and encouraged its Facebook followers to subscribe to its Telegram channel.
Gaza Now also had more than 800,000 collective followers on other social media sites before many of those accounts were deleted last week. It had more than 50,000 subscribers on its YouTube account before it was suspended on Tuesday.
In a statement, a YouTube spokesperson said Gaza Now violated company policies because the channel owner previously operated an account on YouTube that was terminated.
Experts said Telegram has emerged as the most obvious launching pad for pro-Hamas messaging. Accounts there have shared videos of captured prisoners, dead bodies and destroyed buildings, with followers often reacting with thumbs-up emojis. In one example, users instructed each other to upload gruesome footage of Israeli civilians being shot to platforms such as Facebook, TikTok, Twitter and YouTube. The comments also included suggestions on how to alter the footage to make it more difficult for social media companies to easily find and remove it.
Telegram also hosts an official account for the al-Qassam Brigades, the military wing of Hamas. Its number of followers has tripled Ever since the conflict started.
Telegram chief executive Pavel Durov wrote in a Post Last week the company “removed millions of clearly harmful pieces of content from our public platforms.” But he indicated that the app would not completely ban Hamas, saying that the accounts “serve as an unparalleled source of first-hand information for researchers, journalists and fact-checkers.”
“Although it would be easy for us to destroy this source of information, doing so risks escalating an already serious situation,” Mr. Durov wrote.
X, which is owned by Elon Musk, was filled with falsehoods and extremist content as soon as the conflict began. Researchers at the Institute for Strategic Dialogue, a political advocacy group, found In one 24-hour period, a collection of posts supporting terrorist activities on X received more than 16 million views. The European Union said it would investigate whether X violated European law that requires large social networks to prevent the spread of harmful content. X did not respond to a request for comment.
Yet accounts not directly claimed by Hamas pose difficult challenges for social media companies, and users have criticized the platforms for being overzealous in removing pro-Palestinian content.
Thousands of Palestinian supporters said Facebook and Instagram suppressed or removed their posts, even though the messages did not violate the platforms’ rules. Others reported that Facebook had suppressed accounts that called for peaceful protests in cities around the United States, including a planned sit-in in the San Francisco area over the weekend.
Meta said in a blog post On Friday that Facebook may have inadvertently removed some content as it worked to respond to a surge in reports of content violating the site’s policies. The company said some of those posts were hidden due to an accidental bug in Instagram’s system, which was causing pro-Palestinian content to not show on its Stories feature.
Masoud Abdulatti, founder of the health care services company MedicalHub, who lives in Amman, Jordan, said Facebook and Instagram had blocked his posts supporting Palestinians, and he tried to share support for civilians trapped in Gaza. Had resorted to LinkedIn for this. In the midst of conflict.
“The people of the world are ignorant of the truth,” Mr. Abdulatti said.
Iman Belassi, a copywriter who lives in Egypt’s Sharqiya governorate, said she usually used her LinkedIn account only for business networking, but when she felt Facebook and Instagram were not showing the full picture of the devastation in Gaza, she Started posting about the war. ,
“This may not be the place to share war news, but sorry, the level of injustice and hypocrisy is unbearable,” Ms Bellasy said.
The challenges reflect the blunt content moderation tools that social networks are increasingly relying on, said Kathleen Carley, a researcher and professor at the SciLab Security and Privacy Institute at Carnegie Mellon University.
Many companies rely on a mix of human intermediaries — who can be quickly overruled during a crisis — and some computer algorithms, he said, with no coordination between platforms.
“Unless you have consistent content moderation for the same story across all the major platforms, you’re just playing whack-a-mole,” Ms. Carli said. “It’s going to reemerge.”
Shira Frenkel Contributed to the reporting.