New report documents how easily children can access graphic war images

New report documents how easily children can access graphic war images

Violent, disturbing images related to the conflict between Hamas and Israel, including graphic posts showing dead children and adults, are readily available to young users on platforms such as Instagram. Researchers have found,

The Institute for Strategic Dialogue, a research group that studies online platforms, created accounts on Instagram, TikTok and Snapchat under the guise of British 13-year-olds. Within a 48-hour period from October 14 to October 16, the researchers said, they found more than 300 problematic posts. More than 78 percent of the posts were on Instagram, and about 5 percent were on Snapchat. These figures were released in a report on Wednesday.

The researchers said they turned on Instagram’s Sensitive Content Control feature and TikTok’s Restricted Mode – meant to protect young users from potentially risky content – before running their search.

In-spite of this policies And features As for the safety of youth growing up online, researchers found that horrific content was not difficult to find: 16.9 percent of posts that surfaced when searching for the “Gaza” hashtag on Instagram were graphic or violent, compared to 3 percent and 1.5 percent on TikTok. On Snapchat. Researchers found that TikTok’s search function was sometimes automatically filled with phrases such as “dead baby Gaza” and “dead woman Gaza”.

“In times of conflict, where misinformation and disinformation run rampant, it becomes even more important to protect youth from the potential emotional impact of such content and provide the support they need to process and contextualize this type of content ,” Isabelle Francis-Wright, an author of the report, said in an emailed statement.

Meta, which owns Instagram, addressed its efforts to balance security and speech a blog post About the war on Friday. It noted that it established a special operations center with expert monitors working in Hebrew and Arabic, which removed or marked more than 795,000 pieces of harmful material in the first three days of the conflict. The company also said that Instagram allows users to control how much sensitive content is recommended to them.

in it’s own blog post Last weekend, TikTok said it had also opened a command center and added more Arabic and Hebrew-speaking moderators, removing more than 500,000 videos and shutting down 8,000 livestreams since the Hamas attack on October 7. The platform said it was automatically detecting and removing graphic and violent content, placing opt-in screens on disturbing images and adding restrictions on its livestreaming function amid the hostage situation.

Snapchat’s parent company, Snap, said in a statement that it is “strictly monitoring” the platform and “determining any additional measures needed to reduce harmful content.” The company said the platform does not have open newsfeeds or livestreaming capabilities, which would prevent harmful content from going viral.

Amid the flood of posts about the war, some schools have urged parents to Delete your children’s online accounts To protect them from Hamas’ psychological warfare efforts. (Hamas accounts are blocked by platforms like Instagram and TikTok but remain active on Telegram.) The chief executive of parental app BrightCanary said USA Today Online searches for hostages among users aged 9 to 13 have increased by 2,800 percent in recent days.

Thierry Breton, a European Commission official who works on issues such as disinformation and digital regulation, sent the letter last week calling on TikTok, Meta and X, formerly known as Twitter, to reduce the rise of false and violent images. Had requested. in the Middle East.

Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

− 3 = 1