Bug That Showed Violent Content in Instagram Feeds Is Fixed, Meta Says
Instagram’s parent company Meta apologized on Thursday Violent graphic content Some users see it on their Instagram reel feeds. Meta attributes the problem to what the company says.
“We have resolved a bug that has made certain users see content in their Instagram Reels feeds not recommended,” a meta spokesperson said in a statement provided to CNET. “We apologize for this error.”
Meta went on to say the incident had nothing to do with any content policy changes made by the company. At the beginning of this year, Instagram did some Major changes in its user and content creation policiesbut these changes do not specifically address content filtering or inappropriate content that appears on the feed.
Yuanyi Content modification will change recent, Demolition of fact-checking department Support community-driven moderation. Amnesty International Warning early this month Metachange may increase the risk of promoting violence.
Read more: Report
Metal says that most graphics or disturbing images are logos Delete and replace with warning label The user must click to view the image. Meta said some content was also filtered for young people under the age of 18. The company said it has developed policies on violence and graphic images with the help of international experts, and that perfecting these policies is an ongoing process.
User Publish On social media On the message board, Including redditabout some unnecessary images they saw on Instagram, presumably due to a malfunction. These include shootings, beheadings, people hit by vehicles, and other acts of violence.
Brooke Erin DuffyCornell University’s social media researcher and associate professor said she doesn’t believe Meta’s claim that the issue of violence has nothing to do with policy changes.
“The content audit system – whether driven by AI or artificially – is by no means a malfunction,” Duffy told CNET. “While many speculate that Meta’s temperate overhaul (announced last month) will create greater risks and vulnerability, yesterday’s “glitch” provides first-hand evidence of the cost of an unpopular platform.”
Duffy added that despite the difficulty of hosting social media platforms, “platform review guidelines have become a security mechanism for users, especially those from marginalized communities. Meta-replaced with the “community notes” feature represents a step in user protection.”