Meta has issued an apology following a technical issue that led some users to see violent and graphic videos in their Instagram Reels feed.

“We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended,” a Meta spokesperson stated. “We apologize for the mistake.”

Several Instagram users reported on Tuesday that their Reels feed was filled with disturbing videos showing people being beaten or killed, many of which were marked as “sensitive content.”

Meta has not provided further details on what caused the glitch.

This issue arises as Meta works to increase short-form video engagement across its platforms, especially with TikTok’s future in the U.S. uncertain. TikTok has just over a month to secure a new, non-Chinese owner for its U.S. operations or face a potential ban, following a law passed by President Joe Biden last year and extended in January by President Donald Trump.

Instagram has been working to attract users concerned about the potential loss of TikTok by rolling out features similar to those on its biggest competitor, such as longer video time limits and a “tap to pause” option for Reels.

In addition, Meta plans to launch a new video creation app called Edits in the coming weeks, which will be similar to CapCut, the video editing tool used by many creators on TikTok, which is owned by ByteDance.

Meta has also recently introduced significant and controversial changes to its content moderation policies. In January, the company announced it would eliminate fact-checkers and replace them with a user-generated “Community Notes” system to add context to posts. Meta also stated it would scale back its automated content removal systems, focusing only on the most extreme violations, such as terrorism, child sexual exploitation, drugs, fraud, and scams.

When CEO Mark Zuckerberg unveiled the moderation changes, he acknowledged that the company would likely “catch less bad stuff” on its platforms but emphasized that this would allow more free speech.

However, a Meta spokesperson clarified that the technical error that caused users to see violent videos on Wednesday was unrelated to the company’s recent content moderation changes.

Leave a Reply

Your email address will not be published. Required fields are marked *