Have you seen the video of the guy who, as Reddit put it, “got folded by an elephant” that’s been going around Instagram? If so, well, whoops, that wasn’t supposed to happen: Meta says an unexpected deluge of blood, guts, animal abuse, and dead bodies that recently turned Instagram feeds into a gory horror show was a “mistake,” and it’s sorry everyone had to see that.
It wasn’t just the elephant thing, to be clear: There were also reportedly videos of beheadings, shootings, trench warfare in Ukraine, someone getting gored by a bull, someone else having his head set on fire, a double homicide at a gas station, a guy falling out of a carnival ride and getting “splattered on the concrete,” and various other clips of graphic violence and its aftermath. Some porn too, apparently.
“I seen four hundred and forty three people get shot today,” one redditor wrote, and I would normally shrug that off as an exaggeration but in this particular case, perhaps not.
It’s obviously disturbing stuff, and most of the posts about it are appropriately horrified, although there are some darkly funny reactions too. One redditor, who said his Instagram account was suspended for “making a couple fat jokes,” said he’d witnessed “nudity, sex, rape, murder, stabbings, beheadings, [and] castration” videos and wanted to know how all of that wasn’t in violation of Instagram guidelines too.
“Somehow the video I saw of a homeless guy getting humped by a dog is the most normal video I saw today on IG,” another redditor wrote.
Quite a few people decided the best course was to lay off Instagram until the situation was sorted. “I’ve seen like five people get shot, seven fights, and one girl get stabbed and bled to death. Like a genuine river of blood came out of her,” one redditor wrote. “I’m off that shit for a min.”
Naturally, there was also confusion about why all this stuff was being surfaced at all. A number of redditors suspected something had gone very wrong with Instagram’s algorithms, and sure enough, that seems to be exactly what happened.
“We have fixed an error that caused some users to see content on their Instagram Reels feed that should not have been recommended,” a Meta rep told The Guardian. “We apologise for the mistake.” The company also said that the sudden influx of gore was unrelated to Meta’s recent decision to remove fact-checkers from Facebook and Instagram.
The problem is apparently fixed now, and frankly I’m glad I missed it. There was a time, a very many years ago, when sitting through stuff like this was kind of a rite of passage: You weren’t really on the internet until you’d watched the video of the guys getting necklaced or the extreme bicycle accident in the shopping mall parking lot. And I don’t know if it’s youthful folly I’ve outgrown, or if I’ve just had my fill of awfulness, but either way I’m tired of all that and I really don’t want to see any more.
Which makes it kind of infuriating that this happened in the first place: I’m not interested in policing what other people want to look at, but I really don’t want Mark Zuckerberg’s busted-ass machine feeding me material I specifically do not want blasted into my eyes, especially not as a result of a glitch in the matrix deciding that a video of a guy getting “turned into pink mist in a car crash” is what I really need in my life right now.
The incident might also lead one to wonder why all this stuff is on Instagram in the first place. It turns out that Meta does in fact have a detailed policy regulating “violent and graphic content,” but it’s surprisingly lenient: For instance, images or video “depicting a person’s violent death” will be given a warning screen and 18+ age gate, but is otherwise allowable.
Source link
Add comment