JAMES CLAYTON (REPORTER): I decided to look at some of the content being created and uploaded about Kyle Rittenhouse on the platform and then asked the company what they thought about it.
...
CLAYTON: First off, that footage we played earlier, YouTube removed it, saying it flouted its rules after I showed it to them. A pretty clear cut case. Well maybe not, because listen to this.
...
CLAYTON: It's very similar to the first clip, a training video recreating a mass shooting with targets. Yet, YouTube decided this was okay and left it up, although it did put an age limit on it. There are dozens of other videos we found that clearly glorify Rittenhouse. Take this one. Someone has created a platform computer game video with Rittenhouse as an action figure shooting protesters. The video was taken down when we showed it to YouTube. What about this song?
...
CLAYTON: It pretty obviously glorifies Kyle Rittenhouse. Again, YouTube took it down when I showed it to them. Here's the thing, though. There are plenty of posts on its platform that use this exact song that still haven't been taken down. What you may be noticing by now is how arbitrary and inconsistent a lot of this moderation appears to be. With the absence of defined rules around Rittenhouse, users will inevitably push the envelope. This is Angelo Carusone from Media Matters, which monitors extremism online.
ANGELO CARUSONE: While some pro-Rittenhouse content still gets through on Facebook, the reason why YouTube is actually like a safe haven and truly an engine of pro-Rittenhouse content is because they haven't bothered to grapple with it at all and it is a reflection of a much bigger issue with how YouTube is doing business these days.
CLAYTON: A business that includes allowing merchandise celebrating Kyle Rittenhouse and his actions to be sold via YouTube. We found these kinds of adverts and showed them to YouTube, which they promptly removed. But once again, it took a journalist to point this out to them. YouTube said that their community guidelines prohibit any violent or graphic content intended to shock viewers. We take swift action to remove content flags by our community that violates those policies. The company may say that, but experts studying this and our research into Rittenhouse suggests the platform could do a lot more to stop hosting extremism.