[ad_1]
In an e-mail statement, a YouTube spokesperson said the company has made “significant progress in our work to combat hate speech on YouTube since the tragic attack in Christchurch.” Citing the years 2019 reinforced hate speech policy, the spokesperson said there had been “a 5-fold spike in the number of hate videos deleted from YouTube”. YouTube has also changed its recommendation system to “limit the distribution of limit content”.
YouTube claims that of the 1.8 million channels shut down for violating their policies last quarter, 54,000 were for hate speech – the most ever. YouTube has also removed over 9,000 channels and 200,000 videos for violating the rules against promoting violent extremism. In addition to Molyneux, YouTube bans in June included David Duke and Richard Spencer. (The Christchurch terrorist donated to the National Policy Institute, which Spencer heads.)
“It’s clear that the core of the business model has an impact on the growth and development of this content,” says Lewis. “They tweaked their algorithm, they kicked some people off the platform, but they didn’t fix this underlying problem.”
Online culture doesn’t start and end with YouTube or whatever, by design. Cross-platform sharing is a fundamental part of the social media business model. “YouTube isn’t just a place people go to be entertained; they are sucked into these communities. These allow you to participate via commentary, of course, but also by donating and stimulating content elsewhere, ”says Joan Donovan, director of research at Harvard University’s Shorenstein Center on Media, Politics and public policy. According to the New Zealand government report, the Christchurch terrorist regularly shared far-right posts on Reddit, Wikipedia pages and YouTube videos, including in an unnamed game site chat.
The Christchurch Mosque terrorist also followed and posted on several white nationalist Facebook groups, sometimes making threatening comments about immigrants and minorities. According to the report’s authors who interviewed him, “the individual did not admit that his comments would have concerned counterterrorism agencies. He thought so because of the huge number of similar comments that can be found on the internet. (At the same time, he took steps to minimize his digital footprint, including deleting emails and removing his computer’s hard drive.)
Reposting or proselytizing white supremacists without context or warning, says Donovan, sets the stage for the dissemination of marginal ideas. “We need to look at how these platforms provide the capacity for diffusion and scale which, unfortunately, have now started to serve negative purposes,” she says.
YouTube’s business incentives inevitably hamper this kind of transparency. There are no great ways for outside experts to assess or compare techniques to minimize the spread of multiplatform extremism. Instead, they often have to rely on reports published by companies on their own platforms. Daniel Kelley, associate director of the Anti-Defamation League’s Center for Technology and Society, says that while YouTube reports an increase in removals of extremist content, the measure does not reflect its past or current prevalence. Researchers outside the company don’t know how the recommendation algorithm used to work, how it has changed, how it works now, and what the effect is. And they I don’t know how “limit content” is defined—An important point given that many claim it continues to be prevalent on YouTube, Facebook and elsewhere.
“It’s hard to say if their efforts have paid off,” Kelley says. “We don’t know if it really works or not.” The ADL took to YouTube, but Kelley says he hasn’t seen any documents on how they define extremism or train content moderators there.
A real consideration of the dissemination of extremist content has prompted big technologies to invest a lot of money in finding solutions. Moderation to the problem seems effective. How many banned YouTubers have disappeared in the dark? But moderation doesn’t address how the fundamentals of social media as a business – influencer building, cross-platform sharing, and black box policies – are also critical factors in perpetuating hate online.
Many YouTube links shared by the Christchurch shooter have been removed for violating YouTube moderation policies. The networks of people and ideologies created through them and through other social media persist.
More WIRED stories
[ad_2]