What's so good about YouTube? It allows anyone to upload and share their videos with the whole world. And what's bad? It allows anyone to upload and share their videos with the whole world. The proliferation of content with dubious targets, such as fake news, conspiracy theories and other beauties of the information age, is nothing new, but what some are denouncing now is how the company has been dealing with this issue for an extended period of time.
For years, YouTube has been reportedly ignoring its employees' requests to take down toxic videos from its platform. The reason? To increase traffic. According to more than 20 employees (former and current), staff have been making proposals to prevent the propagation of disruptive, false, extremist content and/or promoting conspiracy theories. However, the management of the platform has turned a deaf ear to such proposals in order not to miss out on the views generated.
For example, one of the proposals suggested deleting problematic or borderline videos from the "Recommended" tab. According to a former company engineer, YouTube rejected the proposal in 2016, recommending videos no matter how controversial. According to employees, the internal goal was to achieve one billion viewing hours per day.
Employees outside the moderation team also reported that the company had discouraged them from searching for toxic videos on the platform, as lawyers have said YouTube's responsibility would be greater if its employees knew and reported the existence of such content.
At least five Senior employees have left YouTube because of their reluctance to address this problem. According to one of the former employees, executive director Susan Wojcicki "never got burned," claiming that her job was to "run the business forward" and not to deal with incorrect information or dangerous content. A YouTube spokesman has said the company began taking action in late 2016, and demonetizing toxic content in 2017. However, by the end of 2017, its "Trust and Security" department only had about 20 employees.
In 2018, the platform began to try to curb fake news and conspiracy theories with an informative text box, and this year has decided to eliminate advertising of potentially dangerous content. In addition to assuring the public, through an announcement made on 25 January this year, that fewer videos defending conspiracy theories will be recommended.
All indications are that YouTube is trying to rectify the situation, but it seems to be a little late and it has not presented a coherent strategy to deal with the issue in all its complexity. If the platform wants to prevent the proliferation of problematic videos, it will have to deal with an essential issue such as content moderation, but until then it could still be infested with toxicity.
What do you think? Are views worth more than quality content?