YouTube’s “next video” is a profit-maximizing recommendation system, an A.I. selecting increasingly ‘engaging’ videos. And that’s the problem.
“Computer scientists and users began noticing that YouTube’s algorithm seemed to achieve its goal by recommending increasingly extreme and conspiratorial content. One researcher reported that after she viewed footage of Donald Trump campaign rallies, YouTube next offered her videos featuring “white supremacist rants, Holocaust denials and other disturbing content.” The algorithm’s upping-the-ante approach went beyond politics, she said: “Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.” As a result, research suggests, YouTube’s algorithm has been helping to polarize and radicalize people and spread misinformation, just to keep us watching.”
By teaching machines to understand our true desires, one scientist hopes to avoid the potentially disastrous consequences of having them do what we command.