Cultural critics used to be essential curators of music, movies, books, and art. When there were only a couple of newspapers or radio stations or TV stations, the select few professional critics had enormous power in telling the public which art they should pay attention to. Then came the internet and everything changed. With blogs, message boards, and podcasts, anybody could become a critic. Both the creation and critique of art became more democratic. Traditional critics became less important. People preferred to take recommendations from like-minded people in their specific cultural niche. This gave real cultural power to bloggers and amateur critics on the internet who developed a following.
But now we are in the midst of a new revolution. Both the traditional critics and internet curators are being replaced by algorithms, or automated formulas written in computer code. Most people now get their movie recommendations from the Netflix algorithm. Their music recommendations from the Spotify or Pandora algorithm. News recommendations from the Facebook, Twitter, and Google algorithms. Physical product recommendations from the Amazon algorithm.
Like the human critics of the past, these algorithms tell you where to focus your attention, which is exceedingly valuable as the already enormous amount of content on the internet continually grows. The best algorithms know exactly what you will like based on what you liked before. That’s why Netflix is dominating the movie and television industry. They have the best algorithm to determine most accurately what people will like, based on what shows and movies they liked in the past. As a result, most of their original productions are successful. They increase the odds of success for each movie or show they produce because, based on their personal user data, they know the number of people who will watch and like that kind of content.
I enjoy Netflix as much as anyone, but it’s a slippery slope when the content curator is the same as the content creator. If we get all our TV recommendations from Netflix, and they only recommend Netflix produced series, similar to successful series in the past, how do we ever find something new and original? A world run by algorithms can lead to a future dystopia where all the art is the same: safe and comfortable.
Of course, we could go down an entire rabbit hole about whether any art is truly original, or simply amalgamations of the art that came before it. But aside from originality, how does someone find something completely different from everything they’ve seen/heard/read before? Something they never even knew they would like?
Human critics and bloggers can do this: recommend completely different genres, but not randomly. They can recommend two musicians, for instance, a rapper and a folk singer who sound completely different. All they share is the recommendation from a person with good (and eclectic) taste in music. This is much harder for an algorithm to achieve. I’ve discovered some great new music via the Spotify recommendation algorithm, but it tends to be in the same genres, similar to the music I most often listen to. But my tastes in music vary widely, from indie rock to electronic and hip-hop to jazz. Some of my favorite artists have come out of left field, completely different from anything I’d heard before, which is why I liked them so much. I discovered all those types of artists through recommendations from other humans, whether online or in person. You can program randomness in an algorithm, and you can program taste based on previous likes, but how do you program randomness with good taste? The algorithm itself has no taste—it’s just an algorithm. How do you program taste at all, since taste itself is so subjective? Subjectivity is by definition unprogrammable.
YouTube is another example. I’ve discovered a lot of great new channels (like exurb1a) from the YouTube recommendation algorithm, based on videos (like Vsauce) that I watched before. But the YouTube algorithm can lead to trouble if someone watches a conspiracy theory video then gets recommendations for similar conspiracy videos, each one more extreme than the last. The algorithm itself has no agenda to make you a conspiracy theorist, nor to teach you the truth. Its only agenda is to give you a video you will like so that you continue watching (so Google can collect the ad revenue). That creates an obvious problem.
Facebook and other newsfeed algorithms suffer the same issue. People’s worldviews get easily skewed inside a bubble of confirmation bias. They only follow sources they already agree with and unfollow those they dislike. Eventually, all the content they see shares the same opinions on topics no matter how misguided they may be, making false information seem more valid. This is how the flat earth theory has somehow regained steam in recent years. People watch one YouTube video saying the earth is flat, then the algorithm will feed them dozens of more videos proclaiming the same thing. If the algorithm gives the user a video debunking the flat earth theory, they will down-vote it angrily, so the algorithm will stop recommending those types of videos (with actual scientific facts) because if it does, the user will stop watching YouTube, and Google will stop making money from ad revenue.
Facebook has received a lot of public criticism for their exploitation of user data, but they are only the tip of the iceberg. Every tech company is doing the same thing. Any app or service that collects personal data from its users has an enormous amount of power. It would be nice if the tech companies themselves decided to give their users more freedom and control over their own data, but it’s not in their economic interests to do so. I don’t think government regulation is the answer, either. Really, it’s up to the individual user to maintain their own freedom online. If you recognize that your life on the internet is being ruled by algorithms, then you can decide when those algorithms are and aren’t serving your best interests. As I said, many of these algorithms are useful. You just have to make sure you are using the algorithm and it’s not using you.