Recommender systems are present in almost all social media applications today — the biggest of them being the giants like Facebook, YouTube, Twitter, etc. Whatever we see on our social media feed evokes a certain feeling in us — happiness, envy, anger, depression, most often the negative ones without ourselves even being conscious of them. It affects our emotions while we are on it, and algorithmically generates more and more addictive content for us based on a multitude of acquired data. This makes us all the more hooked on to our devices. While we are on it, we do not realise that this happens because of the secretion of dopamine in our brain, which is also called the “feel good” hormone. It makes us want more of it, and that’s why it is so difficult for us to get back to doing our everyday work.
Our other tasks then seem mundane, boring to our brain. The person starts doing his usual work for a short period and then again gets back to his social media. This, when it happens multiple times, leads him losing on a lot of his productive work hours. Less work done means less satisfaction, more frustration which leads the person search for more ways to relax his mind — the best and most easily accessible of them again being social media(apart from smoking and drinking, both of which are bad for health). Now, this is like an endless loop which unless the person consciously thinks, wouldn’t realise anything wrong that is causing this long term situation. This is because the more we are addicted to social media, the more we pick up our devices and start looking at them. The conscious part involved in this activity slowly and steadily decreases because our brain is being used to it. So it becomes more like a muscle memory which will be involuntary.
The event of grabbing our phones and scroll or look up to notifications will be paired to another stimulus or trigger event that will most likely be boredom (technically speaking when dopamine levels in the brain are low). This entire task becomes an involuntary action so that the conscious part of the brain can devote itself to more critical tasks. Recall the times when we reach out to our phones automatically after working for some time even without consciously choosing so. Or for that matter whenever we wake up in the morning, we check out our phones and read all the notifications, messages, news even without thinking. This accounts for a reason why if we do not plan our schedule, we get lost, not feeling to start back with our actual work. This is because the initial dopamine boost that the brain receives by us checking our social media is much greater than what we would have gained from doing our work.
Now all this is okay but why am I concerned about recommender systems? Recommender system is the backbone of most of our applications. It is responsible for showing us or suggesting us content that we will most likely be interested in. But the problem lies in the fact that these algorithms are optimised for increasing our watch time on these platforms. This can lead to grave implications. Solely “watch time” increase over every other parameter leads the algorithm to suggest more and more of certain types of contents. Now, what if this content relates to violent radicalisation, or fake news or some misinformation drive. The applications, for example, Facebook and Twitter, allow anyone or any organisation to create a piece of content and post them publicly.
Being in a democratic society, every citizen should have the right to voice themselves, but the point is not that. It is related to being in control of what content we want to see. People naively scrolling through their Facebook feed will come across videos that support say a specific ideology, will watch it and keep their phones down. But the next time they open the app, they will get more and more recommended videos related to that. Facebook does not care whether that content is true or morally sound, what are its implications on people, the society, etc. unless a huge number of people report it. The idea of Facebook verifying millions of such local content is also ridiculous when many times when people related to the event themselves aren’t sure of what exactly happened. This leads to a huge unchecked void which is filled up by conspiracy theorists for every possible topic on social media.
While those belonging to the flat earth society and those not believing the moon landings are not cause of much concern but the ones who maliciously incite violence, riots and cause unrest in society for their own cause are. Countries having authoritarian governments like Russia and China aren’t at risk because of their government’s total control over their media, but free-flowing democracies like India and the US, have to reflect. We have all witnessed in the past how Russia has tried to influence the stability of governments in countries like Ukraine, the US, etc. through the injection of false propagandas through these platforms by creating dissent amongst the citizens of the country.
But this is just one extreme scenario. The bigger distress is what is happening at, the lower levels. People are more and more prone to enclosing themselves in a sort of an information bubble without them being consciously in control over what appears the next moment on their screens.
From recommender systems being used to reflect our personal preferences to them actually shaping our preferences. How does it feel to think that important choices that you make in your life can be controlled and decided by a bunch of say ten nerds all of age 25 to 30 years sitting in some office in San Francisco? All of whom are constantly researching and optimising just to attract more and more of your attention. None of them gives any thought about the humane side of things like, what can be the psychological and social impact on the person. The trend of paying for using a licensed application is also incidentally on the decline. So why are utilities like Facebook etc. apparently free for everyone to use? Have we ever put thought into this? We have come a long way from paying for products to ourselves only (more specifically our attention) becoming the product.
But what can be done about all this? Should we quit these platforms permanently? Or can there be a better solution? Well, I’m not in favour of just abandoning these applications because as bad as their effects might be, they have also brought about many positive substantial changes in our world- which were unthinkable even just a decade back. The ideal practical solution would be to democratize the process more instead of just putting it at the mercy of some selective individuals. Each person should be able to have full control over how he or she wants the recommendation system to work for them.
Let’s see an example of a possible solution. Suppose the user opens the app and can choose their session outcome from a list of certain predefined goals — like some user can select, say a learning track from the options as their desired objective from their time on the platform, and the recommender system will recommend videos optimising the learning of the user and nothing else. The user will always have complete freedom to change their outcome or end-goal at any point of time as they wish. This may be one solution amongst some others. But as long as people are not sufficiently aware and do not voice their concerns regarding this issue, recommender systems will continue to pose a threat and remain a risky ball game.