Taxing gatekeepers~I
By 2016-17, Facebook had designed a business model that aimed at maximising the ‘user engagement’, that is how much time users spent on their platform, and started collecting data on what they liked and shared. The more time users spent the more data Facebook collected, so their algorithms could then monetise the information by targeting their ads to specific users. As users spent more time with Facebook, Mark Zuckerburg was getting richer. Commercial success of this business model prompted Facebook to give their algorithms a single, no-holds barred, overarching goal ~ to increase user engagement at all costs, to the exclusion of every other consideration.
By analysing data on millions of users, the Artificial intelligence (AI)-based self-learning algorithms discovered quickly that outrage improves user engagement, that people get easily attracted to hate speech, extremism and news that trigger jealousy, envy, anger and indignation. Thus, to make people remain engaged for longer periods, they only need to promote emotionally charged material over neutral information. Many of our news TV networks have also learnt these tricks to increase their TRPs. Algorithms were thus designed to display and highlight such contents and place them at the top of the users’ news feed, and to recommend them to users in a targeted manner.
By 2017, Facebook’s popularity in Myanmar was so overwhelming that it was almost synonymous with the internet. It was then that Facebook algorithms started promoting anti-Rohingya videos and materials propagating violence against them. As Yuval Noah Harari wrote in his book “Nexus”, 70 per cent of such video views came from Facebook’s autoplaying algorithms and 53 per cent of all videos watched in Myanmar were being auto-played for users by algorithms. Thus, instead of people choosing what to see, algorithms were choosing for them, which helped fan the flames of widespread anti-Rohingya sentiments resulting ultimately in the worst genocide in Myanmar, causing millions of Ro hingya refugees to flee the country. Facebook was the chief instrument for organising this genocide, without any accountability to anyone.
Advertisement
And not simply Facebook ~ there are numerous instances that by promoting emotionally charged, divisive and harmful digital contents that users find irresistible, the entire social media ~ YouTube, WhatsApp, Snap chat, Twitter, TikTok, etc. ~ have radicalised people, engaged them in lynching of innocents, hate crimes and racial attacks, and even influenced voters by targeting political ads. As the Nobel laureate economist........
© The Statesman
visit website