Recent discussions on the Internet and Internet intermediaries has been revolving around the developments and potentials of the Internet of Things (IoT), Big Data Analytics and Artificial Intelligence (AI). The IoT can be referred to as theprocess/concept of connecting any device (so long as it has an on/off switch) to the Internet and other connectable devices. Big data analytics is the application of advanced analytic techniquesagainst massive and diverse data sets that include structured, semi-structured and unstructured data, from multiple sources, and in varying quantities. At the same time, AI refers to the simulation of human intelligence in machines or computational devices that are programmed to think or reflect like human beings and reproduce their actions. The term Artificial Intelligence is often connected with machines or devices that exhibit qualities associated with the human mind, such as learning and problem solving. These technological advancements have without doubts reduced human efforts and consumption of time in a vast quantity during the last years. Meantime, there are widespread criticisms regarding the dependency created by these technologies to the humankind. In other words, these advancements have created a state of high dependency on the Internet and its subsequent products.
The smartphones we use, smartwatch we wear, Smart TV we watch or Assistance services in any of the digital devices, all of these comprise a small share of the larger picture. However,what is driving these innovations to work in perfect conditions? Put simply; there is an algorithm behind every computer programme. Dr. Panos Parpas, a professor in the department of computing at Imperial College London, says “, we have to rely on algorithms, wherever we use computers: there are differenttypes, but algorithms, in common, follow a series of instructions to solve a specific problem. It is almost like how a recipe helps you to bake a cake. Rather than having a generic flour or an oven temperature, the algorithm will try a variety of variants to provide the most effective cake possible from the available sets and variations". To be specific, an algorithm determines how a computer programme should work for maximum potency.
Currently, we are engaging with a new dimension of algorithms, the social algorithm. Today's social algorithms are so complex, which makes it difficult for individuals to understand them fully. Look at the list of recent posts by friends you get when you open Facebook; usually, you will not see all the posts; instead, you will be deluged with the most significant contents which are algorithmically curated. The rationale behind such algorithmic curation is that, with its absence, users would be overwhelmedby stale contents from their networks. In other words, Facebook as an intermediary is striving to pick out the best, and appropriate contents, which you may want, or the company wants you to see, assuming with probabilities of what you will like and click on. Essentially, it is quite simple, Facebook wants you to stay online and keep clicking on the 'like' and 'share' buttons as long as possible, to sustain itself. At this point, the picking and delivering the appropriate content enters another dimension where algorithms are involved. It is not just about providing you with contents, but also keeping you online and thus profitable. Moreover, there rise the question of how fair this computational curation process is? To keep you online, Facebook has to keep you content, and that requires a filter that will only pick the posts that you are expected to see. “It is possible, that friends that Facebook assumes you to keep often incline to be ideologically aligned with you, emphasizing the filtering effect. The impact of curation on other dimensions of deliberative quality on Facebook remains to be examined. Open questions include whether the curation privileges some voices over others and does it highlights specific subjects that systemically undermine discussions of the day (pets over politics). Does a ‘filter bubble’ emerge from this algorithmic curation process, so that individuals only see posts to which they agree?”
The same line can be drawn to analyze other Internet platforms like search engines, games, shopping sites, dating apps, banking, entertainment etc. There are arguments against the search engines, especially, Google and Bing that, their algorithms showcase bias, it can be gender, race, religion or any other traits of discrimination. Google's influence on public discourse materializes primarily through algorithms; an example is thesystem that determines which results you see for your query in its search engine. Just search for terrorists on Google, and the results will be of men in Muslim appearance bearing deadly weapons. It is to be mentioned that Google does not explicitly disclose how its search engine works. German politician Angela Merkel has a precise answer to the situation. “What influences my behaviour on the internet and that of others? Algorithms, when they are not transparent, can lead to a distortion of our perception; they can shrink our expanse of information". It is simple to understand that, human beings are the ones instructing the algorithms, and there are huge possibilities for these people to be biased, as long as their experiences are drawn from real-world situations.
How the algorithms are reinforcing divides is another side of the coin. In his work, Automate This: How Algorithms Came to Rule Our World, Christopher Steiner has recognized a broad range of instances where algorithms are used to deliver predictive insights. He argues, "We are already halfway towards a world where algorithms run nearly everything. As their power intensifies, wealth will concentrate on them. They will ensure the 1%-99% divide gets big. If you do not share the class attached to algorithms, then you will struggle. The reason why there is no popular outrage about Wall Street being run by algorithms is that most people do not yet know or understand it". When Multi-National Companies with sophisticated technologies are using algorithms for their communication, the rest without any access to the technological infrastructure areforced to give up, and this mostly results in the acquisition of smaller ones by the giants.
Algorithms are getting more influential every day. The data generated along with algorithmic processes also raises serious concerns. More or less, we are no longer able to manage our day-to-day lives without the involvement of computation. Rather than constructing our realities, algorithms are structuring it, limiting our choices. In these circumstances, it is vital to knowand to act accordingly as we; the consumers of the experimental computations are vulnerable for manipulations by these technologies and their masters (the real patriarchal-capitalist masters). Robust measures need to be taken, both in individual and institutional capacities, to tackle these challenges and to make the Internet a democratic and just space.