The Effects of Algorithms on Epistemology:
In this post the effects of algorithms on epistemology are examined in the context of the ‘new new media ontology,’ or what has come to be known as the participatory culture or phenomenon that is Web 2.0. This study follows in the tradition of theorists such as Burrows, Deleuze, Foucault, Graham, Guattari, Hayles, Jameson, Lessig, Turow etc. Algorithms are considered to be a shift toward performative infrastructure, underpinned by an ever-evolving multiplicity of power structures that are embedded in the ‘new new media’ space/place that has come to be known as ‘cyberspace.’ Here, algorithms are understood as techniques of ‘surveillance,’ ‘prioritisation’ and ‘inhibition,’ ones’ that are usually invisible and automated, in ways that neither user nor beneficiary are fully aware of, nor conscious of how or even if they are operating. (Hayles (2004): 239) In this context, the ‘embedded politics’ of this ‘haze software,’ and its’ implications for structuring our life chances are explored. (Crong and Graham (2007): 67) It is considered that our perception and understanding of the world, is influenced by the way information is presented to us, and indeed how the medium requires we interact with the information. (Kelkar (2009): 1)
Algorithms are not static entities, rather they are powerful tools that structure invisible processes of prioritisation and marginalisation as software and code, they are used to judge people’s worth and eligibility in regards to access levels to whole ranges of essential urban spaces and services. Turrow (2007) describes this process as one of marginalisation as discrimination, suggesting that algorithms are used by marketing executives to structure, ever more carefully, the customer categories or niches, that tag consumers as ‘desirable’ or ‘undesirable.’ Once these niches have been established, the business then can further create niches that serves to constitute new levels of efficiency, productivity, profit and other business agendas.
Graham’s work on the ‘new new media ontology’ develops concrete illustrations of how software algorithms order and divide information. In this way Graham suggests that algorithms act to ‘filter’ and ‘sort,’ enabling service providers to provide individuals with differentiated services or service levels. (Burrows and Ellison (2004): 22). It is implied here, that algorithms have gone beyond being tools for organization, but now also make decisions for organizations about to deal with information and indeed how information ‘deals’ with them.
We find similar resonance in Katherine Hayles’ writing on the implications of algorithms and ‘new’ media technology, as Hayles’ further highlights how in the new media environment, these ‘organizational’ and structuring ‘processes’ are totally invisible to the user. (Hayles (2008): 34) In addition she suggests that algorithms are used as the interface between the database and telecommunications networks to allocate different ‘levels’ of user, on an increasingly automated basis. (Hayles (2008): 34)