Sunday, May 2, 2010

The New New Media Ontology.


The Effects of Algorithms on Epistemology:

In this post the effects of algorithms on epistemology are examined in the context of the ‘new new media ontology,’ or what has come to be known as the participatory culture or phenomenon that is Web 2.0. This study follows in the tradition of theorists such as Burrows, Deleuze, Foucault, Graham, Guattari, Hayles, Jameson, Lessig, Turow etc. Algorithms are considered to be a shift toward performative infrastructure, underpinned by an ever-evolving multiplicity of power structures that are embedded in the ‘new new media’ space/place that has come to be known as ‘cyberspace.’  Here, algorithms are understood as techniques of ‘surveillance,’ ‘prioritisation’ and ‘inhibition,’ ones’ that are usually invisible and automated, in ways that neither user nor beneficiary are fully aware of, nor conscious of how or even if they are operating. (Hayles (2004): 239)  In this context, the ‘embedded politics’ of this ‘haze software,’ and its’ implications for structuring our life chances are explored. (Crong and Graham (2007): 67) It is considered that our perception and understanding of the world, is influenced by the way information is presented to us, and indeed how the medium requires we interact with the information. (Kelkar (2009): 1)


Algorithms are not static entities, rather they are powerful tools that structure invisible processes of prioritisation and marginalisation as software and code, they are used to judge people’s worth and eligibility in regards to access levels to whole ranges of essential urban spaces and services. Turrow (2007) describes this process as one of marginalisation as discrimination, suggesting that algorithms are used by marketing executives to structure, ever more carefully, the customer categories or niches, that tag consumers as ‘desirable’ or ‘undesirable.’ Once these niches have been established, the business then can further create niches that serves to constitute new levels of efficiency, productivity, profit and other business agendas.

Graham’s work on the ‘new new media ontology’ develops concrete illustrations of how software algorithms order and divide information. In this way Graham suggests that algorithms act to ‘filter’ and ‘sort,’ enabling service providers to provide individuals with differentiated services or service levels. (Burrows and Ellison (2004): 22). It is implied here, that algorithms have gone beyond being tools for organization, but now also make decisions for organizations about to deal with information and indeed how information ‘deals’ with them.

We find similar resonance in Katherine Hayles’ writing on the implications of algorithms and ‘new’ media technology, as Hayles’ further highlights how in the new media environment, these ‘organizational’ and structuring ‘processes’ are totally invisible to the user. (Hayles (2008): 34) In addition she suggests that algorithms are used as the interface between the database and telecommunications networks to allocate different ‘levels’ of user, on an increasingly automated basis. (Hayles (2008): 34)
(to be continued()

Thursday, April 8, 2010

WEB 2.0: wah wah you say?
Catering to Individual Needs and Whims: 'no such thing as an unchangeable document rather bit torrents to be broken apart and custom reassembled...' (Lessig (2006): Web 2.0)
Web 2.0 is a new environment in which the web serves as a platform to offer services rather than software. Instead of documents, web 2.0 offers bit torrents that can be broken up into a thousand pieces and 'custom' reassembled according to individual needs and whims. In addition users can now download, and share much larger amounts of info, and share it with others on social networking sights such as 'google docs/notebook/scribbd/tumblr and diigo to name just a few.. This is evident in the way people use their social media/online pages as expressions of their identity, and how blogs and eTails sites make their money by customising their sites using a mash up of brand imagery and content to fit with their customer, hence the proliferation and growth of the online 'mall' or the 'westfields' of the internet built  by 20 year old students..
This user centered environment involves getting to know the users tagging system of classification, which become data clouds and hence trends are formed and knowledge shared and generated on a much larger and faster scale. This functionality has changed the way information on the internet is gathered and classified, and so has changed the way people interact online. These services place the user in control of data through an architecture of participation, data mixing, and the harnessing of collective intelligence. No longer does centralised standardisation apply, even institutions such as state libraries and congress have had to open up there system to shift towards web 2.0 compatibility.