🇬🇧 Umwelt and individuation: Digital signals and technical being

This chapter, which forms part of a deep and existentially far-reaching anthology on Digital Existence, is essentially a plea for a more responsive, cooperative information infrastructure. I address this by taking Facebook as an example.

Today’s digital landscape is quite literally premised on a theory of information that was in fact intended for machines – Claude Shannon’s theorem from 1948. Thus, the digital imaginary of our time is unfortunately of a very rigid, mute, non-vitalist kind – essentially inhuman.

My chapter is an attempt at reaching towards a more integrated, dynamic, vitalistic, and inclusive theory of digital information, by adopting the theory of Gilbert Simondon, a French 1950s thinker of technology.

Simondon affirms technology as a symbiotic process, enabling a utopian future where humans and digital infrastructures can be allowed to truly co-habit this planet – in contrast with today’s mainstream paradigm, which rather seems to stipulate an alienated relationship to technology, humans in one ringside and machines in the other. In Simondon’s theory, the individual is not a being but an act, and individuality is always an aspect of generation, ever-evolving, an ongoing genesis.

This stands in stark contrast to prevailing technocratic “solutions” (apps, platforms, databases) that are essentially systems of control, where users are deprived of genuine participation and are at best offered limited forms of co-creation that are always conditional on the proprietors or owners in question. At worst, the participation allowed for users is only illusory. The very act of trying to encapsulate human being into predefined, finite and locked-down boxes – trying to “pin down” individuals and groups by recourse to palimpsests, intended to “freeze” system states as if these were reliable and objective snapshots of human behaviour – is reductive and regressive at its core.

Believe it or not: These rather outlandish epistemological convictions actually lay at the root of today’s tech companies that base their business models on behavioural data, leading the operatives inside of these companies to pretend that the signals gathered are truthful and representative renditions of human behaviour.

What is more, once these operatives implement new applications based on the data that they are constantly gathering and feeding into algorithmic systems of behavioural manipulation and control, these systems actually begin to actively shape the real world that they are interacting with.

Soon, sinister feedback loops emerge: By observing the behaviours that these algorithmic systems prescribe, indeed dictate, users are taught to behave in specific ways in order to navigate the interface in the expected ways. By doing so, they become enticed to make further interactions which will, in turn, be farmed into new, interesting content for other users to interact with: Think of how Facebook users are compelled to publish and share content that is expected to be desirable among their peers.

More importantly, any move that a user would make is monitored and recorded so as to enable the corporation to interpret these signals in order to make selections of content and advertisements that they believe that the user him- or herself would find interesting, based on what they read these signals to indicate.

Moreover, users would arguably adapt also their own behaviours in order to suit the algorithmic infrastructure: In order to maintain peer visibility, users are compelled to design their posts in accordance with what the algorithmic interface tends to value as popular or recognizable to a large audience (Gillespie, 2014: 183). This precipitates a kind of built-in conformism; a popularity bias (Webster 2014).

Algorithms indirectly construct culture by way of feedback loops like this. Individuals seem to act based on what they observe that these semi-automated systems seem to value.

My argument, in brief

There is a funny thing though.

Do you see how the humans in the loop always have to second-guess what the system would prefer or predict? Essentially, the corporation makes educated guesses from all the vast amounts of user signals that they collect, and try to make target groups and so that they can increase the chances for advertisers to place ads that actually engage the users. Essentially, users themselves try to “game” the system so that they can reap as many benefits as they can from using it.

Researchers like Taina Bucher and John Cheney-Lippold have come to similar conclusions.

In order to understand all of this better, let us think of these media-technological systems as Umwelts for individuals to roam through. The concept of Umwelt was developed in the early 20th century by the Baltic German biologist Jakob von Uexküll, and refers to the cognized environment, the “self-centered world” which all organisms live in. All organisms experience life in terms of subjective reference frames: a bumblebee is at the center of its own world, much like the Facebook user is at the center of her own world, uniquely personalised for her, by Facebook the corporation™.

So, as users interact with environments-that-are-unique-to-them-and-only-them, they would at the same time give off signals as they keep interacting with this built environment. After all, this is an environment that is built on surveillance, all the way through. These signals are then instantly harvested by the platform proprietors and are read to be indicative of the assumed internal states of these individuals.

The really clever thing with this argument, though, is that we can think of also the platform infrastructure’s intelligence as a form of technical Umwelt unto itself!

Facebook doesn’t magically “know” you, as if we were dealing with some kind of sentient fairy-tale being, a Leviathan of some kind (although some critical scholars would definitely seem to want to frame it like that!) The platform operators and managers can actually only “see” that which takes place in the direct interactions, the actual “clicks” and measurable movements made. This is, quite literally, all that the automated systems have to go on. A system is a sum of inputs. It is by compiling signals, encoded in the form of “behavioural data,” that the engineers, behavioural scientists and marketing experts who build and maintain this infrastructure make their decisions.

Consequentially, we should not underestimate the degree to which the actual operatives inside the platform corporations are informed by estimations that risk being very reductive, if not even blind to a lot of aspects of human life.

A stunning addition!

After having finished this article in 2018, I was reminded of the concept of affordances, pioneered by cognitive psychologist J.J. Gibson in 1979. It is a bit embarrassing that his work hadn’t actually crossed my mind before. I’m schooled in a field somewhat indebted to continental philosophy and the Frankfurt school, so the work of an American mid-20th century psychologist hadn’t really cropped up on may radar.

But, conversely, Gibson himself had no reference to Umwelt either.

Andersson Schwarz, J. (2018). Umwelt and individuation: Digital signals and technical being. In: A. Lagerkvist (Ed.) Digital Existence. London & New York: Routledge. 61-80.

Paywalled / contact me for access

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s