“The easier it is to quantify, the less it’s worth” – Seth Godin
Can you spend a day without your phone, or would you be less smart without techy gadgets, maybe less capable?
Our brain certainly has limits in holding information and in representing reality with restricted sensory inputs and processing capabilities.
Information technology certainly extends our abilities and reflects important scientific advances when assisting us in better discerning cause and effect and in supporting optimal decision-making.
A primary way it works is through the seamless generation, combination and processing of new information. But just as the brain is constrained by its capacity to sense and process information, our technological artefacts are similarly limited by the type of information generated as well as the processing logics they are programmed into.
The following portrays some criticism of information technology, specifically focusing on the ‘datafication’ of our experiences and the algorithmic representation of phenomena. The idea is not to paint a sombre picture but to help maintain course in rapid socio-technical evolution.
Firstly, do the digital tentacles capture all corners of our society?
Smart technologies such as AI, IoT, machine learning, robotics and automated systems promise to deliver on efficiency and responsiveness, but it looks like they might be further exacerbating the “unequal distribution of digital benefits in their design and implementation” (Park & Humphry, 2019, p. 935).
It’s not only that access to digital services is largely intertwined with socio-economic status, but service providers often lean towards relatively normative and simplified user responses when training their algorithms, with AI-learning often being skewed towards the majority (Park & Humphry, 2019).
Should we then be comfortable with AI solutions doing things like screening applications for jobs, for bank loans or insurance?
To what extent can or does data depict an accurate representation of existing realities?
From a phenomenological perspective, no single view can ever be complete, as the perception of a referent is altered by its relationship to context and to juxtaposed voids. We can never enumerate all possible views, but we can’t design with a singular viewpoint in mind.
Monteiro & Parmiggiani (2019) demonstrate the limits of what they call ‘synthetic knowing’ with a longitudinal case study of digitally-rendered environmental monitoring by an oil and gas company operating in the politically contested Arctic. They reveal how knowing is politically charged, and how sensors and algorithms increasingly represent phenomenological reality, ‘configurable’ to the guise of profit-making stakeholders.
With liquification into digital materiality, real-time knowing can become an algorithmic phenomenon, and the debate is ongoing on whether enough credibility is warranted to base consequential decisions on.
Skewed algorithmic representation of people and environments may constrain the intelligence of our services and decision-making, but there is a close corollary when it comes to how we personally construct our daily experiences, particularly when dealing with digital representations of our own selves.
An example of ‘datafication’ of the self is found in the context of self-tracking technologies such as fitness apps and smart wearables which abound in a growing ‘healthist’ culture, both in the private and organizational domains.
Measurement of activities like running or leisure time, and measurement of internal states such as cardiovascular performance, sleep, caloric intake, stress, etc., is meant to support users in monitoring, managing, and achieving desired performance levels.
What are possible implications of having a ‘quantified self’ on the definition of self, and consequently, on one’s lived experience?
The bright side of the discourse reports feelings of empowerment and control, growing self-awareness, as well as initial spikes in interest and engagement.
However, Kristensen & Ruckenstein (2018) point out that initial intensification of experience, feelings of liberation, and developed inner sensitivity may later become limiting in terms of self-experience. Of the reasons proposed is the notion of amplification that the data may bring to certain aspects of the self at the expense of others.
An augmented or a reduced experience?
What are possible implications of having a ‘quantified self’ on the definition of self, and consequently, on one’s lived experience?
From a psychoanalytic perspective, Andrieu (2015) argues that the ontological discontinuity between the ‘living person’, the ‘living person as reflected by the data’ (i.e. the quantified self), and the ‘lived experience’, reflects a distinction between 3 levels of self-knowledge.
Compared to transformation by ‘normal’ knowing of oneself, technology-mediated self-transformation doesn’t take place only through self-examination by one’s consciousness. It happens via a ‘confrontation’ between one’s lived self and the data reflected by tracking technology.
Andrieu indeed poses some pertinent questions on whether this is a new form of ‘bio-power’, or a mode of subjectivation (i.e. a redefinition of the self); is it the ‘living self’ that is reflected, or the ‘living self as transformed by the digital self’?
To illustrate, well-being, which was once qualitative and in a way, non-communicable, becomes numeric and translatable under the illusion of correspondence between one’s lived physical condition and the digital self – is it then one’s felt well-being, or is it one’s wellbeing-as-mediated-by-the-data?
Like the above-mentioned limitations of datafication in our services, ‘algorithms of the skin’ and a particular ‘bio-pedagogy’ further reinforce certain desirable profiles of health and physical education (Ardieu, 2015). Here, the consequences relate to how we perceive and manage our bodies and lifestyles, in addition to the recurring concern of people having access to the resources needed to perform up to measure.
In a nutshell, technological dependence may compromise the self-governance of one’s own body.
This runs counter to the ethos of ‘embodied experience’ where we must learn to rediscover our experiences through a more holistic integration of mind, body and emotion.
In a world of constant distraction, we are easily alienated from ourselves. It is no surprise that we end up serving other people’s agendas, working tirelessly like a cog in a machine (Godin, 2010).
But ‘living by the numbers’ can be counterbalanced by ‘living with the numbers’ in what Pantzar & Ruckenstein (2017) optimistically describe as a milieu of open-endedness and reflexivity.
In concluding, it is advisable to be attuned, even wary, of the man-made things that mediate between the world and our consciousness.
Technology can be fascinating in augmenting our ability to internalize the world, and to externalize the mind. But we need to find that sweet spot in our quest for optimization, to make sure it doesn’t over-rationalize our institutions.
Fares Khalil
Doctoral Student
References
Godin, S. (2010). Linchpin: Are you indispensable?. New York: Portfolio.
Monteiro, E., and Parmiggiani, E. (2019). Synthetic knowing: The politics of the internet of things. MIS Quarterly: Management Information Systems, 43(1), pp. 167-184.
Ng, I.C.L. and Wakenshaw, S.Y.L. (2017). The Internet-of-Things: Review and research directions. International Journal of Research in Marketing, 34(1), pp. 3-21.
Pantzar, M., & Ruckenstein, M. (2017). Living the metrics: Self-tracking and situated objectivity. Digital Health.
Park, S. and Humphry, J. (2019). Exclusion by design: intersections of social, digital and data exclusion. Information, Communication & Society, 22(7), pp. 934-953
Photos: Unsplash