Nicole Timmermans explores how TikTok exploits users’ behavioural biases.
‘Digital crack cocaine’ – the words used by a Forbes journalist to describe the Chinese app that allows users to share bite-sized clips. TikTok downloads have surpassed those of Facebook, Snapchat, WhatsApp and Instagram. Watching aesthetically pleasing pasta clips or frogs dressed as Harry Potter is innocuous fun, but how far will platforms go to retain user engagement?
The popularity surrounding the app has undoubtedly gathered media attention, most recently amid fears over user privacy and Trump’s threats to ban. However, behind publicised political tensions lie covert and cryptic mechanisms designed to exploit humans where they are most vulnerable – their cognitive biases and heuristics.
Development of Nudge Theory
Economics Nobel Prize laureate Daniel Kahneman identified a spectrum of systematic biases present in decision-making, contradicting principles of rationality presented in neoclassical theory. His book, Thinking Fast and Slow, presents a dualistic model for decision-making. A dichotomy lies between ‘System 1’: intuitive and emotional thinking, subject to heuristics, and ‘System 2’: slower, analytical and conscious reasoning. ‘System 1’ dominates day-to-day choices. ‘Nudges’, as proposed by Richard Thaler and Cass Sunstein, build on this model, altering “behaviour in a predictable way without forbidding any options or significantly changing their economic incentives”.
Recognising biases can encourage individuals towards ‘better’ choices, minimising hurdles and making it easier to save, recycle and exercise. By definition, a nudge is transparent, easy to opt out of and aims to improve welfare.
Stuck in the Sludge
Why are we seeing this go in the opposite direction? Whilst awareness of cognitive vulnerabilities can prevent us from falling victim to them, there also exists the potential to exploit and manipulate at a subconscious level. Such ‘Dark Nudges’ have been coined ‘Sludges’. There’s a lack of systematic analysis into how sludges are used to undermine public health efforts and maintain environments which favour corporate interests. TikTok is a relevant example of this, as it is primarily targeted towards children and teenagers.
TikTok’s recognisable interface is characterised by immersive full-screen clips which lack text. Users flick through their ‘for you page’ for hours as AutoPlay starts a new clip with just a single swipe. This minimised physical effort ensures ‘System 2’ remains deactivated; decreased friction, increased engagement.
Behavioural Economics dictates that an absence of decision points increases consumption. By removing the decision to ‘play’ a video, developers exploit our tendency for procrastination, crafting a prolonged and more engaged chain of consumption. This can explain time-inconsistencies reported by users. By the time you look up from watching ‘things-in-my-Edinburgh-flat-that-just-make-sense’, you may not have noticed that an hour has already passed.
Most of us would just laugh this off, however a more sobering parallel is that between the shared mechanisms underpinning gambling and TikTok – intermittent reinforcement. Slot machines are designed to lack predictability, creating an equilibrium where a user will ‘win’ every so often, preventing boredom as a result of continuous wins or frustration as a result of continual loss. TikTok is no different. A user will watch a clip they enjoy (a ‘win’), then wonder if the next clip will be as good, only to find it dull (‘lose’). Rational behaviour would dictate the user leave the app. Instead, they ask ‘what if the next clip is enjoyable?’, reinforcing this addictive cycle.
To what extent can we hold such corporations accountable?
Firms nudging via advertising, setting default choices and reducing alternatives is considered acceptable. There is a fine line between persuasion and manipulation, between a nudge versus a sludge. No objective, quantifiable value separates the two. The distinction is rooted, arguably, in the concept of liberal paternalism. As Thaler and Sunstein advocate, “it is both possible and legitimate for private and public institutions to affect behaviour while also respecting freedom of choice”.
Social media is transgressing this boundary. Charitable bodies such as The Centre for Humane Technology, who question the ethics of consumer technology, are fighting to hold corporations accountable. But what can we do as consumers? We can set our own nudges to counter ‘Dark Nudges’. Next time you’re on the app, set timers or turn off notifications. Try creating your own decision points. In this way, we can recognise and fight those quiet and almost imperceptible changes in our behaviour. Help yourself say goodbye to Addison Rae and hello to autonomy.