How Too Much Information Can Make Us Stupid and Miserable
On the great paradox of the Information Age and why we can’t keep ignoring it
The Noösphere is an entirely reader-supported publication that brings the latest social sciences research into frequently overlooked topics. If you read it every week and value the labour that goes into it, consider becoming a paid subscriber! You can also buy me a coffee instead.
Air, water, and soil pollution are now well-recognised and extensively studied issues.
We are acutely aware of the damage humanity’s exponential growth over the last centuries has inflicted on natural environments and the implications for our future if we do nothing about it today. But, as a group of scientists recently argued in a letter published in Nature Human Behavior, perhaps we also need to recognise and mitigate pollution in another kind of environment: the information space.
After all, and in the last few years alone, we’ve created more data than in all of history. One estimate suggests that roughly 2.5 quintillion bytes of data are generated every day — a number that is difficult even to comprehend.
Most of us are immersed in an unprecedented amount of data, too. From the moment we look at our phone in the morning, we’re almost immediately assaulted by a barrage of information. Emails. Texts. News app alerts. Social media notifications. Weather forecast. Calendar appointments. New podcast episode nudges.
Some of it does make our lives easier.
Some of it, well, not necessarily.
Still, and overall, has all this information actually helped us make more informed and logical decisions, as some previously predicted? Has it made us more intelligent or happier?
There’s a concept in evolutionary biology called the ‘evolutionary mismatch.’ In essence, it’s an adaptive lag that occurs when an organism’s environment changes faster than it can adapt.
One classic example for humans is our ‘sweet tooth.’
Our hunter-gatherer ancestors needed plenty of calorie-rich foods to sustain their large brains and costly reproductive strategies during periods of nutritional scarcity. And since sweetness in nature signals the presence of sugars, a great source of calories, we evolved to crave sweets. But while our environments have changed dramatically since prehistoric times, we mostly retained our Stone Age bodies and brains. As a result, an otherwise helpful trait has proven quite problematic in the modern world, as evidenced by the high rates of obesity, diabetes, and tooth decay.
But that’s just one example of many that suggest our world might be changing too quickly for evolution to keep up.
The rapid rate of technological advancement and modernisation, including the increasing amount of information we process — or, at least, try to process — every single day is another.
Actually, new information is sort of like brain candy. Our prefrontal cortex — the part of our brain that regulates our thoughts, actions, and emotions — has a ‘novelty bias,’ meaning that its attention is easily hijacked by something new, whether it’s a bird landing on your balcony’s rail to rest (one did just as I was writing this sentence) a text from a friend or a breaking news notification. This is also why even when we try to stay focused, we might find it hard not to seek new stimuli ourselves. As neuroscientist Daniel J. Levitin explained it in an article for the Guardian:
We (…) look up something on the internet, check our email, send an SMS, and each of these things tweaks the novelty-seeking, reward-seeking centres of the brain, causing a burst of endogenous opioids.
Some experts have also compared our inboxes, news and social media feeds, etc., to slot machines. And while this might not sound so bad — especially if you’re into gambling — it’s no secret that Vegas wasn’t built on winners, isn’t it?
So, what happens when your attention has been hijacked one too many times throughout the day? Or when you let yourself spiral down the vortex of the web for hours and hours on end?
A systematic review of nearly a hundred studies on this issue, published in Frontiers in Psychology last year, found that information overload can lead to emotional and physical exhaustion, burnout, low job satisfaction, poor decision-making, as well as change in cognitive performance, sometimes significant. In other words, constantly giving the novelty-seeking portion of your brain what it craves results in something akin to the sugar rush some people experience after eating sweets. But you don’t just eventually crash and feel worse than before. You might also be unable to evaluate and process information and make reliable decisions, forcing you to default to feelings over facts.
On top of that, several studies have found that even knowing there’s a wealth of (seemingly) reliable information available 24/7 at our fingertips makes us less likely to store new knowledge in our own memory since we can just ‘look it up’ when needed. This is the so-called ‘Google Effect.’
All of this would be bad enough if the issue were only the quantity of the information and our Stone Age brains’ tendency to get easily distracted by novelty. Unfortunately, it’s not.
Even kids today have a name for it: ‘brain rot.’
Originally coined somewhere in the depths of TikTok, the term refers to the idea that the internet is ‘rotting’ the brains of people who use it too often. It also denotes content that further ‘rots’ the brain due to its self-repetition or low quality.
You don’t have to be on TikTok or any other social media platform, though, to notice that the quality of information online is rapidly diminishing. You just need to use Google’s search engine. Or any other mainstream search engine, for that matter. A study by researchers in Germany, published earlier this year, also confirms this. After analysing the highest-ranked pages for several product review search terms on Google, Bing and DuckDuckGo, the team found that many top results aren’t only low-quality, mass-produced information — they’re often just spam.
If you were counting on AI to fix the problem, I have disappointing news. Shortly after Google rolled out its new AI search tool, AI overviews, to users across the US, several of its suggestions went viral on social media for how wildly nonsensical or dangerous they were. This included advice like telling people to eat one rock per day, use glue to get cheese to stick to pizza, and drink urine to pass kidney stones quickly.
Generative AI is also a massive contributor to the rising pollution of our information environments. Some estimates suggest that AI already produces up to 10% of all online content and that by 2026, this could increase to a staggering 90%. How much of that will be of decent quality? How much of it will be insightful and helpful?
And how much will just end up ‘rotting’ our brains?
Apart from low-quality information and spam, the internet is also rife with misinformation and, perhaps most worryingly, disinformation. A recent study in Nature found that checking the accuracy of misinformation encountered online, particularly on controversial topics, can often lead people to information spaces — dubbed the ‘data voids’ — that contain corroborating evidence from low-quality, unverified sources. I’ve written extensively on a similar issue before. Even searching for terms related to body image or loneliness can lead young men and boys to ‘manosphere’ sites, which are full of misogynistic pseudo-science, while engaging with the seemingly innocuous ‘tradwife’ content will expose you to a plethora of conspiracy theories, including far-right extremist claims.
Research also shows that falling into these disinformation rabbit holes — which are sadly becoming all too common — can have a detrimental impact on people’s physical and mental well-being.
Well, clearly, some of the previous media literacy campaigns that encouraged people to ‘just search’ for information online to verify things and become smarter weren’t exactly right.
This is even more dangerous given our brains’ other tendencies, such as confirmation bias, which leads us to pay more attention to information that confirms our existing beliefs while ignoring contradictory information, and the ‘illusory truth effect,’ which means the more we’re exposed to a particular piece of information, the more we perceive it to be true, regardless of its veracity.
I’ve become increasingly selective about what information I consume, where I consume it, and how much time I spend doing so. I also avoid getting all of it from screens; my preferred method is still the good, old ink-and-paper. Recently, I’ve bought several second-hand books on cooking, plants, and gardening so that whenever I have a question or an issue, I can consult them first before trying my luck with the search engines.
Paradoxically, finding reliable information in the Information Age has become a challenge and, to some extent, a privilege.
Many quality sources are paywalled and sometimes at quite high price points, including, for instance, climate intelligence on the risk of extreme weather events. You’d think people, especially those who live in areas most affected by climate change, should be able to access them for free, but that’s not always the case.
Even having the time and mental energy to filter out irrelevant or low-quality information and find better sources is a privilege not everyone can afford. We’re not only living in an era of unprecedented information explosion but also an era of overwork. Most people are too busy trying to survive, juggling full-time jobs and side hustles with raising families, to then be particularly vigilant about what they consume online.
But it’s precisely our busyness and exhaustion from the increasingly unlivable reality of our world, combined with information pollution, that allows the simplest, loudest, and most emotionally charged — even if not remotely accurate or fair — ideas to become so popular nowadays. After all, rational thought requires some mental effort. But emotions are near-effortless.
It’s also no coincidence that as there are more and more ways to automate and amplify the spread of these ideas — for instance, with bot farms or various AI tools — we’re seeing a rise in the popularity of fringe extremist communities and ideologies, which even in small numbers can have an outsized impact on public perceptions and policy. (The Overton window concept explains well why.)
Too much information, whether good or bad, clearly poses significant personal and societal dangers.
On the one hand, the sheer quantity of information and the frequency with which we’re exposed to it can be detrimental to our mental and emotional well-being and, to some extent, even overall intelligence. (Actually, some recent studies do show a worrying reversal of the 20th-century intelligence boom.) On the other, it can make people increasingly polarised and, ultimately, be existential for democracies.
Even the fact that we now have to waste scientific resources combating misinformation on topics where there’s been established scientific consensus for years shows how counter-productive this explosion of ‘knowledge’ can be.
The right approach would be to build technology with human nature and its limitations in mind rather than creating it first and worrying about its impact later — or worse, designing it specifically to exploit us most efficiently.
But can we hope for the right thing to happen when the capitalist addiction to relentless growth and prioritisation of profit over anything else remains the dominant force in our world?
Still, just as we’ve started to clean the air, water, and soil, it’s time we found ways to clean up the pollution that directly affects our minds, too.
Maybe it's a little bit like having a cluttered basement where you know damned well you have the right tool ... somewhere ... but there are so many broken old appliances, cans of paint from rooms you painted 10 years ago, chairs you mean to fix but never will etc. that you never, ever find it.
This is so true! There was a Jia Tolentino essay related to this a few years ago about how knowing all of the tragedy happening around the world somehow feels virtuous, even though it often leads to a sense of paralysis and overwhelm.
There's too much information in the same way there's too many clothes and knickknacks to buy. It feels like we're living in an age with so much material and information excess but devoid of community and connection.