We are not being coming ‘dumber’ because of Google, but we are becoming cognitively different

Vikram Singh
9 min readDec 3, 2018

When new technologies come around people worry. Teeth gnash, hands wring. Not that worrying about the effects of new tech is unwarranted, but the worry normally results from changes in a small set of variables. People nervously monitor these variables in lab-based studies, with any change being reason to raise the alarm: a technologically-driven dystopia is at hand!

For instance, lately, there has been a great deal of concern about how the web and digital technology generally is affecting our memory and our thinking habits.

Socrates pictured telling everyone that the newly invented writing will “”introduce forgetfulness into the soul of those who learn it”. Look at them — they know that’s bullshit.

A popular article concerning this topic was one written here by Erman Misirlisoy. There are tons of these articles — I’m picking on this one because it’s recent and because it is also on Medium.com.

Neuroscientist Erman Misirlisoy is concerned that the web is making our memory worse. I understand his concern, but unfortunately, as is the case with much of what is written about some lab based science, it doesn’t take in the larger picture.

For example, he writes:

When we type out interesting trivia tidbits on a computer, our memory for the information is significantly better if we are told that the computer will delete rather than save the information.

Though Misirlisoy doesn’t always cite the studies he is discussing, he is almost certainly talking about this study.

First, there this simple question of how generalisable this conclusion is. In the book Minds Online, the authors claim that the study:

“ does not actually involve using the Internet, it only involves storing statements on a desktop computer. It is very likely that the Web transforms human biological memory systems in all kinds of ways, but the currently available empirical research in cognitive psychology does not seem to support strong negative conclusions.”

That’s the simple response to a study like this: it doesn’t factor the actual circumstances of how we use the web in the world, and the ‘difference make a difference’ as there just isn’t strong negative conclusions overall.

But a much more interesting response is that, yes our minds are changing — and that’s not a new or a bad thing. Our minds are plastic, changing and altering with the environment on a micro (personal/cultural) and macro (evolutionary) way.

There’s many different theories about this, they can all be roughly grouped under what’s know as extended cognition. They all in one way or another discuss how one can’t think about the human mind without thinking about the context it currently exists within and the environment from which it has evolved. In other words, the mind is embedded and interwoven into the material world. Lambros Malafouris, a proponent of this view, puts it simply:

The Science of the Mind and the science of material culture are two sides of the same coin.

Think about it: You don’t remember phone numbers anymore because you know you don’t have to — your phone does that for you. The phone acts as a ‘surrogate structure’ in what can be classified as your Extended Mind. This isn’t something we accidentally do, it’s an evolved trait that allows us to work with and use the environment. This isn’t a ‘new’ method of thinking either. Consider basic activities within your life: if you’re in a bus, you don’t recall the particulars of the route because you know you can rely on the bus system to also get you back.

We know what information we don’t need to remember when we are on a bus

At its most basic, you can think of extended cognition as you understanding what information you have that is easily, robustly, and reliably accessible, so your mind adapts. This is what’s known as an implicit metacognitive commitment: it’s a bit of information out there in the world that we unconsciously (implicit) understand how to work into out cognitive behaviour (metacognitive) because we know it is reliably accessible (commitment).

What’s more, we know what information is relevant and not relevant for us and how to adapt to and interpret that relevance. Some information is more important than others, and we only know through experience of world what is important and what isn’t important.

Think about when you enter a new skill-based environment, when you first learn skiing for example. You don’t know what the relevant sensory input is. You’re reflexively are aware of all the information and feedback you receive, rather than unconsciously processing the necessary feedback to keep you on balance. You are aware of each bump, your leg sliding slightly, and you overcompensate because you don’t know how to interpret feedback, or the importance of each bit of feedback. You don’t have to be — and in many cases cannot be — consciously aware of yourself becoming acclimatised to appropriately weighting and responding to the feedback — your mind-body feedback loop does this automatically.

What small bits of sensory input are important, will let you know that you’re not going to land on your face? He knows, you probably don’t.

On a larger scale, we’ve become acclimatised to our environment in ways that make us much less aware of it.

Think about bringing someone from 1000 years ago to our time. Imagine how they would be hyper aware of every car on the road. Our vehicular metal behemoths would be terrifying, unpredictable sources of danger. Us modern humans, however, know we don’t need to be so hyper aware of the location, vector and behaviour of every car within sight because it’s unnecessary.

So, in summary, we have evolved minds that allow us to understand:

  • what information and stimulus is unimportant or important
  • what information and stimulus is reliably and easily accessible
  • how to incorporate and interpret stimulus and information in ways that provide value

And, in altering our cognitive behaviour to take advantage of these understandings we free up our mind to occupy itself with activities that we deem important.

So imagine if your mind didn’t do this!

Imagine if you felt the need to remember and focus on all the stimuli and information around you to an equally high degree. You would not adapt to the environment; learning and utilising new skills that relied on knowledge of the environment (read: all skills) would be tremendously difficult; and, you would not be able to use the environment or technology in ways that allowed you to externalise cognitive processes to free up your physical brain to do other things.

Luckily, our minds are plastic, they respond to our environment by changing how to respond to, prioritise, store, locate, and rely on feedback.

But Misirlisoy is also concerned about the nature of remembering. He writes:

And if we type out tidbits and save them to a specific folder, we are more likely to remember where we stashed that information than details of the actual contents.

There’s actually relatively good evidence that this is true: we remember where things are better than what things are. But the scope of the situation is much more complex than this. Often, you only remember things only when you remember certain cues.

Think about a memory that is even slightly unrelated to what you are doing right now, or somehow distant from your thoughts — the name of an acquaintance from high school, or a book you read last year. You might have to think on it — ‘sorting’ through your memories, finding the right cues, or succession of cues that result in that memory. This ‘sorting’ through your memories is functionally much the same as opening your phone to find some information you know is there. You perform a sorting or searching activity that requires an understanding of what you might call the metadata or succession of cues that point to that ‘memory’. In both cases, you know you have the information, you just have to find it.

This is because everything we ‘know’ we aren’t always thinking about. You’re not remembering every bit of knowledge you have at any given time. Instead, your mind is focussed on the your stream of consciousness — but to actually remember is a distinct action.

What this means — at least from a functional perspective — is that you have a much larger memory, spread across the world, based on cues. We outsource memories and stimulus that can aid our cognition to the world, when we know the outsourcing is reliably accessible.

I was just at the pub with some coworkers and I asked a coworker when the Christmas Party was. She said “Oh I know” and pulled out her phone, looked at it, and told me. Our idea of ‘knowing’ changes.

The functional act of remembering then, is agnostic. To remember, we need different cues. Some come from our own thoughts, but others come from our environment. In the case of our new cognitive environment, which is intertwined with devices, these cues can be related to a file system or a browser tab.

This isn’t a bunch of ‘stuff’ — it’s a niche we’ve built for ourselves that we form to instigate, provoke, and remind us of important things.

Fundamentally, our memories and thought are, and always have been, embodied in the world and we use the world to think.

What we do need to understand is that the material relationship of our cognitive interleaving with the world is changing. This has occurred before: when language came about, or when literacy increased, we began to shift how we embedded our thoughts into the world.

“‘So what”, you might say, “we still can’t actually use our actual meaty brain to remember things as well — we’ve become dumb as hell!”

But I would argue that is most certainly not the case.

As noted before, we know that when we are able to access particular data quickly and reliably our minds aren’t occupied with remembering it. This frees up our minds to focus on other aspects of information such as the relations, patterns and themes of this data. In other words, we don’t need to worry as much about the content as the larger questions about that content. In this way we are able to be smarter about the data.

So if you are separated from your tech are you dumber, do you have a worse memory? Well perhaps, but perhaps not. There’s no evidence that if you know you can’t access your technology, that your memory is worse. As far as I’m aware, studies have only pointed to the conclusion that our minds don’t bother remembering what we know we have easy, reliable access to.

Importantly, this idea that you are a being untethered to its environment, that you don’t rely on the environment to think, is a highly anthropocentric view. We like to think that we somehow float above the world, that we aren’t coupled to the world in ways that define who we are. The fact is that we have always relied on the material world and culture to help us remember. Scientists point to what’s called our cognitive niche. We have co-evolved with our environment to occupy this niche. To separate ourselves from ourselves from our environment is tantamount to make us not ourselves.

Our niche now is not only our physical environment, but our socio-linguistic-cultural environment, that holds all human knowledge in books, in architecture, in traditions, in norms, in civilisation as a whole.

Surely knowing how to access, work with, and correlate the repository of all human knowledge is more effective and useful than endeavouring to remember it all. This means that working to ensure that the knowledge becomes embedded in the socio-technical structures that we have access to is the vital task.

Our goal for tech then, should be ensuring that:

  • we have easy and reliable access to what we can call our extended mind
  • our extended mind is trustworthy (i.e. doesn’t have false information — a big issue right now)
  • we can easily and robustly juxtapose, arrange, and structure information with other bits of information in our extended mind
  • people are educated on the new skills required for this type of cognition, which include thinking about where to to find information and how to create effective assemblages of information with which we can juxtapose and query

It’s not that there aren’t a lot of unknowns. There are incredibly interesting questions that are appearing. For instance, the previously mentioned Professor Malfouris argues that literacy came about through out interaction with clay tablets. Literacy as described means everything involved in literacy — entire new ways of conceptualising the world came into being: we were able to have knowledge take a solid, unchanging form, we allowed our minds to be freed to take up questions on other things. We, in essence, became able to think in new ways.

So the question isn’t ‘how are we getting dumber?’, it’s ‘how are we getting smarter?’

--

--

Vikram Singh

Head of Design @lightful. MSc in HCI Writes about UX, Philosophy of tech, Media, Cognition, et cetera. https://disassemble.substack.com/ for deeper takes.