The way search engines like Google make it almost effortless to find the answers to any question with a few taps of the fingertips could be changing the way our memory works, according to a study published recently in the research journal Science.
The report, co-authored by Betsy Sparrow, an assistant professor in the department of psychology at Columbia University; Jenny Liu at the University of Wisconsin Madison; and Daniel Wegner at Harvard, suggests that the brain is much less likely to recall information when it knows it can find the information quickly online. Instead, the brain will more often remember where the information can be retrieved, rather than what the information actually is.
The study was conducted in four experiments.
Volunteers, all college students, were asked to answer a series of questions or perform cognitive tasks. In one experiment, the students were asked trivia questions and asked to type their answers. Half the volunteers thought their information would be saved, the other thought it would be erased. Sparrow said the students who believed their data would disappear remembered the answers better than the other group.
In another experiment, the volunteers were told their information was being saved in general folders such as “FACTS” and “NAMES.” The students remembered where there data was stored more often than they remembered the data itself.
The research suggests that someone who feels they have access to information later, like many people do with the Internet, are less likely to memorize small pieces of information since it is so often readily available.
This doesn’t necessarily mean that memory or cognition is deteriorating in the information age. People have always relied on other tools to remember certain bits of information, and with the advent of search engines, the Internet has become the latest resource.
The research, according to Daniel Wegner, professor of psychology at Harvard and one of the co-authors of the study, was prompted in part by a theory he and his wife developed in the ’80s called “transactive memory.” The widely studied theory suggests that people in a group such as a married couple or close friends depend on each other to remember alternate pieces of information.
For instance, a husband may rely on his wife to remember what time dinner reservations are, or someone might turn to their history buff friend to recall what year Lincoln issued the Emancipation Proclamation.
“Thinking with computers is a natural extension of that. In the same way you depend on a friend, now you depend on Google,” Wegner told TechNewsWorld.
Dependence on Google or other resources may indeed be an indication of an expansion of mental capability and potential rather than a dumbing-down effect on cognition.
“We’re a lot smarter now, and that’s why we use it. We’ve become somewhat addicted because it really extends mental capacities. It’s kind of like a mental prosthetic device that’s better than what you’ve had before,” Wegner said.
In fact, our brain is wisely strategizing, according to Paul Reber, professor and director of brain, behavior and cognition at Northwestern University.
“It indicates an impressive amount of effective strategic memory use. It’s not hard to overwhelm our memory abilities — if you have a large amount of information to try to learn, you’ll need to spend a long time at it or you are going to lose or forget some items. If some of those items can be looked up again easily, it makes a lot of sense to focus our attention and resources on the other items to remember,” Reber told TechNewsWorld.
With more data, images and information to process, our brains are actually opening up to create room for more analyzing capability rather than useless storage.
“There’s no evidence we are forgetting things more rapidly now than before the Internet. It seems likely that with a much larger amount of information generally around, we are probably trying to remember more. In addition to studying what we forget, it would be important to look at how much we remember,” Reber said.
That influx of information from technology can co-exist with mental cognition without damaging it.
“More information makes us smarter, not stupider,” said Reber.