You're only as old as you feel, someone said. Actually, you're only as old as your ability to feel - or, given that emotion rests in the brain, your ability to think.
For a long time, dementia was thought to become a real problem only for people over the age of sixty-five. However, the results of a ten-year research project released recently suggest that people as young as 45 are beginning to experience the onset of dementia.
I wonder, though, whether another ten year study launched today would eventually find that what we called 'dementia' in 2012 became something like the normal state of mind by 2022? And would it conclude that the decline in important areas of mental function was in large part due to our reliance on digital gadgets?
Indeed, isn’t it feasible that removing access to digital gadgets would leave some of us feeling as confused, troubled by language and withdrawn from other people as dementia sufferers often feel today?
My mother lives with a mild form of Alzheimer’s. Of all the modern forms of dementia, this one has become a particular by-word for all that we fear about growing old. As one commentator put it, Alzheimer’s now has the stigma that cancer carried ten years ago.
Like millions of others in this situation – more than 26 million people suffered with Alzheimer’s in 2006 – I know that things may well get worse for my mother. It might happen very slowly or quite quickly.
The condition is still little understood, but one thing is sure: Alzheimer’s effectively kills off parts of the brain that are essential to memory and communication, and to our social independence.
Isn’t an over-reliance on digital, Cloud-driven tools over a long period of time liable to produce the same results, at least for some of us?
Michael Saling, a neuropsychologist at Melbourne University, says that increasing numbers of people are consulting medics about problems with memory. They’re afraid that they might be showing the early signs of dementia.
Many of these people, he says, are suffering from nothing more than ‘security protection code overload’. They’re simply feeling overwhelmed by all the numbers, codes and procedures they must remember in order to function in a computer-driven age.
Along with the stress of remembering PIN numbers and storing virtual computer manuals in our heads, there is the pressure to multi-task in an age of media overload. Not long ago, the media regulator Ofcom found that the average British child will take in nine hours of digital media per day. However, he or she will cram this into just five hours of real time through multi-tasking.
Multi-tasking is such a cool word; it suggests an ability to juggle a plethora of activities simultaneously without muffing a single one. Studies in industry have shown, though, that multi-tasking is only effective in short bursts; over longer periods it leads to mistakes and falling productivity.
Multi-tasking can be nothing more than another word for distractedness. A 2008 study at Ohio University found that young people with Facebook accounts achieved lower average grade scores that those without accounts. What's more, those on Facebook apparently spent one to five hours a week studying, while their non-Facebook peers hit the books for between 11 and 15 hours per week.[i]
Doubtless, this and other social networking platforms have helped busy students to stay connected with family and friends. One wonders, though, what the impact will be as more and more of these services emerge, each offering new features, and students attempt to have a presence on each one.
A 2009 study of faculty members and librarians in Ontario universities found that 55 percent feel that students are less prepared for university than they were even three years before.
Professors and others felt that first-year students have become less mature, relying too much on Wikipedia and expecting ‘success without the requisite effort’. Interestingly, some of their young charges agreed with their assessment.
Closer to home, the leading judge in England and Wales has declared that today’s ‘internet generation’ are not well suited to jury duty, because they find it hard take in complex and lengthy arguments in a courtroom.
He warned that while most young people are ‘technologically proficient’, they are not good at listening, preferring to read, particularly from screens. ‘One potential problem,’ he said, ‘is whether, learning as they do in this way, they will be accustomed, as we were, to listening for prolonged periods.’[ii]
So, with their phones in one hand, tablets or e-readers on their laps and the internet TV streaming video in the background, our youth are adept at multi-tasking. But does this lead to a form of learned distraction and confusion, which potentially reduce one’s capacity to listen well and to think critically and cogently?
There are those who will argue, with some justification, that the digital experience itself involves very complex thinking skills. Not least among these is the ability to remember how to use all manner of complex operating systems. Yet as more and more studies emerge showing the downsides of a growing technology dependency, it seems that the disadvantages may outweigh the benefits.
It’s a truism, but true: our children learn more watching us than listening to us. We may bemoan their lack of concentration, or their perceived laziness when it comes to some thought-processes, but they’re probably learning some of it from us. In some regions of the world, the fastest growing demographics among social network users are the over-40s and over-50s.
Entranced by the speed and ease-of-use of globalised digital technologies, we live more and more of our lives online. We learn, bank, talk and entertain ourselves via digital tools. But our reliance on them may be reducing the downtime we allow for our brains.
The University of California physiology department conducts experiments into how rest for the brain impacts learning and memory.
Assistant professor Loren Frank says, ‘Almost certainly, downtime lets the brain go over experiences it’s had, solidify them and turn them into permanent long-term memories.’ He adds that when the brain was constantly stimulated, ‘you prevent this learning process.’[iii]
Continued in Part 2 - click here.