Skip to main content

Articles

Do experts simply know "more" than others, or is there something qualitatively different about an expert's knowledge compared to the knowledge of a non-expert?

While most of us are not aiming for an expert's knowledge in many of the subjects we study or learn about, it is worthwhile considering the ways in which expert knowledge is different, because it shows us how to learn, and teach, more effectively.

K. Anders Ericsson, the guru of research into expertise, makes a very convincing case for the absolutely critical importance of what he terms “deliberate practice”, and the minimal role of what is commonly termed “talent”. I have written about this question of talent and also about the principles of expertise. Here I would like to talk briefly about Ericsson’s concept of deliberate practice.

In the following case study, I explore in depth the issue of learning the geological time scale — names, dates, and defining events. The emphasis is on developing mnemonics, of course, but an important part of the discussion concerns when and when not to use mnemonics, and how to decide.


The Geological Time Scale

Phanerozoic Eon 542 mya—present

  Cenozoic Era 65 mya—present

The evidence that diet, physical exercise, and mental stimulation all help prevent age-related cognitive decline and reduce the risk of mild cognitive impairment and Alzheimer’s, is now very convincing.

Studies of mice and (rather intriguingly) beagles, have provided evidence that ‘enriched’ environments — ones that provide opportunities for regular exercise and mental stimulation — reduce or prevent age-related cognitive decline, and reduce the risk of Alzheimer’s.

Most people believe that an adult learner can't hope to replicate the fluency of someone who learned another language in childhood. And certainly there is research to support this. However, people tend to confuse these findings - that the age of acquisition affects your representation of grammar - with the idea that children can learn words vastly quicker than adults. This is not true. Adults have a number of advantages over children:

In 2002, a British study scanned the brains of ten "superior memorizers" — eight leading contenders in the World Memory Championships, and two individuals previously studied for their extraordinary memory accomplishments — all people that had demonstrated truly impressive feats of memory, in terms of the ability to quickly memorize hundreds of numbers or unrelated words. The ten "memory champions" were matched with ten controls, who had no memory capabilities out of the ordinary.

Let me start by saying that if you're healthy and are eating a good balanced diet, there should be no need for you to take supplements. I also want to emphasize that the best way of meeting your body's needs for certain vitamins and minerals is to get them from food. In some cases, for one reason or another, this may not be possible. For example, as a (mostly) vegan, I take iron and B12 supplements, to make up for these deficiencies in my diet. Elderly adults with small appetites may also find it hard to get all the nutrients they need from their diet.

People are poor at assessing their own memory

One thing research seems to show rather consistently is that, for older adults in particular, beliefs about one's own memory performance have little to do with one's actual memory performance¹. People who believe they have a poor memory are usually no worse at remembering than those who believe they have a good memory.

Here’s an interesting study that’s just been reported: 72 seven- and eight-month-old infants watched video animations of familiar fun items being revealed from behind a set of colorful boxes (see the 3-minute YouTube video).

At the same time as a group of French parents and teachers have called for a two-week boycott of homework (despite the fact that homework is officially banned in French primary schools), and just after the British government scrapped homework guidelines, a large long-running British study came out in support of homework.