Brain overload

Have you ever felt that you just can’t learn anymore, your head is spinning, your brain must be full? And yet we are told that the brains capacity is potentially limitless, made up of around 86 billion neurons.

To understand why both of these may be true, we have to delve a little more into how the brain learns or to be precise how it manages information. In a previous blog I outlined the key parts of the brain and discussed some of the implications for learning – the learning brain, but as you might imagine this is a complex subject, but I should add a fascinating one.

Cognitive load and schemas

Building on the work of George (magic number 7) Miller and Jean Paget’s development of schemas, in 1988 John Sweller introduced us to cognitive load, the idea that we have a limit to the amount of information we can process.

Cognitive load relates to the amount of information that working memory can hold at one time

Human memory can be divided into working memory and long-term memory. Working memory also called short term memory is limited, only capable of holding 7 plus or minus 2 pieces of information at any one time, hence the magic number 7, but long-term memory has arguably infinite capacity.

The limited nature of working memory can be highlighted by asking you to look at the 12 letters below. Take about 5 seconds. Look away from the screen and write down what you can remember on a blank piece of paper.

MBIAWTDHPIBF

Because there are more than 9 characters this will be difficult. 

Schemas – Information is stored in long-term memory in the form of schemas, these are frameworks or concepts that help organise and interpret new information. For example, when you think of a tree it is defined by a number of characteristics, its green, has a trunk and leaves at the end of branches, this is a schema. But when it comes to autumn, the tree is no longer green and loses its leaves, suggesting that this cannot be a tree. However, if you assimilate the new information with your existing schema and accommodate this in a revised version of how you think about a tree, you have effectively learned something new and stored it in long term memory. By holding information in schemas, when new information arrives your brain can very quickly identify if it fits within an existing one and in so doing enable rapid knowledge acquisition and understanding.

The problem therefore lies with working memory and its limited capacity, but if we could change the way we take in information, such that it doesn’t overload working memory the whole process will become more effective.

Avoiding cognitive overload

This is where it gets really interesting from a learning perspective. What can we do to avoid the brain becoming overloaded?

1. Simple first – this may sound like common sense, start with a simple example e.g. 2+2 = 4 and move towards the more complex e.g. 2,423 + 12,324,345. If you start with a complex calculation the brain will struggle to manipulate the numbers or find any pattern.

2. Direct Instruction not discovery – although there is significant merit in figuring things out for yourself, when learning something new it is better to follow guided instruction (teacher led) supported by several examples, starting simple and becoming more complex (as above). When you have created your own schema, you can begin to work independently.

3. Visual overload – a presentation point, avoid having too much information on a page or slide, reveal each part slowly. The secret is to break down complexity into smaller segments. This is the argument for not having too much content all on one page, which is often the case in textbooks. Read with a piece of paper or ruler effectively underlining the words you are reading, moving the paper down revealing a new line at a time.

4. Pictures and words (contiguity) – having “relevant” pictures alongside text helps avoid what’s called split attention. This is why creating your own notes with images as well as text when producing a mind map works so well.

5. Focus, avoid distraction (coherence) – similar to visual overload, remove all unnecessary images and information, keep focused on the task in hand. There may be some nice to know facts, but stick to the essential ones.

6. Key words (redundancy) – when reading or making notes don’t highlight or write down exactly what you read, simplify the sentence, focusing on the key words which will reduce the amount of input.

7. Use existing schemas – if you already have an understanding of a topic or subject, it will be sat within a schema, think how the new information changes your original understanding.

Remember the 12 characters from earlier, if we chunk them into 4 pieces of information and link to an existing schema, you will find it much easier to remember. Here are the same 12 characters chunked down.

FBI – TWA – PHD – IBM

Each one sits within an existing schema e.g. Federal Bureau of Investigation etc, making it easier for the brain to learn the new information.

Note – the above ideas are based on Richard E. Mayer’s principles of multimedia learning.

In conclusion

Understanding more about how the brain works, in particular how to manage some of its limitations as is the case with short term memory not only makes learning more efficient but also gives you confidence that how your learning is the most effective.

Double entry bookkeeping replaced by internet

There is an interesting question being asked at the moment, given that fact-based knowledge is so accessible using the internet, is there a case for not teaching facts at all?

According to Don Tapscott, a consultant and speaker, who specialises in organisations and technology, memorising facts and figures is a waste of time because such information is readily available. It would be far better to teach students to think creatively so that they can learn to interpret and apply the knowledge they discover online.

“Teachers are no longer the fountain of knowledge, the internet is”
Don Tapscott

Is this the solution for educators with an over full curriculum, the result of having to continually add new content to ensure their qualification remains relevant and topical? Perhaps they can remove facts and focus on skills development? After all its skills that matter, knowing is useful but it’s the ability to apply that really matters …right?

What makes you an accountant

When you start to learn about finance, you will be taught a number of underpinning foundational subjects including, law, economics, costing and of course basic accounting. Sat stubbornly within the accounting section will be double entry bookkeeping. This axiom is fiercely protected by the finance community such that if anyone questions its value or challenges its relevance they will be met with pure contempt. And yet, is the knowledge as to how you move numbers around following a hugely simple rule i.e. put a number on one side and an equivalent on the other of any use in a world where most accounting is performed by computers and sophisticated algorithms? I am sure there will be similar examples from other professions and industries. The challenge being, do doctors really need to understand basic anatomy or lawyers read cases dating back to 1892?

“Everyone is entitled to his own opinion, but not to his own facts”
Daniel Patrick Moynihan

But Knowledge is power

Daniel T. Willingham is a psychologist at the University of Virginia and the author of a number of books including, why students don’t like school. His early research was on the brain, learning and memory but more recently he has focused on the application of cognitive psychology in K-16 education.

Willingham argues that knowledge is not only cumulative, it grows exponentially. In addition, factual knowledge enhances cognitive processes like problem solving and reasoning. How knowledge Helps.

Knowledge is cumulative – the more you know the more you can learn. Individual chunks of knowledge will stick to new knowledge because what you already know provides context and so aids comprehension. For example, knowing the definition of a bond ‘a fixed income instrument that represents a loan made by an investor to a borrower (prior knowledge), enables the student to grasp the idea that anything fixed has to be paid by the company (the lender) regardless of its profitability and this is the reason debt is considered risky. (new knowledge)

Knowledge helps you remember – the elaboration effect has featured in a previous blog. In essence it suggests that the brain finds it easier to remember something if it can be associated with existing information. Using the same example from above, it is easier to remember that bonds are risky if you already knew what a bond was.

Knowledge improves thinking – there are two reasons for this, firstly it helps with problem solving. Imagine you have a problem to solve, if you don’t have sufficient background knowledge, understanding the problem can consume most of your working memory leaving no space for you to consider solutions. This argument is based on the understanding that we have limited capacity in working memory (magic number 7) and so to occupy it with grasping the problem at best slows down the problem-solving process, but at worse might result in walking away with no solution. Secondly knowledge helps speed up problem solving and thinking. People with prior knowledge are better at drawing analogies as they gain experience in a domain. Research by Bruce Burns in 2004 compared the performance of top chess players at normal and blitz tournaments. He found that what was making some players better than others is differences in the speed of recognition, not faster processing skills. Players who had knowledge of prior games where far quicker in coming up with moves than those who were effectively solving the problem from first principle. Chess speed at least has a lot to do with the brain recognising pre learned patterns.

Skills are domain specific – not transferable

There is one other important lesson from an understanding of knowledge – skills are domain specific. The implication being that teaching “transferable skills” e.g. skills that can be used in different areas, communication, critical thinking etc doesn’t work. A skill (Merriam Webster) is the ability to use one’s knowledge effectively and readily in execution or performance. The argument being that in order to use knowledge effectively, it needs to be in a specific domain.
In July 2016 the Education Endowment Foundation in the UK released the results of a two-year study involving almost 100 schools that wanted to find out if playing chess would improve maths. The hypothesis was that the logical and systematic processes involved in being a good chess player would help students better understand maths i.e. the skills would transfer. The conclusion however found there were no significant differences in mathematical achievement between those having regular chess classes and the control group.

Long live double entry bookkeeping

This is an interesting topic and open to some degree of interpretation and debate but it highlights the difficult path curriculum designers have to tread when it comes to removing the old to make space for the new. In addition there is a strong argument to suggest that core principles and foundational knowledge are essential prerequisites for efficient learning.
But whatever happens, we need to keep double entry bookkeeping, not because knowing that every debit has a credit is important but it helps structure a way of thinking and problem solving that has enabled finance professional to navigate significant complexity and change since Luca Pacioli allegedly invented it in 1494.

And the case from 1893 – Carlill v Carbolic Smoke Ball Company

The independent learner – Metacognition

Metacognition is not a great word but it’s an important one when it comes to learning, especially if you are studying at higher academic levels or on your own. Cognition refers to the range of mental processes that help you acquire knowledge and understanding or more simply, learn. These processes include the storage, manipulation, and retrieval of information. Meta on the other hand means higher than or overarching, put the two together and we are talking about something that sits above learning, connecting it by way of thought. For this reason, it’s often described as thinking about thinking or in this context thinking about how you learn.

Smarter not harder

When you have a lot to learn in terms of subject matter it may feel like a distraction to spend any time learning something other than what you must know, let alone reflecting on it, but this fits under the heading of working smarter not harder, if you can find more effective ways of learning that must be helpful.
As mentioned earlier cognition is about mental processes, storage and retrieval relate to memory, manipulation, to the shifting of attention, changing perception etc. But the meta aspect creates distance, allowing us to become aware of what we are doing, standing back and observing how for example perception has changed, this reflection is a high-level skill that many believe is unique to humans. One final aspect is that we can take control of how we learn, planning tasks, changing strategies, monitoring those that work and evaluating the whole process.

Keeping it simple

Its very easy to overcomplicate metacognition, in some ways its little more than asking a few simple questions, thinking about how you are learning, what works and what doesn’t.  Here are some examples as to how you might do this.

  • Talk to yourself, ask questions at each stage, does this make sense, I have read it several times maybe I should try writing it down.
  • Ask, have I set myself sensible goals?
  • Maybe it’s time to try something different, for example mind mapping, but remember to reflect on how effective it was or perhaps was not.
  • Do I need help from anyone, this could be a fellow student or try YouTube which is a great way to find a different explanation in a different format?

Clearly these skills are helpful for all students but they are especially valuable when studying on your own perhaps on a distance learning programme or engaged in large periods of self-study.

Benefits

There are many reasons for investing some time in this area.

  • Growing self-confidence – by finding out more about how you learn you will discover both your strengths and weaknesses. Confidence isn’t about being good at everything but understanding your limitations.  
  • Improves performance – research has shown that students who actively engage in metacognition do better in exams.
  • Gives control – you are no longer reliant on the way something is taught; you have the ability to teach yourself. Being an autonomous learner is also hugely motivational.
  • The skills are transferable – this knowledge will not only help with your current subjects but all that follow, not to mention what you will need to learn in the workplace.  

It will take some time initially but, in a way, metacognition is part of learning, it’s an essential component and as such you will end up knowing more about yourself at some point, even if you don’t want to, so why not do it sooner rather than later.

And just for fun – Sheldon knows everything about himself – even when he is wrong

Dont worry, Be happy

It’s so easy for well-meaning people to say don’t worry, it’s not bad advice it’s just not very helpful. Firstly, as I have mentioned in previous blogs anything framed as a don’t is difficult for the brain to process. Far better to tell someone what to do than tell them what not.

Secondly If you look up a definition of worry it will say something like, “thinking about problems or unpleasant events that you don’t want to happen but might, in a way that makes you feel unhappy and or frightened.” What a strange concept, why would anyone want to do this?

Having started but I hasten to add not yet finished the second of Yuval Noah Harari’s bestselling books Homo Deus, it’s hard not to question the reason we might have evolved to hold such a strange view. What possible evolutionary purpose could feeling bad or frightened serve?

Don’t worry be happy, In every life we have some trouble. When you worry you make it double.

Worry can be helpful
The truth is worry can be helpful, it’s a means by which the brain can help you prioritises events. It’s not a nice feeling but ultimately humans have evolved to survive and reproduce, they are not meant to be vehicles for happiness. Think of all that goes through your head in a day, the words, the emotions, the noise. How can you possibly figure out what is important and what is not unless you have a little help? Worry does just that, it helps us think about an event in the future that might happen, this heightened focus puts it above the events of the day giving us a chance to do something about it.

Action is worry’s worst enemy – Proverb

Worry, stress and anxiety
Worry tends to be specific; I am worried that I won’t be able to pass the maths exam on the 23rd of September. Worry is future based, it anticipates a problem that has not yet happened, the main reason is to make you do something about it today. Stress on the other hand is relatively short term and arises when the gap between what you need to do and are able to isn’t enough. For example, I haven’t got time to learn everything I need to pass this exam, there is just too much to learn. After the event, the stress level will fall. Anxiety is the big brother of them both, it is far more general than worry, for example, I am not very clever and never have been. You’re not really sure what cleverness is, but you’re still able to be anxious about it. Both stress and worry can lead to anxiety if they are intense or go on for too long.

Worry can wake you in the night, asking your brain to solve the problem. However, unless fully awake It’s unlikely you will be able to do so, instead you will simply turn the problem over in your head again and again and deprive yourself of that all-important sleep. Best put it to the back of your mind if possible, think of something else, the problem will feel less important in the morning and after a good night’s sleep you will be far more able to solve it.

It helps to write down half a dozen things which are worrying me. Two of them, say, disappear; about two of them nothing can be done, so it’s no use worrying; and two perhaps can be settled – Winston Churchill

What to worry about
The human mind is so creative it’s possible for it to worry about almost anything. As one worry is resolved another can appear.

  • Don’t know what to do – where do I start, what should I learn first
  • Don’t know how to do it – how can I get this into my head, what is the best way of learning?
  • Don’t know if I can do it, self-doubt – I am not clever enough. This can lead to anxiety.
  • Don’t know how long it will take, what if I don’t have enough time?

One technique to change these from unknowns to possibilities is to follow the advice of Carol Dweck who suggests you add a word to the end of the sentence – the word is YET. For example, I don’t know what to do YET! Although this may seem trivial it moves the worry from unsolvable to something that if you spend time on can be achieved.

The list of “dont knows” are all triggers to help motivate you, they are calls to action, the only way to reduce the worry is to do something, even if as Churchill suggest you make a simple list. However, there are situations when you can’t take action or at least not an obvious one, perhaps when waiting for exam results. It might seem that all you can do is worry. The bad news is, putting yourself in what can feel like a permanent state of worry can result in anxiety and won’t turn that fail into a pass. But all is not lost, planning for the worst whilst hoping for the best is sensible, coming up with a plan that is achievable can remove the pressure, leaving the feeling that even if you do fail there is a way forward and you can do something about it.

We can end with another quote from Winston Churchill who I am sure had a few worries in his time.

Let our advance worrying become advance thinking and planning

The learning brain

Brain 5

There are a number of books that not only taught me something but helped shape the way I think and opened up a whole new world. One such book was Mapping the Mind by Rita Carter, not as you might imagine a book about mind mapping but the Brain. Rita Carter is a science journalist rather than a neuroscientist and understands that it’s not about what she knows but what she can explain.

Having a better understanding of how the brain works will help do far more than improve your grades in a biology exam, you will develop insight as to why something works not only that it does. As a result, you can be confident you are using the most effective brain friendly learning techniques.

The infrastructure Brain 2
Rita Carter provides us with an excellent description of the brain, that it is as big as a coconut, the shape of a walnut, the colour of uncooked liver and consistency of firm jelly.

Imagine a cross section of the brain, taken from the side, alternatively look at the diagram opposite.

The cerebrum or cortex is the largest part of the human brain and is associated with higher brain function such as thought and action. It is divided into four sections.

  • Frontal lobe – associated with reasoning, planning, some speech, movement, emotions, and problem solving
  • Parietal Lobe – associated with movement, orientation, recognition, perception of stimuli
  • Occipital Lobe – associated with visual processing
  • Temporal Lobe – associated with perception and recognition of auditory stimuli, memory, and speech

The cerebellum coordinates movements such as posture, balance, and speech. Next to this is the brain stem, which includes the medulla and pons. These are the older parts of the brain and evolved over 500 million years ago. In fact, if you touch the back of your head and bring your hand forward over the top towards your nose, this effectively maps the ages in which the brain developed.

The Limbic system is largely associated with emotions but contains the hippocampus which is essential for long term memory and learning.

Synaptic gap – Cells that fire together wire together (Hebbian theory)
Although learning is complex, a large amount takes place in the limbic system because this is where the hippocampus sits. Here our memories are catalogued to be filed away in long-term storage across other parts of the cerebral cortex.

What comes next is important because it’s here within the hippocampus where neurons connect across what is called the synaptic gap that learning arguably begins. Synaptic transmission is the process whereby a neuron sends an electrical message, the result of a stimulus across the synaptic gap to another neuron that is waiting to receive it. The neuron’s never touch, the gap is filled by chemicals referred to as neurotransmitters examples of which include dopamine and serotonin. These are often referred to as the body’s chemical messengers.

Learning is making new connections, remembering is keeping them

When the stimulus is repeated the relationship between the neurons becomes stronger and so a memory is formed and learning has taken place. The whole process is called long term potentiation (LTP).

How does this help?
All a bit technical perhaps but very important as it explains so much. It is the reason that repetition is so valuable, for example, if you are reading something and it’s not going in, you need to fire those neurons again but perhaps using different stimulus. Try saying it out loud or drawing a picture alongside the text.

Don’t forget the blog I wrote in January 2018 that explained brain plasticity and how the brain changes as those new neural connections are made, a process called Neurogenesis.

The neurotransmitters, those chemicals released to fill the synaptic gap are also important as each one is different. For example, in addition to making you feel good, it’s likely that when you feel anxious your brain is releasing high levels of serotonin.

Although it’s fair to say there is still much we don’t understand about the brain, I  hope the blog has helped remove some of the mystery of learning, it’s not a magical process but a scientific one.

learn more

Dedicated to my dog Jack – our family dog and best friend

Mind Mapping – Tony Buzan, Learning leader

MM-How-to-MindMap-imindmap-1024x647

It was with some sadness that I read of the death of Tony Buzan last week. It’s possible you have never heard of him and yet will be familiar with the technique he discovered to help students learn, Mind Mapping. He was born in the UK in 1942 studied Psychology, English, Mathematics and Science at the University of British Columbia.  In addition to his lifelong association with Mind Mapping he worked for Mensa, set up the World Memory Championships in 1991 with Raymond Keene, and found time to write over 140 books. Two of which sit on my bookshelf, both furthered my knowledge and fuelled my interest in learning, memory and how the brain works. These are Use your Head and The Mind Map book.

Curiosity  

When Tony Buzan was at Junior school his curiosity was sparked by a boy who had an excellent knowledge of nature, in particular birds but repeatedly failed tests that were set in school. This led him to question what intelligence was. And although I hadn’t read this at the time it was something I had also been interested in. Society had/has somehow lost sight of the fact that people are different, falling into the trap of praising and promoting those that were “clever” and pitying those that were not. It seemed far more sensible to break intelligence down into a series of biological/neurological qualities, and in 1983, when Howard Gardner published his book on Multiple Intelligence Theory this made perfect sense to me and provided evidence that Buzan was on to something.

Mind Mapping – does it work?

According to Tony Buzan, “Mind Mapping is a two-dimensional note-taking technique with which a Mind Map is made using all the relevant knowledge about a specific subject.”

I have written about how to Mind Map before, so please follow the links if you want to find out more – Mind Mapping unplugged – The De Vinci code – Mind Mapping to pass exams.

Remarkably there is little evidence to prove that Mind Mapping works, academics have focused instead on Concept Mapping, a hierarchical diagram that links conceptual knowledge, but the principles that underpin Mind Mapping are consistent with much of what we know is effective in learning.

This quote from Tony Buzan offers a deeper insight into why it works.

“I used to take formal notes in lines of blue, and underline the key words in red, and I realised I needed only the key words and the idea. Then to bring in connections, I drew arrows and put in images and codes. It was a picture outside my head of what was inside my head – ‘Mind Map’ is the language my brain spoke.”

In this narrative there are three important principles identified. Firstly, use only key words, this process of reduction is hugely valuable in learning. When the brain has to select one or two words it engages in a process of reflection and review, reading and re-reading asking which one word should I pick, and why. Secondly connections, it is well accepted that the brain finds storing unrelated chunks of information difficult, a Mind Map requires the student to link information and in so doing forces a connection. And lastly, arguably one of the most powerful, the use of images. The brain appears to have a limitless capacity to store pictures, the brighter, more colourful and stranger the better.

In summary, it’s not that Mind Mapping was invented by Tony Buzan and before we knew little about the best techniques to aid learning, what he did was pull together much of what we now know to be effective using as inspiration the drawings of the Leonardo de Vinci and created a tool that requires the student to know little of the theory behind how it works but by preparing one engages them in a series of very effective techniques that will help them learn.

Critics

It would be wrong to suggest that everything Tony Buzan said or did was correct, he has been responsible for promoting what many now recognise as pop psychology that has since been proved to be incorrect. For example

“Did you know that you use less than 1% of your brain? The good news is that Mind Mapping can help you to access the other 99%.”

However, he also said

“Learning how to learn is life’s most important skill.”

And in this world rich with information, AI and robotics, this may be the only thing that will keep us ahead of the game.

Listen to Tony Buzan talking about Mind Maps

RIP Tony Buzan learning leader.

 

 

 

 

Making complex simple – the measure of a great teacher.

solve-the-equationRichard Feynman who featured in last months blog was known as the great explainer. This skill was possible because of two key qualities, the first, an intense curiosity and desire to understand the subject incredibly well and secondly, he could make what was complex seem simple. These are of course not mutually exclusive, deep understanding is the foundation on which simplicity is built.

There was a time when getting access to knowledge was a barrier to learning. After all, how could you learn if you didn’t have the books from which to do it? But we no longer have this problem, knowledge is abundant, it is literally at the end of your fingertips.

The world’s knowledge is just waiting for you to ask the right question. But how can you tell if what your reading is shallow and without thought or deep and profound?

Jardins principle
In 1997 I read an article in the Financial Times written by Rob Eastaway, an English author whose books on everyday maths include Why Do Buses Come in Threes? and The Hidden Maths of Sport. For some reason the concept he outlined always stayed with me, sufficient that I wanted to track it down, which I have managed to do.

Jardin’s Principle as explained by Rob Eastaway. If you are trying to understand any subject or system, your level of understanding will pass through three stages. To start with, the way that you see and describe a system (subject) will be simplistic i.e. over-simplified, then it will become complicated but ultimately it will become simple again. He goes on to add that there are three other words that fit in with this idea, Obvious, Sophisticated and Profound.

Make everything as simple as possible, but not simpler – Albert Einstein

Simple – Complex – Profound
As with all ideas there is more to it, below are what Rob refers to as the 5 caveats. I have added in my own thoughts and observations to some of them.

1. It is hard to differentiate between what is ‘simple and profound’ and what is ‘simplistic and obvious’. This is one of the main problems with a process of reduction, for example if you ask, what is the meaning of life you might be given the answer 42. The problem is in knowing if this is just two numbers written down, snatched out of the air or the correct answer, the result of hundreds of thousands of calculations undertaken over 200 years by the most sophisticated computer in the world?

2. Those at the ‘sophisticated/complicated’ level believe that there is no higher level than theirs – in other words you have to be sophisticated to understand fully. This is a clever observation on human nature, it suggests that some people believe you cannot fully appreciate a concept or idea unless you look at it through the lens of complexity. They effectively give up looking for a simpler perspective, because they don’t know one even exists.

3. You are probably wrong about the level of Jardin that you are at. An example perhaps of fish not seeing water.

4. In order to reach the profound level of understanding you usually pass through the other two levels first. This is my favourite because it shows that the route to simplicity is not easy and requires time and effort. You have to revisit your understanding many times before your brain springs into action with the blindingly obvious.

5. Unless you have a profound understanding of a subject, you will either over-complicate or over-simplify it. Perfect…..

Simple can be harder than complex: You have to work hard to get your thinking clean to make it simple. But it’s worth it in the end because once you get there, you can move mountains. – Steve Jobs

Great teaching  – Taking something that is complicated and making it appear simple is in many ways the essence of great teaching. Breaking down a subject into easily understood bite sized chunks of information or capturing the whole concept in one single leap by use of a metaphor or simple story is genius. But the process of getting to these pearls of wisdom involves wading through the mire of complexity in some instances for many years before the obvious reveals itself.

What I didn’t know at the time was that Rob had actually made this theory up, he didn’t want to put his own name to it so chose the French word for garden in homage to the Peter Sellers film, Being There, about a simple gardener who becomes US President.

Rob you ask, will it ever stick, maybe you should call it Eastaways folly instead.