Motivation by Reward and Consequence – Behaviourism

Motivation is one of the most important aspects of learning and as a result has featured in many previous blogs. In its simplest form motivation can be defined as something that you want; you want to get fit or you want to pass the exam, and as a result that want directs your behaviour. For example, if I want to pass the exam, a good behaviour would be to attempt 5 more questions.

But do we ever really know what is motivating someone? We could ask Tom Dean, the gold medal winner in the 200-meter freestyle at this year’s Tokyo Olympics. What motivated him to train even harder after he contracted Covid for a second time? I’m sure he would give us an answer, the problem is it could well be something he has constructed to explain it to himself rather than the real reason.

Maybe we should think less of the cognitive reasoning behind motivation and consider only the actions of a motivated person? It’s likely Tom had a few early mornings and went through some pretty painful training sessions in order to get fit for the games, but it could be that his ability to do this is more a consequence of conditioning rather than his desire for a gold medal. There is also the question as to why a gold medal is motivational, after all its not even gold, they are 92.5% silver. Interestingly the Tokyo medals include recycled metal from electrical devises. Maybe its because he associates it with success and or pride, something that he has been conditioned to over many years.

Behaviourism
Behaviourism, is a theory of learning which states that all behaviours are learned through interaction with the environment by a process called conditioning. The implication is that your behaviour is simply the response to a stimulus, a cause and effect.

The environment shapes people’s actions. B.F. Skinner

Its highly likely you will have experienced and even been involved in motivating someone in this way. For example, were you ever put on the naughty step as a child or told your dog to sit and when he does, reward him? These are examples of how changing the environment results in a different behaviour. The dog is motivated to sit not because it’s a lifelong ambition but because he wants the reward. Tom Dean may well have got up early to go training but that might have more to do with the conditioning resulting from his alarm going off, than a burning desire to get out of bed.

It is effectively motivation as a result of reward and consequence, if you do something you get something.

Classical conditioning – association
Ivan Pavlov, a Russian physiologist, discovered that dogs could “learn” to salivate at the sound of a bell that rang before they were fed. He called this classical conditioning, the dog associating the bell with food. These types of associations can be the reason people are afraid of spiders or chewing gum, yes, it’s real, Oprah Winfrey is a sufferer. It also explains why having a designated study area can help you feel more like studying, you associate it with getting work done. Here are a few more examples, your smart phone bleeps and you pick it up, celebrities are used to associate a product with glamour, Christmas music makes you feel Christmassy and an exam hall brings on exam anxiety.

Operant conditioning – reinforcement
In contrast to classical conditioning, operant conditioning encourages or discourages a specific behaviour using reinforcement. The argument being that a good behaviour should be reinforced by a repeated reward or a bad behaviour stopped by a repeated punishment. The person who developed this type of conditioning is B.F. Skinner, who famously used pigeons in what became known as “Skinner boxes”.

There are four types of reinforcement

  • Positive reinforcement – The behaviour is strengthened by adding something, a reward (praise/treats/prizes) which leads to repetition of the desired behaviour e.g. “Well done, Beth, that was a great question”. Here praise is added to encourage students to ask questions.
  • Negative reinforcement – The removal of something to increase the response e.g. “I can’t study because, everyone is shouting”. The shouting stops which encourages the behaviour of studying.
  • Punishment – The opposite of reinforcement, it adds something that will reduce or eliminate the response. e.g. “that’s probably the worse answer I have ever heard Beth, were you listening at all”. Here humiliation is added that will reduce the likelihood of students asking questions.
  • Negative punishment (Extinction) – This involves removing or taking something away e.g. “You can have your mobile phone back when you have done your homework”. In this situation removing access to the mobile phone results in the homework being completed.

A person who has been punished is not thereby simply less inclined to behave in a given way; at best, he learns how to avoid punishment. B.F. Skinner

Limitations
Skinner remained convinced anything could be taught with operant conditioning and went on to invent a teaching machine using the principles of reinforcement. It required students to fill in the blank, if the answer was correct, they were rewarded if incorrect they had to study the correct answer again to learn why they were wrong.

Give me a child and I’ll shape him into anything. B.F. Skinner

However, there are many limitations, the motivation is not always permanent, it’s too basic to teach complex concepts, punishment can lead to a reinforcement of the undesirable behaviour and its possible the person is just pretending.

Operant conditioning is still a hugely influential in the modern world, for example have you ever watched someone play a fruit machine, the required behaviour rewarded to extract more money. What about online gaming where points and leader boards provide rewards in terms of status and prizes.
Then then there are the ideas surrounding behavioural economics popularised by Nudge theory which suggest that you can influence the likelihood that one option is chosen over another by changing the environment.

And finally, have ever seen how the military train, check out this video.

So next time you think you are making a decision of your own free will, maybe you’re just responding to an external stimulus!

Becoming a better thinker – Edward de Bono learning leader

There are a number of people who have changed the way I think but no more than Edward de Bono who died this month aged 88, a great example of a learning leader.

Born in 1933, he graduated as a doctor from the University of Malta before studying physiology and psychology at Oxford as a Rhodes Scholar. He represented Oxford in polo, set two canoeing records and later gained a PhD in medicine from Trinity College, Cambridge.

But he is probably best known for two things, firstly as the creator of the term lateral thinking and secondly for his six thinking hats strategy that went on to influence business leaders around the world.

Lateral thinking

To understand lateral thinking, we first need to figure out what thinking is. There are many definitions but my own take is that it’s a reflective process involving the manipulation of knowledge, feelings and experiences as we seek to connect what we know with new information, normally focused on a problem.

There are two or maybe three modes of thinking!

1. Convergent – focuses on coming up with a single, “correct” answer to a question or problem. Examples of convergent thinking would include critical thinking, a logical process that involves challenging underpinning assumptions, questioning accuracy, motivation and purpose in order to make sense of a situation or solve a problem. Its origins can be traced back to Socrates, Plato and Aristotle, or as De Bono disparagingly referred to them, the gang of three.  Also, analytical thinking, where you break down complex information into its component parts, this can and is often used in conjunction with critical thinking. Convergent thinking is logical thinking, meaning its rule based, systematic and linear. If for example we concluded that 2 + 2 = 4 and we decided to add another 2, logically the answer would be 6. 

But logic can still be challenging, there is a logical answer to this question yet 80% of people get it wrong.

Jack is looking at Anne, but Anne is looking at George. Jack is married, but George is not. Is a married person looking at an unmarried person?
A: Yes
B: No
C: Cannot be determined

The correct answer is A. (click here for the explanation)

The need to be right all the time is the biggest bar to new ideas.

2. Divergent – is the opposite of convergent and involves coming up with many possible solutions, acknowledging that there may be no single correct answer. This type of thinking is often emergent, free flowing, illogical and requires creativity. Convergent and divergent thinking can be used together, divergent to generate ideas and convergent to make sense of those ideas and find a practical application. Click here for a video that explains how these two types of thinking work together.

3. Lateral thinking – is a way of solving problems by taking an indirect and creative approach by looking at the problem from different perspectives. Although there are similarities with divergent thinking it is not the same. Divergent thinking starts with a problem at the centre and random ideas are generated branching outwards in all directions, lateral thinking requires the individual to come up with a solution by generating different ideas that result from changing perspective. De Bono writes that lateral thinking forces the brain to break set patterns, it’s a pattern switching technique.

Let’s consider one of his examples, Granny is sitting knitting and three-year-old Susan is upsetting Granny by playing with the wool. One parent suggests putting Susan into the playpen, a relatively creative (divergent) solution. A logical (convergent) answer might be to tell Susan not to do it, but anyone with a three-year-old will know how effective this will be, but that won’t stop them trying!

The other parent suggests it might be a better idea to put Granny in the playpen to protect her from Susan, this is lateral thinking, looking at the problem from a different perspective. Its illogical because granny is bigger and surely you don’t need to protect granny from a three-year-old, but it is still a solution and would work.

I am reminded of a question I was once asked whilst visiting Berlin. “Why did the East Germans build the Berlin wall?” ………“To keep people in of course”, it was a prison not a defence. It’s all about perspective.

Lateral thinking is not a substitute for logical thinking and can be used as a way of generating new divergent solutions, they complement each other and are interchangeable. Lateral thinking is generative, logical thinking selective.

In summary lateral thinking is about changing perspective……

Most of the mistakes in thinking are inadequacies of perception rather than mistakes of logic.

My own personal favourite perspective story

This is the transcript of a radio conversation of a US naval ship with Canadian authorities off the coast of Newfoundland in October, 1995. Radio conversation released by the Chief of Naval Operations 10-10-95.

Americans: Please divert your course 15 degrees to the North to avoid a collision.

Canadians: Recommend you divert YOUR course 15 degrees to the South to avoid a collision.

Americans: This is the Captain of a US Navy ship. I say again, divert YOUR course.

Canadians: No. I say again, you divert YOUR course.

Americans: This is the aircraft carrier USS Lincoln, the second largest ship in the United States’ Atlantic fleet. We are accompanied by three destroyers, three cruisers and numerous support vessels. I demand that YOU change your course 15 degrees north, that’s one five degrees north, or countermeasures will be undertaken to ensure the safety of this ship.

Canadians: This is a lighthouse. Your call.

Lateral thinking for learning

But what has this got to do with learning? Well learning is not just about facts and knowing stuff, the reason we go to school is to gain an understanding of a wide range of issues, concepts and ideas that when faced with a problem we can manipulate and cross check in order to form opinion and come up with a solution. Learning is a consequence of thinking.

Learning without thinking is useless. Thinking without learning is dangerous.
Confucius

De Bono believed that thinking was a skill that could be learned and because lateral thinking helps people develop creative ideas, creativity could also be learned. It is not an innate trait, a type of intelligence that you are borne with, it’s something we all possess, we just need the techniques to do it. He did however distinguish between artistic creativity and idea creativity, Michael Angelo and Shakespeare are artistically creative, lateral thinking will only ever make you idea creative.

As to the techniques, maybe they will feature in another blog but if you can’t wait, here is a short video, but beware De Bono was the master of acronym.

We tend to take thinking for granted, believing we are good at it or maybe never even questioning our ability. But what De Bono made popular was the idea that it was a skill and that we can improve. We live in a time when information is more accessible and freely available than ever, so the real value has to be in what we do with it.

Thank you, Edward De Bono 1933 – 2021.

And lastly….the blog would not be complete without one of De Bono’s lateral thinking puzzles.

A man lives on the tenth floor of a building. Every day he takes the elevator to go down to the ground floor to go to work or to go shopping. When he returns, he takes the elevator to the seventh floor and walks up the stairs to reach his apartment on the tenth floor. He hates walking so why does he do it?

The man is a dwarf and cant reach the higher buttons.

Note making, not note taking – it’s about effort

I have always been a believer in the idea that much of what you need to know is accessible, the answer is staring you in the face and yet you can’t always see it. Maybe because you’re not asking the right question or looking at it from the wrong perspective.

Figuring out how learning works and the best way to study can seem complicated and yet if you watch what people do when they are trying to learn and ask the right questions there is much to see.

For example, watch a group of students in class or a lecture, (remember that pre-Covid) what do they do? Where are they looking, what are they concentrating on and most importantly what activity are they engaged in? The answer to this last question is easy, they will all be making notes. Going forward these notes will become the single most important learning resource the student has.

Why is note making important
There are two basic reasons why you make notes, firstly it improves concentration and cognition, making notes gives you something to do that requires attention, you become more focussed. Secondly you will have created a permanent record of what was said to review later. Interestingly if you asked students, they probably think capturing the information is the sole reason for notes, when in reality it’s the effort involved in making them that mattes in terms of learning.

Its worth adding that making notes works just as well from a book as it does a lecture.

How to make notes?

Blank paper notes – The simplest form of note making is to start with a blank piece of paper. Unfortunately, research tells us that most students notes are incomplete, on average they only capture one third of what was deemed to be important. In addition, they are often inaccurate, in one study, Crawford (1925) found that only 53% of noted information was fully correct, 45% was vague, and 2% inaccurate.

Conclusion – making notes in class is a good idea but if you use those same notes afterwards, not only will you be missing some important information but some of what is there may well be wrong.

Full notes – An alternative to a blank piece of paper is to give students a full set of notes. In 1987 a study by Kiewra and Benton showed that students who reviewed full notes achieved 17% higher scores than students who reviewed their own. This of course may not be surprising given the lack of information captured by students in the first place. Interestingly there is even some evidence to show that reviewing a full set of notes is better than attending the lecture!

Just to be clear, the best way of learning is to attend the lecture, make notes but then review a full set of notes not your own. Unless of course there is another way….

Partial, Scaffolded, Skeletal and Gapped notes
Partial notes may offer the best solution, helping keep the student engaged when in class but providing them with a sufficiently complete set of notes from which to study later. Partial notes contain the main ideas but leave blank spaces for students to complete, for example producing or labelling a diagram, adding in key definitions, working calculations etc. More research in 1995 from Kiewra and Benton but this time in collaboration with Kim, Risch, and Christensen showed a marked increase in completeness from 38% for those who used a blank piece of paper to 56% for those that were given partial notes. What we don’t know from this research is the level of detail that was missing, but it proves the point.

Note taking cues – One tip for teachers, the more cuing or signposting that is deployed the better. This might involve pausing and telling students they must pay attention to a particular point or simply writing out a key phrase or definition on the whiteboard. In one study, students recorded 86% of the information written on a blackboard (Locke, 1977).

Handwritten or typed?
This is a difficult one to answer, with some research to support both forms. We know that most students can type more quickly than they can write and as a result they should have more comprehensive notes to study from. But in 2014 Mueller & Oppenheimer cast doubts on the viability of laptop note taking. They concluded that “whereas taking more notes can be beneficial, laptop note takers’ tendency to transcribe lectures verbatim rather than processing information and reframing it in their own words is detrimental to learning”. In addition, laptop users did not capture diagrams that well, this was thought to be the result of the difficulty of doing this digitally. Copying and pasting certainly captures information but is a relatively mindless activity and leads to a certain amount of unnecessary information being recorded which is off little value.

Conclusion
It would be wrong to conclude that making notes on a computer is worse than writing them out by hand. Its more that a computer makes it easier for students to disengage or become distracted, and if that happens, the learning is less effective. To a certain extent learning has to be difficult, it’s all about the effort, the more you try the more you learn.

We can however say that partial notes are a very good compromise, offering the best of both worlds, helping students capture sufficient information to review later but requiring them to concentrate whilst sitting in class.

I’m now off to — in the —— and have a cold —-

For more links to the research, here is an excellent summary, Note-taking: A Research Roundup by Jennifer Gonzalez – The cult of Pedagogy

Who are you when learning? – Personality

Being a very agreeable kind of a person, I was encouraged to find a piece of research that appears to be unanimously supported in terms of evidence as to its validity, it’s called the Five Factor Model or the Big Five model of personality. Developed by McCrae and Costa in 1987 it simplifies personality, suggesting that we are all biologically predisposed towards the following five traits, Openness, Conscientiousness, Extraversion, Agreeableness, Neuroticism, easily remembered by the acronym OCEAN.

For those who are curious and would like to learn more about themselves there is a test at the end of the blog to help you identify your preferences. But some may have little interest in finding out how well they perform, the reason for these two different attitudes could well be personality. What should however be of interest to everyone, partly because you are reading this blog is to find out what personality has to do with learning.

What is personality?
The term Personality is derived from the Latin word ‘persona’ meaning mask or character. An actor might for example wear a mask (persona) to promote a particular quality in a character as part of a performance or simply use it so as not to reveal too much of themselves.

The term is now more commonly used to describe an individual’s characteristics, patterns of thoughts, feelings and behaviours. In other words, your personality is what makes you, you!

Although intelligence is one of the strongest predictors of student success, there is evidence to show that personality is also responsible for individual differences in how well people learn.

“intellectual ability refers to what a person can do, whereas personality traits may provide information on what a person will do”.
O’Connor & Paunonen, 2007 and Furnham and Chamorro-Premuzic 2004

The Five Factor model
Many people will have heard of the Myers Brig’s Type Indicator, MBTI for short, it’s one of the most well-known profiling techniques. In fact, you may well have been asked to take an MBTI test at some point in your career, although strictly it’s not a test as there is no right or wrong answer. It requires the completion of a self-regulated questionnaire in an attempt to capture how people perceive the world, gather information and make decisions. It’s based on Jung’s theory of psychological types. The main problem with MBTI is that its binary by design, meaning that a person is either an introvert or extrovert which on one level is helpful because it gives you an answer but even Jung admitted that “there is no such thing as a pure extrovert or a pure introvert.”

In contrast the Five Factor Model provides its results in the form of a measure as to how much of the trait you possess, it’s a personality trait rather than a type. But although this is more accurate it’s difficult to interpret, for example you may be 55% agreeable, but what conclusions can you draw from that?

However, many academics and practitioners consider the five-factor model superior partly because there is a lack of evidence to support the MBTI and the results can be unreliable. If you retake the test after 5 weeks, there is a 50% chance you will fall into a different type.

OCEAN

Openness to experiences – this personality trait denotes how receptive you might be to new ideas and new experiences, a willingness to try out the unknown. People who have low levels are generally sceptical about the unknown and happy with the status quo. It might be worth adding that there is no opposite to being open, you are not for example a closed person, just less open.

Conscientiousness – Individuals who are conscientious are able to control their impulses. They are more likely to be successful both in the classroom and in their careers, largely because they are organised, hardworking and determined in the pursuit of their goals.  People with low conscientiousness have a tendency to procrastinate and deviate from their objectives. They can also be impetuous and impulsive. As with openness there is no opposite, you are just less conscientious.

Extraversion – Unlike the above you can be introverted. However, the term introvert refers to where you get your energy from and has nothing to do with being shy. Extroverts gain their energy from activities and other people whereas Introverts prefer the world of ideas and internal thoughts.

Agreeableness – if you are agreeable, you are more likely to get along with others and be cooperative. People on the low end of agreeableness can at times be blunt and sometimes even rude, although they will probably view themselves as being honest and not afraid to call “a spade a spade”.

Neuroticism – this refers to how emotionally stability you are as a person. It often manifests itself in being confident and comfortable in your own skin as opposed to suffering from anxiety, worry, and low self-esteem. Instinctively it feels as if this is the worst trait to be strong in, and you would be right. But we all have some aspects of neuroticism and higher levels are often associated with people who are very creative.

Personality and the connection with learning
It may come as no surprise that the research identifies two personality traits as being the most important from a learning perspective, conscientiousness as a positive and neuroticism as a negative. Students who are conscientious perform well academically whilst those that display higher levels of neuroticism can sometimes struggle, for example they are more likely to suffer from test anxiety and self-doubt. This particular aspect of personality might go some way to explaining why “clever” students don’t do so well.

Conscientious learners are more likely to engage in and succeed at learning.

Students with higher levels of Anxiety (a quality of Neuroticism) will face greater learning challenges than less Anxious students.

It was also found that agreeableness and openness helped students academically suggesting that in addition to being conscientious, cooperation and inquisitiveness was also of value.

Interestingly some research has even shown that personality accounts for a greater part of the variance in academic achievement over and above intelligence,and that personality may be better at predicting academic success at the post-secondary levels of education 2.

However, the more important message is that few of us sit at the extreme of any of these personality traits and as individuals have elements of them all. And by recognising that we have a weakness in one and a strength in another can adapt, whilst at the same time acknowledging that these traits are important because they are what makes us who we are.

Take the test – if you are open and conscientiousness you may want to find out more about your personality.

And here is a fun quiz (Buzz quiz) based on MBTI popularised by long term career advisor David Hodgson. Rather than a 4 letter code David’s idea is to presents the results in the form of an animal.

1 (Bratko et al. 2006; Gilles and Bailleux 2001; Noftle and Robins 2007; Poropat 2009)
2 (Conard 2006; Di Fabio and Busoni 2007; Furnham and Chamorro-Premuzic 2004; Furnham et al. 2003; Petrides et al. 2005).

Feedback – The breakfast of champions

There was an interesting piece of research that came out recently, it referred to something called “Temporary mark withholding”. This as the name might suggest is providing students with written feedback but without marks. On the face of it this might seem odd and frankly unhelpful, how can you judge your performance if you don’t know how you compare against what is expected?

To answer that question, you need to ask a far more fundamental one, what’s the purpose of giving feedback in the first place?

Feedback – task or ego
We need to separate feedback from criticism which often implies that the person giving it is trying “to find fault”, although it’s possible to make it sound a little more positive by calling it constructive criticism. In simple terms criticism is more about what was wrong in the past whilst feedback directs you towards what you should do to improve in the future. But when we are thinking in terms of learning it gets a little more complicated, Dylan William talks about whether its ego involving or task involving feedback. The first of these would include offering praise such as “well done you have produced an excellent answer” but he states this is rarely effective and can actually lower achievement. However when the feedback focuses on what the student needs to do to improve, and explains how they can do it, then you get a significant impact on student achievement.

He goes on to say that “good feedback causes thinking, the first thing a student needs to do when they receive feedback is not to react emotionally, not disengage – but think”. It might be worth adding that Dylan William is talking about the impact of feedback on student learning not on how the student might feel in terms of motivation, self-confidence etc. There is clearly a place for ego type feedback it’s just not that effective when sat alongside a direct instruction because the emotional response often blocks or detracts what needs to be understood for the student to improve.

Formative and Summative assessment
There is one last piece of information that will help us make sense of the reasons why temporary mark withholding might work, the difference between formative and summative assessment.

Summative – The purpose of summative assessment is to “sum up” student learning at the end of a chunk of learning or completion of a course and compare it against a standard, normally a pass rate. This is why exams are often criticised, it’s not that testing is bad, it’s how the results are used, often polarising and narrowing opinion as to an individual’s performance, pass and you’re a hero, fail and you’re a villain. It gets worse when you then put those results into a league table and publish them, with the winners at the top and losers at the bottom for all to see and draw often incorrect conclusions.

Summative assessment is however valuable, if you score below the target, it tells you that more effort or work is needed, also that you are not performing well on a particular topic, but it provides no guidance as to what you need to do to improve.

Formative – The purpose of formative assessment is to monitor progress on an ongoing basis in order to help the teacher identify the “gap” between what the student knows and needs to know. This is where the magic happens, firstly in finding out where the gap is e.g. Where is the student currently compared to where they need to be, then figuring out the best way of getting them to that higher standard e.g. what do they need to do to improve. Formative assessment can be a test, a quiz or simply observation.

Lessons for students
And this is why holding back the marks works, what the piece of research (et al) highlighted, is that when students get their marks, they effectively prioritise the grades over the written comments. The good students ignore the comments because they don’t think they have anything to learn, and the weaker students are demotivated so also ignore them.

The key point for students is this, by all means look at the mark but resist that emotional (ego) reaction to pat yourself on the back or beat yourself up. Read all the comments with an open mind, asking two simple questions, can I see that there is a gap between my answer and the model answer and secondly do I know exactly what to do next to close it? The feedback, if it is good of course should make this as easy a process as possible.

The fact that your script might only say “see model answer” or have a cross with the correct number written next to it, is more an example of poor marking with little or no feedback. Perhaps you should return your script providing the marker/teacher with some feedback highlighting the gap between good marking and bad marking but most importantly what they should do to improve…..

And if your interested, here is the link to Dylan William explaining the importance of formative assessment.

Reference – Kuepper-Tetzel & Gardner – Jackson & Marks, 2016 – Taras, 2001, Winstone et al., 2017 – Ramaprasad, 1983

The single most important thing for students to know – Cognitive load

Back in 2017 Dylan Williams, Professor of Educational Assessment at UCL described cognitive load theory (CLT) as ‘the single most important thing for teachers to know’. His reasoning was simple, if learning is an alteration in long term memory (OFSTED’s definition) then it is essential for teachers to know the best ways of helping students achieve this. At this stage you might find it helpful to revisit my previous blog, Never forget, improving memory, which explains more about the relationship between long and short-term memory but to help reduce your cognitive load…. I have provided a short summary below.

But here is the point, if CLT is so important for teachers it must also be of benefit to students.

Cognitive load theory
The term cognitive load was coined by John Sweller in a paper published in the journal of Cognitive Science in 1988. Cognitive load is the amount of information that working/short term memory can process at any one time, and that when the load becomes too great, processing information slows down and so does learning. The implication is that because we can’t do anything about the short-term nature of short-term memory, we can only retain 4 + or – 2 chunks of information before it’s lost, learning should be designed or studying methods changed accordingly. The purpose of which is to reduce the ‘load’ so that it can more easily pass into long term memory where the storage capacity is infinite.

CLT can be broken down into three categories:

Intrinsic cognitive load – this relates to the inherent difficulty of the material or complexity of the task. Some content will always have a high level of difficulty, for example, solving a complex equation is more difficult than adding two numbers together. However, the cognitive load arising from a complex task can be reduced by breaking it down into smaller and simpler steps. There is also evidence to show that prior knowledge makes the processing of complex tasks easier. In fact, it is one of the main differences between an expert and a novice, the expert requires less short-term memory capacity because they already have knowledge stored in long term memory that they can draw upon. The new knowledge is simply adding to what they already know. Bottom line – some stuff is just harder.

Extraneous cognitive load – this is the unnecessary mental effort required to process information for the task in hand, in effect the learning has been made overly difficult or confusing. For example, if you needed to learn about a square, it would be far easier to draw a picture and point to it, than use words to describe it. A more common example of extraneous load is when a presenter puts too much information on a PowerPoint slide, most of which adds little to what needs to be learned. Bottom line – don’t make learning harder by including unimportant stuff.

Germane cognitive load – increasing the load is not always bad, for example if you ask someone to think of a house, that will increase the load but when they have created that ‘schema’ or plan in their mind adding new information becomes easier. Following on with the house example, if you have a picture of a house in your mind, asking questions about what you might find in the kitchen is relatively simple. The argument is that learning can be enhanced when content is arranged or presented in a way that helps the learner construct new knowledge. Bottom line – increasing germane load is good because it makes learning new stuff easier.

In summary, both student and teacher should reduce intrinsic and extraneous load but increase germane.

Implications for learning
The three categories of cognitive load shown above provide some insight as to what you should and shouldn’t do if you want to learn more effectively. For example, break complex tasks down into simpler ones, focus on what’s important and avoid unnecessary information and use schemas (models) where possible to help deal with complexity. There are however a few specifics that relate to the categories worthy of mention.

The worked example effect – If you are trying to understand something and continual reading of the text is having little impact, it’s possible your short-term memory has reached capacity. Finding an example of what you need to understand will help free up some of that memory. For example…….…if I wanted to explain that short term memory is limited I might ask you to memorise these 12 letters, SHNCCMTAVYID. But because this will exceed the 4+ or – 2 rule it will be difficult and hopefully as a result prove the point. In this situation the example is a far more effective way of transferring knowledge than pages of text.

The redundancy effect – This is most commonly found where there is simply too much unnecessary or redundant information. It might be irrelevant or not essential to what you’re trying to learn. In addition, it could be the same information but presented in multiple forms, for example an explanation and diagram on the same page. The secret here is to be relatively ruthless in pursuing what you want to know, look for the answer to your question rather than getting distracted by adjacent information. You may also come across this online where a PowerPoint presentation has far too much content and the presenter simply reads out loud what’s on the slides. In these circumstances, it’s a good idea to turn down the sound and simply read the slides for yourself. People can’t focus when they hear and see the same verbal message during a presentation (Hoffman, 2006).

The split attention effect – This occurs when you have to refer to two different sources of information simultaneously when learning. Often in written texts and blogs as I have done in this one, you will find a reference to something further to read or listen to, ignore it and stick to the task in hand, grasp the principle and only afterwards follow up on the link. Another way of reducing the impact of split attention is to produce notes that reduce the conflict that arises when trying to listen to the teacher and make notes at the same time. You might want to use the Cornel note taking method, click here to find out more.

But is it the single most important thing a student should know?
Well maybe, maybe not but its certainly in the top three. The theory on its own will not make you a better learner but it goes a long way in explaining why you can’t understand something despite spending hours studying, it provides guidance as to what you can do to make learning more effective but most importantly it can change your mindset from – “I’m not clever enough” to, “I just need to reduce the amount of information, and then I’ll get it”.

And believing that is priceless, not only for studying towards your next exam but in helping with all your learning in the years to come.

Motivated ignorance – is ignorance better than knowing?

If it’s true that the cat wasn’t killed by curiosity and that ignorance was to blame (see last month’s blog) then it follows that we should better educate the cat if it is to avoid an untimely death. But what if the cat chooses to remain ignorant?

Ignorant – lacking knowledge or awareness in general; uneducated or unsophisticated.

In a paper published last February, Daniel Williams puts forward a very challenging and slightly worrying proposition, that when the costs of acquiring knowledge outweigh the benefits of possessing it, ignorance is rational. In simple terms this suggests that people are not “stupid”, or ignorant, when they are unaware of something, they are in fact being logical and rational, effectively choosing not to learn.

“Facts do not cease to exist because they are ignored.” – Aldous Huxley

Beware the man of a single book St. Thomas Aquinas
In terms of education this is clearly very important, but it has far wider implications for some of the challenges we are facing in society today. There is an increasing divergence in opinion across the world with people holding diametrically opposite views, both believing the other is wrong. We can probably attach personas to these groups, on the one side there are the knowledgeable and well educated, on the other those who may not be in possession of all the facts but trust their emotions and believe in community and identity. The two groups are clear to see, those that believe in climate change and those that don’t, Trump supporters and anyone but Trump supporters, take the vaccine or anti-vaccine.

The stakes could not be higher.

“Ignorance is a lot like alcohol. The more you have of it, the less you are able to see its effect on you.” – Jay Bylsma

Motivated ignorance
The idea that choosing to be ignorant could be both logical and rational is not new. In his book An Economic Theory of Democracy first published in 1957 Anthony Downs used the term “rational ignorance” for the first time to explain why voters chose to remain ignorant about the facts because their vote wouldn’t count under the current political system. The logic being that it was rational to remain ignorant if the costs of becoming informed, in this case the effort to read and listen to all the political debate outweigh the benefits, of which the voters saw none.

“If you think education is expensive, try ignorance.” – Robert Orben

Daniel Williams is making a slightly different point; he argues that motivated ignorance is a form of information avoidance. The individual is not remaining ignorant because the costs of obtaining the information are too high, they are actively avoiding knowledge for other reasons. He also goes on to say that if you are avoiding something it follows that you were aware of its existence in the first place, what the US Secretary of Defense Donald Rumsfeld so eloquently referred to as a known unknown.

We need one final piece of the jigsaw before we can better understand motivated ignorance, and that is motivated reasoning. Motivated reasoners reach pre-determined conclusions regardless of the evidence available to them. This is subtly different to confirmation bias, which is the tendency to only notice information that coincides with pre-existing beliefs and ignores information that doesn’t.

If motivated reasoning is the desire to seek out knowledge to support the conclusions you want, motivated ignorance is the opposite, it is the desire to avoid knowledge in case it gives you the “wrong” answer. For example, although you might feel ill, you avoid going to the doctors to find out what’s wrong because you don’t want to know what the doctor might say.

The question that we should ask is, why don’t you want to know the answer? The implication here is that something is stopping you, in this instance perhaps the emotional cost of the doctor’s prognosis is greater than the gain. Similar examples can be found in other domains, the husband who doesn’t ask as to his wife’s whereabouts because he is afraid, she is having an affair, and doesn’t want it confirmed, although in reality she might have just been late night shopping!

“If ignorance is bliss, there should be more happy people.” – Victor Cousin

The idea that we should always seek out knowledge to be better informed clearly has its limitations and that far from being illogical motivated ignorance has some degree of rationality.

What have we learned?
Human beings do not strive to answer every question nor have within their grasp all the knowledge that exists. We are selective based on how much time we have available, how we might like to feel and, in some instances, the social groups we would like to belong. There is always a sacrifice or trade-off for knowledge and sometimes the price might be considered too high.

The answer to ignorance is not to throw more information at the problem in an attempt to make the ignorant more enlightened. If you don’t believe in climate change, not even a well-crafted documentary by David Attenborough is likely to help If the motivation for choosing ignorance is not addressed. This over supply of information was evident in the Brexit debate here in the UK. For those who had “made up their mind”, providing very powerful arguments by equally powerful captains of industry as to why leaving Europe was a bad idea failed to educate because most chose not to listen.

The role of education and learning has to be inspiration and curiosity, we need to get closer to those underlying motivational barriers and break them down. We have to help people appreciate the feeling you get as a result of challenging your views and coming out the other side with a better and possibly different answer. There is a need to move away from the competitive nature of right and wrong and the idea that changing your mind is a sign of weakness.

“When the facts change, I change my mind. What do you do, sir?”- attributed to J Maynard Keynes

And maybe we have to accept that although there is a price to pay whatever it is, it will be worth it.

“no people can be both ignorant and free.” – Thomas Jefferson

If it wasn’t curiosity, what did kill the cat?

In 2006 Professor Dr. Ugur Şahin, an oncologist was working on a curiosity-driven research project to help find out if it might be possible to develop a vaccine to control and destroy cancerous tumours by activating the body’s own immune system. This approach was fundamentally different to the more common treatments of radiation and chemotherapy. Curiosity driven projects often have no clear goal but allow scientists to take risks and explore the art of the possible.

In 2008 Dr. Ugur Sahin and his wife Ozlem Tureci founded a small biotech company called BioNTech who you may never have heard of, if it wasn’t for COVID-19. Because together with Pfizer, BioNTech are the suppliers of the first Covid vaccine to be used in the UK. That early curiosity driven research in 2006 provided Sahin and Tureci with the answers to our 2020 problem.

Curiosity is the wick in the candle of learning – William Arthur Ward
Curiosity is the desire to know or learn something in the absence of extrinsic rewards. The point being, there is no reward other than the answer itself. It is a psychological trait and because of that, has a genetic component, some people are just born more curious. However, nurture has an equally important role to play, and although it’s argued you can’t teach curiosity you can encourage people to become more curious by using different techniques. See below.

Sophie von Stumm, a professor of Psychology in Education from the University of York believes that curiosity is so important in terms of academic performance that it should sit alongside intelligence and effort (conscientiousness) as a third pillar. Her research found that intelligence, effort and curiosity are key attributes of exceptional students.

Curiosity follows an inverted U-shape when shown in graphical form. Imagine a graph, along the horizontal axis we have knowledge and on the vertical, curiosity. When we first come across a new subject, we know very little and as such our curiosity rises as does the level of dopamine, but as we find out more and more our curiosity will reach a peak before ultimately falling.

“When you’re curious you find lots of interesting things to do.” Walt Disney

Curiosity types – it would be far too simplistic to think that there is only one type of curiosity. Mario Livio, an astrophysicist talks about a few of them in his book Why? What Makes Us Curious.

  • Epistemic curiosity is the one we have been talking about so far and relates to the type of curiosity that drives research and education. It’s generally a pleasurable state, the result of a release of dopamine that comes from mastery and the anticipation of reward.
  • Perceptual curiosity is primal and exists on a continuum between fear and satisfaction, it’s the curiosity we feel when something surprises us or when we get an answer that doesn’t quite fit with what we expected. The motivation is to seek out something novel although the curiosity will diminish with continued exposure.
  • Diversive curiosity is transient and superficial and is often experienced when swiping through your Twitter feed. Its effectively a means of jumping from topic to topic and normally fails to result in any form of meaningful insight or understanding.

You might think that as we grow older, we become less curious simply because we know more. However, although we may lose some elements of diversive curiosity or the ability to be surprised, research shows that epistemic curiosity remains roughly constant across all age groups

But why?
The roots to curiosity can be traced back to a form of neoteny, an evolutionary condition that means although we reach maturity, we retain juvenile characteristics. Effectively we are more childlike than other mammals, continuing to be curious and playful throughout our lives. You can often tell if people are curios by looking at their eyes, which will become more dilated. This indicates that noradrenaline, a neurotransmitter has been released in the brainstem’s locus coeruleus, the part of the brain most strongly linked to arousal, vigilance, and attention. In addition, noradrenaline is also integral to a number of higher cognitive functions ranging from motivation to working memory and therefore hugely valuable for learning.

This may well be a slightly complicated way of saying that if you are curious about something, you are more likely to pay attention, making it easier to remember and in so doing learn.

How to become more curious

“Millions saw the apple fall, but Newton asked why.” Bernard Baruch

Research into curiosity has confirmed some of what we might have already assumed to be correct, for example in a paper published in 2009, it concluded that people were more likely to recall answers to questions they were especially curious about. However it also showed that curiosity increased when answers were guessed incorrectly, suggesting that surprise was a factor in improved retention.

“I know you won’t believe me, but the highest form of human excellence is to question oneself and others.” Socrates

The concept that curiosity is based on an Information gap was first put forward by George Loewenstein in 1994 which leads to one of the most powerful tools we can use to improve curiosity, asking questions. The best question to ask is probably WHY, but don’t forget Kipling’s other 5 honest serving men, WHAT, WHEN, HOW, WHERE and WHO. Below are a few more ideas.

  • Ask Socratic questions. This involves asking open ended questions that provoke a meaningful exploration of the subject, this process sits at the heart of critical thinking.
  • Create environments that promote curiosity. Challenges that need solving require a curious mind. Case studies are also more of interest, providing several different routes to explore.
  • Guess the answer first. As mentioned above, if you guess first it increases the surprise factor. Loewenstein also argued that guessing with feedback stimulates curiosity because it highlights the gap between what you thought you knew and the correct answer.
  • Failure is feedback. Finding out why you got something wrong can be just as interesting as knowing that you are right, it certainly increases curiosity.
  • Start with the curious part of a subject. You may not be curious about the whole subject, but try to find the part you are interested in and start there.

And if you would like to find out more

What’s the answer, what did kill the cat?

it was IGNORANCE…………

Learning is emotional

We are all emotional, it’s part of what it means to be human, your emotions help navigate uncertainty and experience the world. For some it’s even considered an intelligence, requiring the ability to understand and manage your own emotions, as well as others.

For many years’ emotions were considered something that “got in the way” of learning, effectively disrupting the efficiency, but it is now believed that emotion has a substantial influence on cognitive processes, including perception, attention, memory, reasoning, and problem solving.

Emotions, feelings and mood

In last month’s blog I outlined how sensory input impact memory and the story continues because memories are a key part of emotion and both are found in something called the limbic system, a group of interconnected structures located deep within the brain. The limbic system plays an important part in controlling emotional responses (Hypothalamus), coordinating those responses (Amygdala), and laying down memories (Hippocampus).

There is no single definition of emotion that everyone agrees upon, what we know is, it relies upon the release of chemicals in response to a trigger which in turn leads to three distinct phases. Firstly, a subjective experience, perhaps a feeling of anger, although not everyone would necessarily respond in the same way to the same stimulus. Secondly, a physiological response for example, raised blood pressure, increased heart rate and lastly a behavioural or expressive response, a furrowing of the brow, showing of teeth etc.  

Although emotions are not believed to be hard-wired, in the 1970s Paul Eckman identified six emotions that were universally experienced in all human cultures. They are happiness, sadness, disgust, fear, surprise, and anger. This list has however been expanded to include others for example shame, embarrassment, excitement etc.

Feelings on the other hand arise from emotions, they are a conscious interpretation of the stimulus, asking questions as to what it might mean, some refer to feelings as the human response to emotions.  And finally, moods which are more general and longer term, an emotion might exist for a fraction of a second but moods can last for hours, even days and are sometimes a symptom of more worrying mental health issues.   In addition, moods are not necessarily linked to a single event but shaped by different events over time.

Impact on learning

Understanding what this means for students and educators is complex and in a short blog it’s only possible to introduce the subject. But there are a few lessons we can learn.

  • Emotions direct attention – if students can make an emotional connection with what they are learning it will improve levels of concentration and enjoyment.
  • Consider the emotional environment – the emotional context in which information is delivered can help students experience more positive emotions such as happiness and one of the most powerful emotions in learning, curiosity.
  • Avoid negative emotions – students who are in a continual state of anxiety or fearing failure whilst learning will find concentrating and retaining information difficult. This is partly the result of the brain going into its fight or flight mode which effectively narrows its focus to the task in hand.
  • Emotional state is contagious – the emotional state of the teacher can have a significant impact on students.
  • Memory and emotions are bound together – emotions have a considerable influence on memory. This is why we remember more emotionally charged events such as September 11 or the London bridge attack in 2017.

And if you would like to find out moreHow do emotions impact learning.

Dedication – in a lifetime we will all experience many emotions some good, some bad, but none are as powerful or more gratefully received than a mother’s love, for my mom.

Fairness and mutant algorithms

Back in 2014, I wrote two blogs (part 1 & part 2) about examinations and asked if they were fit for purpose. The conclusion – they provide students with a clear objective to work towards, the process is scalable and the resulting qualification is a transferable measure of competency. They are of course far from perfect, exams do not always test what is most needed or valued and when results are presented in league tables, they give a too simplistic measure of success.

However, I didn’t ask if examinations were fair, that is treating students equally without favouritism or discrimination.

In the last two weeks the question around fairness has been in the headlines following the government’s decision to cancel all A level and GCSE examinations in order to reduce the risk of spreading Covid-19. Whilst many agreed with this it did cause a problem, how could we fairly assess student performance without an examination?

Are examinations fair?

This is not a question about the fairness of an exam as a type of assessment, there are for example other ways of testing ability, course work, observations etc. Its asking if the system of which an examination is part treats all students equally, without bias.

In the world of assessment exams are not considered sufficiently well designed if they aren’t both reliable and valid. It might be interesting to use this as a framework to consider the fairness of the exam system.  

  • Validity – the extent to which it measures what it was designed to measure e.g. add 2+2 to assess mathematical ability.
  • Reliability – the extent to which it consistently and accurately measures learning. The test needs to give the same results when repeated. e.g. adding 2+2 is just as reliable as adding 2+3. The better students will get them both right and the weaker students both wrong.

The examining bodies will be very familiar with these requirements and have controls in place to ensure the questions they set are both valid and reliable. But even with sophisticated statistical controls, writing questions and producing an exam of the same standard over time is incredibly difficult.  Every year the same questions are asked, have students performed better or is it just grade inflation, were A levels in 1951 easier or harder than today? It’s the reliability of the process that is most questionable.

If we step away from the design of the exam to consider the broader process, there are more problems. Because there are several awarding bodies, AQA, OCR, Edexcel to name but three, students are by definition sitting different examinations. And although this is recognised and partly dealt with by adjusting the grade boundaries, it’s not possible to completely eliminate bias. It would be much better to have one single body setting the same exam for all students.

There is also the question of comparability between subjects, is for example A level maths the same as A level General studies? Research conducted by Durham University in 2006 concluded that a pupil would be likely to get a pass two grades higher in “softer” subjects than harder ones. They added that “from a moral perspective, it is clear this is unfair”. The implication being that students could miss out on university because they have chosen a harder subject.

In summary, exams are not fair, there is bias and we haven’t even mentioned the impact of the school you go to or the increased chances of success the private sector can offer. However, many of these issues have been known for some time and a considerable amount effort goes into trying to resolve them. Examinations also have one other big advantage, they are accepted and to a certain extent the trusted norm and as long as you don’t look too closely, they work or at least appear to. Kylie might be right, “it’s better the devil you know”….. than the devil you don’t.

The mutant algorithm

Boris Johnson is well known for his descriptive language, this time suggesting that the A level problem was the result of a mutant algorithm. But it was left to Gavin Williamson the Secretary of State for Education to make the announcement that the government’s planned method of allocating grades would need to change.

We now believe it is better to offer young people and parents’ certainty by moving to teacher assessed grades for both A and AS level and GCSE results”

The government has come in for a lot of criticism and even their most ardent supporters can’t claim that this was handled well.

But was it ever going to be possible to replace an exam with something that everyone would think fair?

Clarification on grading

To help answer this question we should start with an understanding of the different methods of assessing performance.

  1. Predicted Grades (PG) – predicted by the school based on what they believe the individual is likely to achieve in positive circumstances. They are used by universities and colleges as part of the admissions process. There is no detailed official guidance as to how these should be calculated and in general are overestimated. Research from UCL showed that the vast majority, that is 75% of grades were over-predicted.
  2. Centre Assessed Grades (CAG) – These are the grades which schools and colleges believed students were most likely to achieve, if the exams hadn’t gone ahead. They were the original data source for Ofqual’s algorithm. It was based on a range of evidence including mock exams, non-exam assessment, homework assignments and any other record of student performance over the course of study.  In addition, a rank order of all students within each grade for every subject was produced in order to provide a relative measure. These are now also being referred to as Teacher Assessed Grades (TAG)
  3. Calculated grades (CG) – an important difference is that these are referred to as “calculated” rather than predicted! These are the grades awarded based on Ofqual’s algorithm. They use the CAG’s but adjusts them to ensure they are more in line with prior year performance from that school. It is this that creates one of the main problems with the algorithm…

it effectively locks the performance of an individual student this year into the performance of students from the same school over the previous three years.

Ofqual claimed that if this standardisation had not taken place, we would have seen the percentage of A* grades at A-levels go up from 7.7 % in 2019 to 13.9 % this year. The overall impact was that the algorithm downgraded 39 % of the A-level grades predicted by teachers using their CAG’s. Click here to read more about how the grading works.

Following the outcry by students and teachers Gavin Williamson announced on the 17th of August that the Calculated Grades would no longer be used, instead the Centres Assessed Grades would form the basis for assessing student performance.  But was this any fairer, well maybe a little, but it almost certainly resulted in some students getting higher grades than they should whilst others received lower, and that’s not fair.

Better the devil you know

The Government could certainly have improved the way these changes were communicated and having developed a method of allocating grades scenario stress tested their proposal. Changing their mind so quickly at the first sign of criticism suggests they had not done this. It has also left the public and students with a belief that algorithms dont work or at the very least should not to be trusted.

Perhaps the easiest thing to have done would have been to get all the students to sit the exam in September or October. The Universities would then have started in January, effectively everything would move by three months, and no one would have complained about that would they?

Food for thoughts – the impact of food on learning

According the latest government statistics obesity is on the rise, there is also a link to Covid deaths with nearly 8% of critically ill patients in intensive care being obese, compared with 2.9% of the general population. The WHO has stated that being overweight and obese is the fifth leading risk for global deaths with at least 2.8 million adults dying each year.

Eating too much is clearly not good for your health but how about what you eat, how might that impact your health, in particular your brain?

Viva las Vagus

Have you ever used your gut instinct, had butterflies in your stomach or when feeling nervous had to rush to the toilet? If so then you already have some evidence of the connection and importance of your gut to the way you think and feel. The vagus nerve is the longest cranial nerve and runs from the brain stem to part of the colon in effect making the connection. The biggest influence on the levels of activity of the vagus nerve are the trillions of microbes that reside in the gut. The vagus nerve is able to sense the microbe activity and effectively transfer this gut information to the nervous system and ultimately the brain. Watch this 2-minute video that shows how this works.

Scientists refer to the relationship between the gut and the brain as the “gut brain axis”. The brain sends chemical signals to the gut through the bloodstream, one such example is the feeling of being full or hungry. But and this is the interesting part – the stomach talks back; gut bacteria send messages in the same way the brain communicates using neurotransmission. Prior blog – The learning brain.

Exactly what the messages say depends on what you eat, a gut filled with fruit and vegetables will have different microbes to one that has just consumed a Big Mac. This is a very new area and most of the research has been completed on rats but there is already some evidence to suggest that junk food impairs memory.

Hopefully this gives you some idea as to the strong connection that exist between your stomach and your brain. We can now move on and consider what specific types of foods can help when learning.

These Ted talks are well worth watching if you want to find out more – Your Gut Microbiome: The most important organ you’ve never heard of (11m), and Mind-altering microbes: How the microbiome affects brain and behaviour (6m).

What to eat when studying

The first thing to say is that I am far from an expert on nutrition and so the focus here is more on the impact food has on mood, concentration, cognition and memory. Secondly, to give this some context it might be worth thinking about what you eat in the same way an athlete does. They pay close attention to their diet to make sure their body is in the best possible condition in order to compete because if not they are reducing their chances of success. However, a good diet is no substitute for the hard work they have to put in at the gym, you have to do both. Short video on how nutrition is key to sports performance.

Brain foods

  1. Apples, berries and Citrus – The British Journal of Nutrition published research in 2010 (The impact of fruit flavonoids on memory and cognition) indicating that consuming certain fruits such as berries, apple and citrus, that are rich in flavonoids can help improve memory and cognition.
  2. Dark chocolate – Research published in the Frontiers in Nutrition (Enhancing Human Cognition with Cocoa Flavonoids) found that dark chocolate which also contains flavonoids improved memory in both the short and long term. But remember many types of chocolate are high in sugar, fats, and calories so it’s not all good news.
  3. Rosemary – Northumbria University’s Department of Psychology found that herbs such as rosemary and lavender impacted memory, with the scent of rosemary enhancing memory but lavender impairing it. Maybe Shakespeare knew what he was talking about when he said ‘rosemary is for remembrance’.
  4. Oily fish and walnuts (omega 3) – There is a much-published connection between omega three and the improvement in learning and memory. However, many of these claims are exaggerated to promote a particular type of food or brand with most having such small doses to make little or no difference. There is some evidence published in the medical journal of the American Academy of Neurology that found people who ate more seafood, which naturally contains omega 3, had reduced rates of decline in semantic memory. But there is little evidence to show that supplements work at all. The best advice is to eat fish and nuts as part of a balanced diet but don’t expect your exam results to improve by that much.
  5. Fruit and vegetables – A study conducted by Pennsylvania State University in April 2012 found an association between consuming fruit and vegetables and being in a positive mood.
  6. Water – Despite being the least exciting of them all, water remains one of the best ways in which you can improve brain functionality. Research published in the American Journal of Clinical Nutrition studied 101 participants to see if low water consumption impacted cognition. The result was those who had reduced amounts of water experienced poor memory, reduced energy levels and feelings of anxiety, but those drinking water experienced the opposite.

The evidence on specific foods and its impact on cognition and learning is complex and nuanced. However the connection between the stomach and the brain although still in its early stages has greater potential to lead us to a better understanding as to what we should eat to improve our mental wellbeing.

In the meantime, the best advice is to think about how your diet impacts you personally, identify when you feel best studying is it before a meal or after, pay attention to snacking and of course drink lots of water, eat your greens, all as part of a balanced diet.

Lessons from lies – Fake news

There is little doubt that we live in an age with access to more information than any other. All you have to do is log onto your PC and type into Google whatever you want to know and within 0.28 seconds you will get 3.44 million results, it really is science fiction. But having lots of information isn’t the same as having reliable information, how do you know that what your reading is true?

Fake news and false information

Fake news is certainly not new, in 1835 it was reported in a New York newspaper that a telescope “of vast dimensions” could see what was happening on the moon. It caused a sensation and the paper’s circulation increased from 8,000 to more than 19,000. The only problem, it was a complete fiction or fake news concocted by the editor, Richard Adams Locke. It may not be new but fake news is certainly faster moving and far more prolific fuelled by the internet, the growth in social media, globalisation and a lack of regulation.

But before we go any further let’s take a step back and clarify what we mean by fake news. Firstly, there are completely false stories created to deliberately misinform, think here about the moon story although even that contained some facts. There was an astronomer called Sir John Herschel who did indeed have a telescope “of vast dimensions” in his South African observatory, but he did not witness men with bat wings, unicorns, and bipedal beavers on the moon’s surface. Secondly, stories that may have some truth to them, but are not completely accurate, a much more sophisticated and convincing version of the above and probably harder to detect.

We will leave aside the motives for creating fake news but they range from politics, to pranks and as was the case of Richard Adams Locke, commercial gain.

Here are a few headlines:

5G weakens the immune system, making us more vulnerable to catching the virus
If you can hold your breath for 10 seconds, then you don’t have the virus
Fuel pump handles pose a particularly high risk of spreading the Corona-19 infection
And more controversy, Health secretary Matt Hancock stating that testing figures had hit 122,347 on April 30

The first three are fake, the third is based on facts. Click here to make up your own mind as to its truth.

But why do we believe these stories?

Quick to judge A study from the University of Toulouse Capitole, found that when participants were asked to make a quick judgment about whether a news story was real or fake, they were more likely to get it wrong. This is somewhat worrying given the short attention span and patterns of behaviour displayed by those surfing the net.

We think more like lawyers than scientists – Commonly called confirmation bias, our ability to favour information that confirms our existing beliefs. Lawyers examine evidence with a preconceived objective, to prove their client’s innocence whereas scientists remain open minded, in theory at least. An interesting aspect of this is that well educated people may be more susceptible because they have the ability to harness far more information to support their opinion. This is a bias of belief not of knowledge.  

Illusory truth effect – This is the tendency to believe false information after repeated exposure. First identified in a 1977 study at Villanova University and Temple University. It would be wrong to ignore the man who many believe (wrongly) invented the term fake news, including himself, Donald Trump. He is a master of repetition, for example Trump used the expression “Chinese virus” more than 20 times between March 16 and March 30, according to the website Factbase.

Gullibility, the failure to ask questions We are prone to believe stories that “look right”, Psychologists refer to this as “processing fluency”. Experiments have found that “fluent information” tends to be regarded as more trustworthy and as such more likely to be true. Images are especially powerful, for example researchers have found that people believed that macadamia nuts were from the same family as peaches if there was a picture of a nut next to the text.

The same photo but from a different angle

Google it! but do so with care

Most educators will encourage students to become independent learners, be curious and ask questions, solve their own problems, it is one of the most powerful educational lessons, and as Nelson Mandela said, education can be used to change the world. But we need to be careful that what is learned is not just a bunch of facts loosely gathered to prove one person’s point of view. Mandela’s vision of changing the world through education was based on the education being broad and complex not narrow.

We are of course very fortunate to have such a vast amount of information from which to learn, but that curiosity needs to be tempered with a critical mind set. The questions asked should be thoughtfully constructed with knowledge of one’s own personal bias and the information analysed against the backdrop of the source of that information and possible motives of the authors

Guidelines for students using Google

1. Develop a Critical Mindset – this is the ability to think logically, figuring out the connections, being active rather than passive, challenging what you read against what you already know and perhaps most importantly challenging your own ideas in the context of the new information. Are you simply finding information to support your own views, an example of confirmation bias.

2. Check the Source and get confirmation – for websites always look at the URL for the identity of the organisation and the date of the story. Lots of fake news is news rehashed from the past to support the argument currently being made. What is the authority quoted, why not cut that from the story and paste into google to find out who else is using that information and in what context. Look for spelling mistakes and generalisations e.g. most people agree. These terms are vague and give the impression that this is a majority view.

3. Evaluate the evidence and don’t take images at face value – use your critical thinking skills to validate the evidence. Who is the authority quoted, do they have any reasons or motives for making these claims? Images as already mentioned are very powerful, but fake images are easy to create on the internet and a clever camera angle can easily mislead.

4. Does it make sense? – an extension of logical thinking but perhaps more emotional, how do you feel about this, what’s you gut instinct. The unconscious part of your brain can help make complex decisions sometimes more accurately than logical thought.

With large amounts of free knowledge, there are calls for schools to be doing more to better equip children to navigate the internet. In fact, back in 2017 the House of Lords published a report ‘Growing up with the internet’ which recommended that “Digital literacy should be the fourth pillar of a child’s education alongside reading, writing and mathematics”.

It’s not just school children that need this fourth pillar, we probably all do.

And of course the picture at the start of this blog is Fake!

The Covid gap year – a catalyst for change

At times it might seem difficult to find the positives in the current Covid crises but there are some. We may have had to change our travel plans but are benefiting from cleaner air and more time, staying closer to home is leading to a greater sense of community, and social media which was becoming ever more toxic has been used by many to keep in touch with friends and family. But how long will we continue to enjoy these healthy bi-products when we can jump on that aeroplane, tweet something without thinking and once again time becomes scarce, consumed by work. The downside is it can so easily revert back to how it was before.

However, some changes are likely to become permanent, people are beginning to call what comes after Covid the new norm, a kind of normality, familiar and yet different. We have all been given a glimpse of the future or to be precise the future has been brought forward not as a blurry image but with startling clarity because we are living it.

Change is easy
On the whole it’s difficult to get people to change their behaviour but if you change the environment it’s a different story. If we had asked people if they wanted to work from home they would have had to guess what it would be like, imagining not having to travel, imagining not seeing colleagues in the wok place but if you are forced into doing it, you experience it for real. And that’s what’s happened, people may not have chosen to work from home but having experienced it the change will be faster.

Neurologically a habit or learning for that matter takes place when you do something repeatedly. In 1949 Donald Hebb, a Canadian neuroscientist noted that once a circuit of neurons is formed, when one neuron fires so do the others, effectively strengthening the whole circuit. This has become known as Hebbian theory or Hebbs law and leads to long term potentiation, (LTP).

“Neurons that fire together wire together.”

Habits are patterns that can be thought of as grooves created over time by repetition but once formed they are hard to get out of, the deeper the groove, the less we think about it at a conscious level. But if you change the environment you are forcing the brain to reconsider those habits, effectively moving you out of that particular groove until you form another one. The secret is of course to create good habits and remove bad ones.

Many are suggesting that working from home will become far more common, Google and Facebook have already announced that they do not need their employees to go back into offices until at least the end of 2020, but who knows what that groove will be like by then. The other big changes on the horizon with potential for long term impact are, the reduction in the use of cash as appose to contactless, online shopping already popular will see a more drastic reshaping of the high street and studying online becoming a new way of learning. Education has seen one of the biggest changes arguably since we have had access to the internet with 1.3 billion students from 186 countries across the world now having to learn remotely. Even before COVID-19, global EdTech investment was $18.7 billion and the overall market for online education is projected to reach $350 Billion by 2025. (source WEF).

This is what school in China looks like during coronavirus.

Changing attitudes to study
Given the choice 1.3 billion students would not have all agreed to study online but Covid-19 has made this a reality within a matter of months. Its an environmental change on a massive scale. The argument that online learning is better remains complex and confusing, requiring a clearer understanding of what is being measured and a much longer time period under which it can be evaluated. There are for example claims that retention rates are higher by somewhere between 25% – 60% but I would remain sceptical despite its appeal and apparent common sense logic.

Instead focus on your own learning, think less of how much more difficult it is to concentrate staring at a computer screen rather than being in a classroom and embrace the process. You are in a new “groove” and as a result it’s not going to feel comfortable.

Covid Gap year
Why not make 2020 your Covid Gap year. UCAS says that one of the benefits of a gap year is that it “offers you the opportunity to gain skills and experiences, while giving you time to reflect and focus on what you want to do next”. It’s the changing environment in terms of geography, people, doing things that you might not have chosen that makes the gap year so worthwhile, and despite what people say when they return, it wasn’t enjoyable all of the time, you do get bored and frustrated but it can open your mind to new possibilities and ironically lockdown can do the same.

Online learning is a new environment, view it through the spectrum of new skills and experiences and only when you reflect back should you decide on how valuable it might have been.

Brain overload

Have you ever felt that you just can’t learn anymore, your head is spinning, your brain must be full? And yet we are told that the brains capacity is potentially limitless, made up of around 86 billion neurons.

To understand why both of these may be true, we have to delve a little more into how the brain learns or to be precise how it manages information. In a previous blog I outlined the key parts of the brain and discussed some of the implications for learning – the learning brain, but as you might imagine this is a complex subject, but I should add a fascinating one.

Cognitive load and schemas

Building on the work of George (magic number 7) Miller and Jean Paget’s development of schemas, in 1988 John Sweller introduced us to cognitive load, the idea that we have a limit to the amount of information we can process.

Cognitive load relates to the amount of information that working memory can hold at one time

Human memory can be divided into working memory and long-term memory. Working memory also called short term memory is limited, only capable of holding 7 plus or minus 2 pieces of information at any one time, hence the magic number 7, but long-term memory has arguably infinite capacity.

The limited nature of working memory can be highlighted by asking you to look at the 12 letters below. Take about 5 seconds. Look away from the screen and write down what you can remember on a blank piece of paper.

MBIAWTDHPIBF

Because there are more than 9 characters this will be difficult. 

Schemas – Information is stored in long-term memory in the form of schemas, these are frameworks or concepts that help organise and interpret new information. For example, when you think of a tree it is defined by a number of characteristics, its green, has a trunk and leaves at the end of branches, this is a schema. But when it comes to autumn, the tree is no longer green and loses its leaves, suggesting that this cannot be a tree. However, if you assimilate the new information with your existing schema and accommodate this in a revised version of how you think about a tree, you have effectively learned something new and stored it in long term memory. By holding information in schemas, when new information arrives your brain can very quickly identify if it fits within an existing one and in so doing enable rapid knowledge acquisition and understanding.

The problem therefore lies with working memory and its limited capacity, but if we could change the way we take in information, such that it doesn’t overload working memory the whole process will become more effective.

Avoiding cognitive overload

This is where it gets really interesting from a learning perspective. What can we do to avoid the brain becoming overloaded?

1. Simple first – this may sound like common sense, start with a simple example e.g. 2+2 = 4 and move towards the more complex e.g. 2,423 + 12,324,345. If you start with a complex calculation the brain will struggle to manipulate the numbers or find any pattern.

2. Direct Instruction not discovery – although there is significant merit in figuring things out for yourself, when learning something new it is better to follow guided instruction (teacher led) supported by several examples, starting simple and becoming more complex (as above). When you have created your own schema, you can begin to work independently.

3. Visual overload – a presentation point, avoid having too much information on a page or slide, reveal each part slowly. The secret is to break down complexity into smaller segments. This is the argument for not having too much content all on one page, which is often the case in textbooks. Read with a piece of paper or ruler effectively underlining the words you are reading, moving the paper down revealing a new line at a time.

4. Pictures and words (contiguity) – having “relevant” pictures alongside text helps avoid what’s called split attention. This is why creating your own notes with images as well as text when producing a mind map works so well.

5. Focus, avoid distraction (coherence) – similar to visual overload, remove all unnecessary images and information, keep focused on the task in hand. There may be some nice to know facts, but stick to the essential ones.

6. Key words (redundancy) – when reading or making notes don’t highlight or write down exactly what you read, simplify the sentence, focusing on the key words which will reduce the amount of input.

7. Use existing schemas – if you already have an understanding of a topic or subject, it will be sat within a schema, think how the new information changes your original understanding.

Remember the 12 characters from earlier, if we chunk them into 4 pieces of information and link to an existing schema, you will find it much easier to remember. Here are the same 12 characters chunked down.

FBI – TWA – PHD – IBM

Each one sits within an existing schema e.g. Federal Bureau of Investigation etc, making it easier for the brain to learn the new information.

Note – the above ideas are based on Richard E. Mayer’s principles of multimedia learning.

In conclusion

Understanding more about how the brain works, in particular how to manage some of its limitations as is the case with short term memory not only makes learning more efficient but also gives you confidence that how your learning is the most effective.

Double entry bookkeeping replaced by internet

There is an interesting question being asked at the moment, given that fact-based knowledge is so accessible using the internet, is there a case for not teaching facts at all?

According to Don Tapscott, a consultant and speaker, who specialises in organisations and technology, memorising facts and figures is a waste of time because such information is readily available. It would be far better to teach students to think creatively so that they can learn to interpret and apply the knowledge they discover online.

“Teachers are no longer the fountain of knowledge, the internet is”
Don Tapscott

Is this the solution for educators with an over full curriculum, the result of having to continually add new content to ensure their qualification remains relevant and topical? Perhaps they can remove facts and focus on skills development? After all its skills that matter, knowing is useful but it’s the ability to apply that really matters …right?

What makes you an accountant

When you start to learn about finance, you will be taught a number of underpinning foundational subjects including, law, economics, costing and of course basic accounting. Sat stubbornly within the accounting section will be double entry bookkeeping. This axiom is fiercely protected by the finance community such that if anyone questions its value or challenges its relevance they will be met with pure contempt. And yet, is the knowledge as to how you move numbers around following a hugely simple rule i.e. put a number on one side and an equivalent on the other of any use in a world where most accounting is performed by computers and sophisticated algorithms? I am sure there will be similar examples from other professions and industries. The challenge being, do doctors really need to understand basic anatomy or lawyers read cases dating back to 1892?

“Everyone is entitled to his own opinion, but not to his own facts”
Daniel Patrick Moynihan

But Knowledge is power

Daniel T. Willingham is a psychologist at the University of Virginia and the author of a number of books including, why students don’t like school. His early research was on the brain, learning and memory but more recently he has focused on the application of cognitive psychology in K-16 education.

Willingham argues that knowledge is not only cumulative, it grows exponentially. In addition, factual knowledge enhances cognitive processes like problem solving and reasoning. How knowledge Helps.

Knowledge is cumulative – the more you know the more you can learn. Individual chunks of knowledge will stick to new knowledge because what you already know provides context and so aids comprehension. For example, knowing the definition of a bond ‘a fixed income instrument that represents a loan made by an investor to a borrower (prior knowledge), enables the student to grasp the idea that anything fixed has to be paid by the company (the lender) regardless of its profitability and this is the reason debt is considered risky. (new knowledge)

Knowledge helps you remember – the elaboration effect has featured in a previous blog. In essence it suggests that the brain finds it easier to remember something if it can be associated with existing information. Using the same example from above, it is easier to remember that bonds are risky if you already knew what a bond was.

Knowledge improves thinking – there are two reasons for this, firstly it helps with problem solving. Imagine you have a problem to solve, if you don’t have sufficient background knowledge, understanding the problem can consume most of your working memory leaving no space for you to consider solutions. This argument is based on the understanding that we have limited capacity in working memory (magic number 7) and so to occupy it with grasping the problem at best slows down the problem-solving process, but at worse might result in walking away with no solution. Secondly knowledge helps speed up problem solving and thinking. People with prior knowledge are better at drawing analogies as they gain experience in a domain. Research by Bruce Burns in 2004 compared the performance of top chess players at normal and blitz tournaments. He found that what was making some players better than others is differences in the speed of recognition, not faster processing skills. Players who had knowledge of prior games where far quicker in coming up with moves than those who were effectively solving the problem from first principle. Chess speed at least has a lot to do with the brain recognising pre learned patterns.

Skills are domain specific – not transferable

There is one other important lesson from an understanding of knowledge – skills are domain specific. The implication being that teaching “transferable skills” e.g. skills that can be used in different areas, communication, critical thinking etc doesn’t work. A skill (Merriam Webster) is the ability to use one’s knowledge effectively and readily in execution or performance. The argument being that in order to use knowledge effectively, it needs to be in a specific domain.
In July 2016 the Education Endowment Foundation in the UK released the results of a two-year study involving almost 100 schools that wanted to find out if playing chess would improve maths. The hypothesis was that the logical and systematic processes involved in being a good chess player would help students better understand maths i.e. the skills would transfer. The conclusion however found there were no significant differences in mathematical achievement between those having regular chess classes and the control group.

Long live double entry bookkeeping

This is an interesting topic and open to some degree of interpretation and debate but it highlights the difficult path curriculum designers have to tread when it comes to removing the old to make space for the new. In addition there is a strong argument to suggest that core principles and foundational knowledge are essential prerequisites for efficient learning.
But whatever happens, we need to keep double entry bookkeeping, not because knowing that every debit has a credit is important but it helps structure a way of thinking and problem solving that has enabled finance professional to navigate significant complexity and change since Luca Pacioli allegedly invented it in 1494.

And the case from 1893 – Carlill v Carbolic Smoke Ball Company

Synergy – Direct Instruction part 2

Last month’s blog introduced the idea that Direct Instruction (DI) which is a highly structured form of teaching was a very efficient way of delivering information. The challenge was that in a world where knowledge is largely free “drilling” information using rigid methods does little to develop the skills most valued by employers.

Earlier this year in an attempt to identify some of these higher-level skills, I am not a fan of the term soft skills, LinkedIn analysed hundreds of thousands of job advertisements. They produced a top 5, which are as follows: Creativity, Persuasion, Collaboration, Adaptability and Time management. We might add to this, the ability to think for yourself which in some ways underpins them all.

The modern world doesn’t reward you for what you know, but for what you can do with what you know. Andreas Schleicher

This month I want to expand on what DI is but also add to the argument that DI (teacher led) and discovery based (Student led) are not mutually exclusive, in fact when used together they work better than on their own.

Direct Instruction is learning led
The main reason that despite its many critics DI fails to go away is because of the significant amount of evidence that proves it works. And the reason it works is because it presents information in a brain friendly way.

Cognitive load, this is a very common instructional terms and refers to the limitation of short term or working memory to hold sufficient information at any one time. As a result, it’s better not to bombard the brain with too much information, meaning its more effective for students to reduce distraction and be presented with content broken down into smaller chunks, sequenced and taught individually before being linked together at a later date. This is one of the most important aspects of DI. Avoiding distraction refers not only to external distractions e.g. your mobile phone but information that is not required or is unnecessary in arriving at the desired learning outcome

Retrieval and spaced practice are both used in direct instruction and have been mentioned in previous blogs. They are well researched and the evidence is compelling as to their effectiveness.

Using examples to teach is also something strongly promoted. It is argued that the brain has the ability to use examples to build connections, ironically without DI e.g. if we are talking about pets and we said that a cat is an example of a pet but we already knew a cat was also an animal we could link the two. Next time when the term cat is mentioned we would know it was both a pet and an animal.

Discovery based (Student led – Autonomous – Constructivism)
Many of the discovery-based learning techniques have their roots in the work of psychologists Jean Piaget, Jerome Bruner, and Seymour Papert. The core argument is that self-discovery and the process of acquiring information for yourself makes that information more readily available when it comes to problem solving. In addition, it encourages creativity, motivation, promotes autonomy, independent learning and is self-paced.

It is not however without instruction. Teachers should guide and motivate learners to look for solutions by combining existing and new information, help students avoid distraction and simplify what to a student may appear complex. To expect the student to figure everything out for themselves would be incredibly inefficient and although might lead to a truly original idea is most likely to result in a feeling of wasted time and solutions we already know or are wrong.

Critical thinking processes such as reasoning and problem solving are intimately intertwined with factual knowledge that is stored in long-term memory Daniel Willinghams – Why Students Don’t Like School.

2 + 2 = 5 = Synergy
DI and the many discovery-based learning methods can be used together because together they are far more powerful and effective. Think more of them in terms of a venn diagram with highly effective learning in the middle where the circles overlap and DI in one circle and discovery based in the other. The mix is up to the teacher which in turn is dependent on the time available, the nature of the subject, their judgment of the students and the desired outcome.

You cannot tell students how to think but you can provide them with the building blocks, helping them learn along the way before giving them real world challenges with problems they will have to solve for themselves. Then its into the workplace where the real learning experience will begin.

Learn faster with Direct Instruction – Siegfried Engelmann

What we need to learn is changing, knowledge is free, if you want the answer just google it. According to the World Economic Forum’s Future of Jobs Survey, there is an ever-greater need for cognitive abilities such as creativity, logical reasoning and problem solving. And with advances in AI, machine learning and robotics many of the skills previously valued will become redundant.

No need for the Sage on the stage
These demands have led to significant change in the way learning is happening, no longer should students be told what to think, they need to be encouraged to think for themselves, Socratic questioning, group work, experiential learning and problem based learning have all become popular, and Sir Ken Robinson Ted lecture, do schools kill creativity has had 63 million views.

Sir Kens talk is funny and inspiring and I recommend you watch it, but I want to challenge the current direction of travel or at least balance the debate by promoting a type of teaching that has fallen out of fashion and yet ironically could form the foundation upon which creativity could be built – Direct Instruction.

Direct Instruction – the Sage is back
The term direct instruction was first used in 1968, when a young Zig Engelmann a science research associate proved that students could be taught more effectively if the teacher presented information in a prescriptive, structured and sequenced manner. This carefully planned and rigid process can help eliminate misinterpretation and misunderstanding, resulting in faster learning. But most importantly it has been proven to work as evidenced by a 2018 publication which looked at over half a century of analysis and 328 past studies on the effectiveness of Direct Instruction.

Direct Instruction was also evaluated by Project Follow Through, the most extensive educational experiment ever conducted. The conclusion – It produced significantly higher academic achievement for students than any of the other programmes.

The steps in direct instruction

It will come as no surprise that a method of teaching that advocates structure and process can be presented as a series of steps.

Step 1 Set the stage for learning – The purpose of this first session is to engage the student, explaining specifically what they should be able to do and understand as a result of this lesson. Where possible a link to prior knowledge should also be made.
Step 2 Present the material – (I DO) The lesson should be organised, broken down into a step-by-step process, each one building on the other with examples to show exactly how it can be applied. This can be done by lecture, demonstration or both.
Step 3 Guided practice – (WE DO) This is where the tutor demonstrates and the student follows closely, copying in some instances. Asking questions is an important aspect for the student if something doesn’t make sense.
Step 4 Independent practice – (YOU DO) Once students have mastered the content or skill, it is time to provide reinforcement and practice.

The Sage and the Guide
The goal of Direct Instruction is to “do more in less time” which is made possible because the learning is accelerated by clarity and process.

There are of course critics, considering it a type of rote learning that will stifle the creativity of both teacher and student, and result in a workforce best suited for the industrial revolution rather than the fourth one. But for me it’s an important, effective and practical method of teaching. That when combined with inspirational delivery and a creative mindset will help students develop the skills to solve the problems of tomorrow or at least a few of them.

The independent learner – Metacognition

Metacognition is not a great word but it’s an important one when it comes to learning, especially if you are studying at higher academic levels or on your own. Cognition refers to the range of mental processes that help you acquire knowledge and understanding or more simply, learn. These processes include the storage, manipulation, and retrieval of information. Meta on the other hand means higher than or overarching, put the two together and we are talking about something that sits above learning, connecting it by way of thought. For this reason, it’s often described as thinking about thinking or in this context thinking about how you learn.

Smarter not harder

When you have a lot to learn in terms of subject matter it may feel like a distraction to spend any time learning something other than what you must know, let alone reflecting on it, but this fits under the heading of working smarter not harder, if you can find more effective ways of learning that must be helpful.
As mentioned earlier cognition is about mental processes, storage and retrieval relate to memory, manipulation, to the shifting of attention, changing perception etc. But the meta aspect creates distance, allowing us to become aware of what we are doing, standing back and observing how for example perception has changed, this reflection is a high-level skill that many believe is unique to humans. One final aspect is that we can take control of how we learn, planning tasks, changing strategies, monitoring those that work and evaluating the whole process.

Keeping it simple

Its very easy to overcomplicate metacognition, in some ways its little more than asking a few simple questions, thinking about how you are learning, what works and what doesn’t.  Here are some examples as to how you might do this.

  • Talk to yourself, ask questions at each stage, does this make sense, I have read it several times maybe I should try writing it down.
  • Ask, have I set myself sensible goals?
  • Maybe it’s time to try something different, for example mind mapping, but remember to reflect on how effective it was or perhaps was not.
  • Do I need help from anyone, this could be a fellow student or try YouTube which is a great way to find a different explanation in a different format?

Clearly these skills are helpful for all students but they are especially valuable when studying on your own perhaps on a distance learning programme or engaged in large periods of self-study.

Benefits

There are many reasons for investing some time in this area.

  • Growing self-confidence – by finding out more about how you learn you will discover both your strengths and weaknesses. Confidence isn’t about being good at everything but understanding your limitations.  
  • Improves performance – research has shown that students who actively engage in metacognition do better in exams.
  • Gives control – you are no longer reliant on the way something is taught; you have the ability to teach yourself. Being an autonomous learner is also hugely motivational.
  • The skills are transferable – this knowledge will not only help with your current subjects but all that follow, not to mention what you will need to learn in the workplace.  

It will take some time initially but, in a way, metacognition is part of learning, it’s an essential component and as such you will end up knowing more about yourself at some point, even if you don’t want to, so why not do it sooner rather than later.

And just for fun – Sheldon knows everything about himself – even when he is wrong

Intelligence defined – Inspiring learning leaders – Howard Gardner

Intelligence is a term that is often used to define people, David is “clever” or “bright” maybe even “smart” but it can also be a way in which you define yourself. The problem is that accepting this identity can have a very limiting effect on motivation, for example if someone believes they are not very clever, how hard will they try, effort would be futile. And yet it is that very effort that can make all the difference. See brain plasticity.
I wrote about an inspiring learning leader back in April this year following the death of Tony Buzan, the creator of mind maps. I want to continue the theme with Howard Gardner (Professor of Cognition and Education at the Harvard Graduate School of Education) who I would guess many have never heard of but for me is an inspirational educator.

Multiple Intelligence Theory (MIT)
Now in fairness Howard Gardner is himself not especially inspiring but his idea is. Gardner is famous for his theory that the traditional notion of intelligence, based on I.Q. is far too limited. Instead, he argues that there are in fact eight different intelligences. He first presented the theory in 1983, in the book Frames of Mind – The Theory of Multiple Intelligences. 

This might also be a good point to clarify exactly how Gardner defines intelligence.

Intelligence – ‘the capacity to solve problems or to fashion products that are valued in one or more cultural setting’ (Gardner & Hatch, 1989).

Multiple intelligences

  1. SPATIAL – The ability to conceptualise and manipulate large-scale spatial arrays e.g. airplane pilot, sailor
  2. BODILY-KINESTHETIC – The ability to use one’s whole body, or parts of the body to solve problems or create products e.g. dancer
  3. MUSICAL – Sensitivity to rhythm, pitch, meter, tone, melody and timbre. May entail the ability to sing, play musical instruments, and/or compose music e.g. musical conductor
  4. LINGUISTIC – Sensitivity to the meaning of words, the order among words, and the sound, rhythms, inflections, and meter of words e.g. poet
  5. LOGICAL-MATHEMATICAL – The capacity to conceptualise the logical relations among actions or symbols e.g. mathematicians, scientists
  6. INTERPERSONAL – The ability to interact effectively with others. Sensitivity to others’ moods, feelings, temperaments and motivations e.g. negotiator
  7. INTRAPERSONAL- Sensitivity to one’s own feelings, goals, and anxieties, and the capacity to plan and act in light of one’s own traits.
  8. NATURALISTIC – The ability to make consequential distinctions in the world of nature as, for example, between one plant and another, or one cloud formation and another e.g. taxonomist

I have taken the definitions for the intelligences direct from the MI oasis website.

It’s an interesting exercise to identify which ones you might favour but be careful, these are not learning styles, they are simply cognitive or intellectual strengths. For example, if someone has higher levels of linguistic intelligence, it doesn’t necessarily mean they prefer to learn through lectures alone.

You might also want to take this a stage further by having a go at this simple test. Please note this is for your personal use, its main purpose is to increase your understanding of the intelligences.

Implications – motivation and self-esteem
Gardner used his theory to highlight the fact that schools largely focused their attention on linguistic and logical-mathematical intelligence and rewarded those who excelled in these areas. The implication being that if you were more physically intelligent the school would not consider you naturally gifted, not “clever” as they might if you were good at maths. The advice might be that you should consider a more manual job. I wonder how that works where someone with high levels of physical and spacial intelligence may well find themselves playing for Manchester United earning over £100,000 a week!

But for students this theory can really help build self-esteem and motivate when a subject or topic is proving hard to grasp. No longer do you have to say “I don’t understand this, I am just not clever enough”. Change the words to “I don’t understand this yet, I find some of these mathematical questions challenging, after all, its not my strongest intelligence”. “I know I have to work harder in this area but when we get to the written aspects of the subject it will become easier”.

This for me this is what make Gardner’s MIT so powerful it’s not a question of how intelligent you are but which intelligence(s) you work best in.

“Discover your difference, the asynchrony with which you have been blessed or cursed and make the most of it.” Howard Gardner

As mentioned earlier Howard Gardner is not the most inspirational figure and here is an interview to prove it, but his theory can help you better understand yourself and others, and that might just change your perception of who you are and what you’re capable of – now that’s inspiring!

MI Oasis – The Official Authoritative Site of Multiple Intelligences 

Dont worry, Be happy

It’s so easy for well-meaning people to say don’t worry, it’s not bad advice it’s just not very helpful. Firstly, as I have mentioned in previous blogs anything framed as a don’t is difficult for the brain to process. Far better to tell someone what to do than tell them what not.

Secondly If you look up a definition of worry it will say something like, “thinking about problems or unpleasant events that you don’t want to happen but might, in a way that makes you feel unhappy and or frightened.” What a strange concept, why would anyone want to do this?

Having started but I hasten to add not yet finished the second of Yuval Noah Harari’s bestselling books Homo Deus, it’s hard not to question the reason we might have evolved to hold such a strange view. What possible evolutionary purpose could feeling bad or frightened serve?

Don’t worry be happy, In every life we have some trouble. When you worry you make it double.

Worry can be helpful
The truth is worry can be helpful, it’s a means by which the brain can help you prioritises events. It’s not a nice feeling but ultimately humans have evolved to survive and reproduce, they are not meant to be vehicles for happiness. Think of all that goes through your head in a day, the words, the emotions, the noise. How can you possibly figure out what is important and what is not unless you have a little help? Worry does just that, it helps us think about an event in the future that might happen, this heightened focus puts it above the events of the day giving us a chance to do something about it.

Action is worry’s worst enemy – Proverb

Worry, stress and anxiety
Worry tends to be specific; I am worried that I won’t be able to pass the maths exam on the 23rd of September. Worry is future based, it anticipates a problem that has not yet happened, the main reason is to make you do something about it today. Stress on the other hand is relatively short term and arises when the gap between what you need to do and are able to isn’t enough. For example, I haven’t got time to learn everything I need to pass this exam, there is just too much to learn. After the event, the stress level will fall. Anxiety is the big brother of them both, it is far more general than worry, for example, I am not very clever and never have been. You’re not really sure what cleverness is, but you’re still able to be anxious about it. Both stress and worry can lead to anxiety if they are intense or go on for too long.

Worry can wake you in the night, asking your brain to solve the problem. However, unless fully awake It’s unlikely you will be able to do so, instead you will simply turn the problem over in your head again and again and deprive yourself of that all-important sleep. Best put it to the back of your mind if possible, think of something else, the problem will feel less important in the morning and after a good night’s sleep you will be far more able to solve it.

It helps to write down half a dozen things which are worrying me. Two of them, say, disappear; about two of them nothing can be done, so it’s no use worrying; and two perhaps can be settled – Winston Churchill

What to worry about
The human mind is so creative it’s possible for it to worry about almost anything. As one worry is resolved another can appear.

  • Don’t know what to do – where do I start, what should I learn first
  • Don’t know how to do it – how can I get this into my head, what is the best way of learning?
  • Don’t know if I can do it, self-doubt – I am not clever enough. This can lead to anxiety.
  • Don’t know how long it will take, what if I don’t have enough time?

One technique to change these from unknowns to possibilities is to follow the advice of Carol Dweck who suggests you add a word to the end of the sentence – the word is YET. For example, I don’t know what to do YET! Although this may seem trivial it moves the worry from unsolvable to something that if you spend time on can be achieved.

The list of “dont knows” are all triggers to help motivate you, they are calls to action, the only way to reduce the worry is to do something, even if as Churchill suggest you make a simple list. However, there are situations when you can’t take action or at least not an obvious one, perhaps when waiting for exam results. It might seem that all you can do is worry. The bad news is, putting yourself in what can feel like a permanent state of worry can result in anxiety and won’t turn that fail into a pass. But all is not lost, planning for the worst whilst hoping for the best is sensible, coming up with a plan that is achievable can remove the pressure, leaving the feeling that even if you do fail there is a way forward and you can do something about it.

We can end with another quote from Winston Churchill who I am sure had a few worries in his time.

Let our advance worrying become advance thinking and planning