Feedback – The breakfast of champions

There was an interesting piece of research that came out recently, it referred to something called “Temporary mark withholding”. This as the name might suggest is providing students with written feedback but without marks. On the face of it this might seem odd and frankly unhelpful, how can you judge your performance if you don’t know how you compare against what is expected?

To answer that question, you need to ask a far more fundamental one, what’s the purpose of giving feedback in the first place?

Feedback – task or ego
We need to separate feedback from criticism which often implies that the person giving it is trying “to find fault”, although it’s possible to make it sound a little more positive by calling it constructive criticism. In simple terms criticism is more about what was wrong in the past whilst feedback directs you towards what you should do to improve in the future. But when we are thinking in terms of learning it gets a little more complicated, Dylan William talks about whether its ego involving or task involving feedback. The first of these would include offering praise such as “well done you have produced an excellent answer” but he states this is rarely effective and can actually lower achievement. However when the feedback focuses on what the student needs to do to improve, and explains how they can do it, then you get a significant impact on student achievement.

He goes on to say that “good feedback causes thinking, the first thing a student needs to do when they receive feedback is not to react emotionally, not disengage – but think”. It might be worth adding that Dylan William is talking about the impact of feedback on student learning not on how the student might feel in terms of motivation, self-confidence etc. There is clearly a place for ego type feedback it’s just not that effective when sat alongside a direct instruction because the emotional response often blocks or detracts what needs to be understood for the student to improve.

Formative and Summative assessment
There is one last piece of information that will help us make sense of the reasons why temporary mark withholding might work, the difference between formative and summative assessment.

Summative – The purpose of summative assessment is to “sum up” student learning at the end of a chunk of learning or completion of a course and compare it against a standard, normally a pass rate. This is why exams are often criticised, it’s not that testing is bad, it’s how the results are used, often polarising and narrowing opinion as to an individual’s performance, pass and you’re a hero, fail and you’re a villain. It gets worse when you then put those results into a league table and publish them, with the winners at the top and losers at the bottom for all to see and draw often incorrect conclusions.

Summative assessment is however valuable, if you score below the target, it tells you that more effort or work is needed, also that you are not performing well on a particular topic, but it provides no guidance as to what you need to do to improve.

Formative – The purpose of formative assessment is to monitor progress on an ongoing basis in order to help the teacher identify the “gap” between what the student knows and needs to know. This is where the magic happens, firstly in finding out where the gap is e.g. Where is the student currently compared to where they need to be, then figuring out the best way of getting them to that higher standard e.g. what do they need to do to improve. Formative assessment can be a test, a quiz or simply observation.

Lessons for students
And this is why holding back the marks works, what the piece of research (et al) highlighted, is that when students get their marks, they effectively prioritise the grades over the written comments. The good students ignore the comments because they don’t think they have anything to learn, and the weaker students are demotivated so also ignore them.

The key point for students is this, by all means look at the mark but resist that emotional (ego) reaction to pat yourself on the back or beat yourself up. Read all the comments with an open mind, asking two simple questions, can I see that there is a gap between my answer and the model answer and secondly do I know exactly what to do next to close it? The feedback, if it is good of course should make this as easy a process as possible.

The fact that your script might only say “see model answer” or have a cross with the correct number written next to it, is more an example of poor marking with little or no feedback. Perhaps you should return your script providing the marker/teacher with some feedback highlighting the gap between good marking and bad marking but most importantly what they should do to improve…..

And if your interested, here is the link to Dylan William explaining the importance of formative assessment.

Reference – Kuepper-Tetzel & Gardner – Jackson & Marks, 2016 – Taras, 2001, Winstone et al., 2017 – Ramaprasad, 1983

The single most important thing for students to know – Cognitive load

Back in 2017 Dylan Williams, Professor of Educational Assessment at UCL described cognitive load theory (CLT) as ‘the single most important thing for teachers to know’. His reasoning was simple, if learning is an alteration in long term memory (OFSTED’s definition) then it is essential for teachers to know the best ways of helping students achieve this. At this stage you might find it helpful to revisit my previous blog, Never forget, improving memory, which explains more about the relationship between long and short-term memory but to help reduce your cognitive load…. I have provided a short summary below.

But here is the point, if CLT is so important for teachers it must also be of benefit to students.

Cognitive load theory
The term cognitive load was coined by John Sweller in a paper published in the journal of Cognitive Science in 1988. Cognitive load is the amount of information that working/short term memory can process at any one time, and that when the load becomes too great, processing information slows down and so does learning. The implication is that because we can’t do anything about the short-term nature of short-term memory, we can only retain 4 + or – 2 chunks of information before it’s lost, learning should be designed or studying methods changed accordingly. The purpose of which is to reduce the ‘load’ so that it can more easily pass into long term memory where the storage capacity is infinite.

CLT can be broken down into three categories:

Intrinsic cognitive load – this relates to the inherent difficulty of the material or complexity of the task. Some content will always have a high level of difficulty, for example, solving a complex equation is more difficult than adding two numbers together. However, the cognitive load arising from a complex task can be reduced by breaking it down into smaller and simpler steps. There is also evidence to show that prior knowledge makes the processing of complex tasks easier. In fact, it is one of the main differences between an expert and a novice, the expert requires less short-term memory capacity because they already have knowledge stored in long term memory that they can draw upon. The new knowledge is simply adding to what they already know. Bottom line – some stuff is just harder.

Extraneous cognitive load – this is the unnecessary mental effort required to process information for the task in hand, in effect the learning has been made overly difficult or confusing. For example, if you needed to learn about a square, it would be far easier to draw a picture and point to it, than use words to describe it. A more common example of extraneous load is when a presenter puts too much information on a PowerPoint slide, most of which adds little to what needs to be learned. Bottom line – don’t make learning harder by including unimportant stuff.

Germane cognitive load – increasing the load is not always bad, for example if you ask someone to think of a house, that will increase the load but when they have created that ‘schema’ or plan in their mind adding new information becomes easier. Following on with the house example, if you have a picture of a house in your mind, asking questions about what you might find in the kitchen is relatively simple. The argument is that learning can be enhanced when content is arranged or presented in a way that helps the learner construct new knowledge. Bottom line – increasing germane load is good because it makes learning new stuff easier.

In summary, both student and teacher should reduce intrinsic and extraneous load but increase germane.

Implications for learning
The three categories of cognitive load shown above provide some insight as to what you should and shouldn’t do if you want to learn more effectively. For example, break complex tasks down into simpler ones, focus on what’s important and avoid unnecessary information and use schemas (models) where possible to help deal with complexity. There are however a few specifics that relate to the categories worthy of mention.

The worked example effect – If you are trying to understand something and continual reading of the text is having little impact, it’s possible your short-term memory has reached capacity. Finding an example of what you need to understand will help free up some of that memory. For example…….…if I wanted to explain that short term memory is limited I might ask you to memorise these 12 letters, SHNCCMTAVYID. But because this will exceed the 4+ or – 2 rule it will be difficult and hopefully as a result prove the point. In this situation the example is a far more effective way of transferring knowledge than pages of text.

The redundancy effect – This is most commonly found where there is simply too much unnecessary or redundant information. It might be irrelevant or not essential to what you’re trying to learn. In addition, it could be the same information but presented in multiple forms, for example an explanation and diagram on the same page. The secret here is to be relatively ruthless in pursuing what you want to know, look for the answer to your question rather than getting distracted by adjacent information. You may also come across this online where a PowerPoint presentation has far too much content and the presenter simply reads out loud what’s on the slides. In these circumstances, it’s a good idea to turn down the sound and simply read the slides for yourself. People can’t focus when they hear and see the same verbal message during a presentation (Hoffman, 2006).

The split attention effect – This occurs when you have to refer to two different sources of information simultaneously when learning. Often in written texts and blogs as I have done in this one, you will find a reference to something further to read or listen to, ignore it and stick to the task in hand, grasp the principle and only afterwards follow up on the link. Another way of reducing the impact of split attention is to produce notes that reduce the conflict that arises when trying to listen to the teacher and make notes at the same time. You might want to use the Cornel note taking method, click here to find out more.

But is it the single most important thing a student should know?
Well maybe, maybe not but its certainly in the top three. The theory on its own will not make you a better learner but it goes a long way in explaining why you can’t understand something despite spending hours studying, it provides guidance as to what you can do to make learning more effective but most importantly it can change your mindset from – “I’m not clever enough” to, “I just need to reduce the amount of information, and then I’ll get it”.

And believing that is priceless, not only for studying towards your next exam but in helping with all your learning in the years to come.

Motivated ignorance – is ignorance better than knowing?

If it’s true that the cat wasn’t killed by curiosity and that ignorance was to blame (see last month’s blog) then it follows that we should better educate the cat if it is to avoid an untimely death. But what if the cat chooses to remain ignorant?

Ignorant – lacking knowledge or awareness in general; uneducated or unsophisticated.

In a paper published last February, Daniel Williams puts forward a very challenging and slightly worrying proposition, that when the costs of acquiring knowledge outweigh the benefits of possessing it, ignorance is rational. In simple terms this suggests that people are not “stupid”, or ignorant, when they are unaware of something, they are in fact being logical and rational, effectively choosing not to learn.

“Facts do not cease to exist because they are ignored.” – Aldous Huxley

Beware the man of a single book St. Thomas Aquinas
In terms of education this is clearly very important, but it has far wider implications for some of the challenges we are facing in society today. There is an increasing divergence in opinion across the world with people holding diametrically opposite views, both believing the other is wrong. We can probably attach personas to these groups, on the one side there are the knowledgeable and well educated, on the other those who may not be in possession of all the facts but trust their emotions and believe in community and identity. The two groups are clear to see, those that believe in climate change and those that don’t, Trump supporters and anyone but Trump supporters, take the vaccine or anti-vaccine.

The stakes could not be higher.

“Ignorance is a lot like alcohol. The more you have of it, the less you are able to see its effect on you.” – Jay Bylsma

Motivated ignorance
The idea that choosing to be ignorant could be both logical and rational is not new. In his book An Economic Theory of Democracy first published in 1957 Anthony Downs used the term “rational ignorance” for the first time to explain why voters chose to remain ignorant about the facts because their vote wouldn’t count under the current political system. The logic being that it was rational to remain ignorant if the costs of becoming informed, in this case the effort to read and listen to all the political debate outweigh the benefits, of which the voters saw none.

“If you think education is expensive, try ignorance.” – Robert Orben

Daniel Williams is making a slightly different point; he argues that motivated ignorance is a form of information avoidance. The individual is not remaining ignorant because the costs of obtaining the information are too high, they are actively avoiding knowledge for other reasons. He also goes on to say that if you are avoiding something it follows that you were aware of its existence in the first place, what the US Secretary of Defense Donald Rumsfeld so eloquently referred to as a known unknown.

We need one final piece of the jigsaw before we can better understand motivated ignorance, and that is motivated reasoning. Motivated reasoners reach pre-determined conclusions regardless of the evidence available to them. This is subtly different to confirmation bias, which is the tendency to only notice information that coincides with pre-existing beliefs and ignores information that doesn’t.

If motivated reasoning is the desire to seek out knowledge to support the conclusions you want, motivated ignorance is the opposite, it is the desire to avoid knowledge in case it gives you the “wrong” answer. For example, although you might feel ill, you avoid going to the doctors to find out what’s wrong because you don’t want to know what the doctor might say.

The question that we should ask is, why don’t you want to know the answer? The implication here is that something is stopping you, in this instance perhaps the emotional cost of the doctor’s prognosis is greater than the gain. Similar examples can be found in other domains, the husband who doesn’t ask as to his wife’s whereabouts because he is afraid, she is having an affair, and doesn’t want it confirmed, although in reality she might have just been late night shopping!

“If ignorance is bliss, there should be more happy people.” – Victor Cousin

The idea that we should always seek out knowledge to be better informed clearly has its limitations and that far from being illogical motivated ignorance has some degree of rationality.

What have we learned?
Human beings do not strive to answer every question nor have within their grasp all the knowledge that exists. We are selective based on how much time we have available, how we might like to feel and, in some instances, the social groups we would like to belong. There is always a sacrifice or trade-off for knowledge and sometimes the price might be considered too high.

The answer to ignorance is not to throw more information at the problem in an attempt to make the ignorant more enlightened. If you don’t believe in climate change, not even a well-crafted documentary by David Attenborough is likely to help If the motivation for choosing ignorance is not addressed. This over supply of information was evident in the Brexit debate here in the UK. For those who had “made up their mind”, providing very powerful arguments by equally powerful captains of industry as to why leaving Europe was a bad idea failed to educate because most chose not to listen.

The role of education and learning has to be inspiration and curiosity, we need to get closer to those underlying motivational barriers and break them down. We have to help people appreciate the feeling you get as a result of challenging your views and coming out the other side with a better and possibly different answer. There is a need to move away from the competitive nature of right and wrong and the idea that changing your mind is a sign of weakness.

“When the facts change, I change my mind. What do you do, sir?”- attributed to J Maynard Keynes

And maybe we have to accept that although there is a price to pay whatever it is, it will be worth it.

“no people can be both ignorant and free.” – Thomas Jefferson

If it wasn’t curiosity, what did kill the cat?

In 2006 Professor Dr. Ugur Şahin, an oncologist was working on a curiosity-driven research project to help find out if it might be possible to develop a vaccine to control and destroy cancerous tumours by activating the body’s own immune system. This approach was fundamentally different to the more common treatments of radiation and chemotherapy. Curiosity driven projects often have no clear goal but allow scientists to take risks and explore the art of the possible.

In 2008 Dr. Ugur Sahin and his wife Ozlem Tureci founded a small biotech company called BioNTech who you may never have heard of, if it wasn’t for COVID-19. Because together with Pfizer, BioNTech are the suppliers of the first Covid vaccine to be used in the UK. That early curiosity driven research in 2006 provided Sahin and Tureci with the answers to our 2020 problem.

Curiosity is the wick in the candle of learning – William Arthur Ward
Curiosity is the desire to know or learn something in the absence of extrinsic rewards. The point being, there is no reward other than the answer itself. It is a psychological trait and because of that, has a genetic component, some people are just born more curious. However, nurture has an equally important role to play, and although it’s argued you can’t teach curiosity you can encourage people to become more curious by using different techniques. See below.

Sophie von Stumm, a professor of Psychology in Education from the University of York believes that curiosity is so important in terms of academic performance that it should sit alongside intelligence and effort (conscientiousness) as a third pillar. Her research found that intelligence, effort and curiosity are key attributes of exceptional students.

Curiosity follows an inverted U-shape when shown in graphical form. Imagine a graph, along the horizontal axis we have knowledge and on the vertical, curiosity. When we first come across a new subject, we know very little and as such our curiosity rises as does the level of dopamine, but as we find out more and more our curiosity will reach a peak before ultimately falling.

“When you’re curious you find lots of interesting things to do.” Walt Disney

Curiosity types – it would be far too simplistic to think that there is only one type of curiosity. Mario Livio, an astrophysicist talks about a few of them in his book Why? What Makes Us Curious.

  • Epistemic curiosity is the one we have been talking about so far and relates to the type of curiosity that drives research and education. It’s generally a pleasurable state, the result of a release of dopamine that comes from mastery and the anticipation of reward.
  • Perceptual curiosity is primal and exists on a continuum between fear and satisfaction. it’s the curiosity we feel when something surprises us or when we get an answer that doesn’t quite fit with what we expected. The motivation is to seek out something novel although the curiosity will diminish with continued exposure.
  • Diversive curiosity is transient and superficial and is often experienced when swiping through you Twitter feed. Its effectively a means of jumping from topic to topic and normally fails to result in any form of meaningful insight or understanding.

You might think that as we grow older, we become less curious simply because we know more. However, although we may lose some elements of diversive curiosity or the ability to be surprised, research shows that epistemic curiosity remains roughly constant across all age groups

But why?
The roots to curiosity can be traced back to a form of neoteny, an evolutionary condition that means although we reach maturity, we retain juvenile characteristics. Effectively we are more childlike than other mammals, continuing to be curious and playful throughout our lives. You can often tell if people are curios by looking at their eyes, which will become more dilated. This indicates that noradrenaline, a neurotransmitter has been released in the brainstem’s locus coeruleus, the part of the brain most strongly linked to arousal, vigilance, and attention. In addition, noradrenaline is also integral to a number of higher cognitive functions ranging from motivation to working memory and therefore hugely valuable for learning.

This may well be a slightly complicated way of saying that if you are curious about something, you are more likely to pay attention, making it easier to remember and in so doing learn.

How to become more curious

“Millions saw the apple fall, but Newton asked why.” Bernard Baruch

Research into curiosity has confirmed some of what we might have already assumed to be correct, for example in a paper published in 2009, it concluded that people were more likely to recall answers to questions they were especially curious about. However it also showed that curiosity increased when answers were guessed incorrectly, suggesting that surprise was a factor in improved retention.

“I know you won’t believe me, but the highest form of human excellence is to question oneself and others.” Socrates

The concept that curiosity is based on an Information gap was first put forward by George Loewenstein in 1994 which leads to one of the most powerful tools we can use to improve curiosity, asking questions. The best question to ask is probably WHY, but don’t forget Kipling’s other 5 honest serving men, WHAT, WHEN, HOW, WHERE and WHO. Below are a few more ideas.

  • Ask Socratic questions. This involves asking open ended questions that provoke a meaningful exploration of the subject, this process sits at the heart of critical thinking.
  • Create environments that promote curiosity. Challenges that need solving require a curious mind. Case studies are also more of interest, providing several different routes to explore.
  • Guess the answer first. As mentioned above, if you guess first it increases the surprise factor. Loewenstein also argued that guessing with feedback stimulates curiosity because it highlights the gap between what you thought you knew and the correct answer.
  • Failure is feedback. Finding out why you got something wrong can be just as interesting as knowing that you are right, it certainly increases curiosity.
  • Start with the curious part of a subject. You may not be curious about the whole subject, but try to find the part you are interested in and start there.

And if you would like to find out more

What’s the answer, what did kill the cat?

it was IGNORANCE…………

Learning is emotional

We are all emotional, it’s part of what it means to be human, your emotions help navigate uncertainty and experience the world. For some it’s even considered an intelligence, requiring the ability to understand and manage your own emotions, as well as others.

For many years’ emotions were considered something that “got in the way” of learning, effectively disrupting the efficiency, but it is now believed that emotion has a substantial influence on cognitive processes, including perception, attention, memory, reasoning, and problem solving.

Emotions, feelings and mood

In last month’s blog I outlined how sensory input impact memory and the story continues because memories are a key part of emotion and both are found in something called the limbic system, a group of interconnected structures located deep within the brain. The limbic system plays an important part in controlling emotional responses (Hypothalamus), coordinating those responses (Amygdala), and laying down memories (Hippocampus).

There is no single definition of emotion that everyone agrees upon, what we know is, it relies upon the release of chemicals in response to a trigger which in turn leads to three distinct phases. Firstly, a subjective experience, perhaps a feeling of anger, although not everyone would necessarily respond in the same way to the same stimulus. Secondly, a physiological response for example, raised blood pressure, increased heart rate and lastly a behavioural or expressive response, a furrowing of the brow, showing of teeth etc.  

Although emotions are not believed to be hard-wired, in the 1970s Paul Eckman identified six emotions that were universally experienced in all human cultures. They are happiness, sadness, disgust, fear, surprise, and anger. This list has however been expanded to include others for example shame, embarrassment, excitement etc.

Feelings on the other hand arise from emotions, they are a conscious interpretation of the stimulus, asking questions as to what it might mean, some refer to feelings as the human response to emotions.  And finally, moods which are more general and longer term, an emotion might exist for a fraction of a second but moods can last for hours, even days and are sometimes a symptom of more worrying mental health issues.   In addition, moods are not necessarily linked to a single event but shaped by different events over time.

Impact on learning

Understanding what this means for students and educators is complex and in a short blog it’s only possible to introduce the subject. But there are a few lessons we can learn.

  • Emotions direct attention – if students can make an emotional connection with what they are learning it will improve levels of concentration and enjoyment.
  • Consider the emotional environment – the emotional context in which information is delivered can help students experience more positive emotions such as happiness and one of the most powerful emotions in learning, curiosity.
  • Avoid negative emotions – students who are in a continual state of anxiety or fearing failure whilst learning will find concentrating and retaining information difficult. This is partly the result of the brain going into its fight or flight mode which effectively narrows its focus to the task in hand.
  • Emotional state is contagious – the emotional state of the teacher can have a significant impact on students.
  • Memory and emotions are bound together – emotions have a considerable influence on memory. This is why we remember more emotionally charged events such as September 11 or the London bridge attack in 2017.

And if you would like to find out moreHow do emotions impact learning.

Dedication – in a lifetime we will all experience many emotions some good, some bad, but none are as powerful or more gratefully received than a mother’s love, for my mom.

Fairness and mutant algorithms

Back in 2014, I wrote two blogs (part 1 & part 2) about examinations and asked if they were fit for purpose. The conclusion – they provide students with a clear objective to work towards, the process is scalable and the resulting qualification is a transferable measure of competency. They are of course far from perfect, exams do not always test what is most needed or valued and when results are presented in league tables, they give a too simplistic measure of success.

However, I didn’t ask if examinations were fair, that is treating students equally without favouritism or discrimination.

In the last two weeks the question around fairness has been in the headlines following the government’s decision to cancel all A level and GCSE examinations in order to reduce the risk of spreading Covid-19. Whilst many agreed with this it did cause a problem, how could we fairly assess student performance without an examination?

Are examinations fair?

This is not a question about the fairness of an exam as a type of assessment, there are for example other ways of testing ability, course work, observations etc. Its asking if the system of which an examination is part treats all students equally, without bias.

In the world of assessment exams are not considered sufficiently well designed if they aren’t both reliable and valid. It might be interesting to use this as a framework to consider the fairness of the exam system.  

  • Validity – the extent to which it measures what it was designed to measure e.g. add 2+2 to assess mathematical ability.
  • Reliability – the extent to which it consistently and accurately measures learning. The test needs to give the same results when repeated. e.g. adding 2+2 is just as reliable as adding 2+3. The better students will get them both right and the weaker students both wrong.

The examining bodies will be very familiar with these requirements and have controls in place to ensure the questions they set are both valid and reliable. But even with sophisticated statistical controls, writing questions and producing an exam of the same standard over time is incredibly difficult.  Every year the same questions are asked, have students performed better or is it just grade inflation, were A levels in 1951 easier or harder than today? It’s the reliability of the process that is most questionable.

If we step away from the design of the exam to consider the broader process, there are more problems. Because there are several awarding bodies, AQA, OCR, Edexcel to name but three, students are by definition sitting different examinations. And although this is recognised and partly dealt with by adjusting the grade boundaries, it’s not possible to completely eliminate bias. It would be much better to have one single body setting the same exam for all students.

There is also the question of comparability between subjects, is for example A level maths the same as A level General studies? Research conducted by Durham University in 2006 concluded that a pupil would be likely to get a pass two grades higher in “softer” subjects than harder ones. They added that “from a moral perspective, it is clear this is unfair”. The implication being that students could miss out on university because they have chosen a harder subject.

In summary, exams are not fair, there is bias and we haven’t even mentioned the impact of the school you go to or the increased chances of success the private sector can offer. However, many of these issues have been known for some time and a considerable amount effort goes into trying to resolve them. Examinations also have one other big advantage, they are accepted and to a certain extent the trusted norm and as long as you don’t look too closely, they work or at least appear to. Kylie might be right, “it’s better the devil you know”….. than the devil you don’t.

The mutant algorithm

Boris Johnson is well known for his descriptive language, this time suggesting that the A level problem was the result of a mutant algorithm. But it was left to Gavin Williamson the Secretary of State for Education to make the announcement that the government’s planned method of allocating grades would need to change.

We now believe it is better to offer young people and parents’ certainty by moving to teacher assessed grades for both A and AS level and GCSE results”

The government has come in for a lot of criticism and even their most ardent supporters can’t claim that this was handled well.

But was it ever going to be possible to replace an exam with something that everyone would think fair?

Clarification on grading

To help answer this question we should start with an understanding of the different methods of assessing performance.

  1. Predicted Grades (PG) – predicted by the school based on what they believe the individual is likely to achieve in positive circumstances. They are used by universities and colleges as part of the admissions process. There is no detailed official guidance as to how these should be calculated and in general are overestimated. Research from UCL showed that the vast majority, that is 75% of grades were over-predicted.
  2. Centre Assessed Grades (CAG) – These are the grades which schools and colleges believed students were most likely to achieve, if the exams hadn’t gone ahead. They were the original data source for Ofqual’s algorithm. It was based on a range of evidence including mock exams, non-exam assessment, homework assignments and any other record of student performance over the course of study.  In addition, a rank order of all students within each grade for every subject was produced in order to provide a relative measure. These are now also being referred to as Teacher Assessed Grades (TAG)
  3. Calculated grades (CG) – an important difference is that these are referred to as “calculated” rather than predicted! These are the grades awarded based on Ofqual’s algorithm. They use the CAG’s but adjusts them to ensure they are more in line with prior year performance from that school. It is this that creates one of the main problems with the algorithm…

it effectively locks the performance of an individual student this year into the performance of students from the same school over the previous three years.

Ofqual claimed that if this standardisation had not taken place, we would have seen the percentage of A* grades at A-levels go up from 7.7 % in 2019 to 13.9 % this year. The overall impact was that the algorithm downgraded 39 % of the A-level grades predicted by teachers using their CAG’s. Click here to read more about how the grading works.

Following the outcry by students and teachers Gavin Williamson announced on the 17th of August that the Calculated Grades would no longer be used, instead the Centres Assessed Grades would form the basis for assessing student performance.  But was this any fairer, well maybe a little, but it almost certainly resulted in some students getting higher grades than they should whilst others received lower, and that’s not fair.

Better the devil you know

The Government could certainly have improved the way these changes were communicated and having developed a method of allocating grades scenario stress tested their proposal. Changing their mind so quickly at the first sign of criticism suggests they had not done this. It has also left the public and students with a belief that algorithms dont work or at the very least should not to be trusted.

Perhaps the easiest thing to have done would have been to get all the students to sit the exam in September or October. The Universities would then have started in January, effectively everything would move by three months, and no one would have complained about that would they?

Food for thoughts – the impact of food on learning

According the latest government statistics obesity is on the rise, there is also a link to Covid deaths with nearly 8% of critically ill patients in intensive care being obese, compared with 2.9% of the general population. The WHO has stated that being overweight and obese is the fifth leading risk for global deaths with at least 2.8 million adults dying each year.

Eating too much is clearly not good for your health but how about what you eat, how might that impact your health, in particular your brain?

Viva las Vagus

Have you ever used your gut instinct, had butterflies in your stomach or when feeling nervous had to rush to the toilet? If so then you already have some evidence of the connection and importance of your gut to the way you think and feel. The vagus nerve is the longest cranial nerve and runs from the brain stem to part of the colon in effect making the connection. The biggest influence on the levels of activity of the vagus nerve are the trillions of microbes that reside in the gut. The vagus nerve is able to sense the microbe activity and effectively transfer this gut information to the nervous system and ultimately the brain. Watch this 2-minute video that shows how this works.

Scientists refer to the relationship between the gut and the brain as the “gut brain axis”. The brain sends chemical signals to the gut through the bloodstream, one such example is the feeling of being full or hungry. But and this is the interesting part – the stomach talks back; gut bacteria send messages in the same way the brain communicates using neurotransmission. Prior blog – The learning brain.

Exactly what the messages say depends on what you eat, a gut filled with fruit and vegetables will have different microbes to one that has just consumed a Big Mac. This is a very new area and most of the research has been completed on rats but there is already some evidence to suggest that junk food impairs memory.

Hopefully this gives you some idea as to the strong connection that exist between your stomach and your brain. We can now move on and consider what specific types of foods can help when learning.

These Ted talks are well worth watching if you want to find out more – Your Gut Microbiome: The most important organ you’ve never heard of (11m), and Mind-altering microbes: How the microbiome affects brain and behaviour (6m).

What to eat when studying

The first thing to say is that I am far from an expert on nutrition and so the focus here is more on the impact food has on mood, concentration, cognition and memory. Secondly, to give this some context it might be worth thinking about what you eat in the same way an athlete does. They pay close attention to their diet to make sure their body is in the best possible condition in order to compete because if not they are reducing their chances of success. However, a good diet is no substitute for the hard work they have to put in at the gym, you have to do both. Short video on how nutrition is key to sports performance.

Brain foods

  1. Apples, berries and Citrus – The British Journal of Nutrition published research in 2010 (The impact of fruit flavonoids on memory and cognition) indicating that consuming certain fruits such as berries, apple and citrus, that are rich in flavonoids can help improve memory and cognition.
  2. Dark chocolate – Research published in the Frontiers in Nutrition (Enhancing Human Cognition with Cocoa Flavonoids) found that dark chocolate which also contains flavonoids improved memory in both the short and long term. But remember many types of chocolate are high in sugar, fats, and calories so it’s not all good news.
  3. Rosemary – Northumbria University’s Department of Psychology found that herbs such as rosemary and lavender impacted memory, with the scent of rosemary enhancing memory but lavender impairing it. Maybe Shakespeare knew what he was talking about when he said ‘rosemary is for remembrance’.
  4. Oily fish and walnuts (omega 3) – There is a much-published connection between omega three and the improvement in learning and memory. However, many of these claims are exaggerated to promote a particular type of food or brand with most having such small doses to make little or no difference. There is some evidence published in the medical journal of the American Academy of Neurology that found people who ate more seafood, which naturally contains omega 3, had reduced rates of decline in semantic memory. But there is little evidence to show that supplements work at all. The best advice is to eat fish and nuts as part of a balanced diet but don’t expect your exam results to improve by that much.
  5. Fruit and vegetables – A study conducted by Pennsylvania State University in April 2012 found an association between consuming fruit and vegetables and being in a positive mood.
  6. Water – Despite being the least exciting of them all, water remains one of the best ways in which you can improve brain functionality. Research published in the American Journal of Clinical Nutrition studied 101 participants to see if low water consumption impacted cognition. The result was those who had reduced amounts of water experienced poor memory, reduced energy levels and feelings of anxiety, but those drinking water experienced the opposite.

The evidence on specific foods and its impact on cognition and learning is complex and nuanced. However the connection between the stomach and the brain although still in its early stages has greater potential to lead us to a better understanding as to what we should eat to improve our mental wellbeing.

In the meantime, the best advice is to think about how your diet impacts you personally, identify when you feel best studying is it before a meal or after, pay attention to snacking and of course drink lots of water, eat your greens, all as part of a balanced diet.

Lessons from lies – Fake news

There is little doubt that we live in an age with access to more information than any other. All you have to do is log onto your PC and type into Google whatever you want to know and within 0.28 seconds you will get 3.44 million results, it really is science fiction. But having lots of information isn’t the same as having reliable information, how do you know that what your reading is true?

Fake news and false information

Fake news is certainly not new, in 1835 it was reported in a New York newspaper that a telescope “of vast dimensions” could see what was happening on the moon. It caused a sensation and the paper’s circulation increased from 8,000 to more than 19,000. The only problem, it was a complete fiction or fake news concocted by the editor, Richard Adams Locke. It may not be new but fake news is certainly faster moving and far more prolific fuelled by the internet, the growth in social media, globalisation and a lack of regulation.

But before we go any further let’s take a step back and clarify what we mean by fake news. Firstly, there are completely false stories created to deliberately misinform, think here about the moon story although even that contained some facts. There was an astronomer called Sir John Herschel who did indeed have a telescope “of vast dimensions” in his South African observatory, but he did not witness men with bat wings, unicorns, and bipedal beavers on the moon’s surface. Secondly, stories that may have some truth to them, but are not completely accurate, a much more sophisticated and convincing version of the above and probably harder to detect.

We will leave aside the motives for creating fake news but they range from politics, to pranks and as was the case of Richard Adams Locke, commercial gain.

Here are a few headlines:

5G weakens the immune system, making us more vulnerable to catching the virus
If you can hold your breath for 10 seconds, then you don’t have the virus
Fuel pump handles pose a particularly high risk of spreading the Corona-19 infection
And more controversy, Health secretary Matt Hancock stating that testing figures had hit 122,347 on April 30

The first three are fake, the third is based on facts. Click here to make up your own mind as to its truth.

But why do we believe these stories?

Quick to judge A study from the University of Toulouse Capitole, found that when participants were asked to make a quick judgment about whether a news story was real or fake, they were more likely to get it wrong. This is somewhat worrying given the short attention span and patterns of behaviour displayed by those surfing the net.

We think more like lawyers than scientists – Commonly called confirmation bias, our ability to favour information that confirms our existing beliefs. Lawyers examine evidence with a preconceived objective, to prove their client’s innocence whereas scientists remain open minded, in theory at least. An interesting aspect of this is that well educated people may be more susceptible because they have the ability to harness far more information to support their opinion. This is a bias of belief not of knowledge.  

Illusory truth effect – This is the tendency to believe false information after repeated exposure. First identified in a 1977 study at Villanova University and Temple University. It would be wrong to ignore the man who many believe (wrongly) invented the term fake news, including himself, Donald Trump. He is a master of repetition, for example Trump used the expression “Chinese virus” more than 20 times between March 16 and March 30, according to the website Factbase.

Gullibility, the failure to ask questions We are prone to believe stories that “look right”, Psychologists refer to this as “processing fluency”. Experiments have found that “fluent information” tends to be regarded as more trustworthy and as such more likely to be true. Images are especially powerful, for example researchers have found that people believed that macadamia nuts were from the same family as peaches if there was a picture of a nut next to the text.

The same photo but from a different angle

Google it! but do so with care

Most educators will encourage students to become independent learners, be curious and ask questions, solve their own problems, it is one of the most powerful educational lessons, and as Nelson Mandela said, education can be used to change the world. But we need to be careful that what is learned is not just a bunch of facts loosely gathered to prove one person’s point of view. Mandela’s vision of changing the world through education was based on the education being broad and complex not narrow.

We are of course very fortunate to have such a vast amount of information from which to learn, but that curiosity needs to be tempered with a critical mind set. The questions asked should be thoughtfully constructed with knowledge of one’s own personal bias and the information analysed against the backdrop of the source of that information and possible motives of the authors

Guidelines for students using Google

1. Develop a Critical Mindset – this is the ability to think logically, figuring out the connections, being active rather than passive, challenging what you read against what you already know and perhaps most importantly challenging your own ideas in the context of the new information. Are you simply finding information to support your own views, an example of confirmation bias.

2. Check the Source and get confirmation – for websites always look at the URL for the identity of the organisation and the date of the story. Lots of fake news is news rehashed from the past to support the argument currently being made. What is the authority quoted, why not cut that from the story and paste into google to find out who else is using that information and in what context. Look for spelling mistakes and generalisations e.g. most people agree. These terms are vague and give the impression that this is a majority view.

3. Evaluate the evidence and don’t take images at face value – use your critical thinking skills to validate the evidence. Who is the authority quoted, do they have any reasons or motives for making these claims? Images as already mentioned are very powerful, but fake images are easy to create on the internet and a clever camera angle can easily mislead.

4. Does it make sense? – an extension of logical thinking but perhaps more emotional, how do you feel about this, what’s you gut instinct. The unconscious part of your brain can help make complex decisions sometimes more accurately than logical thought.

With large amounts of free knowledge, there are calls for schools to be doing more to better equip children to navigate the internet. In fact, back in 2017 the House of Lords published a report ‘Growing up with the internet’ which recommended that “Digital literacy should be the fourth pillar of a child’s education alongside reading, writing and mathematics”.

It’s not just school children that need this fourth pillar, we probably all do.

And of course the picture at the start of this blog is Fake!

The Covid gap year – a catalyst for change

At times it might seem difficult to find the positives in the current Covid crises but there are some. We may have had to change our travel plans but are benefiting from cleaner air and more time, staying closer to home is leading to a greater sense of community, and social media which was becoming ever more toxic has been used by many to keep in touch with friends and family. But how long will we continue to enjoy these healthy bi-products when we can jump on that aeroplane, tweet something without thinking and once again time becomes scarce, consumed by work. The downside is it can so easily revert back to how it was before.

However, some changes are likely to become permanent, people are beginning to call what comes after Covid the new norm, a kind of normality, familiar and yet different. We have all been given a glimpse of the future or to be precise the future has been brought forward not as a blurry image but with startling clarity because we are living it.

Change is easy
On the whole it’s difficult to get people to change their behaviour but if you change the environment it’s a different story. If we had asked people if they wanted to work from home they would have had to guess what it would be like, imagining not having to travel, imagining not seeing colleagues in the wok place but if you are forced into doing it, you experience it for real. And that’s what’s happened, people may not have chosen to work from home but having experienced it the change will be faster.

Neurologically a habit or learning for that matter takes place when you do something repeatedly. In 1949 Donald Hebb, a Canadian neuroscientist noted that once a circuit of neurons is formed, when one neuron fires so do the others, effectively strengthening the whole circuit. This has become known as Hebbian theory or Hebbs law and leads to long term potentiation, (LTP).

“Neurons that fire together wire together.”

Habits are patterns that can be thought of as grooves created over time by repetition but once formed they are hard to get out of, the deeper the groove, the less we think about it at a conscious level. But if you change the environment you are forcing the brain to reconsider those habits, effectively moving you out of that particular groove until you form another one. The secret is of course to create good habits and remove bad ones.

Many are suggesting that working from home will become far more common, Google and Facebook have already announced that they do not need their employees to go back into offices until at least the end of 2020, but who knows what that groove will be like by then. The other big changes on the horizon with potential for long term impact are, the reduction in the use of cash as appose to contactless, online shopping already popular will see a more drastic reshaping of the high street and studying online becoming a new way of learning. Education has seen one of the biggest changes arguably since we have had access to the internet with 1.3 billion students from 186 countries across the world now having to learn remotely. Even before COVID-19, global EdTech investment was $18.7 billion and the overall market for online education is projected to reach $350 Billion by 2025. (source WEF).

This is what school in China looks like during coronavirus.

Changing attitudes to study
Given the choice 1.3 billion students would not have all agreed to study online but Covid-19 has made this a reality within a matter of months. Its an environmental change on a massive scale. The argument that online learning is better remains complex and confusing, requiring a clearer understanding of what is being measured and a much longer time period under which it can be evaluated. There are for example claims that retention rates are higher by somewhere between 25% – 60% but I would remain sceptical despite its appeal and apparent common sense logic.

Instead focus on your own learning, think less of how much more difficult it is to concentrate staring at a computer screen rather than being in a classroom and embrace the process. You are in a new “groove” and as a result it’s not going to feel comfortable.

Covid Gap year
Why not make 2020 your Covid Gap year. UCAS says that one of the benefits of a gap year is that it “offers you the opportunity to gain skills and experiences, while giving you time to reflect and focus on what you want to do next”. It’s the changing environment in terms of geography, people, doing things that you might not have chosen that makes the gap year so worthwhile, and despite what people say when they return, it wasn’t enjoyable all of the time, you do get bored and frustrated but it can open your mind to new possibilities and ironically lockdown can do the same.

Online learning is a new environment, view it through the spectrum of new skills and experiences and only when you reflect back should you decide on how valuable it might have been.

Brain overload

Have you ever felt that you just can’t learn anymore, your head is spinning, your brain must be full? And yet we are told that the brains capacity is potentially limitless, made up of around 86 billion neurons.

To understand why both of these may be true, we have to delve a little more into how the brain learns or to be precise how it manages information. In a previous blog I outlined the key parts of the brain and discussed some of the implications for learning – the learning brain, but as you might imagine this is a complex subject, but I should add a fascinating one.

Cognitive load and schemas

Building on the work of George (magic number 7) Miller and Jean Paget’s development of schemas, in 1988 John Sweller introduced us to cognitive load, the idea that we have a limit to the amount of information we can process.

Cognitive load relates to the amount of information that working memory can hold at one time

Human memory can be divided into working memory and long-term memory. Working memory also called short term memory is limited, only capable of holding 7 plus or minus 2 pieces of information at any one time, hence the magic number 7, but long-term memory has arguably infinite capacity.

The limited nature of working memory can be highlighted by asking you to look at the 12 letters below. Take about 5 seconds. Look away from the screen and write down what you can remember on a blank piece of paper.

MBIAWTDHPIBF

Because there are more than 9 characters this will be difficult. 

Schemas – Information is stored in long-term memory in the form of schemas, these are frameworks or concepts that help organise and interpret new information. For example, when you think of a tree it is defined by a number of characteristics, its green, has a trunk and leaves at the end of branches, this is a schema. But when it comes to autumn, the tree is no longer green and loses its leaves, suggesting that this cannot be a tree. However, if you assimilate the new information with your existing schema and accommodate this in a revised version of how you think about a tree, you have effectively learned something new and stored it in long term memory. By holding information in schemas, when new information arrives your brain can very quickly identify if it fits within an existing one and in so doing enable rapid knowledge acquisition and understanding.

The problem therefore lies with working memory and its limited capacity, but if we could change the way we take in information, such that it doesn’t overload working memory the whole process will become more effective.

Avoiding cognitive overload

This is where it gets really interesting from a learning perspective. What can we do to avoid the brain becoming overloaded?

1. Simple first – this may sound like common sense, start with a simple example e.g. 2+2 = 4 and move towards the more complex e.g. 2,423 + 12,324,345. If you start with a complex calculation the brain will struggle to manipulate the numbers or find any pattern.

2. Direct Instruction not discovery – although there is significant merit in figuring things out for yourself, when learning something new it is better to follow guided instruction (teacher led) supported by several examples, starting simple and becoming more complex (as above). When you have created your own schema, you can begin to work independently.

3. Visual overload – a presentation point, avoid having too much information on a page or slide, reveal each part slowly. The secret is to break down complexity into smaller segments. This is the argument for not having too much content all on one page, which is often the case in textbooks. Read with a piece of paper or ruler effectively underlining the words you are reading, moving the paper down revealing a new line at a time.

4. Pictures and words (contiguity) – having “relevant” pictures alongside text helps avoid what’s called split attention. This is why creating your own notes with images as well as text when producing a mind map works so well.

5. Focus, avoid distraction (coherence) – similar to visual overload, remove all unnecessary images and information, keep focused on the task in hand. There may be some nice to know facts, but stick to the essential ones.

6. Key words (redundancy) – when reading or making notes don’t highlight or write down exactly what you read, simplify the sentence, focusing on the key words which will reduce the amount of input.

7. Use existing schemas – if you already have an understanding of a topic or subject, it will be sat within a schema, think how the new information changes your original understanding.

Remember the 12 characters from earlier, if we chunk them into 4 pieces of information and link to an existing schema, you will find it much easier to remember. Here are the same 12 characters chunked down.

FBI – TWA – PHD – IBM

Each one sits within an existing schema e.g. Federal Bureau of Investigation etc, making it easier for the brain to learn the new information.

Note – the above ideas are based on Richard E. Mayer’s principles of multimedia learning.

In conclusion

Understanding more about how the brain works, in particular how to manage some of its limitations as is the case with short term memory not only makes learning more efficient but also gives you confidence that how your learning is the most effective.