Fairness and mutant algorithms

Back in 2014 I wrote two blogs (part 1 & part 2) about examinations and asked if they were fit for purpose. The conclusion – they provide students with a clear objective to work towards, the process is scalable and the resulting qualification is a transferable measure of competency. They are of course far from perfect, exams do not always test what is most needed or valued and when results are presented in league tables, they give a too simplistic measure of success.

However, I didn’t ask if examinations were fair, that is treating students equally without favouritism or discrimination.

In the last two weeks the question around fairness has been in the headlines following the government’s decision to cancel all A level and GCSE examinations in order to reduce the risk of spreading Covid-19. Whilst many agreed with this it did cause a problem, how could we fairly assess student performance without an examination?

Are examinations fair?

This is not a question about the fairness of an exam as a type of assessment, there are for example other ways of testing ability, course work, observations etc. Its asking if the system of which an examination is part treats all students equally, without bias.

In the world of assessment exams are not considered sufficiently well designed if they aren’t both reliable and valid. It might be interesting to use this as a framework to consider the fairness of the exam system.  

  • Validity – the extent to which it measures what it was designed to measure e.g. add 2+2 to assess mathematical ability.
  • Reliability – the extent to which it consistently and accurately measures learning. The test needs to give the same results when repeated. e.g. adding 2+2 is just as reliable as adding 2+3. The better students will get them both right and the weaker students both wrong.

The examining bodies will be very familiar with these requirements and have controls in place to ensure the questions they set are both valid and reliable. But even with sophisticated statistical controls, writing questions and producing an exam of the same standard over time is incredibly difficult.  Every year the same questions are asked, have students performed better or is it just grade inflation, were A levels in 1951 easier or harder than today? It’s the reliability of the process that is most questionable.

If we step away from the design of the exam to consider the broader process, there are more problems. Because there are several awarding bodies, AQA, OCR, Edexcel to name but three, students are by definition sitting different examinations. And although this is recognised and partly dealt with by adjusting the grade boundaries, it’s not possible to completely eliminate bias. It would be much better to have one single body setting the same exam for all students.

There is also the question of comparability between subjects, is for example A level maths the same as A level General studies? Research conducted by Durham University in 2006 concluded that a pupil would be likely to get a pass two grades higher in “softer” subjects than harder ones. They added that “from a moral perspective, it is clear this is unfair”. The implication being that students could miss out on university because they have chosen a harder subject.

In summary, exams are not fair, there is bias and we haven’t even mentioned the impact of the school you go to or the increased chances of success the private sector can offer. However, many of these issues have been known for some time and a considerable amount effort goes into trying to resolve them. Examinations also have one other big advantage, they are accepted and to a certain extent the trusted norm and as long as you don’t look too closely, they work or at least appear to. Kylie might be right, “it’s better the devil you know”….. than the devil you don’t.

The mutant algorithm

Boris Johnson is well known for his descriptive language, this time suggesting that the A level problem was the result of a mutant algorithm. But it was left to Gavin Williamson the Secretary of State for Education to make the announcement that the government’s planned method of allocating grades would need to change.

We now believe it is better to offer young people and parents’ certainty by moving to teacher assessed grades for both A and AS level and GCSE results”

The government has come in for a lot of criticism and even their most ardent supporters can’t claim that this was handled well.

But was it ever going to be possible to replace an exam with something that everyone would think fair?

Clarification on grading

To help answer this question we should start with an understanding of the different methods of assessing performance.

  1. Predicted Grades (PG) – predicted by the school based on what they believe the individual is likely to achieve in positive circumstances. They are used by universities and colleges as part of the admissions process. There is no detailed official guidance as to how these should be calculated and in general are overestimated. Research from UCL showed that the vast majority, that is 75% of grades were over-predicted.
  2. Centre Assessed Grades (CAG) – These are the grades which schools and colleges believed students were most likely to achieve, if the exams hadn’t gone ahead. They were the original data source for Ofqual’s algorithm. It was based on a range of evidence including mock exams, non-exam assessment, homework assignments and any other record of student performance over the course of study.  In addition, a rank order of all students within each grade for every subject was produced in order to provide a relative measure. These are now also being referred to as Teacher Assessed Grades (TAG)
  3. Calculated grades (CG) – an important difference is that these are referred to as “calculated” rather than predicted! These are the grades awarded based on Ofqual’s algorithm. They use the CAG’s but adjusts them to ensure they are more in line with prior year performance from that school. It is this that creates one of the main problems with the algorithm…

it effectively locks the performance of an individual student this year into the performance of students from the same school over the previous three years.

Ofqual claimed that if this standardisation had not taken place, we would have seen the percentage of A* grades at A-levels go up from 7.7 % in 2019 to 13.9 % this year. The overall impact was that the algorithm downgraded 39 % of the A-level grades predicted by teachers using their CAG’s. Click here to read more about how the grading works.

Following the outcry by students and teachers Gavin Williamson announced on the 17th of August that the Calculated Grades would no longer be used, instead the Centres Assessed Grades would form the basis for assessing student performance.  But was this any fairer, well maybe a little, but it almost certainly resulted in some students getting higher grades than they should whilst others received lower, and that’s not fair.

Better the devil you know

The Government could certainly have improved the way these changes were communicated and having developed a method of allocating grades scenario stress tested their proposal. Changing their mind so quickly at the first sign of criticism suggests they had not done this. It has also left the public and students with a belief that algorithms dont work or at the very least should not to be trusted.

Perhaps the easiest thing to have done would have been to get all the students to sit the exam in September or October. The Universities would then have started in January, effectively everything would move by three months, and no one would have complained about that would they?

Food for thoughts – the impact of food on learning

According the latest government statistics obesity is on the rise, there is also a link to Covid deaths with nearly 8% of critically ill patients in intensive care being obese, compared with 2.9% of the general population. The WHO has stated that being overweight and obese is the fifth leading risk for global deaths with at least 2.8 million adults dying each year.

Eating too much is clearly not good for your health but how about what you eat, how might that impact your health, in particular your brain?

Viva las Vagus

Have you ever used your gut instinct, had butterflies in your stomach or when feeling nervous had to rush to the toilet? If so then you already have some evidence of the connection and importance of your gut to the way you think and feel. The vagus nerve is the longest cranial nerve and runs from the brain stem to part of the colon in effect making the connection. The biggest influence on the levels of activity of the vagus nerve are the trillions of microbes that reside in the gut. The vagus nerve is able to sense the microbe activity and effectively transfer this gut information to the nervous system and ultimately the brain. Watch this 2-minute video that shows how this works.

Scientists refer to the relationship between the gut and the brain as the “gut brain axis”. The brain sends chemical signals to the gut through the bloodstream, one such example is the feeling of being full or hungry. But and this is the interesting part – the stomach talks back; gut bacteria send messages in the same way the brain communicates using neurotransmission. Prior blog – The learning brain.

Exactly what the messages say depends on what you eat, a gut filled with fruit and vegetables will have different microbes to one that has just consumed a Big Mac. This is a very new area and most of the research has been completed on rats but there is already some evidence to suggest that junk food impairs memory.

Hopefully this gives you some idea as to the strong connection that exist between your stomach and your brain. We can now move on and consider what specific types of foods can help when learning.

These Ted talks are well worth watching if you want to find out more – Your Gut Microbiome: The most important organ you’ve never heard of (11m), and Mind-altering microbes: How the microbiome affects brain and behaviour (6m).

What to eat when studying

The first thing to say is that I am far from an expert on nutrition and so the focus here is more on the impact food has on mood, concentration, cognition and memory. Secondly, to give this some context it might be worth thinking about what you eat in the same way an athlete does. They pay close attention to their diet to make sure their body is in the best possible condition in order to compete because if not they are reducing their chances of success. However, a good diet is no substitute for the hard work they have to put in at the gym, you have to do both. Short video on how nutrition is key to sports performance.

Brain foods

  1. Apples, berries and Citrus – The British Journal of Nutrition published research in 2010 (The impact of fruit flavonoids on memory and cognition) indicating that consuming certain fruits such as berries, apple and citrus, that are rich in flavonoids can help improve memory and cognition.
  2. Dark chocolate – Research published in the Frontiers in Nutrition (Enhancing Human Cognition with Cocoa Flavonoids) found that dark chocolate which also contains flavonoids improved memory in both the short and long term. But remember many types of chocolate are high in sugar, fats, and calories so it’s not all good news.
  3. Rosemary – Northumbria University’s Department of Psychology found that herbs such as rosemary and lavender impacted memory, with the scent of rosemary enhancing memory but lavender impairing it. Maybe Shakespeare knew what he was talking about when he said ‘rosemary is for remembrance’.
  4. Oily fish and walnuts (omega 3) – There is a much-published connection between omega three and the improvement in learning and memory. However, many of these claims are exaggerated to promote a particular type of food or brand with most having such small doses to make little or no difference. There is some evidence published in the medical journal of the American Academy of Neurology that found people who ate more seafood, which naturally contains omega 3, had reduced rates of decline in semantic memory. But there is little evidence to show that supplements work at all. The best advice is to eat fish and nuts as part of a balanced diet but don’t expect your exam results to improve by that much.
  5. Fruit and vegetables – A study conducted by Pennsylvania State University in April 2012 found an association between consuming fruit and vegetables and being in a positive mood.
  6. Water – Despite being the least exciting of them all, water remains one of the best ways in which you can improve brain functionality. Research published in the American Journal of Clinical Nutrition studied 101 participants to see if low water consumption impacted cognition. The result was those who had reduced amounts of water experienced poor memory, reduced energy levels and feelings of anxiety, but those drinking water experienced the opposite.

The evidence on specific foods and its impact on cognition and learning is complex and nuanced. However the connection between the stomach and the brain although still in its early stages has greater potential to lead us to a better understanding as to what we should eat to improve our mental wellbeing.

In the meantime, the best advice is to think about how your diet impacts you personally, identify when you feel best studying is it before a meal or after, pay attention to snacking and of course drink lots of water, eat your greens, all as part of a balanced diet.

Lessons from lies – Fake news

There is little doubt that we live in an age with access to more information than any other. All you have to do is log onto your PC and type into Google whatever you want to know and within 0.28 seconds you will get 3.44 million results, it really is science fiction. But having lots of information isn’t the same as having reliable information, how do you know that what your reading is true?

Fake news and false information

Fake news is certainly not new, in 1835 it was reported in a New York newspaper that a telescope “of vast dimensions” could see what was happening on the moon. It caused a sensation and the paper’s circulation increased from 8,000 to more than 19,000. The only problem, it was a complete fiction or fake news concocted by the editor, Richard Adams Locke. It may not be new but fake news is certainly faster moving and far more prolific fuelled by the internet, the growth in social media, globalisation and a lack of regulation.

But before we go any further let’s take a step back and clarify what we mean by fake news. Firstly, there are completely false stories created to deliberately misinform, think here about the moon story although even that contained some facts. There was an astronomer called Sir John Herschel who did indeed have a telescope “of vast dimensions” in his South African observatory, but he did not witness men with bat wings, unicorns, and bipedal beavers on the moon’s surface. Secondly, stories that may have some truth to them, but are not completely accurate, a much more sophisticated and convincing version of the above and probably harder to detect.

We will leave aside the motives for creating fake news but they range from politics, to pranks and as was the case of Richard Adams Locke, commercial gain.

Here are a few headlines:

5G weakens the immune system, making us more vulnerable to catching the virus
If you can hold your breath for 10 seconds, then you don’t have the virus
Fuel pump handles pose a particularly high risk of spreading the Corona-19 infection
And more controversy, Health secretary Matt Hancock stating that testing figures had hit 122,347 on April 30

The first three are fake, the third is based on facts. Click here to make up your own mind as to its truth.

But why do we believe these stories?

Quick to judge A study from the University of Toulouse Capitole, found that when participants were asked to make a quick judgment about whether a news story was real or fake, they were more likely to get it wrong. This is somewhat worrying given the short attention span and patterns of behaviour displayed by those surfing the net.

We think more like lawyers than scientists – Commonly called confirmation bias, our ability to favour information that confirms our existing beliefs. Lawyers examine evidence with a preconceived objective, to prove their client’s innocence whereas scientists remain open minded, in theory at least. An interesting aspect of this is that well educated people may be more susceptible because they have the ability to harness far more information to support their opinion. This is a bias of belief not of knowledge.  

Illusory truth effect – This is the tendency to believe false information after repeated exposure. First identified in a 1977 study at Villanova University and Temple University. It would be wrong to ignore the man who many believe (wrongly) invented the term fake news, including himself, Donald Trump. He is a master of repetition, for example Trump used the expression “Chinese virus” more than 20 times between March 16 and March 30, according to the website Factbase.

Gullibility, the failure to ask questions We are prone to believe stories that “look right”, Psychologists refer to this as “processing fluency”. Experiments have found that “fluent information” tends to be regarded as more trustworthy and as such more likely to be true. Images are especially powerful, for example researchers have found that people believed that macadamia nuts were from the same family as peaches if there was a picture of a nut next to the text.

The same photo but from a different angle

Google it! but do so with care

Most educators will encourage students to become independent learners, be curious and ask questions, solve their own problems, it is one of the most powerful educational lessons, and as Nelson Mandela said, education can be used to change the world. But we need to be careful that what is learned is not just a bunch of facts loosely gathered to prove one person’s point of view. Mandela’s vision of changing the world through education was based on the education being broad and complex not narrow.

We are of course very fortunate to have such a vast amount of information from which to learn, but that curiosity needs to be tempered with a critical mind set. The questions asked should be thoughtfully constructed with knowledge of one’s own personal bias and the information analysed against the backdrop of the source of that information and possible motives of the authors

Guidelines for students using Google

1. Develop a Critical Mindset – this is the ability to think logically, figuring out the connections, being active rather than passive, challenging what you read against what you already know and perhaps most importantly challenging your own ideas in the context of the new information. Are you simply finding information to support your own views, an example of confirmation bias.

2. Check the Source and get confirmation – for websites always look at the URL for the identity of the organisation and the date of the story. Lots of fake news is news rehashed from the past to support the argument currently being made. What is the authority quoted, why not cut that from the story and paste into google to find out who else is using that information and in what context. Look for spelling mistakes and generalisations e.g. most people agree. These terms are vague and give the impression that this is a majority view.

3. Evaluate the evidence and don’t take images at face value – use your critical thinking skills to validate the evidence. Who is the authority quoted, do they have any reasons or motives for making these claims? Images as already mentioned are very powerful, but fake images are easy to create on the internet and a clever camera angle can easily mislead.

4. Does it make sense? – an extension of logical thinking but perhaps more emotional, how do you feel about this, what’s you gut instinct. The unconscious part of your brain can help make complex decisions sometimes more accurately than logical thought.

With large amounts of free knowledge, there are calls for schools to be doing more to better equip children to navigate the internet. In fact, back in 2017 the House of Lords published a report ‘Growing up with the internet’ which recommended that “Digital literacy should be the fourth pillar of a child’s education alongside reading, writing and mathematics”.

It’s not just school children that need this fourth pillar, we probably all do.

And of course the picture at the start of this blog is Fake!

The Covid gap year – a catalyst for change

At times it might seem difficult to find the positives in the current Covid crises but there are some. We may have had to change our travel plans but are benefiting from cleaner air and more time, staying closer to home is leading to a greater sense of community, and social media which was becoming ever more toxic has been used by many to keep in touch with friends and family. But how long will we continue to enjoy these healthy bi-products when we can jump on that aeroplane, tweet something without thinking and once again time becomes scarce, consumed by work. The downside is it can so easily revert back to how it was before.

However, some changes are likely to become permanent, people are beginning to call what comes after Covid the new norm, a kind of normality, familiar and yet different. We have all been given a glimpse of the future or to be precise the future has been brought forward not as a blurry image but with startling clarity because we are living it.

Change is easy
On the whole it’s difficult to get people to change their behaviour but if you change the environment it’s a different story. If we had asked people if they wanted to work from home they would have had to guess what it would be like, imagining not having to travel, imagining not seeing colleagues in the wok place but if you are forced into doing it, you experience it for real. And that’s what’s happened, people may not have chosen to work from home but having experienced it the change will be faster.

Neurologically a habit or learning for that matter takes place when you do something repeatedly. In 1949 Donald Hebb, a Canadian neuroscientist noted that once a circuit of neurons is formed, when one neuron fires so do the others, effectively strengthening the whole circuit. This has become known as Hebbian theory or Hebbs law and leads to long term potentiation, (LTP).

“Neurons that fire together wire together.”

Habits are patterns that can be thought of as grooves created over time by repetition but once formed they are hard to get out of, the deeper the groove, the less we think about it at a conscious level. But if you change the environment you are forcing the brain to reconsider those habits, effectively moving you out of that particular groove until you form another one. The secret is of course to create good habits and remove bad ones.

Many are suggesting that working from home will become far more common, Google and Facebook have already announced that they do not need their employees to go back into offices until at least the end of 2020, but who knows what that groove will be like by then. The other big changes on the horizon with potential for long term impact are, the reduction in the use of cash as appose to contactless, online shopping already popular will see a more drastic reshaping of the high street and studying online becoming a new way of learning. Education has seen one of the biggest changes arguably since we have had access to the internet with 1.3 billion students from 186 countries across the world now having to learn remotely. Even before COVID-19, global EdTech investment was $18.7 billion and the overall market for online education is projected to reach $350 Billion by 2025. (source WEF).

This is what school in China looks like during coronavirus.

Changing attitudes to study
Given the choice 1.3 billion students would not have all agreed to study online but Covid-19 has made this a reality within a matter of months. Its an environmental change on a massive scale. The argument that online learning is better remains complex and confusing, requiring a clearer understanding of what is being measured and a much longer time period under which it can be evaluated. There are for example claims that retention rates are higher by somewhere between 25% – 60% but I would remain sceptical despite its appeal and apparent common sense logic.

Instead focus on your own learning, think less of how much more difficult it is to concentrate staring at a computer screen rather than being in a classroom and embrace the process. You are in a new “groove” and as a result it’s not going to feel comfortable.

Covid Gap year
Why not make 2020 your Covid Gap year. UCAS says that one of the benefits of a gap year is that it “offers you the opportunity to gain skills and experiences, while giving you time to reflect and focus on what you want to do next”. It’s the changing environment in terms of geography, people, doing things that you might not have chosen that makes the gap year so worthwhile, and despite what people say when they return, it wasn’t enjoyable all of the time, you do get bored and frustrated but it can open your mind to new possibilities and ironically lockdown can do the same.

Online learning is a new environment, view it through the spectrum of new skills and experiences and only when you reflect back should you decide on how valuable it might have been.

Brain overload

Have you ever felt that you just can’t learn anymore, your head is spinning, your brain must be full? And yet we are told that the brains capacity is potentially limitless, made up of around 86 billion neurons.

To understand why both of these may be true, we have to delve a little more into how the brain learns or to be precise how it manages information. In a previous blog I outlined the key parts of the brain and discussed some of the implications for learning – the learning brain, but as you might imagine this is a complex subject, but I should add a fascinating one.

Cognitive load and schemas

Building on the work of George (magic number 7) Miller and Jean Paget’s development of schemas, in 1988 John Sweller introduced us to cognitive load, the idea that we have a limit to the amount of information we can process.

Cognitive load relates to the amount of information that working memory can hold at one time

Human memory can be divided into working memory and long-term memory. Working memory also called short term memory is limited, only capable of holding 7 plus or minus 2 pieces of information at any one time, hence the magic number 7, but long-term memory has arguably infinite capacity.

The limited nature of working memory can be highlighted by asking you to look at the 12 letters below. Take about 5 seconds. Look away from the screen and write down what you can remember on a blank piece of paper.

MBIAWTDHPIBF

Because there are more than 9 characters this will be difficult. 

Schemas – Information is stored in long-term memory in the form of schemas, these are frameworks or concepts that help organise and interpret new information. For example, when you think of a tree it is defined by a number of characteristics, its green, has a trunk and leaves at the end of branches, this is a schema. But when it comes to autumn, the tree is no longer green and loses its leaves, suggesting that this cannot be a tree. However, if you assimilate the new information with your existing schema and accommodate this in a revised version of how you think about a tree, you have effectively learned something new and stored it in long term memory. By holding information in schemas, when new information arrives your brain can very quickly identify if it fits within an existing one and in so doing enable rapid knowledge acquisition and understanding.

The problem therefore lies with working memory and its limited capacity, but if we could change the way we take in information, such that it doesn’t overload working memory the whole process will become more effective.

Avoiding cognitive overload

This is where it gets really interesting from a learning perspective. What can we do to avoid the brain becoming overloaded?

1. Simple first – this may sound like common sense, start with a simple example e.g. 2+2 = 4 and move towards the more complex e.g. 2,423 + 12,324,345. If you start with a complex calculation the brain will struggle to manipulate the numbers or find any pattern.

2. Direct Instruction not discovery – although there is significant merit in figuring things out for yourself, when learning something new it is better to follow guided instruction (teacher led) supported by several examples, starting simple and becoming more complex (as above). When you have created your own schema, you can begin to work independently.

3. Visual overload – a presentation point, avoid having too much information on a page or slide, reveal each part slowly. The secret is to break down complexity into smaller segments. This is the argument for not having too much content all on one page, which is often the case in textbooks. Read with a piece of paper or ruler effectively underlining the words you are reading, moving the paper down revealing a new line at a time.

4. Pictures and words (contiguity) – having “relevant” pictures alongside text helps avoid what’s called split attention. This is why creating your own notes with images as well as text when producing a mind map works so well.

5. Focus, avoid distraction (coherence) – similar to visual overload, remove all unnecessary images and information, keep focused on the task in hand. There may be some nice to know facts, but stick to the essential ones.

6. Key words (redundancy) – when reading or making notes don’t highlight or write down exactly what you read, simplify the sentence, focusing on the key words which will reduce the amount of input.

7. Use existing schemas – if you already have an understanding of a topic or subject, it will be sat within a schema, think how the new information changes your original understanding.

Remember the 12 characters from earlier, if we chunk them into 4 pieces of information and link to an existing schema, you will find it much easier to remember. Here are the same 12 characters chunked down.

FBI – TWA – PHD – IBM

Each one sits within an existing schema e.g. Federal Bureau of Investigation etc, making it easier for the brain to learn the new information.

Note – the above ideas are based on Richard E. Mayer’s principles of multimedia learning.

In conclusion

Understanding more about how the brain works, in particular how to manage some of its limitations as is the case with short term memory not only makes learning more efficient but also gives you confidence that how your learning is the most effective.

Double entry bookkeeping replaced by internet

There is an interesting question being asked at the moment, given that fact-based knowledge is so accessible using the internet, is there a case for not teaching facts at all?

According to Don Tapscott, a consultant and speaker, who specialises in organisations and technology, memorising facts and figures is a waste of time because such information is readily available. It would be far better to teach students to think creatively so that they can learn to interpret and apply the knowledge they discover online.

“Teachers are no longer the fountain of knowledge, the internet is”
Don Tapscott

Is this the solution for educators with an over full curriculum, the result of having to continually add new content to ensure their qualification remains relevant and topical? Perhaps they can remove facts and focus on skills development? After all its skills that matter, knowing is useful but it’s the ability to apply that really matters …right?

What makes you an accountant

When you start to learn about finance, you will be taught a number of underpinning foundational subjects including, law, economics, costing and of course basic accounting. Sat stubbornly within the accounting section will be double entry bookkeeping. This axiom is fiercely protected by the finance community such that if anyone questions its value or challenges its relevance they will be met with pure contempt. And yet, is the knowledge as to how you move numbers around following a hugely simple rule i.e. put a number on one side and an equivalent on the other of any use in a world where most accounting is performed by computers and sophisticated algorithms? I am sure there will be similar examples from other professions and industries. The challenge being, do doctors really need to understand basic anatomy or lawyers read cases dating back to 1892?

“Everyone is entitled to his own opinion, but not to his own facts”
Daniel Patrick Moynihan

But Knowledge is power

Daniel T. Willingham is a psychologist at the University of Virginia and the author of a number of books including, why students don’t like school. His early research was on the brain, learning and memory but more recently he has focused on the application of cognitive psychology in K-16 education.

Willingham argues that knowledge is not only cumulative, it grows exponentially. In addition, factual knowledge enhances cognitive processes like problem solving and reasoning. How knowledge Helps.

Knowledge is cumulative – the more you know the more you can learn. Individual chunks of knowledge will stick to new knowledge because what you already know provides context and so aids comprehension. For example, knowing the definition of a bond ‘a fixed income instrument that represents a loan made by an investor to a borrower (prior knowledge), enables the student to grasp the idea that anything fixed has to be paid by the company (the lender) regardless of its profitability and this is the reason debt is considered risky. (new knowledge)

Knowledge helps you remember – the elaboration effect has featured in a previous blog. In essence it suggests that the brain finds it easier to remember something if it can be associated with existing information. Using the same example from above, it is easier to remember that bonds are risky if you already knew what a bond was.

Knowledge improves thinking – there are two reasons for this, firstly it helps with problem solving. Imagine you have a problem to solve, if you don’t have sufficient background knowledge, understanding the problem can consume most of your working memory leaving no space for you to consider solutions. This argument is based on the understanding that we have limited capacity in working memory (magic number 7) and so to occupy it with grasping the problem at best slows down the problem-solving process, but at worse might result in walking away with no solution. Secondly knowledge helps speed up problem solving and thinking. People with prior knowledge are better at drawing analogies as they gain experience in a domain. Research by Bruce Burns in 2004 compared the performance of top chess players at normal and blitz tournaments. He found that what was making some players better than others is differences in the speed of recognition, not faster processing skills. Players who had knowledge of prior games where far quicker in coming up with moves than those who were effectively solving the problem from first principle. Chess speed at least has a lot to do with the brain recognising pre learned patterns.

Skills are domain specific – not transferable

There is one other important lesson from an understanding of knowledge – skills are domain specific. The implication being that teaching “transferable skills” e.g. skills that can be used in different areas, communication, critical thinking etc doesn’t work. A skill (Merriam Webster) is the ability to use one’s knowledge effectively and readily in execution or performance. The argument being that in order to use knowledge effectively, it needs to be in a specific domain.
In July 2016 the Education Endowment Foundation in the UK released the results of a two-year study involving almost 100 schools that wanted to find out if playing chess would improve maths. The hypothesis was that the logical and systematic processes involved in being a good chess player would help students better understand maths i.e. the skills would transfer. The conclusion however found there were no significant differences in mathematical achievement between those having regular chess classes and the control group.

Long live double entry bookkeeping

This is an interesting topic and open to some degree of interpretation and debate but it highlights the difficult path curriculum designers have to tread when it comes to removing the old to make space for the new. In addition there is a strong argument to suggest that core principles and foundational knowledge are essential prerequisites for efficient learning.
But whatever happens, we need to keep double entry bookkeeping, not because knowing that every debit has a credit is important but it helps structure a way of thinking and problem solving that has enabled finance professional to navigate significant complexity and change since Luca Pacioli allegedly invented it in 1494.

And the case from 1893 – Carlill v Carbolic Smoke Ball Company

Synergy – Direct Instruction part 2

Last month’s blog introduced the idea that Direct Instruction (DI) which is a highly structured form of teaching was a very efficient way of delivering information. The challenge was that in a world where knowledge is largely free “drilling” information using rigid methods does little to develop the skills most valued by employers.

Earlier this year in an attempt to identify some of these higher-level skills, I am not a fan of the term soft skills, LinkedIn analysed hundreds of thousands of job advertisements. They produced a top 5, which are as follows: Creativity, Persuasion, Collaboration, Adaptability and Time management. We might add to this, the ability to think for yourself which in some ways underpins them all.

The modern world doesn’t reward you for what you know, but for what you can do with what you know. Andreas Schleicher

This month I want to expand on what DI is but also add to the argument that DI (teacher led) and discovery based (Student led) are not mutually exclusive, in fact when used together they work better than on their own.

Direct Instruction is learning led
The main reason that despite its many critics DI fails to go away is because of the significant amount of evidence that proves it works. And the reason it works is because it presents information in a brain friendly way.

Cognitive load, this is a very common instructional terms and refers to the limitation of short term or working memory to hold sufficient information at any one time. As a result, it’s better not to bombard the brain with too much information, meaning its more effective for students to reduce distraction and be presented with content broken down into smaller chunks, sequenced and taught individually before being linked together at a later date. This is one of the most important aspects of DI. Avoiding distraction refers not only to external distractions e.g. your mobile phone but information that is not required or is unnecessary in arriving at the desired learning outcome

Retrieval and spaced practice are both used in direct instruction and have been mentioned in previous blogs. They are well researched and the evidence is compelling as to their effectiveness.

Using examples to teach is also something strongly promoted. It is argued that the brain has the ability to use examples to build connections, ironically without DI e.g. if we are talking about pets and we said that a cat is an example of a pet but we already knew a cat was also an animal we could link the two. Next time when the term cat is mentioned we would know it was both a pet and an animal.

Discovery based (Student led – Autonomous – Constructivism)
Many of the discovery-based learning techniques have their roots in the work of psychologists Jean Piaget, Jerome Bruner, and Seymour Papert. The core argument is that self-discovery and the process of acquiring information for yourself makes that information more readily available when it comes to problem solving. In addition, it encourages creativity, motivation, promotes autonomy, independent learning and is self-paced.

It is not however without instruction. Teachers should guide and motivate learners to look for solutions by combining existing and new information, help students avoid distraction and simplify what to a student may appear complex. To expect the student to figure everything out for themselves would be incredibly inefficient and although might lead to a truly original idea is most likely to result in a feeling of wasted time and solutions we already know or are wrong.

Critical thinking processes such as reasoning and problem solving are intimately intertwined with factual knowledge that is stored in long-term memory Daniel Willinghams – Why Students Don’t Like School.

2 + 2 = 5 = Synergy
DI and the many discovery-based learning methods can be used together because together they are far more powerful and effective. Think more of them in terms of a venn diagram with highly effective learning in the middle where the circles overlap and DI in one circle and discovery based in the other. The mix is up to the teacher which in turn is dependent on the time available, the nature of the subject, their judgment of the students and the desired outcome.

You cannot tell students how to think but you can provide them with the building blocks, helping them learn along the way before giving them real world challenges with problems they will have to solve for themselves. Then its into the workplace where the real learning experience will begin.

Learn faster with Direct Instruction – Siegfried Engelmann

What we need to learn is changing, knowledge is free, if you want the answer just google it. According to the World Economic Forum’s Future of Jobs Survey, there is an ever-greater need for cognitive abilities such as creativity, logical reasoning and problem solving. And with advances in AI, machine learning and robotics many of the skills previously valued will become redundant.

No need for the Sage on the stage
These demands have led to significant change in the way learning is happening, no longer should students be told what to think, they need to be encouraged to think for themselves, Socratic questioning, group work, experiential learning and problem based learning have all become popular, and Sir Ken Robinson Ted lecture, do schools kill creativity has had 63 million views.

Sir Kens talk is funny and inspiring and I recommend you watch it, but I want to challenge the current direction of travel or at least balance the debate by promoting a type of teaching that has fallen out of fashion and yet ironically could form the foundation upon which creativity could be built – Direct Instruction.

Direct Instruction – the Sage is back
The term direct instruction was first used in 1968, when a young Zig Engelmann a science research associate proved that students could be taught more effectively if the teacher presented information in a prescriptive, structured and sequenced manner. This carefully planned and rigid process can help eliminate misinterpretation and misunderstanding, resulting in faster learning. But most importantly it has been proven to work as evidenced by a 2018 publication which looked at over half a century of analysis and 328 past studies on the effectiveness of Direct Instruction.

Direct Instruction was also evaluated by Project Follow Through, the most extensive educational experiment ever conducted. The conclusion – It produced significantly higher academic achievement for students than any of the other programmes.

The steps in direct instruction

It will come as no surprise that a method of teaching that advocates structure and process can be presented as a series of steps.

Step 1 Set the stage for learning – The purpose of this first session is to engage the student, explaining specifically what they should be able to do and understand as a result of this lesson. Where possible a link to prior knowledge should also be made.
Step 2 Present the material – (I DO) The lesson should be organised, broken down into a step-by-step process, each one building on the other with examples to show exactly how it can be applied. This can be done by lecture, demonstration or both.
Step 3 Guided practice – (WE DO) This is where the tutor demonstrates and the student follows closely, copying in some instances. Asking questions is an important aspect for the student if something doesn’t make sense.
Step 4 Independent practice – (YOU DO) Once students have mastered the content or skill, it is time to provide reinforcement and practice.

The Sage and the Guide
The goal of Direct Instruction is to “do more in less time” which is made possible because the learning is accelerated by clarity and process.

There are of course critics, considering it a type of rote learning that will stifle the creativity of both teacher and student, and result in a workforce best suited for the industrial revolution rather than the fourth one. But for me it’s an important, effective and practical method of teaching. That when combined with inspirational delivery and a creative mindset will help students develop the skills to solve the problems of tomorrow or at least a few of them.

The independent learner – Metacognition

Metacognition is not a great word but it’s an important one when it comes to learning, especially if you are studying at higher academic levels or on your own. Cognition refers to the range of mental processes that help you acquire knowledge and understanding or more simply, learn. These processes include the storage, manipulation, and retrieval of information. Meta on the other hand means higher than or overarching, put the two together and we are talking about something that sits above learning, connecting it by way of thought. For this reason, it’s often described as thinking about thinking or in this context thinking about how you learn.

Smarter not harder

When you have a lot to learn in terms of subject matter it may feel like a distraction to spend any time learning something other than what you must know, let alone reflecting on it, but this fits under the heading of working smarter not harder, if you can find more effective ways of learning that must be helpful.
As mentioned earlier cognition is about mental processes, storage and retrieval relate to memory, manipulation, to the shifting of attention, changing perception etc. But the meta aspect creates distance, allowing us to become aware of what we are doing, standing back and observing how for example perception has changed, this reflection is a high-level skill that many believe is unique to humans. One final aspect is that we can take control of how we learn, planning tasks, changing strategies, monitoring those that work and evaluating the whole process.

Keeping it simple

Its very easy to overcomplicate metacognition, in some ways its little more than asking a few simple questions, thinking about how you are learning, what works and what doesn’t.  Here are some examples as to how you might do this.

  • Talk to yourself, ask questions at each stage, does this make sense, I have read it several times maybe I should try writing it down.
  • Ask, have I set myself sensible goals?
  • Maybe it’s time to try something different, for example mind mapping, but remember to reflect on how effective it was or perhaps was not.
  • Do I need help from anyone, this could be a fellow student or try YouTube which is a great way to find a different explanation in a different format?

Clearly these skills are helpful for all students but they are especially valuable when studying on your own perhaps on a distance learning programme or engaged in large periods of self-study.

Benefits

There are many reasons for investing some time in this area.

  • Growing self-confidence – by finding out more about how you learn you will discover both your strengths and weaknesses. Confidence isn’t about being good at everything but understanding your limitations.  
  • Improves performance – research has shown that students who actively engage in metacognition do better in exams.
  • Gives control – you are no longer reliant on the way something is taught; you have the ability to teach yourself. Being an autonomous learner is also hugely motivational.
  • The skills are transferable – this knowledge will not only help with your current subjects but all that follow, not to mention what you will need to learn in the workplace.  

It will take some time initially but, in a way, metacognition is part of learning, it’s an essential component and as such you will end up knowing more about yourself at some point, even if you don’t want to, so why not do it sooner rather than later.

And just for fun – Sheldon knows everything about himself – even when he is wrong

Intelligence defined – Inspiring learning leaders – Howard Gardner

Intelligence is a term that is often used to define people, David is “clever” or “bright” maybe even “smart” but it can also be a way in which you define yourself. The problem is that accepting this identity can have a very limiting effect on motivation, for example if someone believes they are not very clever, how hard will they try, effort would be futile. And yet it is that very effort that can make all the difference. See brain plasticity.
I wrote about an inspiring learning leader back in April this year following the death of Tony Buzan, the creator of mind maps. I want to continue the theme with Howard Gardner (Professor of Cognition and Education at the Harvard Graduate School of Education) who I would guess many have never heard of but for me is an inspirational educator.

Multiple Intelligence Theory (MIT)
Now in fairness Howard Gardner is himself not especially inspiring but his idea is. Gardner is famous for his theory that the traditional notion of intelligence, based on I.Q. is far too limited. Instead, he argues that there are in fact eight different intelligences. He first presented the theory in 1983, in the book Frames of Mind – The Theory of Multiple Intelligences. 

This might also be a good point to clarify exactly how Gardner defines intelligence.

Intelligence – ‘the capacity to solve problems or to fashion products that are valued in one or more cultural setting’ (Gardner & Hatch, 1989).

Multiple intelligences

  1. SPATIAL – The ability to conceptualise and manipulate large-scale spatial arrays e.g. airplane pilot, sailor
  2. BODILY-KINESTHETIC – The ability to use one’s whole body, or parts of the body to solve problems or create products e.g. dancer
  3. MUSICAL – Sensitivity to rhythm, pitch, meter, tone, melody and timbre. May entail the ability to sing, play musical instruments, and/or compose music e.g. musical conductor
  4. LINGUISTIC – Sensitivity to the meaning of words, the order among words, and the sound, rhythms, inflections, and meter of words e.g. poet
  5. LOGICAL-MATHEMATICAL – The capacity to conceptualise the logical relations among actions or symbols e.g. mathematicians, scientists
  6. INTERPERSONAL – The ability to interact effectively with others. Sensitivity to others’ moods, feelings, temperaments and motivations e.g. negotiator
  7. INTRAPERSONAL- Sensitivity to one’s own feelings, goals, and anxieties, and the capacity to plan and act in light of one’s own traits.
  8. NATURALISTIC – The ability to make consequential distinctions in the world of nature as, for example, between one plant and another, or one cloud formation and another e.g. taxonomist

I have taken the definitions for the intelligences direct from the MI oasis website.

It’s an interesting exercise to identify which ones you might favour but be careful, these are not learning styles, they are simply cognitive or intellectual strengths. For example, if someone has higher levels of linguistic intelligence, it doesn’t necessarily mean they prefer to learn through lectures alone.

You might also want to take this a stage further by having a go at this simple test. Please note this is for your personal use, its main purpose is to increase your understanding of the intelligences.

Implications – motivation and self-esteem
Gardner used his theory to highlight the fact that schools largely focused their attention on linguistic and logical-mathematical intelligence and rewarded those who excelled in these areas. The implication being that if you were more physically intelligent the school would not consider you naturally gifted, not “clever” as they might if you were good at maths. The advice might be that you should consider a more manual job. I wonder how that works where someone with high levels of physical and spacial intelligence may well find themselves playing for Manchester United earning over £100,000 a week!

But for students this theory can really help build self-esteem and motivate when a subject or topic is proving hard to grasp. No longer do you have to say “I don’t understand this, I am just not clever enough”. Change the words to “I don’t understand this yet, I find some of these mathematical questions challenging, after all, its not my strongest intelligence”. “I know I have to work harder in this area but when we get to the written aspects of the subject it will become easier”.

This for me this is what make Gardner’s MIT so powerful it’s not a question of how intelligent you are but which intelligence(s) you work best in.

“Discover your difference, the asynchrony with which you have been blessed or cursed and make the most of it.” Howard Gardner

As mentioned earlier Howard Gardner is not the most inspirational figure and here is an interview to prove it, but his theory can help you better understand yourself and others, and that might just change your perception of who you are and what you’re capable of – now that’s inspiring!

MI Oasis – The Official Authoritative Site of Multiple Intelligences