The Covid gap year – a catalyst for change

At times it might seem difficult to find the positives in the current Covid crises but there are some. We may have had to change our travel plans but are benefiting from cleaner air and more time, staying closer to home is leading to a greater sense of community, and social media which was becoming ever more toxic has been used by many to keep in touch with friends and family. But how long will we continue to enjoy these healthy bi-products when we can jump on that aeroplane, tweet something without thinking and once again time becomes scarce, consumed by work. The downside is it can so easily revert back to how it was before.

However, some changes are likely to become permanent, people are beginning to call what comes after Covid the new norm, a kind of normality, familiar and yet different. We have all been given a glimpse of the future or to be precise the future has been brought forward not as a blurry image but with startling clarity because we are living it.

Change is easy
On the whole it’s difficult to get people to change their behaviour but if you change the environment it’s a different story. If we had asked people if they wanted to work from home they would have had to guess what it would be like, imagining not having to travel, imagining not seeing colleagues in the wok place but if you are forced into doing it, you experience it for real. And that’s what’s happened, people may not have chosen to work from home but having experienced it the change will be faster.

Neurologically a habit or learning for that matter takes place when you do something repeatedly. In 1949 Donald Hebb, a Canadian neuroscientist noted that once a circuit of neurons is formed, when one neuron fires so do the others, effectively strengthening the whole circuit. This has become known as Hebbian theory or Hebbs law and leads to long term potentiation, (LTP).

“Neurons that fire together wire together.”

Habits are patterns that can be thought of as grooves created over time by repetition but once formed they are hard to get out of, the deeper the groove, the less we think about it at a conscious level. But if you change the environment you are forcing the brain to reconsider those habits, effectively moving you out of that particular groove until you form another one. The secret is of course to create good habits and remove bad ones.

Many are suggesting that working from home will become far more common, Google and Facebook have already announced that they do not need their employees to go back into offices until at least the end of 2020, but who knows what that groove will be like by then. The other big changes on the horizon with potential for long term impact are, the reduction in the use of cash as appose to contactless, online shopping already popular will see a more drastic reshaping of the high street and studying online becoming a new way of learning. Education has seen one of the biggest changes arguably since we have had access to the internet with 1.3 billion students from 186 countries across the world now having to learn remotely. Even before COVID-19, global EdTech investment was $18.7 billion and the overall market for online education is projected to reach $350 Billion by 2025. (source WEF).

This is what school in China looks like during coronavirus.

Changing attitudes to study
Given the choice 1.3 billion students would not have all agreed to study online but Covid-19 has made this a reality within a matter of months. Its an environmental change on a massive scale. The argument that online learning is better remains complex and confusing, requiring a clearer understanding of what is being measured and a much longer time period under which it can be evaluated. There are for example claims that retention rates are higher by somewhere between 25% – 60% but I would remain sceptical despite its appeal and apparent common sense logic.

Instead focus on your own learning, think less of how much more difficult it is to concentrate staring at a computer screen rather than being in a classroom and embrace the process. You are in a new “groove” and as a result it’s not going to feel comfortable.

Covid Gap year
Why not make 2020 your Covid Gap year. UCAS says that one of the benefits of a gap year is that it “offers you the opportunity to gain skills and experiences, while giving you time to reflect and focus on what you want to do next”. It’s the changing environment in terms of geography, people, doing things that you might not have chosen that makes the gap year so worthwhile, and despite what people say when they return, it wasn’t enjoyable all of the time, you do get bored and frustrated but it can open your mind to new possibilities and ironically lockdown can do the same.

Online learning is a new environment, view it through the spectrum of new skills and experiences and only when you reflect back should you decide on how valuable it might have been.

Brain overload

Have you ever felt that you just can’t learn anymore, your head is spinning, your brain must be full? And yet we are told that the brains capacity is potentially limitless, made up of around 86 billion neurons.

To understand why both of these may be true, we have to delve a little more into how the brain learns or to be precise how it manages information. In a previous blog I outlined the key parts of the brain and discussed some of the implications for learning – the learning brain, but as you might imagine this is a complex subject, but I should add a fascinating one.

Cognitive load and schemas

Building on the work of George (magic number 7) Miller and Jean Paget’s development of schemas, in 1988 John Sweller introduced us to cognitive load, the idea that we have a limit to the amount of information we can process.

Cognitive load relates to the amount of information that working memory can hold at one time

Human memory can be divided into working memory and long-term memory. Working memory also called short term memory is limited, only capable of holding 7 plus or minus 2 pieces of information at any one time, hence the magic number 7, but long-term memory has arguably infinite capacity.

The limited nature of working memory can be highlighted by asking you to look at the 12 letters below. Take about 5 seconds. Look away from the screen and write down what you can remember on a blank piece of paper.

MBIAWTDHPIBF

Because there are more than 9 characters this will be difficult. 

Schemas – Information is stored in long-term memory in the form of schemas, these are frameworks or concepts that help organise and interpret new information. For example, when you think of a tree it is defined by a number of characteristics, its green, has a trunk and leaves at the end of branches, this is a schema. But when it comes to autumn, the tree is no longer green and loses its leaves, suggesting that this cannot be a tree. However, if you assimilate the new information with your existing schema and accommodate this in a revised version of how you think about a tree, you have effectively learned something new and stored it in long term memory. By holding information in schemas, when new information arrives your brain can very quickly identify if it fits within an existing one and in so doing enable rapid knowledge acquisition and understanding.

The problem therefore lies with working memory and its limited capacity, but if we could change the way we take in information, such that it doesn’t overload working memory the whole process will become more effective.

Avoiding cognitive overload

This is where it gets really interesting from a learning perspective. What can we do to avoid the brain becoming overloaded?

1. Simple first – this may sound like common sense, start with a simple example e.g. 2+2 = 4 and move towards the more complex e.g. 2,423 + 12,324,345. If you start with a complex calculation the brain will struggle to manipulate the numbers or find any pattern.

2. Direct Instruction not discovery – although there is significant merit in figuring things out for yourself, when learning something new it is better to follow guided instruction (teacher led) supported by several examples, starting simple and becoming more complex (as above). When you have created your own schema, you can begin to work independently.

3. Visual overload – a presentation point, avoid having too much information on a page or slide, reveal each part slowly. The secret is to break down complexity into smaller segments. This is the argument for not having too much content all on one page, which is often the case in textbooks. Read with a piece of paper or ruler effectively underlining the words you are reading, moving the paper down revealing a new line at a time.

4. Pictures and words (contiguity) – having “relevant” pictures alongside text helps avoid what’s called split attention. This is why creating your own notes with images as well as text when producing a mind map works so well.

5. Focus, avoid distraction (coherence) – similar to visual overload, remove all unnecessary images and information, keep focused on the task in hand. There may be some nice to know facts, but stick to the essential ones.

6. Key words (redundancy) – when reading or making notes don’t highlight or write down exactly what you read, simplify the sentence, focusing on the key words which will reduce the amount of input.

7. Use existing schemas – if you already have an understanding of a topic or subject, it will be sat within a schema, think how the new information changes your original understanding.

Remember the 12 characters from earlier, if we chunk them into 4 pieces of information and link to an existing schema, you will find it much easier to remember. Here are the same 12 characters chunked down.

FBI – TWA – PHD – IBM

Each one sits within an existing schema e.g. Federal Bureau of Investigation etc, making it easier for the brain to learn the new information.

Note – the above ideas are based on Richard E. Mayer’s principles of multimedia learning.

In conclusion

Understanding more about how the brain works, in particular how to manage some of its limitations as is the case with short term memory not only makes learning more efficient but also gives you confidence that how your learning is the most effective.

Double entry bookkeeping replaced by internet

There is an interesting question being asked at the moment, given that fact-based knowledge is so accessible using the internet, is there a case for not teaching facts at all?

According to Don Tapscott, a consultant and speaker, who specialises in organisations and technology, memorising facts and figures is a waste of time because such information is readily available. It would be far better to teach students to think creatively so that they can learn to interpret and apply the knowledge they discover online.

“Teachers are no longer the fountain of knowledge, the internet is”
Don Tapscott

Is this the solution for educators with an over full curriculum, the result of having to continually add new content to ensure their qualification remains relevant and topical? Perhaps they can remove facts and focus on skills development? After all its skills that matter, knowing is useful but it’s the ability to apply that really matters …right?

What makes you an accountant

When you start to learn about finance, you will be taught a number of underpinning foundational subjects including, law, economics, costing and of course basic accounting. Sat stubbornly within the accounting section will be double entry bookkeeping. This axiom is fiercely protected by the finance community such that if anyone questions its value or challenges its relevance they will be met with pure contempt. And yet, is the knowledge as to how you move numbers around following a hugely simple rule i.e. put a number on one side and an equivalent on the other of any use in a world where most accounting is performed by computers and sophisticated algorithms? I am sure there will be similar examples from other professions and industries. The challenge being, do doctors really need to understand basic anatomy or lawyers read cases dating back to 1892?

“Everyone is entitled to his own opinion, but not to his own facts”
Daniel Patrick Moynihan

But Knowledge is power

Daniel T. Willingham is a psychologist at the University of Virginia and the author of a number of books including, why students don’t like school. His early research was on the brain, learning and memory but more recently he has focused on the application of cognitive psychology in K-16 education.

Willingham argues that knowledge is not only cumulative, it grows exponentially. In addition, factual knowledge enhances cognitive processes like problem solving and reasoning. How knowledge Helps.

Knowledge is cumulative – the more you know the more you can learn. Individual chunks of knowledge will stick to new knowledge because what you already know provides context and so aids comprehension. For example, knowing the definition of a bond ‘a fixed income instrument that represents a loan made by an investor to a borrower (prior knowledge), enables the student to grasp the idea that anything fixed has to be paid by the company (the lender) regardless of its profitability and this is the reason debt is considered risky. (new knowledge)

Knowledge helps you remember – the elaboration effect has featured in a previous blog. In essence it suggests that the brain finds it easier to remember something if it can be associated with existing information. Using the same example from above, it is easier to remember that bonds are risky if you already knew what a bond was.

Knowledge improves thinking – there are two reasons for this, firstly it helps with problem solving. Imagine you have a problem to solve, if you don’t have sufficient background knowledge, understanding the problem can consume most of your working memory leaving no space for you to consider solutions. This argument is based on the understanding that we have limited capacity in working memory (magic number 7) and so to occupy it with grasping the problem at best slows down the problem-solving process, but at worse might result in walking away with no solution. Secondly knowledge helps speed up problem solving and thinking. People with prior knowledge are better at drawing analogies as they gain experience in a domain. Research by Bruce Burns in 2004 compared the performance of top chess players at normal and blitz tournaments. He found that what was making some players better than others is differences in the speed of recognition, not faster processing skills. Players who had knowledge of prior games where far quicker in coming up with moves than those who were effectively solving the problem from first principle. Chess speed at least has a lot to do with the brain recognising pre learned patterns.

Skills are domain specific – not transferable

There is one other important lesson from an understanding of knowledge – skills are domain specific. The implication being that teaching “transferable skills” e.g. skills that can be used in different areas, communication, critical thinking etc doesn’t work. A skill (Merriam Webster) is the ability to use one’s knowledge effectively and readily in execution or performance. The argument being that in order to use knowledge effectively, it needs to be in a specific domain.
In July 2016 the Education Endowment Foundation in the UK released the results of a two-year study involving almost 100 schools that wanted to find out if playing chess would improve maths. The hypothesis was that the logical and systematic processes involved in being a good chess player would help students better understand maths i.e. the skills would transfer. The conclusion however found there were no significant differences in mathematical achievement between those having regular chess classes and the control group.

Long live double entry bookkeeping

This is an interesting topic and open to some degree of interpretation and debate but it highlights the difficult path curriculum designers have to tread when it comes to removing the old to make space for the new. In addition there is a strong argument to suggest that core principles and foundational knowledge are essential prerequisites for efficient learning.
But whatever happens, we need to keep double entry bookkeeping, not because knowing that every debit has a credit is important but it helps structure a way of thinking and problem solving that has enabled finance professional to navigate significant complexity and change since Luca Pacioli allegedly invented it in 1494.

And the case from 1893 – Carlill v Carbolic Smoke Ball Company

Synergy – Direct Instruction part 2

Last month’s blog introduced the idea that Direct Instruction (DI) which is a highly structured form of teaching was a very efficient way of delivering information. The challenge was that in a world where knowledge is largely free “drilling” information using rigid methods does little to develop the skills most valued by employers.

Earlier this year in an attempt to identify some of these higher-level skills, I am not a fan of the term soft skills, LinkedIn analysed hundreds of thousands of job advertisements. They produced a top 5, which are as follows: Creativity, Persuasion, Collaboration, Adaptability and Time management. We might add to this, the ability to think for yourself which in some ways underpins them all.

The modern world doesn’t reward you for what you know, but for what you can do with what you know. Andreas Schleicher

This month I want to expand on what DI is but also add to the argument that DI (teacher led) and discovery based (Student led) are not mutually exclusive, in fact when used together they work better than on their own.

Direct Instruction is learning led
The main reason that despite its many critics DI fails to go away is because of the significant amount of evidence that proves it works. And the reason it works is because it presents information in a brain friendly way.

Cognitive load, this is a very common instructional terms and refers to the limitation of short term or working memory to hold sufficient information at any one time. As a result, it’s better not to bombard the brain with too much information, meaning its more effective for students to reduce distraction and be presented with content broken down into smaller chunks, sequenced and taught individually before being linked together at a later date. This is one of the most important aspects of DI. Avoiding distraction refers not only to external distractions e.g. your mobile phone but information that is not required or is unnecessary in arriving at the desired learning outcome

Retrieval and spaced practice are both used in direct instruction and have been mentioned in previous blogs. They are well researched and the evidence is compelling as to their effectiveness.

Using examples to teach is also something strongly promoted. It is argued that the brain has the ability to use examples to build connections, ironically without DI e.g. if we are talking about pets and we said that a cat is an example of a pet but we already knew a cat was also an animal we could link the two. Next time when the term cat is mentioned we would know it was both a pet and an animal.

Discovery based (Student led – Autonomous – Constructivism)
Many of the discovery-based learning techniques have their roots in the work of psychologists Jean Piaget, Jerome Bruner, and Seymour Papert. The core argument is that self-discovery and the process of acquiring information for yourself makes that information more readily available when it comes to problem solving. In addition, it encourages creativity, motivation, promotes autonomy, independent learning and is self-paced.

It is not however without instruction. Teachers should guide and motivate learners to look for solutions by combining existing and new information, help students avoid distraction and simplify what to a student may appear complex. To expect the student to figure everything out for themselves would be incredibly inefficient and although might lead to a truly original idea is most likely to result in a feeling of wasted time and solutions we already know or are wrong.

Critical thinking processes such as reasoning and problem solving are intimately intertwined with factual knowledge that is stored in long-term memory Daniel Willinghams – Why Students Don’t Like School.

2 + 2 = 5 = Synergy
DI and the many discovery-based learning methods can be used together because together they are far more powerful and effective. Think more of them in terms of a venn diagram with highly effective learning in the middle where the circles overlap and DI in one circle and discovery based in the other. The mix is up to the teacher which in turn is dependent on the time available, the nature of the subject, their judgment of the students and the desired outcome.

You cannot tell students how to think but you can provide them with the building blocks, helping them learn along the way before giving them real world challenges with problems they will have to solve for themselves. Then its into the workplace where the real learning experience will begin.

Learn faster with Direct Instruction – Siegfried Engelmann

What we need to learn is changing, knowledge is free, if you want the answer just google it. According to the World Economic Forum’s Future of Jobs Survey, there is an ever-greater need for cognitive abilities such as creativity, logical reasoning and problem solving. And with advances in AI, machine learning and robotics many of the skills previously valued will become redundant.

No need for the Sage on the stage
These demands have led to significant change in the way learning is happening, no longer should students be told what to think, they need to be encouraged to think for themselves, Socratic questioning, group work, experiential learning and problem based learning have all become popular, and Sir Ken Robinson Ted lecture, do schools kill creativity has had 63 million views.

Sir Kens talk is funny and inspiring and I recommend you watch it, but I want to challenge the current direction of travel or at least balance the debate by promoting a type of teaching that has fallen out of fashion and yet ironically could form the foundation upon which creativity could be built – Direct Instruction.

Direct Instruction – the Sage is back
The term direct instruction was first used in 1968, when a young Zig Engelmann a science research associate proved that students could be taught more effectively if the teacher presented information in a prescriptive, structured and sequenced manner. This carefully planned and rigid process can help eliminate misinterpretation and misunderstanding, resulting in faster learning. But most importantly it has been proven to work as evidenced by a 2018 publication which looked at over half a century of analysis and 328 past studies on the effectiveness of Direct Instruction.

Direct Instruction was also evaluated by Project Follow Through, the most extensive educational experiment ever conducted. The conclusion – It produced significantly higher academic achievement for students than any of the other programmes.

The steps in direct instruction

It will come as no surprise that a method of teaching that advocates structure and process can be presented as a series of steps.

Step 1 Set the stage for learning – The purpose of this first session is to engage the student, explaining specifically what they should be able to do and understand as a result of this lesson. Where possible a link to prior knowledge should also be made.
Step 2 Present the material – (I DO) The lesson should be organised, broken down into a step-by-step process, each one building on the other with examples to show exactly how it can be applied. This can be done by lecture, demonstration or both.
Step 3 Guided practice – (WE DO) This is where the tutor demonstrates and the student follows closely, copying in some instances. Asking questions is an important aspect for the student if something doesn’t make sense.
Step 4 Independent practice – (YOU DO) Once students have mastered the content or skill, it is time to provide reinforcement and practice.

The Sage and the Guide
The goal of Direct Instruction is to “do more in less time” which is made possible because the learning is accelerated by clarity and process.

There are of course critics, considering it a type of rote learning that will stifle the creativity of both teacher and student, and result in a workforce best suited for the industrial revolution rather than the fourth one. But for me it’s an important, effective and practical method of teaching. That when combined with inspirational delivery and a creative mindset will help students develop the skills to solve the problems of tomorrow or at least a few of them.

The independent learner – Metacognition

Metacognition is not a great word but it’s an important one when it comes to learning, especially if you are studying at higher academic levels or on your own. Cognition refers to the range of mental processes that help you acquire knowledge and understanding or more simply, learn. These processes include the storage, manipulation, and retrieval of information. Meta on the other hand means higher than or overarching, put the two together and we are talking about something that sits above learning, connecting it by way of thought. For this reason, it’s often described as thinking about thinking or in this context thinking about how you learn.

Smarter not harder

When you have a lot to learn in terms of subject matter it may feel like a distraction to spend any time learning something other than what you must know, let alone reflecting on it, but this fits under the heading of working smarter not harder, if you can find more effective ways of learning that must be helpful.
As mentioned earlier cognition is about mental processes, storage and retrieval relate to memory, manipulation, to the shifting of attention, changing perception etc. But the meta aspect creates distance, allowing us to become aware of what we are doing, standing back and observing how for example perception has changed, this reflection is a high-level skill that many believe is unique to humans. One final aspect is that we can take control of how we learn, planning tasks, changing strategies, monitoring those that work and evaluating the whole process.

Keeping it simple

Its very easy to overcomplicate metacognition, in some ways its little more than asking a few simple questions, thinking about how you are learning, what works and what doesn’t.  Here are some examples as to how you might do this.

  • Talk to yourself, ask questions at each stage, does this make sense, I have read it several times maybe I should try writing it down.
  • Ask, have I set myself sensible goals?
  • Maybe it’s time to try something different, for example mind mapping, but remember to reflect on how effective it was or perhaps was not.
  • Do I need help from anyone, this could be a fellow student or try YouTube which is a great way to find a different explanation in a different format?

Clearly these skills are helpful for all students but they are especially valuable when studying on your own perhaps on a distance learning programme or engaged in large periods of self-study.

Benefits

There are many reasons for investing some time in this area.

  • Growing self-confidence – by finding out more about how you learn you will discover both your strengths and weaknesses. Confidence isn’t about being good at everything but understanding your limitations.  
  • Improves performance – research has shown that students who actively engage in metacognition do better in exams.
  • Gives control – you are no longer reliant on the way something is taught; you have the ability to teach yourself. Being an autonomous learner is also hugely motivational.
  • The skills are transferable – this knowledge will not only help with your current subjects but all that follow, not to mention what you will need to learn in the workplace.  

It will take some time initially but, in a way, metacognition is part of learning, it’s an essential component and as such you will end up knowing more about yourself at some point, even if you don’t want to, so why not do it sooner rather than later.

And just for fun – Sheldon knows everything about himself – even when he is wrong

Intelligence defined – Inspiring learning leaders – Howard Gardner

Intelligence is a term that is often used to define people, David is “clever” or “bright” maybe even “smart” but it can also be a way in which you define yourself. The problem is that accepting this identity can have a very limiting effect on motivation, for example if someone believes they are not very clever, how hard will they try, effort would be futile. And yet it is that very effort that can make all the difference. See brain plasticity.
I wrote about an inspiring learning leader back in April this year following the death of Tony Buzan, the creator of mind maps. I want to continue the theme with Howard Gardner (Professor of Cognition and Education at the Harvard Graduate School of Education) who I would guess many have never heard of but for me is an inspirational educator.

Multiple Intelligence Theory (MIT)
Now in fairness Howard Gardner is himself not especially inspiring but his idea is. Gardner is famous for his theory that the traditional notion of intelligence, based on I.Q. is far too limited. Instead, he argues that there are in fact eight different intelligences. He first presented the theory in 1983, in the book Frames of Mind – The Theory of Multiple Intelligences. 

This might also be a good point to clarify exactly how Gardner defines intelligence.

Intelligence – ‘the capacity to solve problems or to fashion products that are valued in one or more cultural setting’ (Gardner & Hatch, 1989).

Multiple intelligences

  1. SPATIAL – The ability to conceptualise and manipulate large-scale spatial arrays e.g. airplane pilot, sailor
  2. BODILY-KINESTHETIC – The ability to use one’s whole body, or parts of the body to solve problems or create products e.g. dancer
  3. MUSICAL – Sensitivity to rhythm, pitch, meter, tone, melody and timbre. May entail the ability to sing, play musical instruments, and/or compose music e.g. musical conductor
  4. LINGUISTIC – Sensitivity to the meaning of words, the order among words, and the sound, rhythms, inflections, and meter of words e.g. poet
  5. LOGICAL-MATHEMATICAL – The capacity to conceptualise the logical relations among actions or symbols e.g. mathematicians, scientists
  6. INTERPERSONAL – The ability to interact effectively with others. Sensitivity to others’ moods, feelings, temperaments and motivations e.g. negotiator
  7. INTRAPERSONAL- Sensitivity to one’s own feelings, goals, and anxieties, and the capacity to plan and act in light of one’s own traits.
  8. NATURALISTIC – The ability to make consequential distinctions in the world of nature as, for example, between one plant and another, or one cloud formation and another e.g. taxonomist

I have taken the definitions for the intelligences direct from the MI oasis website.

It’s an interesting exercise to identify which ones you might favour but be careful, these are not learning styles, they are simply cognitive or intellectual strengths. For example, if someone has higher levels of linguistic intelligence, it doesn’t necessarily mean they prefer to learn through lectures alone.

You might also want to take this a stage further by having a go at this simple test. Please note this is for your personal use, its main purpose is to increase your understanding of the intelligences.

Implications – motivation and self-esteem
Gardner used his theory to highlight the fact that schools largely focused their attention on linguistic and logical-mathematical intelligence and rewarded those who excelled in these areas. The implication being that if you were more physically intelligent the school would not consider you naturally gifted, not “clever” as they might if you were good at maths. The advice might be that you should consider a more manual job. I wonder how that works where someone with high levels of physical and spacial intelligence may well find themselves playing for Manchester United earning over £100,000 a week!

But for students this theory can really help build self-esteem and motivate when a subject or topic is proving hard to grasp. No longer do you have to say “I don’t understand this, I am just not clever enough”. Change the words to “I don’t understand this yet, I find some of these mathematical questions challenging, after all, its not my strongest intelligence”. “I know I have to work harder in this area but when we get to the written aspects of the subject it will become easier”.

This for me this is what make Gardner’s MIT so powerful it’s not a question of how intelligent you are but which intelligence(s) you work best in.

“Discover your difference, the asynchrony with which you have been blessed or cursed and make the most of it.” Howard Gardner

As mentioned earlier Howard Gardner is not the most inspirational figure and here is an interview to prove it, but his theory can help you better understand yourself and others, and that might just change your perception of who you are and what you’re capable of – now that’s inspiring!

MI Oasis – The Official Authoritative Site of Multiple Intelligences