Plastic fantastic – how the brain grows

Stress BallA major new idea was presented to the world in 1991, to many it will mean very little but in terms of improving our understanding of the brain it was a milestone.

Functional magnetic resonance imaging (fMRI) had seen its roots in the earlier MRI, but instead of creating images of organs and tissues, fMRI looks at blood flow in the brain to detect areas of activity and so show how the brain works in real time.

The implications of this for learning are significant because for the first time we were able to identify which parts of the brain were reacting when different tasks were being performed. For example, we know that the cerebrum which is the largest part of the brain performs higher functions such as interpreting touch, vision, hearing, speech, emotions etc.

Brain plasticity

But it is the next discovery that is far more interesting from a learning perspective. For many years the common belief was that brain functionality (intelligence) was to a certain extent hard wired, largely genetic, with a fixed number of neurons. It probably didn’t help that the computer gave us a simile for how the brain worked which was misleading.

That all changed when it became possible to observe the brain and watch how it responded to what it saw and was asked to do. What this showed was that the brain has the ability to generate new cells, a process called Neurogenesis.

Click here to listen to neuroscientist Sandrine Thuret explain how humans can generate new brain cells i.e. Neurogenesis.

This may make sense for children given the basic brain functionality when a child is born, something must be happening to turn them into caring and thoughtful adults. In fact, by adolescence the brain has produced so many synapse, the connections between cells, they have to be cut back or pruned. Hence the term synaptic pruning.  What was perhaps more of a surprise was that growing new brain cells was not just something children could do, adults were able to do it as well.

The classic example is the evidence by Professor Eleanor Maguire from the Wellcome Trust Centre and colleague Dr Katherine Woollett who followed a group of 79 trainee taxi drivers and 31 controls (non-taxi drivers). Their research showed that London taxi drivers developed a greater volume of grey matter i.e.  cell development, three to four years after passing “the knowledge”  when compared to the control group.

Learning about learning

This may leave you thinking, all very interesting but what does it mean for me as a student?

In the same way that people can develop a growth mindset, bringing it within your control, you can do the same with your academic performance. Just because you don’t understand something or pick it up very quickly doesn’t mean that you won’t be able to. This is not to say that some people are not “brighter” than others, it is estimated that around 50%/60% of your intelligence is genetic, but that’s on the assumption your brain cannot change, and what this proves is it can.

And here is one last interesting observation, knowing how the brain works can actually help rewire it. There is evidence that students who know more about how they learn, (meta cognition) will naturally reflect on what they are doing when they are learning which in turn will help grow new cells, how good is that.

Advertisements

Artificial Intelligence in education (AIEd)

robot learning or solving problems

The original Blade Runner was released in 1982. It depicts a future in which synthetic humans known as replicants are bioengineered by a powerful Corporation to work on off-world colonies. The final scene stands out because of the “tears in rain” speech given by Roy, the dying replicant.

I’ve seen things you people wouldn’t believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. Time to die.

This was the moment in which the artificial human had begun to think for himself. But what makes this so relevant is that the film is predicting what life will be like in 2019. And with 2018 only a few days away, 2019 is no longer science fiction, and neither is Artificial Intelligence (AI).

Artificial Intelligence and machine learning

There is no one single agreed upon definition for AI, “machine learning” on the other hand is a field of computer science that enables computers to learn without being explicitly programmed. The way it does this is by analysing large amounts of data in order to make accurate predictions, for example regression analysis does something very similar when using data to produce a line of best fit.

The problem with the term artificial intelligence is the word intelligence, defining this is key. If intelligence is, the ability to learn, understand, and make judgments or have opinions based on reason, then you can see how difficult deciding if a computer has intelligence might be. So, for the time being think of it like this:

AI is the intelligence; machine learning is the enabler making the machine smarter i.e. it helps the computer behave as if it is making intelligent decisions.

AI in education

As with many industries AI is already having an impact in education but given the right amount of investment it could do much more, for example

Teaching – Freeing teachers from routine and time-consuming tasks like marking and basic content delivery. This will give them time to develop greater class engagement and address behavioural issues and higher-level skill development. These being far more valued by employers, as industries themselves become less reliant on knowledge but dependant on those who can apply it to solve real word problems. In some ways AI could be thought of as a technological teaching assistant. In addition the quality and quantity of feedback the teacher will have available to them will not only be greatly improved with AI but be far more detailed and personalised.

Learning – Personalised learning can become a reality by using AI to deliver a truly adaptive experience. AI will be able to present the student with a personalised pathway based on data gathered from their past activities and those of other students. It can scaffold the learning, allowing the students to make mistakes sufficient that they will gain a better understanding.  AI is also an incredibly patient teacher, helping the student learn from constant repetition, trial and error.

Assessment and feedback – The feedback can also become rich, personalised and most importantly timely. Offering commentary as to what the individual student should do to improve rather than the bland comments often left on scripts e.g. “see model answer” and “must try harder.” Although some teachers will almost certainly mark “better” than an AI driven system would be capable of, the consistency of marking for ALL students would be considerably improved.

Chatbots are a relatively new development that use AI.  In the Autumn of 2015 Professor Ashok Goel built an AI teaching assistant called Jill Watson using IBM’s Watson platform. Jill was developed specifically to handle the high number of forum posts, over 10,000 by students enrolled on an online course. The students were unable to tell the difference between Jill and a “real” teacher. Watch and listen to Professor Goel talk about how Jill Watson was built.

Pearson has produced an excellent report on AIEd – click to download.

Back on earth

AI still has some way to go, and as with many technologies although there is much talk, getting it into the mainstream takes time and most importantly money. Although investors will happily finance driverless cars, they are less likely to do the same to improve education.

The good news is that Los Angeles is still more like La La Land than the dystopian vision created by Ridely Scott, and although we have embraced many new technologies, we have avoided many of the pitfalls predicated by the sci-fi writers of the past, so far at least.

But we have to be careful watch this, it’s a robot developed by AI specialist David Hanson named “Sophia” and has made history by becoming the first ever robot to be granted a full Saudi Arabian citizenship, honestly…..

 

Concentration – the war in the brain

Concentrating

One of the most important skills in learning is the ability to concentrate. If you could focus your attention on a specific task for long periods of time you would be able to absorb more content, more quickly.

But concentrating is not easy. The reason is partly because we lack the ability to manage distraction. I have written before about focus, information overload and the problems with multi-tasking, but this is a large and fascinating subject.

The war in the brain

Improving concentration has a lot to do with attention, which in some ways is an invisible force, but as we have found before neuroscience can help us gain insight into the previously unknown. For example, most of us will have what is called a priority map, a map of the most visited places in our brain. Its value is that it can be used to identify how we prioritise incoming information and as such where we place our attention. It’s worth stating that attention a is a limited resource so how we use it is important.

Take this attention test and find out your level of attention.

The problem is that these maps change based on how “relevant” the information is, and relevancy itself is dependent on three systems that continually compete with each other. I know this is getting complicated but stick with it, concentrate!

The executive system – Sitting in the frontal lobe, this is the main system and orients attention according to our current goals. For example, I need to learn about double entry bookkeeping, so I will place my attention on page 4 and start reading.

The reward system – As you might imagine this is the system that offers us rewards. A reward can be as simple as the dopamine rush you get when checking your mobile phone, the problem is, you should be reading page 4! And its made worse by the fact that the brain’s attention naturally moves to flashing lights, which you often get when a text comes in.

The habit system – This system operates using fixed rules often built up over time by repetition, perhaps it’s the reason you keep looking at your phone just to check that you haven’t had a text even though you know you haven’t because you would have seen the flashing light….But most importantly the habit of checking, created by you has once again distracted your attention, when you should still be reading page 4!

Hence the term, war in the brain, these systems are in competition for your attention. The result is exhausting, you don’t finish reading page 4, and feel tired even though you have achieved very little.

How to improve concentration  

Some of the methods below will seem obvious and there is of course no magic bullet, however because there is a scientific reason as to why these might work I hope you will be more likely to give them a go.

  1. Reduce distraction –  if you have to make a huge amount of effort to check your mobile phone, the reward you get from checking it will diminish. The simple advice is don’t have your phone with you when studying or anything else that might occupy your thoughts. Also have a space to study that is quiet, with simple surroundings and nothing interesting that might be a distraction. Finally, although there is mixed evidence on playing music or listening to white noise in the background, it may be worth a try.
  2. Set goals – this is to support your executive system, write down your goal and don’t make them too ambitious.
  3. Relax and stay calm – it’s hard to concentrate when you are feeling high levels of anxiety. Methods to help with relaxation include, deep breathing, click this video its very helpful, and of course exercise which I have written about in the past, because of it being a natural antidote for stress.
  4. Avoid too much stimulation – novelty seeking behaviours for example playing video games can become imbedded in your reward system. They can make studying appear very dull and unrewarding especially if you have played a game immediately before getting down to study. Keep it for afterwards, by way of a reward perhaps.

And if you would like to find out more watch these:

What’s the use of lectures?

Robot lecturerThe title of this month’s blog is not mine but taken from what many would consider a classic book about what can realistically be achieved by someone stood at the front of a classroom or lecture theatre, simply talking. Written some 25 years ago but updated recently Donald A. Bligh’s book takes 346 pages to answer the question, what’s the use of lectures?

What makes this book interesting is the amount of research it brings to bear on a topic some consider an art form and so not easily measured or assessed.

With many in Higher education questioning what they get for their £9,250 per annum, and contact time being one way of measuring value, it’s as important a question as ever.

For clarity, we should define what we mean by lecturing, as ever Wikipedia can help –  A lecture (from the French meaning ‘reading’) is an oral presentation intended to deliver information or teach people about a particular subject.

What should happen in a lecture?

If you’re a student attending a lecture you would hope to learn something, however as many of my past blogs have discussed, learning is a complicated process and so we may need to break this question down a little further by asking, what should a lecture actually achieve?

A lecture should….

  • Transmit information
  • Promote thought,
  • Maybe change opinion or attitude
  • Inspire and motivate
  • Help you be able to do something i.e. develop a behavioural skill

Well here is the bad news, according to Mr Bligh, a lecture is only really good for one of the above, to transmit information. And it’s not even better than many other methods e.g. reading, it’s simply as effective, but no more.

Promoting thought, changing opinions

Lectures are relatively passive whereas a discussion requires that people listen, translate what is said into their own words, check if it makes sense with what is already understood, construct a sentence in response etc. In effect, a discussion is far more effective than a lecture in developing thought.

In addition, putting the student in a situation where they have to think is important, for example by giving them a problem or asking a question as is the case when you have to answer a past exam question for example. A discussion can also help change opinions, especially where you can hear other people’s views, often different to your own. It has a longer-term impact when the group comes to a consensus.

Inspiration and motivation

Bligh also argues that on the whole lectures are not an effective means of inspiring or motivating. He suggests that it should certainly be the objective of the lecturer to try, it’s just they rarely succeed. I find myself slightly disagreeing, lecturers can be inspirational, and yet maybe this is just my personal bias from having watched Sir Ken deliver his “do schools kill creativity“  or the last lecture delivered by Randy Paush.

But perhaps, these are just the exceptions that prove the rule.

Developing skills

And finally, if you want to help people become good at a particular behaviour, you don’t tell them how to do it, you get them to practice, over and over again, with good feedback.

The end of the lecture?

I don’t think this is the end of the lecture, these criticisms have been around for many years. But I can’t help thinking that with new technologies and online learning, lectures are going to have to get a whole lot better in the future.

And what will Universities point to as value for money then?

 

 

The 5 top EdTech trends – summer of 2017

Glastonbury a marginally more interesting gathering….but only just.

We are in the season when many learning and technology leaders gather to discuss what’s new and what’s trending in the world of education. And at two recent conferences, Learning Technologies and EdTechXEurope there was plenty to see. Generally, the role of technology in learning seems to have found its place with many acknowledging it should support learning not drive it. However it’s still very easy to look at the latest shiny new offerings and think, this is great how can I use it, rather than, what learning problem does it solve.

Here are a few of the most notable developments.

1. Video is getting even better – fuelled by the YouTube generation of learners, those who would rather watch a video than read a book as a means to consume knowledge, we have some new developments.

Firstly, using video to deliver micro learning.  Not just small chunks of video but untethered, JIT, 3 minute courses that offer the learner digestible easy to remember information. Think of micro learning as a series of very short courses that could be linked to each other or not, and can even include assessment.

Secondly, interactive video. TV is no longer the all commanding medium it once was, it like other technologies has had to evolve. In recent years the shift has been towards better engagement, offering spin off programmes where there is a live audience, web sites that showcase the backstory to the characters and programmes that require the audience to vote and so influence events. Now we have interactive video, where the individual can choose what they would do and so change the future. Check out this amazing example, used by Deloitte to attract new talent.

2. Gamification is becoming better understood. For the uninitiated gamification is the use of game based principles to improve motivation, concentration and more effective learning. Gamification uses Points (P) as a measure of reward, Badges (B) as a visual record of success, and leader boards (L) to create competition.

We now believe Dopamine, the pleasure induced neurotransmitter (chemical) is not created as a result of a reward e.g. by being given a badge, it is the challenge and subsequent achievement that releases the dopamine which in turn leads to pleasure. This might seem obvious, with hindsight, no one gets pleasure from being top of a leader board, if they did nothing to get there.  In addition, dopamine is released when you have a new experience, so think about changing pathways, setting different questions and tasks, it’s certainly not very motivational to go over the same content again.

3. Information overload is leading to a need for Knowledge Curation – we are living in an age where  information is abundant. You can learn anything from the internet. But there lies the problem, we have too much information, we suffer from information overload. Curation is the collecting and sorting of meaningful content around a theme, and it is now in some instances being thought of as more valuable than the content itself.

Arguably curation is not so much about what you curate and share but what you don’t share. In addition to the organisation of content the curators need to have an expertise in the subject and an understanding of their audience and what they want.

Steven Rosenbaum in his book Curation Nation, offers up a good summary. “Curation replaces noise with clarity. And it’s the clarity of your choosing; it’s the things that people you trust help you find.”

4. The market is becoming more accepting of user generated content (UGC) – organisations are beginning to see the benefits of UGC for a whole host of reasons. It’s a very fast way of generating content, there is a lot of expertise that can be uncovered by allowing individuals to share what they know, it’s often user friendly, and importantly its cheap. It is of course not perfect, and there are concerns about quality, but by allowing the users to rate the content, the quality might just look after itself.

5. Virtual reality (VR), Augmented Reality (AR) and Artificial intelligence (AI) – not that these are all related, but just a simple way of me summarising three areas to keep an eye on in the not too distant future. All of these technologies are becoming cheaper, largely because of the investment made and experience being gained in the gaming industry.

By way of a footnote Google have released an open source software called Tensorflow which can help with machine learning, something that they believe will help drive new initiatives in AI.

Putting the context into case study

Context

I am still reading Sensemaking by Christian Madsbjerg and as I always tend to do I have been trying to reduce the 216 pages down to something that is both meaningful and memorable. The rational for this is that if I can summarise the essence of what is being said into a single statement, then my level of understanding is reasonably good, and it makes it easier for me to use what I have learned in other situations.

So here goes, if I was to summarise what Sensemaking is all about, in one word it would be..….Context. In essence, in a world of complexity and abundance of information we are in danger of thinking that the “fact” we see on our computer screen, offered up by a search engine, driven by an algorithm is the truth, when in reality it’s only one version of it. Without the context from which this information came we are fooling ourselves as to its true meaning.

As a result of this discovery, I wondered into an area I  have wanted to write about before, the importance context plays in changing what something means, especially in examinations. Getting the meaning wrong could be the reason you fail the exam rather than pass it.  Even objective tests will have some form of context setting just before the actual question. But the type of exam where you are most likely to have a problem with context, is a case study.

Jokes play with context

A hamburger and a french fry walk into a bar.

The bartender says, “I’m sorry we don’t serve food here

The importance of context in case study

I have written about case studies before, “passing case study by thinking in words,” but focussed more on the process of how you think and write rather than how you interpret the information presented.  Case studies are becoming an increasingly popular way of assessing a student’s ability to apply knowledge from several different subjects (synoptic) in the context of a real-life situation.  This shift towards case studies is understandable given the need for improved employability skills. Here is a great story to illustrate how context changes the decision you would make or as often in a case study, the advise you would give.

A battleship had been at sea on its routine manoeuvres under heavy weathers for days. The captain, who was worried about the deteriorating weather conditions, stayed on the bridge to keep an eye on all activities.

One night, the lookout on the bridge suddenly shouted, “Captain! A light, bearing on the starboard bow.”

“Is it stationary or moving astern?” the captain asked.

The lookout replied that it was stationary. This meant a collision would result unless something changed. The captain immediately ordered a signal to be sent to the other ship: “We are on a collision course. I advise you to change course 20 degrees east.”

Back came a response from the other ship: “advise you change your course 20 degrees west.”

Agitated by the arrogance of the response, the captain asked his signalman to shoot out another message: “I am the captain of one of the most powerful battleships in the British navy, you change course 20 degrees east now.”

Back came the second response: “I am a second-class seaman, you had still better change course 20 degrees west.”

The captain was furious this time! He shouted to the signalman to send back a final message: Change course 20 degrees east right now or you will leave me no choice!

Back came the flashing response: “I am a lighthouse – your move.”

How to deal with context

It is easy even in the example above to think you know what is going to happen or what you would do. But when the context is revealed, your advice fundamentally changes. Case studies are created to see how well you respond in certain situations, so it’s important not to jump to conclusions.

And this is where sensemaking plays its part, use your senses, don’t just look at what is there, think in opposites, what is not there, what’s missing? Use visualisation, see yourself in that situation, look around, free up your thoughts, what do you see now? But most of all, be curious, ask questions of the scenario, how big is the ship, how long has the captain been in charge, what is the weather like, are there others close by?

Another excellent tool to use in these situations is called perceptual positions. Think of the event from different positions, firstly yours, what does the event look like through your eyes, secondly, the other person(s), what would you do if you were them, and thirdly what would the event look like if someone was looking in, observing both parties.

Case studies in the future will become even more sophisticated. Virtual reality offers up so many opportunities to create real world environments in which to tests students. And when that happens, you will definitely need to use all of your senses to get you through – take a look at this 360 VR surgical training, amazing.

And one last joke

Thomas Edison walks into a bar and orders a beer.

The bartender says, “Okay, I’ll serve you a beer, just don’t get any ideas.”

 

 

 

 

 

 

Sensemaking, humility and the humanities

human-being-girl-picture

For a variety of reasons, I have been engaged this month in thinking not so much about examinations but what subjects should be examined.

Whilst the news has been dominated by terrorism, Trump and Brexit, we may be facing a far bigger problem, of which these news stories are a good example, how can we be sure of making the right decisions in a world of mass information, complexity and change.

People voted Brexit for a whole variety of reasons, many “facts” were presented in simple terms, we will save £350m a week and this money will go into the NHS, immigration will be reduced as we gain control over our borders. Yet these facts are far too simplistic, any level of analysis, critical thinking and challenge would have revealed the difficulty of delivering them, and in many instances they won’t be delivered. If this is the case, did people vote to leave, or stay not on the facts as presented but using other criteria, maybe they were just naive and placed far too much trust in Politicians or perhaps they had never been taught about sensemaking, humility or studied the humanities.

Sensemaking

An interesting article caught my eye earlier in the month, “Silicon Valley needs to get schooled”. it was by Christian Madsbjeg, author of the book Sensemaking and senior partner in ReD, a strategy consulting company based on the human sciences. In the article Madsbjeg argues that the reason for a lack of new and exciting products from Silicon Valley is not because of a shortage of ideas but a complete failure to understand people.

In the book Sensemaking he expands on the problem. In order to cope with complexity, we look to science, logic and the algorithm (a rules based process) for a solution. On the face of it crunching big data so that it spews out the correct answer seems perfect, but, and this is a quote from the book, Madsbjeg makes a very important point, he says we stop seeing numbers and models as a representation of the world and we start to see them as the truth – the only truth”.  We are in fact looking at the numbers without the context of the world from which they came or a sufficiently deep understanding of the behaviours we are measuring.

We rely on science and the scientific method for so much of what we do but where people are involved we need a different approach. To put it another way “When human beings enter the equation, things go non-linear” Neil deGrasse.

Sensemaking is “how we make sense of the world so we can perform better in it”. It recognises that situations are complex and information ambiguous. It requires people to make a continuous effort to understand the connectivity that exists between people, places, and events in order to anticipate their trajectories and act accordingly.

Humility

trumpwillwin-notextIntellectual humility as defined by the authors of a recent paper entitled, Cognitive and Interpersonal Features of Intellectual Humility is the opposite of intellectual arrogance or conceit. It is in effect, recognising that you could be wrong. One of the findings from the research was that people who displayed intellectual humility were better than the control group at evaluating the quality of evidence they had been presented with. A very useful skill indeed, given the world of false news in which we currently find ourselves.

Humanities

And what job will you get after studying History for three years……

The humanities (English, History, Philosophy etc) have been given a bad press in recent times. Overshadowed by the drive to develop coding skills and with the constant chanting of STEM (Science, Technology, Engineering and Mathematics) in the background, it’s not surprising that less people are studying them. They were at an all-time low in 2014 at 6.1% of all bachelor’s degrees, a long way of the 1967 record of 17.2%.

But it is generally recognised that the humanities can teach us a lot. In another reference from Christian Madsbjeg’s book, Sensemaking, he suggests the humanities can teach us, one that other worlds exist, two that they are different and three, we learn how to imagine other worlds that in turn helps us better understand our own.

As with sensemaking and humility, are these not the types of skills we need to learn?

Examinations – what to examine?

What subjects should be examined depends to a large extent on what job you would like to do. But with the claim that 60% of 11 year olds will leave school to do jobs which have not yet been invented it’s hard to know the answer. What we do know is that the world is unlikely to slow down, change not happen, data become less available and complexity give way to simplicity. As a result, we need to teach people and so examine the skills that will help them better navigate this world. Maybe when those primary school children go onto higher education they will be studying sensemaking, humility and the humanities.

Even though the ink is barely dry on the letter sent by Theresa May bringing about our formal negotiations to exit Europe, the interesting thing is we will never know if this was a good or bad decision. Because post Brexit people will behave differently, some will work hard to make the impossible possible whilst others will continue to frustrate the process, and none of that could have been foreseen at the time.

So, let’s hope the basis for the original decision to leave was not because of the headline – We will save £350m a week and this money will go into the NHS!