Listening to the radio recently I heard an author say she wrote two sorts of book. One was to find out what she thought about a subject and the other was to tell people what she thought. Something clicked for me, so I’m going to follow suit. This blog is to explore what I think about the evidence that exists for learning and development.
I’m also keen to see what I can learn from the conversation this will hopefully provoke.
I frame this discussion around L&D, because that’s the field GoodPractice works in. However, if learning and development is about improving performance and changing behaviour (and I firmly believe it is), then we enter the arena of evidence-based management.
Jeffrey Pfeffer and Robert I. Sutton wrote a great introduction to Evidence-Based Management for Harvard Business Review 10 years ago.  They examine the rise of evidence-based approaches in medicine and highlight the need for an ‘evidence-based movement’ within organisations and the ranks of managers.
Pfeffer and Sutton acknowledge that this is an extremely difficult challenge, not least because:
“Managers are much more ignorant about which prescriptions are reliable and they’re less eager to find out.”
However, they also say it is surely better to follow empirical evidence than blindly accept what passes for conventional wisdom.
I wholeheartedly agree.
Recently, Sukh Pabial, who has written very thoughtfully about L&D practice, posted a blog ‘Understanding Evidence Based Management better’.  Here he explores the idea from an L&D perspective:
“From what I’ve understood about Evidence Based Management so far is that it relies on research and evidence to inform exactly what is effective and what is not. Right, that makes complete sense to me. So, for example, we are able to know that NLP, MBTI, learning styles, employee engagement initiatives, learning and development all have weak evidence bases. Now, before my fellow L&Ders get all defensive about this, it is well worth you taking your time to understand why they have a weak evidence base.
What this means in the broader Evidence Based Management piece is if Consultant A suggests Intervention A, then they should be doing so with a decent evidence base that Intervention A will work. Now there’s a whole piece there about the education of HR/L&D professionals better understanding what to look and ask for when it comes to that evidence base…
So far, I’m in agreement with the approach, what it can inform us, and what that means for the interventions we choose to implement.
Where I come unstuck is when Evidence Based Management approach can’t provide an answer for a solution that is effective.”
Where Sukh becomes “unstuck” got me thinking more about the use of evidence and what to do when it doesn’t exist.
One of the advantages medicine has is the relative depth of evidence it has on the treatment of specific conditions and their outcomes. Conclusions can then be drawn from the data that a certain condition will normally respond well to a particular treatment. The reason medicine has this advantage is that it has several decades of evidence gathering to draw from.
As yet, the world of management and leadership isn’t so advanced. It has neither the historical data, nor the same compelling link between a problem (in medicine, people die, are in pain or are uncomfortable) and the solution. Management, at present, is more an art than an exact science. What works in one situation may often not work in different circumstances. This is why we should be wary of examples from well-regarded companies. A sample size of one is simply an anecdote.
However, medicine also faced the same problem before the 1970s. The way it tackled the problem was to start finding out what worked and what didn’t.
Evidence-based approaches aren’t limited to science and medicine. In ‘Black Box Thinking’, Matthew Syed explores how the aviation industry has managed to drive down errors and accidents to such an extent that flying has become the safest form of travel.  He shows how the culture of facing up to and learning from failure has been a fundamental component of this success. Much of the evidence used to examine failure is facilitated by the data collected from the infamous ‘black box’ flight recorder.
In aviation, when something goes wrong, the whole industry is focused on finding out why it happened and how to prevent it from happening again. There is no fear of discovering new problems or making incorrect assumptions. There is a large cultural element to this, but at the heart of the process is the power of being able to learn from the evidence gathered, to analyse it systematically and apply lessons learned across the industry.
I’m intrigued by the idea of building a ‘black box’ for organisational improvement.
The good thing is that Towards Maturity has already started this process for L&D. I’d like to see if we can work with them to make it more granular in its application. It will take time to gather the weight of evidence to allow us to create the “decent evidence base” that Sukh talks about. That leaves us with his question: “What do you do when that evidence base doesn’t exist?” I think the answer is threefold:
1. As consultants, if there isn’t a strong evidence base and we are having to make our recommendations and solutions on the best evidence we do have, i.e. our experience, hunches and best guesses, then we should assume that the proposed solution may well not work. We should therefore be honest with ourselves and also with our clients.
2. If we can, we should test the idea(s) in a robust way to see if they create the desired performance outcome(s). There should be no fear of finding out that an approach hasn’t worked – surely it’s better to know for sure than to live in ignorance.
3. We need to find a way to codify and record the situation and the outcome from the treatment that is applied.
At GoodPractice, we’re looking at creating a new way of working which moves away from conventional approaches, and instead focuses on building evidence for what we do. Our ambition is to create a Learning Innovation Lab (and bear with us, as that name might change!) where solutions that improve performance in organisations are developed and tested in a way that determines what works and what doesn’t.
Our long-term ambition for the Lab is to create a set of solutions that are constantly being tested in real world situations in order to drive innovation. A crucial component of this approach is collating the results of these experiments in real life into a ‘Black Box’ evidence database. This would be open to L&D professionals to interact with, but I’m getting ahead of myself.
To do this requires a brave pill for both the consultant and the client. To improve the performance of L&D so that it significantly adds to organisational outcomes, the same culture of candour based on evidence that the airline industry have adopted is crucial.
The question I hold at the moment is:
How do we make a start on this?
I would welcome your views on this.
 Jeffrey Pfeffer and Robert I. Sutton, ‘Evidence-Based Management’, Harvard Business Review (January 2006 Edition). Available at: https://hbr.org/2006/01/evidence-based-management
 Sukh Pabial, ‘Understanding Evidence Based Management better’, Thinking About Learning (23 March 2016). Available at: https://pabial.wordpress.com/2016/03/23/understanding-evidence-based-management-better/
 Matthew Syed, Black Box Thinking: The Surprising Truth About Success (John Murray, 2015).