Most people working in positions where they are managing people or processes are used to the constant flow of new ideas that come their way. Whether you work in human resources, finance or sales, there's a management guru ready to tell you there's a newer, better way of doing things. These new management ideas and strategies are like iPhone apps: there's thousands of them, but only a few are any good. Implementing the latest fad or revolutionary concept without evaluating its true worth can be costly. Not only that, but for those whose work in the knowledge economy, getting the right information can be critical to our success. No matter how excellent a concept sounds at first, it pays to look beneath the surface. Whenever I read about the latest cure-it-all initiative, I always ask the following questions:
If a 'new' management idea seems too wonderfully simple and easy to implement, then it probably is. In the words of the late, great Douglas Adams: "If it looks like a duck, and quacks like a duck, we have at least to consider the possibility that we have a small aquatic bird of the family anatidae on our hands." Very rarely, someone will come up with a simple idea, which is really effective, that no one else has thought of. Most of the time, however, it's been tried already and found wanting.
'Credible' will mean different things to different people. A credible source to me means one or more of the following:
If the author doesn't tick any of those boxes, it doesn't mean that I automatically discount their idea; it just means that I'm less likely to forgive other oversights.
If a new book, article or column does not offer any evidence backing up their claims then I treat them with extreme caution. It's quite likely in this case that the concept being shared is theoretical, or based purely on anecdotal evidence, and hasn't been tested in the chaotic arena that is real life. In addition, the evidence should be referenced or easily discovered in the public domain. Oblique references to 'studies' or 'research' can't count as evidence unless there's a way of looking at the original study itself. For instance, everyone knows that you should drink eight glasses of water a day, right? This idea floated around for decades, and no one questioned what the original source of this was. If they had, as two researchers writing in the British Medical Journal did, they would have found that it's a total myth. This is not to say that new concepts lacking evidence shouldn't be tried out, but a more circumspect approach would be recommended, such as trialling the new model/strategy with a small number of people/teams/departments first. I'd never recommend implementing a costly initiative that has never been tested. That's what pilots are for.
It's an ingrained human trait to pay more attention to the evidence that supports our personal viewpoint than the evidence that contradicts it. Let's say I make a claim that the most creative companies give their staff the freedom to work on personal projects that have nothing to do with their main job a certain amount of the time. I would point you in the direction of Google and their famous 20% time, and 3M with their 15% rule as examples. I can probably uncover a few more and give you some historical examples, such as Edison's Menlo Park laboratory. But what's missing from this wonderful picture are the examples of organisations who implement something similar and are no more creative. This is because, as a management guru with an idea, I'm unlikely to look for contradictory evidence. Great ideas go through a rigorous process of challenge and come out stronger as a result. If the evidence supporting a new technique or concept seems to be cherry-picked, it's always worth looking for contradictory evidence.
Often, the author of a business book will back up their claims with statistical evidence or use a well known 'fact' as the basis of their idea. These numbers can come from research studies, real life surveys or occasionally out of thin air. We're predisposed to be persuaded by numbers, but nine times out of ten,1 these statistics don't stand up to scrutiny. Before accepting an idea based on the statistics that 'support it', you should prod the data to see if it's of any value. Typical problems that can be found are:
We see numbers used this way all the time in business and politics:
Using evidence that falls foul of these problems, without mentioning it, can mean one of two things: sloppiness or deceitful intent. In either case, it's a mark against the author's ideas.
Popular ideas always attract some form of criticism. These can range from the wild ramblings of ill informed bloggers through to devastating point by point refutations of the author's key points by highly respected individuals. Even if you like an idea and it seems to be well considered, it always pays off to investigate whether anyone else has found problems with it. The chances are, if you're interested in the author and their idea, someone else has been too and they've probably done some research you can make use of. The skill here is to be able to sort the useful critiques from the ill informed stuff and to do this I tend to ask myself ... yep, the same questions.
I'm sure there are loads of other ways that people use to sort through the myriad of new concepts that come our way. If you've got any methods that you find particularly effective, I'd love to hear them.
 This statement is made with ironic intent.