MBTwhy?

form filling

Here are two biiiig reasons why personality or psychometric tests suck….

You know the drill. Once in a while the boss decides you’ve all got to get together and find out about each other. What makes you tick, and all that jazz. So how’re ya gonna do that? Lucky you – it’s MBTI time (other tests are available). Forms are filled, scores are totted up, foreheads are wrinkled. And then you are ready for the illumination – the results (cue fanfare).

Guess what…the test results tell you a bunch of stuff you already know about yourself, and importantly, it’s stuff you like. ‘Ooh, isn’t it uncannily accurate’ titters the excited recipient. Well derrr, you filled out the form, what did you expect? This is the first reason these tests suck. They tell you what you already know.

And guess what else…the test results tell you a bunch of stuff you already know about yourself and don’t like. And being human, you choose to disregard it all. Yeah OK, you might buy it for a wee while, but as soon as the L&D expert’s back is turned – you’ve flipped back to the real you. This is the second reason these tests suck. You ignore the stuff you don’t like.

And please – don’t get me started on Belbin. If you want to know your ‘team role’ just take a look at your desk. It’s state of organisation or otherwise will tell you pretty much all you need to know. There you go – I’ve just saved you a bucket of cash for you to spend on something ‘useful‘.

From my experience I know many folks in the world of work agree with these observations, and I’m struggling to think of anyone who approaches these assessments with anything higher on the excitement scale than vague (and often forced) interest.  What do you think? 

photo c/o dumbledad

Vampire HR

The #zombiehr series continues…

Oh happy day! The UK civil service has just published its 2010 staff survey results. 325,119 people (62% of the 528,729 who were invited) took part. What did they tell us? Well for starters 32% believe “I have the opportunity to contribute my views before decisions are made that affect me”. A whopping 38% of the people who replied believe that senior management will take action based on these survey results, and 27% believe change is managed well in their organisation. What that says in relation to achieving the seismic changes that are about to hit the civil service is anyone’s guess. Pick a number – and make it a low one. There are more questions and answers in the survey than you can shake a stick at. Each one of them registers considerably lower on my interest scale than the last.

I posted a link to these results over on David Zinger’s Employee Engagement Network. Jean Douglas was kind enough to get in touch. She notes:

You have to wade through the methodology to find out that the engagement index is calculated in a manner different than what you might think – I am still trying to understand what they did – and this is my field.

Here is their description:

The employee engagement index is calculated as a weighted average of the response to the five employee engagement questions and ranges from 0 to 100. An index score of 0 indicates all respondents strongly disagree to all five engagement questions and a score of 100 represents all respondents strongly agree to all five engagement questions. The 2010 benchmark is the median (midpoint) engagement index of the 103 organisations that participated in the CSPS 2010.

The engagement score is listed as 56%; however, the “%” is misleading. There is no 56% of something. The score is simply 56 (the highest number is 100 – which does not automatically mean it is a percent). It could have been from a range of scores running from 0 to 157).

They have also “mooshed” together the scores in a department (“moosh” is my new statistical term when numbers are added and divided to come up with another difficult to understand index).

They missed some real opportunities here to get at some good predictive results. .

The individual departmental results are more meaningful (except for the engagement score) as they have not done all that mooshing.

Thanks Jean. So basically the civil service is frigging around with numbers and mooshing stuff. That figures.

Beyond the survey we find…the initial findings. The initial findings – there’s a title to stir the soul. The initial findings are about the rationale behind the survey and why it is important to measure engagement. Apparently it is important to measure engagement because:

Engaged employees in the UK take an average of 2.7 sick days per year, the disengaged 6.2 days (Gallup Research, 2003)
59% of engaged employee say their work “brings out creative ideas”, compared to just 3% of disengaged employees (Gallup Research, 2003)
70% of engaged employees indicate that they have a good understanding of how to meet customer needs, compared to 17% of nonengaged employees (Right Management Research, 2006)
Branches of Standard Chartered bank with high levels of engagement have a 16% higher profit margin than branches where it is low (evidence submitted to MacLeod and Clarke, 2009)
Improving engagement levels in branches of the Co-op supermarket has been estimated to save the organisation £600,000 per annum from reduced food wastage. (evidence submitted to MacLeod and Clarke, 2009)

Rotten vegetables aside – this whole project is dull and unimaginitive. Trying to measure engagement sucks. Sucks like a vampire. It sucks cost and it sucks time ( I estimate that the completion of the survey alone took over 6,000 person days). And having gone to all the trouble to measure – the evidence shows us that few believe action will be taken, fewer still believe that any action taken will be managed well.  This sucks. Sucks in a way that the good Count Dracula himself would be proud of.

Stop measuring engagement and just start doing it.