News story

Why do Courses only have a Limited Impact?

Learning NewsFuse Universal

Does e-learning make a difference? A difficult but important question, and a question that Fuse Universal are answering together with Carpetright and the AI institute of UCL. What insights did they take from this data and what’s next?

 

Does e-learning make a difference? This was the great question asked to Fuse Universal by the ex-HR director of Carpetright, Lyn Warren. Fuse had been professing the benefits of bottling greatness, social learning and continuous performance coaching – but does any of it really make any difference to the business, in real performance numbers?

Together with Carpetright (a European retailer specialising in Carpets and beds), and Fuse's measurement partners – the AI institute of UCL, which has professors in mathematics and neuroscience – they set about answering the question, one hypothesis at a time, outsourcing the measurement methodology to UCL’s experts.

Spoiler alert: the quick answer was yes.

And whilst it made an impact, that wasn’t the most interesting fact – the most interesting fact is something that should really help change how we think about designing learning forever.

Together, we wanted to take just one modern learning concept and see whether our hypothesis on positive business impact was correct or incorrect. We took the ‘bottling greatness’ concept for our first test. Our joint hypothesis was: if an organisation finds the best person within their organisation doing something (such as selling a certain product better than anyone else), then if we digitally extract that knowledge, post-produce it quickly through agile techniques into an interesting, authentic, passionate way, explained by the expert (but codified by the L&D team)…

…and then distributed in a small number of bitesized three-minute videos frictionlessly via the employees’ own mobile phone – that alone would have a measurable business benefit, likely in the 10% area (as that is what historical data tells us from other organisations).

So we did just that.

UCL guided us on what was the best area to choose, the one that would give us the purest form of measurement. We took an existing product that Carpetright had been selling for a number of years, which gave us the historical data that we needed, and would also provide good evidence of any variables such as seasonality or economic climate. We took a period of nine months to measure learning activity with revenue impact over that period and used the previous 3 years of data to compare against.

The result showed that over that nine-month period there was a 13% difference improvement in revenue due to the codifed knowledge being embedded across the organisation, and in data science terms it was ‘significant difference’ – meaning that the scientists stood behind the results.

The full story by Steve Dineen from Fuse Universal is available on LinkedIn: Why do courses only have limited impact?