MEASURING TRAINING EFFECTIVENESS
Digital learning builds business capabilities and drives performance. However, throughout L&D these learning metrics can be overlooked. At Fenturi, we fully understand the importance of measuring the impact of the learning and work with our clients to do so cost-effectively and efficiently.
Measuring training effectiveness can help maximise the impact of the learning alongside acquiring future L&D budget. Understandably the board require fact-based evidence of the training’s impact to justify the cost. As a result, we measure training effectiveness and create scalable learning that suits the workforce and provides tangible benefits to the bottom line, company culture and the end user.
MEASURING DIGITAL LEARNING
On every project we track our success by surveying the commissioning client, asking them for their level of satisfaction with our work. Our average score over 2020 has been 5 out of 5, for more information about our quality score click here. We’re hugely proud of this score as proof of our consistent ability to deliver great quality design.
We also like to recommend that we work with our clients to measure the impact of the learning content itself on the end learners. There are 2 main ways that this can be done, and we encourage you to discuss these options with us early in the briefing process so that we can understand logistical and budgetary implications.
One option is to consider at pilot stage is tracking the impact of the training versus a control sample of employees who have not received the training. We appreciate this takes additional effort and capturing the impact will require co-ordination with your other tracking studies, but we are keen to support this activity whenever possible. It can be a powerful part of a pilot study that proves the business case for large scale roll out.
LEARNER FEEDBACK LOOPS
TRACKING LEARNER FEEDBACK IN REAL TIME
A global client engaged with us on a pilot study for the launch of a new initiative around their CSR programme. We tracked learner feedback in real time, through a sample of participant interviews and surveys. Through this work we identified the key interactions that were strongest in creating open ended discussions, a primary learning goal. We also discovered methods of optimising the next steps section, as we discovered there was an appetite for further involvement. After making these changes, the client pressed ahead with full roll out.