Every year, companies make massive investments in employee training — from leadership and customer service to teamwork and technical skills. In 2019, the Association of Talent Development stated that the average organization spent $1,299 per employee on learning and development (L&D), and Statista Research reported that the total expenditure on workplace training in the United States in 2020 was $82.5 billion.
Even though companies are making major investments in their employee training, it is often accomplished by plugging everyone into the same classroom for hazmat preparedness. Looking at a classroom filled with employees, it’s obvious that many are bored, a handful are confused and only some are engaged. Although it’s easy to forget that people come to our organizations with varying levels of existing skills, the differences in existing knowledge are plainly visible when employees attend training.
Sure, there are benefits to be gained by having an eclectic group rub elbows, but learning scientists suggest that the brain will only absorb so much new information. When there is a room full of learners with vast differences in knowledge, some individuals will hit capacity before the rest.
What happens when they’re receiving information that pushes their limits? Do they retain that information? Do they experience frustration? Do more knowledgeable participants also experience frustration at the slow pace? The answer to all is yes.
Gaining Effectiveness from Personalized Learning
Professional development satisfaction and efficacy largely comes from applicability for the learner. Organizations are looking for a positive impact on their business: higher retention, greater productivity, happier customers and greater sales.
As expected, qualitative data from live training courses regularly reveals that learners on the fringes give high marks to facilitators, but the information being covered sometimes misses the mark. The content is either too much of a stretch, or the learner already understands and uses the information being offered.
While one-size-fits-all training is easiest to offer, meeting individual employees where they are ensures training satisfaction, efficacy and application of new knowledge. One solution to consider is to structure professional training to assess current knowledge first, then place employees into training along a continuum or at a specific level.
A Case Study on Pre-assessments to Direct Individualized Learning
This case study was developed to determine if the benefits of learning effectiveness would be greater if the participants were intentionally segmented, and each person was directed to a training opportunity aligned to their current abilities.
To test this solution, pre-assessment self-evaluations were conducted in two different training areas —Microsoft Excel and management. These pre-assessment tests were developed to enable learners to self-evaluate competency levels.
Learners accessed these pre-assessments through an eLearning platform. Then, two methods were used to help learners identify appropriate eLearning starting points.
Microsoft Excel Pre-assessment Explained
The Microsoft Excel pre-training evaluation was built using video clips of a facilitator discussing varying shortcuts, functions and operations.
Learners were first given a brief description of content for each of the three levels and then asked to guess their possible starting point (X100 – beginner, X200 – intermediate or X300 – advanced). Within each level, they were then offered six 15-second video clips of training content appropriate to that level. Following each clip, learners were asked to rate their familiarity with the material in the video on a scale from 1 to 5 (with 1 being not at all familiar and 5 being very familiar).
Results were calculated, and learners were given a recommendation either confirming their original level choice or suggesting another level. Learners were then offered a single comprehensive course at their assigned level or were able to access more granular microlearning materials one at a time using the X100, X200 or X300 course codes.
Participants in the pre-assessment study were all heavy Excel users who reported using the program daily. A survey conducted after learners took the pre-assessment and completed some training placed them at 60% beginner, 20% intermediate and 20% advanced. Participants using the Excel learning assets said they would apply 88% of the information covered in the training. The majority (60%) said they preferred the microlearning Excel options to the longer course.
All participants said they were either somewhat likely or extremely likely to use self-assessments if they were available in other professional development areas. Further evidence that quizzes used in learning environments appeal to learners is that several participants took the self-assessment but accessed no further training.
Prior to adding the Excel pre-assessment into the eLearning platform, users had access to the same Excel training materials, and usage patterns were not greatly affected by the addition of the self-assessment. Satisfaction levels and predicted application of skills learned were largely unchanged by self-assessments, indicating that – while learners saw merit in taking the assessment – it didn’t greatly affect how they searched for or used the eLearning platform.
Management Pre-assessment Explained
To develop the management pre-assessment, research was conducted to identify 10 competency areas needed in leadership. Scenario-based questions were then developed to help assess existing expertise, such as: “When delegating, I am likely to …,” or “When dealing with an employee who seems resistant to change, I should … .” This style of question was selected to decrease responses based on what the learner might assume is the right answer.
The final self-assessment consisted of 56 questions (five to seven for each competency area) and were accessed one at a time, with results in one competency not being affected by results in another competency. Participants in this study self-identified as low- to mid-level managers with more than one year of experience.
Unlike the Excel pre-assessment, each competency area was assessed individually. Learners received a rating of expert, proficient, intermediate or novice, along with training prescribed based on the learners’ responses.
For instance, a low answer on only one question would prompt one related training suggestion; low answers on multiple questions within the competency area resulted in multiple training recommendations. While the eLearning platform has extensive items within each competency, suggestions were intentionally held to one per question to avoid overwhelming learners with too much prescribed training.
Following the addition of management assessments to the eLearning platform, participants were surveyed on satisfaction and intent to apply learning. On a scale of 0 to 10, participants rated the management self-assessment at a 7 for accuracy in assessing their management skills. However, 85% said they were satisfied with the assessment in enabling them to pinpoint the training they needed within the platform.
All participants reported satisfaction with the volume of recommendations. Satisfaction with prescribed training content was 8.5 on a 0 to 10 scale from “not satisfied” to “extremely satisfied.” And intent to apply the information learned averaged 8.9 on a 0 to 10 scale ranging from “none of the information was helpful” to “all of the information was helpful.”
The use of self-assessments also received high marks. When asked how likely participants were to recommend this assessment to other learners interested in building management skills, it received a 9 on a 0 to 10 scale of “not likely” to “extremely likely.” When surveyed about the general helpfulness of self-assessment in targeting management training, all participants reported it was “probably needed” or “definitely needed.” Additionally, 80% said they would use similar self-assessments if they were available for other professional learning and development areas.
This study and subsequent survey did not have a control group, because the eLearning platform used was not previously segmented by management competency.
Four Notable Outcomes from Self-assessments
The execution of and findings from these two self-assessments suggest that some topics are better suited for self-assessment for individualizing training. For technical skills like Microsoft Excel, self-assessments are simpler to construct, and abilities and knowledge can more objectively be measured. You either know how to do something, or you don’t. Self-assessing the diverse soft skills required in management is trickier and less objective.
Participation and perceived value in self-assessments to pinpoint or direct training was high. In both studies, learners viewed the assessment as helpful and reported interest in additional assessments and high likelihood of recommending to others. Assessments work best when learners are directed to specific prescribed training. It was more effective to provide specific training titles and accompanying links.
Following the self-assessment and prescribed training, purported intent to apply information learned was high in both studies. This data can serve as a control group as adjustments are made to both the self-assessments and related recommendations.
A one-size-fits all training solution is often the easiest training option for organizations. However, using self-assessments with supporting training recommendations enables learners to pinpoint their own strengths and weaknesses, giving them greater control and incentive to apply their learning.