An employee did something, someone observed her doing it and so a box was checked.
This oversimplification of on-the-job training (OJT) evaluation is all too often the only performance feedback employees and managers see. However, many companies rely on this much-used method for teaching a new employee his or her new job. It is also true that being shown how to do one’s new job by shadowing an experienced employee is a valuable teaching tool; however, if there are not clear performance objectives and documented observable evidence, is it really the most effective training approach, especially for jobs that are safety-critical or extremely dangerous?
If you are asked to improve an OJT program, where should you start? And is there a more effective approach than simply using a checklist to document that an OJT task has been completed and observed? Here is an approach that may help improve OJT performance feedback at your organization.
We have both worked in industries from mining and transportation to allied health where safety-critical skills must be captured in order to both document and provide feedback to accrediting agencies, government regulators, and – of course – the employees and supervisors. We have both also seen checklists used to document an observation, yet all too often, these checklists neither capture the trainee’s actual level of performance nor accurately assess his or her level of competency on a given skill. Our experience has shown that a performance rubric gives an organization better insight into an employee’s level of training – a crucial training component if the organization desires to be both safe and effective.
This new and improved OJT checklist can be more accurately called a performance evaluation form. But we also realize that creating a new form will not guarantee efficacy, so we also want to share a best practice for incorporating constructive feedback, practice time and remediation when needed. Below, we illustrate a two-stage process as well as the performance forms needed to accurately and effectively capture an employee’s performance while participating in on-the-job training.
Part 1: Practice Exercise
This stage and document lists the performance objectives necessary for completing each task, any safety equipment required and any additional references required (e.g., policy, standard operating procedures and operations manuals). The detailed steps for the task are listed in a checklist using performance criteria to evaluate each step. Instead of vague or unmeasurable outcomes like “Demonstrates an understanding,” “understands” or “knows,” this format uses rating criteria between 1 (“cannot perform the step without assistance”) and 4 (“can perform the step without assistance and with initiative”).
Trainees should first shadow/watch the senior or expert employee complete or demonstrate the proper method to complete the task. Next, the trainee can begin to practice the steps using the checklist while the expert employee uses the performance criteria to evaluate him or her. The expert employee can also use these practice sessions to remediate and coach the trainee in areas that do not meet acceptable performance levels.
Below is an example of the format used to capture performance during the practice phase:
Part 2: Performance Assessment
Once the trainee has performed all the steps of the practice exercise at a level of 3 or 4, he or she is ready for a performance assessment. This assessment uses the same steps as the practice exercise and rates each step as either “pass” or “fail.” In the case of safety-critical technical tasks and equipment use, the trainee may be required to receive a 100-percent evaluation on each step. Once completed, the expert employee documents that the on-the-job training is complete – a key step to ensuring compliance and effective OJT program evaluation.
Below is an example of the performance assessment used to capture the pass/fail for each task:
The improvements we have seen at our own organization using this method have helped us to look at new and even better approaches to training and development outside of traditional on the job functions. We are looking at ways to help mentors and trainees assess performance while in the field. This type of training differs from OJT in that the trainees who are out in the field are concurrently receiving formal – and usually face-to-face – instruction. Facilitators, trainees, mentors and managers need a new way to assess how trainees are progressing while in the field. As we develop additional best practices and more effective approaches, we will share our findings. As L&D professionals, we are always looking for ways to give back to the L&D community, where collaboration is a true best practice.