The course has been created and delivered. Now is the time to assess what did and didn’t work in the process through the lens of learner outcomes, which is the ultimate measure of success or failure. How does the monitoring specialist report on what happened?

What Is a Monitoring Specialist?

We live in a world of analytics, where we try to capture data from users and draw meaningful results out of the seemingly random noise in the signals that we have. The monitoring specialist sits in the center of the data analysis tools and log files, providing meaningful information to the stakeholders, who look to see what worked – and what didn’t work – in the training.

What Is Expected of the Monitoring Specialist?

Monitoring specialists are, first and foremost, data people. They look at the reporting coming out of the system and convert it into insights about instructors, courses and learners. While they may be called upon to help develop a new survey instrument or connect a survey tool to the learning management platform, their core competencies are around extracting meaning out of the data.

They’re expected to be so intimately familiar with the data that they can see when a course isn’t performing as well as it should (by poor customer satisfaction reports from typically good instructors), when instructors aren’t performing well (by poor customer satisfaction reports from typically good courses) and when learners’ feedback should be separated because it’s not reasonable.

Monitoring specialists ideally have a background in statistics and psychometrics, so they can separate factors and identify poorly performing assessment questions.

What Is not Expected of the Monitoring Specialist?

Monitoring specialists can’t make up the answers. They must respond with the answers that the data tell them. It’s not expected that they lie about what did and didn’t work – though they’re asked to hypothesize (and hopefully test) why the data say what they do. Monitoring specialists aren’t expected to create content or directly evaluate its quality. This frees them to focus on what the right metrics are and what those metrics say.

Where’s the Role Going?

Monitoring is becoming more sophisticated and more important. With the rise of large content library subscriptions, it’s more important and possible to evaluate the quality inside of a course. For example, each video clip can be evaluated to see exactly where users abandon it to do something else. As a result, monitoring specialists’ skills are highly in demand and will continue to be for the foreseeable future.

The Good, the Bad and the Ugly

Since this role isn’t involved in producing content or maintaining platforms, they’re often in a relatively low-stress position. However, like every role, there is good, bad and ugly:

  • Good: Vacations are rarely rescheduled, because this role isn’t on the critical path of projects and doesn’t have to keep the system running.
  • Good: Day-to-day work allows for high degrees of flow because of infrequent interruptions by other team members.
  • Bad: The tools are still emerging, and much of the specialist’s time may be spent implementing the systems he or she needs for reporting.
  • Ugly: Even with the rise of data, there’s never enough to answer all the questions that the business wants answered.
  • Ugly: Conclusions aren’t fool-proof, and specialists must be always ready to accept the idea that they may have drawn the wrong conclusion from the data they had all along.

Want to learn more? Our Midwinter Month of Measurement is leading up to our next virtual conference, TICE Virtual Conference: Metrics Matter, a Focus on Strategic Planning, Analytics and Alignment. Learn more and register for the free event here.