Published in July/Aug 2019
In 2017, my sales enablement organization tapped me, a long-time performance consultant, to lead and form a team whose focus would be learning measurement and analytics. While many organizations claim they focus on measuring learning, few truly dedicate a team to support it.
At that time, we had a two-member reporting function within the enablement operations team who provided reports on course completions via Excel. We relied on those individuals and the enablement leads across departments to use filters and pivot tables to make further analysis. Prior to our team’s creation, the enablement operations team created non-standardized questionnaires to distribute to course and event participants, without leveraging any measurement approach. Then, the same two individuals reported the results. Our leaders recognized a need.
Getting Started
Tapped with not only establishing the initial focus and objective but structuring the team as well, I sought guidance from mentors and set my mind on building a collaborative and innovative team. Within a few days, we had a team name – learning measurement, analytics and seller experience – and a general focus.
Our team – containing our two business analysts, a program manager for our seller experience program, a strategic initiatives lead, an outgoing contractor who supported our dashboard creation and myself – gathered in person for a few days, taking advantage of a globally dispersed team actually able to meet in our headquarters. This made the beginning steps doable.
First, we needed to define our team’s mission statement. What verbiage would be our focus and reference for our direction? While each of us relied upon technology to work and connect virtually, we appreciated and leveraged our face-to-face time. We discussed ideas, shared thoughts and wordsmithed. The mission statement itself didn’t take long, as we shared this common vision:
To provide actionable insight of enablement’s impact towards achievement of business outcomes aligned to company priorities.
There were many additional, valuable words left on that whiteboard. Words like strategy, passion, zeal, optimistic, connect and balance. These words were ours, not for external communications but to remind ourselves what we brought to this new team.
Although not mentioned specifically in our mission or principles, we all agreed that we did not want to be considered a managed service. Our team’s focus was to guide the metrics maturity of our organization. Through measurement and analysis, we could provide clear insights to arm our performance consultants with, as well as the data behind the stories they tell. We knew that would take dedication and a change of management.
Measurement Approach
To my good fortune, every person in that room had a background in Phillips Levels of Evaluation, and most had achieved certification. The true foundation of this team would be found in its measurement levels.
Internal Focus
After our face-to-face meeting, the real work began. How we functioned as a new team was crucial to our success. Our basic procedures and principles included:
• Team meetings – At first weekly and later transitioning to every two weeks, these were our cornerstone event to bring our geographically dispersed team together.
• Communication – We never wanted unilateral decisions made on how to improve or remove processes, tools, and approaches to measurement and analytics. Each team member provided crucial input in these areas.
• Collaboration – As processes and tools needed to evolve, we joined forces to implement that change. Each team member led, and each team member supported multiple areas as we moved forward.
External Focus
Over the next year, we balanced completion and compliance metrics while maturing the reaction and satisfaction, application, and business impact measurements – knowing that in the future we could stand by return on investment (ROI) measurements.
Changing management served as a key aspect of our external focus. Changing the formerly referred to reporting team to the measurement and analytics team was significant. To our enablement organization, this change clarified our intention to be a partner in the business. To other industry professionals, this indicated the level of seriousness our enablement organization brought to the table. On each conference call we attended, we no longer used reporting. Our team scheduled lunch-and-learns and attended other team calls to represent clearly who we were, our mission and how to engage our support.
Successes
We knew that as an enablement organization and as a team, we were moving toward deeper, more insightful learning measurement. Each small victory lead to our larger victories chronicled below:
Implementation of measurement levels across enablement organization: The implementation of Phillips’ measurement methodology was, and remains, vital. The pyramid of measurement levels became the symbol for the intended journey both internally and externally. In conversations with field leaders, we communicate our intention of providing value to and partnering with the business.
Introduction of measurable objectives: Fundamental to our efforts, and helped in great part by team leaders and partner centers of excellence, was the demand for more detailed objectives when planning for a course, quarterly plans and yearly goals. Our team provided examples and worked with teams to turn needs into clear, measurable objectives. This alone helped our other enablement teams be clear on expectations as we planned, as well as during quarterly business reviews where we detailed what was accomplished.
Standardization of surveys: Although surveys had been developed, distributed and reported on previously, there was no uniformity. Creating survey templates was among our first tasks as a team. Both our team and enablement operations owned creating in the survey platform. We identified key metrics for measurement. This way, regardless of the target audience, the course or event, we had standard metrics across the board:
• Net promotor score (NPS)
• Confidence change
• Relevancy to role
• Anticipated business impact
Progress from reporting manage service to a consultative business partner: Our team’s efforts included layering the beginning measurement level of completion and compliance with survey data, and then guiding the enablement leads towards identifying the behaviors that would indicate application had occurred. In many cases, we were able to connect those to business changes expected by field leaders. In one instance, we leveraged data related to the value selling workshops held in order to enable a methodology and tool for our sales force. We combined the available workshop completions with survey metrics, monitored tool usage for an application metric across teams and worked with an extended team to average deal size to show impact.
With our corporate business insight team, we partnered to move the measurement of this program from Level 1 to Level 4. With the business impact data, our team progressed through the levels of measurement to go beyond reporting. We, along with enablement leads who represented various participant groups, could analyze these results for changes over time, achievement of targets set and correlate the enablement to the business through these results. These accomplishments represent our position as a true partner to our enablement leads as they seek to satisfy the demands of the business and their field leaders.
While the team now consists of additional members and the enablement organization changes constantly, we continue in the right direction, with our methodology grounding our focus to measure and analyze – not only report.