Published in July/August 2020
It’s Tuesday morning, and managers from the federal government gather virtually for an online data literacy course. Some hail from human resources; others come from policy teams. However, each of them shares a similar objective for the class: to learn how they can use data more effectively in their roles. With that, the instructor launches into material covering the data analytics maturity model, describing the stages an organization passes through on its way to being data driven.
The maturity model starts with descriptive analytics, which can answer questions about what has already happened. For example, a customer service center may track and report on the number of calls received in an hour, day or week. A learning and development (L&D) organization may track the number of enrolled students or percentage of passing grades. We are then responsible for taking this information and using it in our decision-making. This type of analysis is valuable and relatively easy to deploy. Nobody in the class has any questions.
As we move up the continuum, the instructor says, analytics become more valuable yet more difficult. Diagnostic analytics, for example, can explain why something happened and help managers identify root causes. However, more detailed data is required. In a customer service center, L&D may tie sudden spikes in call volume to events such as recall announcements or software updates. Participants start to mumble about how difficult it is for them to access data.
Once discussion moves into the realm of data science — which includes predictive analytics, prescriptive analytics and artificial intelligence — the instructor remarks that more than just detailed data is required. The data must also be clean, stable and supported by a coordinated governance strategy. There must also be an innovative environment, the right tools and knowledgeable employees within the organization. The benefits, however, are great. You can leverage data to predict what will happen in the future, let data prescribe how to make decisions, and do so continuously and at scale. Participants rattle off great ideas for utilizing data in these ways but don’t know where to start.
As the course unfolds, participants are equipped with the knowledge and skills necessary to start planning and executing data projects. However, chances are many of you currently feel the same way these students did in the first hour of the class.
The idea of applying analytics to learning data isn’t new, and the benefits of doing so have been covered extensively. More and more, instructional designers and content creators are called upon to incorporate analytics into their processes, particularly relating to learner analysis and evaluation. However, most learning professionals don’t see themselves as data scientists and have not been exposed to extensive data analysis.
Meanwhile, data is integral to the role of an instructional designer. When they design curriculum or pick class activities to reinforce learning, instructional designers are often doing so using research-based best practices. Likewise, they find the highest levels of the Kirkpatrick Model of Evaluation can only be measured with data-driven assessment approaches.
There is little difference between learning analytics and data analytics as a whole, so there are lessons to be learned from our data science brethren. Here are some steps learning leaders can take to start moving up the data analytics maturity model without returning to school for a degree in data science.
Utilize Seasoned Data Professionals
There’s no need to reinvent the wheel. Any good data scientist will routinely borrow and reuse tested code from colleagues and other projects. If your organization already has a data team, tap into that resource. Take them a specific, measurable and objective question, as well as an idea of the available data. They’re sure to have some ideas. For instance, to determine the best syllabus for an executive-level course, you might ask your data team to look at what topics are most frequently cited, both favorably or unfavorably, in course evaluations. The results they obtain with text mining methods are beyond the expertise and ability of many L&D teams.
Start with Pain Points
Data Society offers highly-customized courses that teach complex data science concepts and techniques. Whether it’s a class in R, Python or another programming language, learners need to get their hands dirty and practice new techniques on real data. This once posed a huge challenge for our content creators, since programming languages are dynamic and datasets are constantly changing to meet client needs. It meant constantly updating our training materials with fresh code. Fortunately, data scientists like to optimize. In this case, they built a bespoke content authoring tool, allowing us to automatically update coding snippets in instructional materials and standardize formatting. This resulted in improved quality and a huge savings in time. The team reduced its creation time for an hour of content from 40 hours of debugging, topic alignment and formatting to just five. If that doesn’t get content developers lined up for a ride on the data train, nothing will.
If you don’t have a team of data scientists on hand to help you innovate at this scale, start smaller. The point is that you start looking at your pain points and thinking about how technology and data can be used to find a better, more optimized way.
Set Interim Milestones
While you should never shy away from developing a long-term vision for leveraging data, set interim milestones to hit along the way. Data scientists learn basic programming syntax and modeling methods long before they learn advanced neural networks. Some may never learn advanced neural networks if it’s not crucial to their work. If your instruction runs off of Google Docs and PowerPoint, don’t expect to set up an advanced analytics program by the end of the week. You may need to start with a task as simple as cataloging the types and descriptions of data that you have access to — known by data scientists as a data dictionary. Likewise, you may never need to employ artificial intelligence if your business needs stop at predictive analytics.
Understand and Advocate for Data Governance
Data governance is the management of the overall quality, integrity, relevance and security of available data. L&D teams must get data governance right before any high-powered analytics are possible, and it is where many learning organizations are currently stuck. Instructional designers may not be privy to what relevant data is collected, how it is stored and secured, who owns it, or its level of quality. This is especially true if useful data is collected by another department or external entity. Asking around for this information is a good first step, as what you find may trigger ideas for exciting analytics.
Many organizations have designated learning data as low priority, but you can advocate for improved governance if you demonstrate how the data will be useful. Other organizations – especially small ones – will have newly established governance strategies, and you can lead the charge to set standards for data entry, data checking and data ownership relevant to your goals.
Get the Tools You Need
By this point, most L&D organizations have implemented content authoring tools and learning management systems. These tools spit out useful data, such as number of enrolled students, time spent online and final test scores. Some companies may have even invested in the latest technologies like experience application programming interface (xAPI). These are all great tools for collecting data, but analytics is so much more. The canned reports and dashboards provided by these tools may not give you what you need. Additional tools for storing, cleaning, analyzing, collaborating on and visualizing data may prove more useful. Ask around your organization to see what tools are available to you and consider taking a data literacy course to understand how they can enable your work.
Build a Data-driven Culture
One of the biggest factors influencing a data program’s success is the culture built around it. Whether it’s a large analytics program or specifically related to learning, you need a leader to champion data’s usage through actions such as highlighting successful data projects in newsletters or at events, asking for data in meetings, or bringing in experts for “lunch and learns.” You and others may want to attend data-related trainings and conferences or start a community of practice.
With time and practice, L&D organizations will begin to move up the data analytics maturity model. As they do so, content creators will have more information available to optimize learning experiences than ever before. What would you like your data to tell you?