Learning Analytics at Learning 2018
In advance of his Learning 2018 conference, Elliott Masie called for a deeper understanding of learning analytics. While at the conference, I summarized lessons learned from several presentations on this topic. Learning analytics in corporate learning and development are at a very early stage.
Elliott Masie has hosted a succession of events at Disney over almost 30 years. I had a colleague who attended and was transformed at one of the early events when the focus was on computer training. I went myself in 2000 when it was Tech Learn. I was excited to be back for the newest version. My primary interest was the topic of learning analytics. I wanted to see how corporate learning and development departments were approaching this issue in contrast to my experiences in higher education.
Do we really need a learning blockchain? (Opening Session)
At the opening to Learning 2018, Elliott Masie promised a major focus of the new year for his company would be developing a learning blockchain. The twilight zone aspect of this was that the last time I attended a Masie conference in 2000, there was a focus on SCORM with the same idea…to provide a way to provide access to distributed learning content. SCORM failed to achieve this vision, and despite the fancy new buzzword, blockchain will also fail.
Because content is not learning. Content is an ingredient to learning, but only a fool mistakes the finger pointing at the moon for the moon. In other parts of the opening, Masie introduced a web-designing Broadway star to sing a couple of songs and a graphics effects artist. In both cases, the artists described the portfolio that represented their work. In the graphics effect example, Masie showed a before and after 8-second scene from a zombie show. Neither of these speakers talked about the content they used to learn their craft. The Broadway star has a degree in industrial engineering, so her learning blockchain would have some content unrelated to her ability to sing and perform on stage.
So what do we need instead?
We do not need to know what content someone has consumed. We need to know what they can do with that content. We need more portfolios in more career fields and other forms of authentic assessment that demonstrate how learners can apply content.
What does this mean for learning design?
Masie asked participants to assemble some building blocks at each table then trade some pieces with another table to illustrate his metaphorical vision of a learning blockchain. The idea is that a learner can assemble content from a variety of sources to create a personalized learning record. In practice, people do this all the time. If faced with a challenge around the house, I head to my preferred search engine. There is usually a slew of YouTube videos and also help forums and other resources. In the case of YouTube, there is a built-in ordering based on how people have engaged the content in the past (an example of big data and machine learning). We also have access to metadata on the number of views, another clue to the authority of the content.
If we design learning with the end in mind, we would begin with the skill or competency we want learners to develop because of the learning. Learning design should focus on developing the task prompt to trigger the performance. Suggested learning resources can and should be provided, but learners should be free to discover their own resources as needed. At the end of the learning, the learner creates a solution to the problem. This solution becomes the demonstration of what the learner has learned.
In higher education, one of the more interesting developments in teaching is the flipped classroom. Rather than use classroom time (which is limited) for the professor to lecture, lecture is shifted to online video and other content. The classroom becomes a place where students can work on problems (what was known as homework) with the professor serving as a mentor and coach to facilitate the solution to problems and providing just-in time support.
What about compliance?
I have worked in a highly regulated industry (higher education) for over two decades. I have served as a compliance officer. I understand the concern about this approach leading to variation. The solution is to include in the requirements for the final product the requirements that adhere to compliance. It is less important that we cover compliance requirements in the training and more important that behaviors are compliant. When compliance or other standards matter, build them into the specifications and assess them and provide feedback in the learning.
The reality is that learning has a short life anyway. If the learner does not use the learning right away and see the value of the learning, it is hard to retain the learning and even harder to apply it on the job. Problem-based learning and similar approaches that connect learning to application during the learning process provide a more effective learning model in the first place.
Metrics and Measurement: Evolving your strategy (Kathy Tague – Guardian)
This was my first session from Learning 2018. We had plenty of seating in an 8 am session on metrics. Learning professionals tend to be people oriented not data oriented. Many people have been traumatized by numbers. Those who are not gravitate towards engineering and other fields that are numbers based. Tague’s key point is that learning drives business results and metrics demonstrate results.
Four Lessons Learned
One: Key performance
- Start by defining goal…what does great look like? Metrics are an agreement on measurable outcomes aligned to business results. Involve stakeholders in the process.
- Keep it simple to find the right balance of not too much and not too little data.
Be aware of the differences between learning operations vs business results. Learning operations data can be useful but do not demonstrate the value of learning.
Two: Business case Build a business case for learning using metrics.
What are business goals? What are pain points? Need immediate results and not just 4-year impact.
Add a financial model to show business impact of increasing productivity (or whatever learning goals are). Show improvement towards goal even if miss targets.
Three: Interpret the data
Combine quantitative and qualitative data. Consistent process for collecting qualitative data from conversations.
- Use data results to make changes to program to drive improvement.
Four: Fear of numbers
- Develop analytics competency within team rather than hiring competency. Everyone needs data fluency.
Tague works in insurance which is a highly quantitative environment, and that context has been important in developing an approach to learning that while still being very focused on learning yet using metrics to demonstrate the value of the learning and guidance on how to improve learning.
Maximizing the impact of learning & development (Deloitte)
In higher education we have focuses on assessing student learning for over twenty years. I have been personally responsible for developing learning assessment at three different universities. Our primary interest is in measuring what students learn in their courses and programs, but we have also looked at outcomes after graduation. We also look at student satisfaction and retention of students. Unlike corporate learning, our students pay for their learning, and it is essential that we keep them enrolled and progressing towards graduation.
One of my goals at the conference was building bridges between corporate learning and higher education, especially around the issue of learning analytics. This session from Deloitte provided a great example of the parallels between higher education and corporate learning.
Framework for Measuring Learning Impact
Deloitte created a framework for internal talent and development including:
Value and impact on business – alignment
Solution effectiveness – strategically look at all offerings and how they meet business needs as well as how individual learning products meet specific objectives.
Impact mindset – capabilities of talent and development team to focus on improving business versus being order takers. Shift from taking order for delivering a live meeting (solution) without a defining purpose to understanding business needs and collaboratively finding a solution.
Create a macro/strategic level language
Business alignment and impact
Innovation – creative problem solving to advance learning
Effectiveness – are people learning and developing
Efficiency – how do more with less
Use this framework to support the annual planning process and to review new projects.
Learning Design addresses:
What designing and why?
What are business objectives?
What are learners expected to know?
Primarily addressed through end of course survey including four areas of assessment:
Improved learner performance
Content relevance, usefulness applicability
Acquisition of new skills or knowledge
Learner perception of the learning process
In one project, this was augmented by additional measures including review of knowledge check results and supervisor focus group feedback on learner performance after training. A capstone exercise was also used to integrate learning and evaluate application of learning.
In higher education, we talk about direct measures of learning including student performance on tests, projects, and other results of learning and indirect measures including survey data. While indirect measures are important, direct measures are essential.
A missed opportunity in the presentation was segmentation of learners. In the context of corporate learning, new hires have different needs and opportunities for growth than experienced employees. A training program will often create wins for new employees that other employees do not benefit from. In higher education, we have students who are starting college, in the middle, and the end. Effective assessment measures these different stages of learning.
You are not your learner! (Becca Sharon – Expedia Group)
In a world of information overload, one of the challenges for learning producers is to stand out. A product approach provides both context and process for designing and packaging learning products in response to learner needs and desires. The model has its origins in digital product development and management.
- What do you do when learning is not mandatory?
- How do you get to know your learner?
Products should meet immediate need for users who engage with and benefit from product.
Move beyond requirements of stakeholders
- Increased effectiveness – engagement
- Decreased in wasted resources
Influencing power – focus on user versus stakeholder and designer perspective influence up from data
Start with clear problems and not solutions problem statement
hypothesis what is the expected outcome of product
competitive intelligence – has anyone solved this before?
- Consistently engage with users
User interviews – empathy interviews – 5-7 people
User personas – goals, pain points, values – general picture of group of users by product
Ideations – group 5=10 on how might solve problem 90 minutes – 2 hours
Product testing – iterative process with partial products such as outline
Develop iteratively to build measure and learn
Minimum viable product (MVP) – solve a problem for immediate impact and feedback
Proto typing – pre-MVP
Prioritized features – what goes into MVP
Product approach draws on design thinking and how to engage users in collaboratively problem solving along with agile project management. Agile minimizes the risks of investing resources in a project that misses the target. In agile, projects are developed in stages that include delivery to the user at each stage, enabling course corrections right away. The common theme is collaborative development that engages users every step of the process.
Analytics-driven curation: Connect to your learners (PwC)
My doctorate is in planning. Somewhere in planning school, I learned that you make sidewalks where people walk across the grass. The best design adheres to existing behaviors. Rather than trying to change human behavior (which is hard), design your product to meet their needs.
In learning, how do we know where people are walking through the grass? PwC’s curation uses analytics to identify how people are accessing content to learn how to improve their services. They look at content by strategic area, modality, and topic. The data indicates searches that lead to matches and those where the learner was left with no satisfaction, which helped identify what gaps needed to be filled.
Be in the moment – fit their workflow with location and modality of assets
Pairing is caring – mix technical and nontechnical assets (people look for technical subjects, so pair with nontechnical skills that people are not looking for…top down and bottom up)
Ride the wave – what is the next big thing? Build content in advance of future hot topics.
Don’t lose your heroes – use packaging and marketing to showcase existing assets.
- Use your sundial – align with timing of business activities. Different topics will be in demand at different times of the year.
Curate local, think global – collaborate with other parts of organization to track usage outside of known channels.
Have a piece of flair – visuals to increase engagement.
- Be a super sleuth – discover meaning of hidden behaviors, wants, and needs
The overarching theme here is the idea of learning as a product. Rather than trying to build learning and pushing it out to learners, instead build learning that users want/need and deliver it in the way and time they need it.
The session also demonstrates the use of analytics to support the learning design process. The data provided useful indicators of what learning products to create and how and when it is best to deliver the learning.
The original take-away was the tactic of combining the learning that learners need but do not realize that they need. Like hiding vegetables in something else, this strategy provides a way to get learners to willingly consume content that they might otherwise avoid.
The next step in this process is to add automation to provide a referral service like Netflix or Spotify.
Summary from Learning 2018
In general, learning analytics was not a significantly popular topic at the conference, and it failed to deliver the promised commitment to a deeper examination of analytics. A greater emphasis was put on blockchain and tracking what learning someone had been exposed to versus measuring the impact of that learning.