Analytics 2.0 for e-learning

My previous post on analytics for online learning made a few simple but effective suggestions. These were all focused on using data from Google Analytics to gain meaningful insights which you could then use to improve the online learning experience.

However, this is only one aspect of a robust measurement plan for online learning courses. Much of the following post is an adaptation of work by Google’s Analytics Evangelist, Avinash Kaushik, in his bestselling book Web Analytics 2.0. Although he focuses on for-profit businesses, many of his insights can easily be translated to the field of online learning.

Pillars Of The EarthThe 5 pillars of online learning analysis

1. The What: Clickstream

This is the data you get from Google Analytics. Basic metrics such as:

  • what learners click on (assignments, resources, downloads etc)
  • how long it takes them to complete a particular task
  • where learners drop off in a course
  • whether learners return to particular pages or watch a particular lecture more than once
  • whether users complete the course in one go or in separate sessions
  • the order in which a user completes the course
  • which devices and browsers they use
  • and so on – see more in my previous post on Google Analytics for e-learning

2. Multiple outcomes analysis

This basically means tying data to outcomes. According to Avinash Kaushik, in the world of business:

A website attempts to deliver just three types of Outcomes:

  • Increase revenue
  • Reduce cost
  • Improve customer satisfaction/loyalty

In the world of learning, a web-based course attempts to deliver different outcomes:

  • Increase achievement of learning objectives
  • Increase retention rates
  • Increase efficiency of learning process
  • Improve learning experience
  • Encourage learners to try other courses

It is interesting that we are not just replacing “outcomes” with “learning objectives.” Multiple outcomes analysis forces us to think about the purpose of the online learning experience in a broader way. We have to look not just at Google Analytics but at data from social media and surveys. This is ultimately better for learners.

3. The Why: Experimentation and testing

Test tubes and other recipients in chemistry labWith Google Analytics and Google’s free Content Experiments tool (which has replaced its Website Optimizer), it is the easiest thing in the world to create two or more slightly different courses with the same learning objectives. You can then drill down into the data to see which is the most effective – and more importantly, why it is the most effective.

Not testing or experimenting means stagnating – as Kaushik puts it, “Experiment or die.” With so many e-learning providers flooding the market, innovation is essential.

Testing and experimentation allows us to try new ideas, fail fast, iterate and improve in a continuous upward spiral. And the testing process will teach us all valuable things which we can use to improve the outcomes of future courses.

4. The Why: Voice of the Learner

Although analytics programs are becoming increasingly sophisticated, they will never tell you what the user is actually thinking. So it is important to listen to what Kaushik calls “Voice of Customer” or, in our case, “Voice of Learner.”

Surveys, usability tests and other qualitative data collection can help to relate the “what” to the “why.”

5. The What Else: Competitive Intelligence

Start lineHow are similar e-learning courses, both within your institution and in the wider field? It is surprisingly difficult to find published data on the effectiveness of different e-learning techniques (you can browse my archives on instructional design for my best attempts). But data needs a context in order to make sense.

Say the completion rate of your Massive Online Open Course is 13 percent. You think that’s terrible – and compared to a campus-based course it is! But compared to an average of 7 percent (as my article on the Quinnipiac Interactive Media blog discusses), it’s seriously impressive.

The context – or “competitive intelligence” as Kaushik calls it – could be internal. You could compare your data with data from e-learning courses in other departments, or with previous iterations of the same course. Or it could be external, if you can get hold of data from other e-learning providers.

It might be worth your while developing relationships with other e-learning providers in order to share useful data. This will help everyone to improve and drive the field of e-learning forward.

Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *