What Google Analytics can tell us about e-learning

Web analytics for online learning – a revised definition

Web Analytics 2.0 is the analysis of qualitative and quantitative data from your website learning management system and the competition other e-learning providers, to drive a continual improvement of the online learning experience that your customers and potential customers learners have, which translates into your desired educational outcomes (online and offline).

This is the most widely accepted definition of modern web analytics – by Avinash Kaushik, Analytics Evangelist for Google – adapted for e-learning. Easy. So:

Why don’t we talk more about analytics for e-learning?

Google AnalyticsAs part of my day job in content marketing, I regularly help clients dive deep into their website and social media analytics data. The aims are generally to:

  • improve the customer experience
  • increase conversion rates
  • identify potential new avenues for marketing.

For online learning analytics, the aims would be to:

  • improve the learning experience
  • increase completion and retention rates
  • identify potential new learning methods and motivational techniques to attract new students and help existing ones complete the course.

Many learning management systems report little more than the number of participants who have started and completed a course, along with their test grades. That barely tells you the “what,” never mind the “how” or the “why.”

Here are some simple ideas for using Google Analytics, or any robust analytics program, to draw meaningful insights which can be used to improve the online learning experience. Remember that Google Analytics can be configured not just for websites but for apps – a slightly difference in technical terms but the principles are the same.

Google Analytics for e-learning

Here are some easy insights from Google Analytics that will help you improve the online learning experience and help students achieve their learning objectives.

User issues

  1. Versailles, Paris, ClockTime and chunking. How long does it take learners to complete not just the entire course but each different element? Does that correspond to your estimates? If the answer is “badly,” is there a usability problem with one of the sections – or is it more difficult than you anticipated? Are learners completing the course in one go (say, the night before the deadline) or visiting regularly and completing separate chunks each time? How does that compare to learning outcomes – do the regular visitors have better or worse outcomes than those who log on once for a marathon session?
  2. Where do learners have problems? I.e. where do they drop off? Bounce rates, exit pages and funnel tracking in Google Analytics allow businesses to see where customers drop off in the conversion process. They also allow educational technologists to see where learners drop off in the learning process. If one particular section has a high drop-off rate, it’s time to revisit it.
  3. How can the resources be improved? What percentage of students download or click on the external resources offered in each section? If the percentage is low, why is this – and should you include the most important points in the actual lecture or assignment?
  4. Which motivational techniques are most effective? If you send out email reminders or post social media messages with hyperlinks to the course, you can tag these hyperlinks using the Google Analytics URL builder. The tags allow you to see exactly how effective your messages have been – i.e. how many people have clicked through. This may make you initially depressed, but it will allow you to test out the effectiveness of different wordings or incentives as well as to identify the strongest channels for motivating learners.
  5. Identify the best students. Do students who participate in class forums have better learning outcomes than those who simply watch the lectures and complete the mandatory assignments? Do students who complete the course in stages achieve better results than those who do it all in one go? We may think we know the answers to these questions already – but until we have seen the data we can’t draw any conclusions.

Technical issues

  1. Browsers. Ideally, your learning management system is compatible with all browsers. In practice, it never will be. You can see if users of a particular browser are encountering more difficulties than others by filtering metrics such as “time on page” or “exit page” according to the “browser” dimension in Google Analytics. This will help you identify potential issues which you can then test in real life.
  2. BlackBerry Storm SmartphoneDevices. Learning management systems are generally designed for desktop or laptop use. But what if lots of your learners are attempting to log onto your course from their tablet or smartphone? If so – and if they’re bouncing straight off – it means that there’s a demand for a mobile-friendly platform and that users are getting frustrated. You can check this out by filtering metrics such as “bounce rate” and “exit page” according to the “devices” dimension. Even better, if you have set up funnels (see also #2 of “User issues”) you can check how far learners with mobile devices get along the process before dropping out.

There is some movement in this direction. Stanford’s Lytics Lab is using data from Massive Online Open Courses (MOOCs) to study how students learn. Valuable research, but let’s wait for the research papers on that.

Meanwhile, take the time to plan and implement a Google Analytics strategy to improve your e-learning course.

Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *