Saturday, July 21, 2012

Measuring performance online: How can you evaluate online learning?

  • Savings  of €  20,000. - per year and €  1,500. -  per participant;
  • A course attendee who  experiments with virtual tools in team collaboration and who contributes to shorter and more effective meetings in the organisation  and contributes to cost  savings in travelling;
  • Satisfied participants
Three practical examples as a result of an evaluation of a 'blended learning' course at a training agency. One of the major challenges in online learning, is to visualize the development of a participant or organization. To what extent  contributes online learning  to improvement in efficiency? And to what extent does it contribute  to improving the competence of the individual or the team  in the organization? How can you monitor and evaluate online?
Practical models
As general practise  the following 3 models are commonly applied in monitoring and evaluating online performance:
1. The 8-Fields model
The 8-fields model (Kessels, Smit, Keursten) examines the effects of a  training on an individual or group.  With a  goal or problem as a point of departure,  the development of knowledge, skills and attitude are followed over a fixed period of time. The process and outcomes are benchmarks, which help to measure effects and impact on the long term in the organisation and in service delivery with the target group. Read more about the  8-Field Model.
See presentation: How can you monitor and evaluate online?


2. Value creation in communities and networks
The Value creation model (Wenger, Trayner, the Late) goes through 5 phases, in which the development  of online learning is followed. Phase 1; Direct value, phase 2;  Potential value, phase 3; Applied value, phase 4; Realized Benefits  and finally Phase  5; Custom value at the level of the target group and the strategy of the organisation.    Read more about the Value creation model.

3. Valid Metrics Framework
The Valid Metrics Framework is based on current learning models (eg Kolb, Bloom's learning domains) where the process and developments in awareness, knowledge, skills and attitude of the student are followed. Support from the organization, changes in behavior and ultimately changes in actions and service delivery are followed on  organizational level and customer value at the target group level. The challenge in this model is to measure the attribution. To what extent has changed behavior in human behaviour in the organization contributed to change in the target population and the consumer? Not all the changes in consumers or the target can be directly attributed to changes from the service provider. Read more about the Valid Metrics Framework.

Tools for monitoring and evaluation
Online learning can be followed both quantitatively and qualitatively.

Quantitative
Most mailing lists have a function that measures the number of hits and visitors. Yammer have leaderboards and response pages, where the  participation of the most active members can be followed.  The Poll function of  Yammer is an excellent tool to do an in between poll about member's opinions on a topic. With google analytics, the number of visitors and intensity of the visits are quantified over de defined period of time.
Quantitative data can also be collected through online survey's. Both SurveyMonkey and Free Online Survey  offer features where statistics  can be collected.
Google docs is a convenient tool to do time tracking and in keeping and analyzing financial records.

Qualitative
At the level of activities (eg training or an event), Twitter, Mood Panda and Poll Everywhere are appropriate tools to gather feedback from participants. Mood Panda assesses  the feelings  and experience of participants by a score of 1 to 10. Both Twitter and Poll Everywhere offer opportunities to ask open ended questions and sharing at the same time.  Wordle is a creative tool to collect one-word impressions of people.
Storytelling is an excellent tool to measure impact and development on the long term.   A good example is Zaitun's story, a compilation of pictures which vizualises the outcomes of the training and coaching activity with a local farmer.

Getting Started:  Choose a model and a mix of tools,  keep it simple! "
Are you a supervisor of an online or blended learning course  and looking for a model to monitor the development of your participtants or a group,  get started  by selecting a practical model for monitoring and evaluation. Then choose the appropriate tools, through which you can collect data at each of the monitoring levels.
Consider using the  following questions in helping to develop your M & E system:
  • What are the targets that need to be accomplished?  Set challenging targets which are a departure point for monitoring and evaluation.  Dare to define targets in cost savings and improved learning capacity.
  • What is the purpose of monitoring and evaluation system? To what does it contribute?
  • What information is needed (the questions) ?
  • From whom is the information obtained?
  • When is the data collected and how often?
  • How and with which tool is the information collected?
  •  Who collects the information?
  • How is the information analyzed, recorded and shared?
  • How much time needs to  be invested?
  • How much should it cost?
A tip; Keep it simple!

No comments:

Post a Comment