February 13, 2006
Tracking of student activity in VLEs and e-learning tools
Adam Marshall is doing some research on tracking in VLEs; I thought I'd post my response here as well as on the mailing list.
Tracking is something that quite frankly baffles me. The assessment things mentioned make sense - assessing the products of learning is what we use as the basis for grading and accreditation after all. However, tracking is about measuring the process rather than the product, and has an ambiguous status in e-learning generally. Vendors make a big point of including tracking capabilities - its basically a log of activity pretty similar to the regular server log, with some graphs and so on.
Clearly there is feedback in the sense of "did any students actually read this thing I posted?", but this is much less indicative than students writing something themselves along the lines of "this was really useful! Thanks!". I think relying on hits to gauge the effectiveness of resources and posts is pretty unsatisfying compared to qualitative data in terms of responses. Although correlations may be useful (see below).
There is tracking in the sense of reporting presence for funding purposes - that is, trying to assess drop-out and withdrawal by level of participation. But thats typically at a rolled-up level rather than in the detail items of server logs - basically an activity percentile. I can see a use for this, especially in relation to some of the FE funding processes, but its not really what you're after, is it? Its not really about the learning and teaching per se.
Finally there is the use of tracking data for intelligence analysis - so using the VLE as a data source for analytical CRM to identify trends that can be fed back in terms of strategy and policy development. I'm not sure how much of this goes on.
The example of correlating resource use with success factors is interesting, however I think this is only really going to be useful if provided as a direct feedback signal to the learner - there was some research at OUNL on providing "trails" recently along the lines of providing "people who successfully completed the course also did..." indicators weighted using collaborative filtering techniques. This appears to be reasonably successful from the cursory glance I gave the statistics.
Maybe the best way is to look at this backwards - if your VLE didn't have ANY tracking, what would you miss?