skip to main page content CETIS: Click here to return to the homepage
the centre for educational technology interoperability standards

skip over the long navigation bar
Press centre

Inside Cetis
what is Cetis?
Contact us
Cetis staff
Jobs at CETIS


XML: Click here to get the news as an RSS XML file XML: Click here to get the news as an Atom XML file iCAL: Click here to get the events as an iCalendar file

what are learning technology standards?
who's involved?
who's doing what?

CETIS Groups
what are cetis groups?
what difference can they make?
Assessment SIG
Educational Content SIG
Enterprise SIG
Metadata SIG
Life Long Learning Group
Portfolio SIG
Accessibility Group
Pedagogy Forum
Developer's forum

Accessibility (310)
Assessment (74)
Content (283)
Metadata (195)
Pedagogy (34)
Profile (138)
Tools (197)
For Developers (569)
For Educators (344)
For Managers (339)
For Members (584)
SCORM (118)
AICC (18)
CEN (34)
DCMI (36)
EML (47)
IEEE (79)
IMS (302)
ISO (21)
OAI (24)
OKI (20)
W3C (37)

print this article (opens in new window) view printer-friendly version (opens in new window)

SiX plugfest report: encouraging, but could do better

Dutch educational standards working group SiX just published the results of its plugfest. Over the day, various managed/virtual learning environments were required to import, export and display a standardised set of ADL SCORM 1.2, IMS Content packaging 1.3, IMS QTI 1.2 and IMS Enterprise 1.1 data. Result: familiar problems and remarkable differences between products.

SiX' plugfest set-up was to send participants a set of relatively house-trained packages two weeks in advance, and then have participating VLE/MLE vendors show and tell the results during the plugfest day itself. The advantage being that there's a known, level playing field and no arguing over who's tool is to blame if things don't quite proceed according to plan ("it's your editor!" "no, it's your &#@ VLE!"). The disadvantage of the approach is that it may be a little far from a real world where bad packages get shunted from one MLE to another.

The weak points identified in the report include a relative lack of support for the latest versions of the specs. The SiX packages where made against the latest versions of the specs, while many VLEs supported only older versions. Also, many tools had uneven support for specs: they could either import them, but not export them, or vice versa.

In terms of degree of support, SCORM 1.2 and IMS Content Package 1.3 were supported best, with QTI and Enterprise more variable. Not suprising, really, considering that IMS Content Package 1.3 is merely a later development of the spec that SCORM 1.2 uses, and that QTI is a very large and all encompassing spec. Enterprise ought to be better supported, but perhaps VLE vendors feel (erroneously) that it is more of an MLE thing.

Bearing in mind that not all products are meant to be complete VLEs (e.g. Question Mark Perception did very well in its niche), the best pupils in the class were Threeships and Giunti's Learn eXact 1.5. The worst the Harvest Road Hive repository and Blackboard 6. The first supported none of the polled specs, even if it does nifty things with IMS Meta-Data and has plans to support proper package aggregation and disaggregation in the (near) future. The latter supports SCORM 1.2 out of the box, and that's it. Anything else either requires extra extensions (building blocks), third party programs, custom solutions or, perhaps, a new version.

SiX published a summary of results (390 Kb, pdf, English). Presentations of the vendors and much else that was discussed during the SURF education days conference is available on the SURF Onderwijsdagen 2002 page

Related items:


No responses have been posted

Creative Commons License This work is licensed under a Creative Commons License.

syndication |publisher's statement |contact us |privacy policy

 go to start of page content