skip to main page content CETIS: Click here to return to the homepage
the centre for educational technology interoperability standards

skip over the long navigation bar
Home
News
Features
Events
Forums
Reference
Briefings
Press centre

Inside Cetis
what is Cetis?
Contact us
Cetis staff
Jobs at CETIS


 




Syndication
XML: Click here to get the news as an RSS XML file XML: Click here to get the news as an Atom XML file iCAL: Click here to get the events as an iCalendar file

Background
what are learning technology standards?
who's involved?
who's doing what?

CETIS Groups
what are cetis groups?
what difference can they make?
Assessment SIG
Educational Content SIG
Enterprise SIG
Metadata SIG
Life Long Learning Group
Portfolio SIG
Accessibility Group
Pedagogy Forum
Developer's forum

Subjects
Accessibility (310)
Assessment (74)
Content (283)
Metadata (195)
Pedagogy (34)
Profile (138)
Tools (197)
For Developers (569)
For Educators (344)
For Managers (339)
For Members (584)
SCORM (118)
AICC (18)
CEN (34)
DCMI (36)
EML (47)
IEEE (79)
IMS (302)
ISO (21)
OAI (24)
OKI (20)
PROMETEUS (12)
W3C (37)

print this article (opens in new window) view printer-friendly version (opens in new window)

Interoperability state of play at IMS Melbourne meeting

The open tech forum was one of a whole series of e-learning events that together formed IDEA Summer 2005. Yes, 'Summer' - no place for hemispherism here. Other events included a two day workshop on the Content Object Repository Discovery and Registration/Resolution Architecture (CORDRA), an IMS workgroup meeting and several workshops on IMS Simple Sequencing and SCORM 2004.

After a brief intro and admonition by host Neil MacLean of Australia's DEST to keep it informal, IMS CEO Ed Walker kicked of by giving his version of the goals of IMS: it is all about convenience and task focus, and keeping things affordable, effective and profitable.

To help get there, three trends are emerging, Ed continued. First of all, the service oriented approach (soa) - the effort to break down big, monolithic applications in to smaller, more manageable functions. This is not limited to just technology, but also pedagogic approaches. Second, the realisation that e-learning isn't going to go away. We've had it for a while now, and its use is no longer in doubt. Lastly, a shift in focus from technology development to users, be they learners or teachers.

Service Oriented E-learning Frameworks

Picking up the soa meme, Scott Wilson of CETIS outlined the ins and outs of the ELF. That was initially meant to stand for 'E-Learning Framework', but Scott indicated that he was "not sure what ELF means anymore, it's just ELF now". The reason being that a number of people realised that it made little sense to limit it to e-learning, and not include areas such as scholarly information, research and administration.

The aim of the effort is to bring about "new synergies from the combination of tools and functions", or what others have called the 'beware of the unintentional integration' effect. Once you have relatively small and well defined functions available on a network, it becomes easier to stitch them together in new and interesting ways. It also means that things can be cheaper, and more flexible.

The approach itself, according to Scott, came out of the lessons learned from past attempts at integrating big, 'black box' systems. A process akin to "docking the LMS battleship with the MIS aircraft carrier".

Getting there with smaller, independent services is not a magic overnight solution, though. Scott's worst case scenario involved application developers having to devote as much time and effort on security and 'workflow' (the stitching together of services) as on the actual service consumption and functionality. The more optimistic view is that these will be slim layers of generic tech, with reality likely to be somewhere in the middle.

The whole issue of such 'whole of fabric' issues was explored in more detail by MELCOE's James Dalziel. His main argument is that some interfaces matter more than others. Where some ELF service such as 'Chat' can happily live on its own without affecting any other service, things like 'Authentication', 'Athorisation', 'Access control' and 'Workflow' have an impact on all the other services, which makes them pretty hard to design.

Fortunately, James' team (which includes Scott Wilson) is working on the fabric issues in the MAMS project. The preliminary demos were certainly impressive, even if they indicated that everyone's fervent wish that MAMS' complete solution will magically appear tomorrow is a little optimistic. Worse, we can't just leave all the worrying to James and his team either.

Where the ELF is mostly oriented on web services as a technology (both the 'heavy' WS-something stuff, and lighter things such as RSS and FOAF), another approach involves the use of adapters. MIT's Open Knowledge Initiative (OKI) has been working on this for some time now, and Scott Thorne of MIT gave his take on soa.

To his mind, the essence of system interoperability isn't about being able to go point to point, but to be able to swap both user tools, and the systems that service them, with maximal flexibility. That is, the user should be able to use whatever search tool she likes for whatever repository she needs to search, without having to worry about how the tool and the repository manage to communicate.

According to Scott, that can be best done by agreeing an OSID, and use adapters that plug into them. The outside end of that adapter is used to talk across the network, and the inside to talk to the application. Adapters would then be loaded in a particular tool much like a print driver on a PC.

The two approaches are intended to be complementary, but as Sandy Britain of Tairawhiti Polytechnic wondered in the discussion afterwards, the question remains how practical that is. MIT's Scott replied that OKI has put out fifteen specs in the area, and that ELF doesn't do specs itself. CETIS' Scott confirmed that, and added that OKI's OSIDs were often the only specs in some service areas. Furthermore, ELF focusses on how data flows across the network, which is of little concern to the OSIDs.

Addressing the issue independently, Colin Smythe of IMS outlined the new way in which IMS builds its specs: from a fairly abstract but common representation of data and behaviour in UML to more specific technology such as web services and XOSIDs. Other technologies are perfectly possible, but you may have to roll your own from the UML.

Yet another Scott (Penrose, of Australia's myinternet), wondered how the ELF thing fitted with existing approaches such as the School Interoperability Framework (SIF) that is widely used in US primary and secondary education. CETIS Scott characterised that as a particular take on soa that fits well in the ELF; it is just that SIF uses a zone integration server as a kind of traffic cop to regulate the data flows.

Fred Beshears of UC Berkeley went one deeper with suggesting that, given how critical some IMS specs have become, it may be time to adopt a rule that requires two independent implementations of any spec, before it gets full final status. In James Dalziel's opinion, that depends on what you think IMS is- a proper, end of the line standards body, or a research and development body that also does specs. In either case, he thought that IMS needs to do more open trial and error before declaring a spec 'final'.

Repositories

2004 was the year of the repository, according to DEST and IMS Australia's Kerry Blinco. There's plenty of local evidence for that in such initiatives as ARROW; a pioneering set of projects that looks at the implementation and federation of institutional repositories.

Kerry also came up with a definition of repositories that is refreshingly simple: "somewhere where you can put stuff that someone else can later find and download". Trouble is, that covers a wide variety of sins, as illustrated by her famous repository wheel of fortune diagram 1.

HarvestRoad's John Townsend added his "5.8 Euro cents worth" by challenging Kerry's notion that 2004 was indeed the year of the repository. As a repository vendor, he sees every year as the year of the repository, chiefly because "managing your content is always in fashion".

Having said all of that, John's main argument is that the e-learning is in the middle of a second wave. The first was about the spread of VLEs, which was necessary and useful, but VLEs are often "delibarately restricted in their ability to share and re-use". The second wave are about the spread of repositorires, or, in John's admirably self-deprecating phrase a "Content bridge over a river of uncertainty [of VLE vendor viability and choice]".

Griff Richards of Canada's Simon Frasier University has been at the e-learning repository game for a while now, and gave his own, pithy take on the phenomenon.

To start with the current direction of development: one trend is to have the most objects- but only those that are appropriate to a particular sector, subject, language and accessibility need. Second is to have the widest audience- but with restrictions such as different access rules for different users.

Griff also gave his take on success indicators. Before anything else, that is instant gratification for the user. That means immediate results from easy, robust tools. Add a sufficient quantity of quality materials, and we'll know we'll have made it.

Discussion afterwards immediately went to brass tacks: what is the motivation of VLE vendors to allow you to share content? As one of the few major VLE vendors present, Blackboard's representatives picked up the challenge and emphasised that it should be up to the customer to decide what to do with their own content. They're willing to help out, hence the development of some Blackboard Building Blocks that add sharing capability. Blackboard's Jan Day also referred to a new opportunity for Blackboard customers to share content between multiple installations without an increase in the licence fee. There was no indication whether such a federation could include other kinds of VLEs or repositories, however.

Implementation technology infrastructure

While these initiatives are all very well, there is still an implementation 'last mile', as Steve Griffin of IMS pointed out. Standards got us halfway to interoperability, and frameworks will get us a bit further still, but implementations need support to complete the picture.

IMS is exploring the possibilities in this area with ideas such as supporting the IDEA-lab codebash, producing high level documentation, supporting particular communities of practice, and a revamped, lighter weight conformance programme.

Starting with observation that we don't yet have full interoperability, Dan Rehak of Carnegie-Mellon's LSAL continued to explore the issue from a more historic perspective. For example, the practice of authoring metadata has not actually changed all that much since the days of the card catalog. People still use a welter of (local) formats that, though on screen, would look instantly familiar to a librarian of about a century ago.

Since Dan's organisation is the steward of the CORDRA initiative, his intentions is to augment all that with a vision founded in the general lessons of successful technological infrastructure development. The development of things such as the railways and the 'phone system have a lot to tell us about issues that include scalability from local to global, separation of functions, targetting the right users and much more.

That we're not there with targetting the right users at all times, was expounded in greater detail by Liddy Neville of DEST. A participant in the accessibility efforts in IMS, the W3C and other gremia, Liddy had some interesting statistics about the size of the accessibility problem. A study by Microsoft — not known for its extensive accessibility support — found that as much as sixty percent of the workforce require some form of assisstance in accessing resources on-line. Mind; these are not just people officially classified as 'disabled', but also include those of us who work in a noisy environment, or who just need to increase font size a notch or two.

Crucially, supporting all these people is not a matter of beating up on your webjockey. Real solutions involve better tools, a better distribution of the burden (so that others can make alternative resources, for example), making greater demands on publishing systems and by making content properly re-useable using standards.

That way, appropriate resources for everyone can be seamlessly deployed in everyone's environment. Which, as Ed Walker pointed out, is why all these organisations started collaborating on interoperability standards in the first place.

Resources

The IDEA-lab website, with a good many presentations on the open tech forum page.

Notes

1. The repository wheel of fortune is a series of technical and social aspects of a repository that can be combined to form one category such as 'formal digital library' or 'personal repository'.

Related items:

Comments:

No responses have been posted

copyright cetis.ac.uk
Creative Commons License This work is licensed under a Creative Commons License.

syndication |publisher's statement |contact us |privacy policy

 go to start of page content