view printer-friendly version (opens in new window)
IMS and OKI, the wire and the socket
There was a reason last week's alt-i-lab e-learning standards conference took place at MIT in Boston: OKI's main developers are there, and their baby is just about ready to stand on its own two feet. The OKI specifications are not very sexy of themselves, and they are surrounded by a fair amount of confusion, but it is well worth exploring what it can do for your institution's MLE.
Because OKI, unlike most other e-learning interoperability specs, has been developed pretty much behind closed doors, a couple of persistent whole and half truths have been doing the rounds about it. The first thing to know about OKI, therefore, is finding out what it is not.
First, it is not an open source VLE or MLE. You can build an open source MLE with it, and MIT, Stanford and the universities of MIchigan and Indiana are doing just that, but OKI Open Service Interface Definitions (OSIDs) and reference code are just one part of doing an MLE.
Second, and this is probably controversial, the heart of OKI is not really its architecture either. It has one, and its layered cake diagram of services is quite famous by now, but it is more of a necessary means than a goal. Most crucially, you can make good use of OKI without buying into the whole OKI way of slicing networked MLE applications.
The really crucial part of OKI are the OSIDs: definitions of particular slots in a computer program. In an application that allows you to do searches for learning objects in repositories, for example, the programmer can just say 'put the code that talks to the repository here' without actually having to program very much beyond calls to the list of commands that the OSID specifies.
When the search program is run, the idea is that it loads the appropriate adapter (the code that takes the OSID commands and does the actual talking to the repository), gives standardised commands to the adapter, who then passes it on to the repository.
Geoff Merriman, chief strategist of OKI, compares the adapter to a print driver: an application like Word doesn't know how to print anything on any printer. It only gives a set of standardised commands to whatever print driver is selected. The driver will then take care of the actual communication between the PC and the printer.
The adapter - OSID, plug and socket, relation also means that OKI is not a competitor to IMS' or any other existing e-learning interoperability standards.
What IMS has done in the past is defining just data models: what bits of information something like an IMS enterprise compliant student record can contain, and where these bits of information go. What IMS is increasingly doing in specs like DRI is specifying behaviour as well: how applications at different ends of the network can talk to each other. A search program can pass a message with a query in it to a learning object repository, for example, and expect one of a standardised clutch of more or less useful replies in return, and so on.
If that sounds similar to OKI, it's because it is. But it is not the same. An 'on-the-wire' protocol like IMS DRI requires that the applications at either end of the network talk in very similar ways. Not a big deal if both systems were bought at the same time, and built to the same version of the spec, much more of a deal if one was never built with the spec in mind, especially when there are more than two different systems involved in the same conversation.
What OKI allows you to do in such cases is to build a bridge between the differences. Adapters can be written for almost any on-the-wire protocols in the same sphere. So if search systems need to talk to an IMS DRI compliant repository, but don't know how to talk IMS DRI (or any of the specs it encompasses), an adapter can be written that takes OSID commands from the search program, and talks DRI to the repository. Provided the search programme has an OSID adapter socket in it, of course.
It gets better. When combining OKI with on-the-wire protocols like IMS', it becomes much easier to integrate the whole lot. If, for example, a digital repository talks IMS DRI, it becomes economical for the vendor to write one or two OKI adapters to go with it (depending on the different computer languages the vendor wants to support). Any system that needs to talk to the repository can then make use of the one adapter- no need to write a new adapter for every repository.
Put differently, you can integrate an MLE without any specs, but that pretty much means ripping out any old systems that you already have and buy everything from one vendor, in one go. And pray that you don't need to change anything for a number of years. Or else face a lot of expensive customisation and some maintenance headaches.
You can also integrate an MLE with just IMS specs. But that still means that everything needs to talk the same version of that spec. If the spec changes, or if there are systems that can't or won't talk IMS, some potentially fragile, ad hoc customisation is needed.
Integrating an MLE with both IMS and OKI means maximum flexibility in how you plan and execute the MLE. If one system speaks just IMS, it can still participate. If another system only has an OKI slot, that's fine too; you just have to find an adapter that talks IMS. If you have a system that doesn't have an OKI slot, and doesn't talk IMS either, you can still gain by having just one custom adapter that talks whatever on-the-wire protocol most other systems have OKI adapters for. Which is most likely to be IMS.
Other advantages include, and this is where the OKI architecture comes back in, much greater flexibility in learning related applications. If all the hard work is done in standardised, widely used adapters, then it becomes much easier and cheaper to make applications that are very innovative or finely tuned to teachers' and learners' needs.
All of this doesn't mean that OKI is some kind of magic, though. Initial development of applications with OKI slots is not going to be significantly cheaper than regular systems. The benefit is longer term. Also, there are limits to the amount of translation an adapter can do between an on-the-wire protocol on the one hand, and the OSID on the other. If the data- or interface model on one end requires things that the other end cannot provide, a rewrite of one of the systems may still be necessary. Furthermore, rather a lot depends on the OSIDs; if they don't provide for what is required, you're stuck. Same, to a lesser extent, for the wider architecture.
The biggest risk of all, though, is that the success of OKI depends almost entirely on the support it gets. If a sufficient amount of people start demanding OSID slots in their MLE components, and writing adapters, the benefits can quickly scale. If not, the benefit of implementing OSIDs will be limited to just a nice clean insulation of the application from the network.
More information about OKI is available on the OKI website.