January 21, 2005
In Praise of Simpler Standards?
I've been wondering a lot recently about the virtue of simplicity in specifications and standards. How important is it in the scale of values that determine a successful specification?
The other values I think are important in specifications work are:
- utility: the specification can fulfill the requirement it was created for
- precision: the specification is unambiguous in its definitions
- free: the specification does not incur licensing costs
- open: the specification has an open process where any person or organisation has a chance to influence its future
- clarity: the specification is clearly explained
In adoption, I think developers also favour:
- popularity: lots of other people use it, or a few influential people publicly endorse it
- toolset: there are some reasonably OK libraries to help you get coding, and so you don't need to read the fine print in the spec
- coolness: it uses some cool technology, or has some cool applications, that make it more appealing to experiment with
Simplicity fits in a number of areas. A simple specification has fewer items to define (classes, properties, methods, bindings etc), so precision is easier to accomplish. Simple specifications are easier to experiment with, so provided they have utility (and coolness if at all possible) then its likely that at least a few developers will give it a go. Which helps with popularity, and usually generates a few open-source software libraries.
I'm tempted to think that simplicity is one of the key virtues for a specification to reach critical adoption. I'm thinking of HTML, RSS and vCard as being good examples of this.
So why are some of the specifications in the field of e-learning relatively complex?
Scope is an easy culprit - if the requirements are broad then in order achieve utility, simplicity has to be sacrificed. IEEE Learning Object Metadata (LOM) would be an example here, as would IMS Learner Information Package. These are both specifications originating from IMS; however others such as IMS ePortfolio and IMS Content Packaging are far simpler, so clearly this isn't just a result of the process in that particular specification body.
Something I've seen a few times has been a tendency to spread the requirements for a specification into 'bordering' areas, typically in the protocol stack layers such as security and management. This is sometimes a symptom of insufficient research into and awareness of other specifications, or of a lack of such specifications at the time the work is produced; a good example would be the "proto-SOAP" HTTP protocols in the Schools Interoperability Framework (SIF), or the message processing rules included inside the first IMS Enterprise specification. In neither of these cases are the definitions in any meaningful way "education specific" and would have been better expressed in protocols that had cross-industry support; however this is only possible with the benefit of hindsight, as at the time of authoring the specifications things like SOAP did not exist.
Another tendency is something I characterise as excessive "what if?" analysis: trying to address in a theoretical discussion all the possible consequences of the use of the specification, and building features into the specification to accommodate them. A certain amount of "what if?" is critical to ensuring utility, but this can easily lead too far into speculation. For example, the QTI 1.0 specification provided for a vast range of assessment types, covering every conceivable combination of answer type and rendering mechanism, even ones that didn't make any real sense (e.g. a true/false answer rendered as a slider, or a fill-in-the-blanks rendered using a drop-down list). The QTI 2.0 group recognised this wasn't terribly helpful, and "unrefactored" the specification into a small number of concrete assessment types that are actually used. Its a much simpler specification as a result!
One other way complexity is introduced is something I think Martin Fowler may have coined the term for - "premature optimisation" - which is trying to increase the performance of an implementation or reduce file size by introducing new elements, often before this has been demonstrated to even be a problem.
For example, the "Dependency" element in IMS Content Packaging exists solely to enable reuse of resources within a package, and thereby save a few lines of XML. However, to achieve this result it introduced both complexity and ambiguity - ambiguiity, as some people implemented the specification incorrectly because "dependency" sounds like it should do a job that the "file" element performs (references to dependent files), and complexity, because to account for the use of "dependency" meant implementing two completely different ways of referencing a "Resource" from an "Item", both of which are identical in terms of final behaviour. "Dependency" is logically redundant - its a performance fix for a theoretical performance problem. Similar optimization redundancies can be found elsewhere in Content Packaging ("path"), and IMS Learning Design.
I've been using IMS as an example primarily because of my own experiences in the IMS working groups; the same kinds of problem occur in other specifications - in RSS 2.0, for example.
I don't think there can be hard and fast rules about complexity - some things are naturally very complex (authorization being a good example) - but I wonder if we can use some simple heuristics to ensure the work we do doesn't result in overly complex specifications. Perhaps a "notepad test" - can you create a valid data instance in Notepad in less than 4k? Or a "reading test" - can rou read and understand the basics of the specification in under an hour? Or maybe a "scripting test" - can you create a simple client or service provider in a scripting language in a day?
Complexity can be hard to argue against in a specification meeting, as utility is such a high priority, but we need to guard against it wherever we can, if the specifications we create are to become widely adopted, and thereby justify our investment of time working on them.