Blog / eLearning / Why SCORM 2004 failed & what that means for Tin Can

Why SCORM 2004 failed & what that means for Tin Can

“SCORM 2004 is dying (if not already dead!).” Now that might seem like a strong statement but it’s the sad truth. For the careful observer there are many signs to support this view, and here are a few of them:

Sign #1: 75% of packages are still on SCORM 1.2, 10 years after the initial release of SCORM 2004 [1] [2]

 

Sign #2: There is no certification process for tools and packages for the latest SCORM 2004 4th edition. This is the case although several years have passed since 4th release. Currently, someone can be a 4th edition adopter but *not* certified. [3]

Sign #3: ADL itself heavily supports Tin Can as the successor of SCORM.[4]

In essence, SCORM 2004 always lived in the shadow of SCORM 1.2. Now, with the introduction of Tin Can API it seems certain that its adoption rate will decline even further.

Reasons SCORM 2004 Failed

There are a multitude of reasons why SCORM 2004 failed. Here are most prominent (and yes, we refer to SCORM 2004 in the past tense quite deliberately):

Complexity

The major contribution of SCORM 2004 was the “simple sequencing model”. In fact, it was anything but simple. It was a lot of work for LMS vendors to implement and more importantly, it was too complex for many courseware developers to use. Even the simplest of sequencing required a room full of flow-chart diagrams, dozens of field settings – and even then you needed to be an expert to actually understand what it was doing.

The sad fact is that SCORM 2004 had some nice extensions over SCORM 1.2 which generally made sense, but was hidden under the sequential model nightmare.

For example, a major problem with SCORM 1.2 was that when you took a SCORM quiz there was no way for the LMS to know what the actual questions were. You could access the kind of the question, the correct response, the student response and the score – but not the actual question. This is one of the areas where SCORM 2004 was profoundly better than SCORM 1.2. It included a full text question description and a descriptive identifier for answers. This meant that you could do some effective reporting on questions and the distribution of answers. It was a dramatic improvement but only a few took notice.

Low adoption

This was a side-effect of high complexity. Pedagogically, SCORM 2004 offered important new opportunities but at a disproportional cost.  In other words, the added benefits from the standard were unbalanced by its complexity. The end result was low adoption from vendors and instructional designers.

Even when vendors offered support for SCORM 2004 this was handicapped to great extent. For example, many rapid elearning tools that are available for creating courses DO NOT allow you to build anything easily other than basic SCORM 2004. Almost none of them have an interface for creating a dynamically sequenced Multi-SCO package.

Technology shift

10 years is a long time. Since SCORM 2004 was introduced new technologies have come and gone, smartphones have become mainstream, gamification has been introduced, Cloud & lean solutions are hot topics. We are living in a much different, and more connected, world yet SCORM is still an isolated, browser-based, LMS-centered standard. SCORM 2004 had to change in order to adapt to such a dramatically different environment and rather than do that ADL decided to save itself the trouble and start from scratch through what we now know as Tin Can or Experience API (xAPI)

On not being pragmatic

When SCORM was first introduced it answered a real-world problem – that of the standardization of learning packaging and delivery. And it succeeded because until that point in time there was no adequate way to do that job. On the other hand SCORM 2004 tried to address less obvious problems. It had a higher vision and tried to allow or enforce sound pedagogical concepts but was proven less pragmatic.

There is one important real-world issue that SCORM 2004 deliberately avoided dealing with and that is making SCORM a concrete standard. SCORM is a reference model and not a true standard – you can’t plug this into a wall and have everyone work the same way. There is still too much variation in how compliant LMSs implement UIs associated with the SCORM engine. Will content be loaded in a new window? A frameset? How large a window? How will the table of contents be presented? What navigation request does closing the browser imply? Content authors should be able to rely on a consistent set of UI expectations.

Lessons to be learned and the Tin Can future

Tin Can is trying to succeed where SCORM 2004 failed. Nothing is perfect though as ADL admits [5]. Ongoing compromises are not a bad thing per se, but can certainly be tricky.[6]

Simplicity matters

It seems like the need for simplicity is something that Tin Can endorses. Simplicity drives adoption and without adoption no standard can succeed. In essence Tin Can is much simpler that SCORM 2004 and even simpler than SCORM 1.2 (still, on the latest 0.95 and upcoming version 1 of the standard some complexity elements emerge like support for multiple languages – in essence this is good but comes with higher complexity for technology providers).

Technology-shifts can render you irrelevant

Another important characteristic of Tin Can is that it is actually technologically ‘agnostic’. It can be used inside the LMS, outside the LMS, embedded in a mobile phone or in a videogame. That provides some assurance against technology-shifts and opens new possibilities for capturing interesting learning interactions from informal activities.

Ongoing project support is important

An interesting decision regarding Tin Can is that ADL hired a company to drive the Tin Can project (Rustici Software). What that means for the future of the standard is still not clear, however, currently, the marketing and support effort is much improved.

Freedom and Standardization are opposite forces

Unfortunately, Tin Can does not help in the path towards standardization. It does however offer even more freedom to content creators by letting them, for example, define their own verbs used on statements. Interoperability of content between LMSs is somewhat improved due to the simpler messaging system and absence of Javascript; however, standardization of presentation will not be benefited from Tin Can as it is shaped today.

Tin Can chose freedom over standardization. It remains to be seen if this was a good move.

Reporting is critical for eLearning

The need for reporting is one of the main driving forces behind eLearning. Without reporting you cannot calculate the ROI (return-on-investment) of your learning activities. Reporting was not a favorite topic of SCORM but is at the core of Tin Can.  In principal, Tin Can is built around descriptors of actions (‘training evidences’) that can be translated to better reports. Still, the reporting itself depends on each vendor’s interpretation.  Also, for reporting to be useful it may help to merge statements to form higher level descriptors. For example, Tin Can can report on what you experienced or completed. But those are low level statements that cannot be rendered easily to something like “George is good at mathematics”.

Summing up

To say that SCORM 2004 failed because it was too complex is an over-simplification. There were a number of forces that led to this outcome. Tin Can tries to succeed where SCORM 2004 failed by addressing several but not all of the ongoing issues. It also comes with a fresh view on the technology landscape.

It seems that the compromises were calculated ones in order to simplify the standard but we anticipate that in the near future Tin Can will introduce several new elements in favor of standardization. Hopefully during this process its simplicity won’t be hammered too much. It is still early but a good way to introduce standardization might be through a new standard that builds over Tin Can (and thus, does not make it more complex) and addresses visual and reporting concerns. Let’s call it “Tin-Can X”.


Resources

  1. http://scorm.com/blog/2011/08/scorm-stats-then-and-now/
  2. http://scorm.com/scorm-stats/
  3. http://www.adlnet.gov/scorm/scorm-certification
  4. http://www.adlnet.gov/the-definite-indefinite-future-of-scorm
  5. http://scorm.com/project-tin-can-phase-3-known-weaknesses/
  6. http://dspace.dial.pipex.com/town/street/pl38/comp.htm

Share now

You may also like

See how eFront will work in your organization