Quinnsights: State of This Acronyms – by Clark N. Quinn

0 Comments

When I began this column, my editor had several suggested topics. One thing in common was that they had been all evaluating things with acronyms. I wasn’t too keen because each of them was kind of waning. However, I decided I could do them all in 1-swell foop (as the expression goes).
So, here’s an update on those subjects, and I will close with some ideas about trends.
Is ADDIE dead?
Florida State University for the US Army developed in 1975 ADDIE. Since then, it has turned into a mainstream design strategy being used across educational design. It has not, of course, remained static, having changed according to understandings and external pressures. Is it relevant?
One issue needs to be made apparent: ADDIE isn’t an instructional concept. No promises are made by ADDIE concerning what are good practices to attain learning ends. It’s not Elaboration Theory, Cognitive Load Theory, Four-Component Instructional Design, or some other approach to discovering instructional requirements. Instead, it’s a process model. It to be taken to go to a solution out of a necessity. And you can slot in any concept into the design stage, which you would like!
When you look across design domains (interface, industrial, graphics, and much more), you’ll see three-step versions and four-step models predominantly. Some assume that the analysis from the plan, or the execution is lumped in with all the development. ADDIE divides out every step, and with a motive that is plausible. Design Thinking, the new umbrella phrase for design approaches, asks one to diverge and converge about the problem (analysis) until you similarly do so for design solutions. Along with the issues involved with the execution could plausibly be separated from the development issues.
As an interface design recognized from the 80s, an iterative strategy tends to discover initial assumptions and adjusts to increased user awareness through testing. Obviously, among ADDIE’S adaptations is to become iterative.
What exactly does this imply to answer the query? ADDIE on principle is simply fine. If, nevertheless, it’s made it simple for your organization to create 1 pass instead of testing and refining, then it’s a burden, not a blessing. My inclination is to leave ADDIE because of its baggage and occupy a new strategy simply to maintain the attention on iteration. As the expression goes “your mileage may vary”.

A term that is more recent and has ascended (and descended) more quickly is that the MOOC. These began as higher-education asynchronous classes for self-study learning that many people could take them. There might be synchronous lectures, but in the scale that they were being consumed (10-K students at a time), the marking was all auto-marked and self-evaluation. And they changed over time.
There were two comparatively immediate phenomena. One was that folks began, but the completion rate tended to hover around 10 percent. This wasn’t viewed. The next was that they had been free, but if you wanted certification for conclusion there typically was a fee to cover.
Another development has been the emergence of MOOC platforms. While in theory any LMS might be utilized, the scale tended to demand re-engineering, and committed platforms surfaced. Some were collaborations; others arose from jobs within particular labs. And these platforms quickly combined capabilities with particular business models. Businesses emerged around different offerings, such as selling the lists of students to potential employers (e.g., software course finishers to tech firms).
In reality, the variations of this were designed for learning. Others contended that such a speed supposed that there were issues. And among these problems soon became evident.
The shortage of evaluation, owing to the scale of these courses, was debatable. Pupils quickly formed their own classes to at least share their understandings. Many MOOC platforms included societal abilities, including peer evaluation. This emerged as a way to manage the scale, and teachers might supervise the peer review to keep it on track.
Responding to canned questions is a good way to find out. Ordinarily, a domain that represents a complex subject demands complicated answers, which are yet still hard to automatically evaluate. Critics rightly noted that such courses may teach you something such as AI (a popular topic) but wouldn’t make you an AI engineer.
Ultimately, I think MOOCs have morphed into more courses. The model of ‘free’ did not last, and the absence of interaction with teachers was crucial. Instead, we’re seeing a move to types and astute pedagogy of approaches to meet billing needs that are urgent. I think MOOCs are, thankfully, gone, replaced by more suitable versions of delivery of learning expertise. They can work for simple subjects, but these are infrequently of interest that is meaningful.
Can xAPI replace SCORM?
This raises the question of if there’s still a function for SCORM. And that question requires examining the motivations and background of the 2 standards.
Due to the’angels dancing on the head of a pin’ arguments entailed around the attempts to make an interoperability standard, ADL picked a’good-enough’ interpretation. Labeled SCORM, and with all the weight of the US government behind it became the standard. With attempt to create uptake and awareness, it gained a foothold.
And it worked; finally SCORM became a reasonable bet that material could be migratable, while there were initial hiccups. There was a difference; the answers were in the ‘course’ level. If you wanted finer granularity–for example to learn what people were getting, how independent elements were performing, etc.–you’re out of luck. Or, rather, you had to create your own mechanisms.
Inspired by the comprehensive data available through internet activity monitoring, there was a push for a finer granularity. The best outcome was that the xAPI, a very simple standard for reporting information in an arrangement. These required a brand-new mechanism to aggregate the information, and the LRS (Learning Record Store) has been born. The information isn’t necessarily useful, but correlating information such as ‘who does what’ starts to give a richer picture of performance. XAPI is more workplace-focused while Sensor API is targeted at higher ed — IMS for example has a standard although xAPI isn’t the only such.
What’s the fact that ADL has released. The cmi5 specification is, at core, a set of statements rules that actually are a greater SCORM. Ultimately, xAPI is a richer format for types of information, and cmi5 is to supersede SCORM. XAPI was an enabler, although yes, SCORM is dead.
By acronym to buzzword
As a little aside, it seems that there’s a motion away from acronyms. (Perhaps we’ve hit acronym fatigue!) We’re not missing out on buzzwords, but they’re becoming. Though microlearning isn’t dead yet, At this time, the topic is workflow learning. Along with the aforementioned Design Thinking (even I am guilty) is in vogue.
The point is that there’ll remain shiny objects with the related hype. Do the due diligence it is worth it to track the trends, comprehend the opportunities that are actual, when it makes sense and engage. Make sense to you?

Categories:

Leave a Reply

Your email address will not be published. Required fields are marked *

New Thinking. New Online Learning.

Signup For Access To Free Courses and Lessons

For more details click on the below link.
TRY IT FREE!
close-link
Get the week’s best articles right in your inbox
Subscribe
Join 15K subscribers
close-image