Fundamental flaws in government-sanctioned e-learning

#thinkdifferent

This post is somewhat off-topic to my regular audience, but some rants must be vented, so here goes...

 

CONTEXT

The government has endorsed a new eLearning quality model.

Readers are urged to take a look and then return to the post.

To be fair, or brutal, depending on your point of view, this flaw is really expected because it is built on a flaw that exists in the entire vocational educational system which is competency based system here in Australia.

My company used to operate an RTO. Mainly because our clients asked about the money. We tried to do the right thing in giving them the training they needed whilst maintaining compliance with what the system demanded.

In the end we quit because we could not do justice to the commercial reality and continue the box ticking. It was uneconomical for us (as a small business) and tedious for the client. Now our training products must succeed or fail on their own merits and generate sufficient value to be judged against the plethora of free stuff on the internet and cowboy RTOs subsidised by the government.

This note is not written because we are disappointed in any way. (We stopped taking trainees more than 2 years before we ‘expired’ our registration, so it happened a long time ago.)

I mention this as context simply to make the point that we are intimately familiar with the system. We continue to operate in the Learning & Development space as a private entity but not as an RTO.

The whole formal educational system is under pressure from a rapidly evolving learning landscape – and I am not talking about MOOCs. Some of these issues raised and the final solutions suggested therefore is broader than just the VET system.

The reason why I raise it in response to the eLearning Quality Model is because it represents a major opportunity that is missed since it merely builds on what is already a flawed foundation.

The analogy of driving a car can be used to understand the flaws in competency based education. If this new ‘model’ was a car, it would work as follows.

There is a detailed manual of how the car functions, much like an owners’ manual. This one even includes some driving instructions. This is the new ‘quality model’ issued by FLAG.

If you were generous, you would say it even references a rule book. Except that this rule book is not generally available to all users, is not fully documented and even where it is, it is subject to interpretation. The rule book is enforced by the feared auditors who each have their own pet approach and much of the RTOs success in an audit is about whether your preparation aligned with the auditor’s pet hates.

But I digress.

The job of the owners’ manual (quality model) is to provide the drivers of the car with the knowledge and the structure to successfully navigate the car from point A to Point B.

And this is where the wheels fall off – pun intended.

Knowing where every safety switch is, knowing where the spare tyre is located, knowing which type of fuel to use and how to tune the radio may all be important parts of driving the car, but that clearly won’t make you a competent driver.

Even if the rules were clear (which they are not) then you still won’t be a competent driver. Does someone who is travelling at 80 km/h in an 80 km/h zone competent?

Clearly not.

Even if the rules were clear and the assessment of those rules was effective, it still would not work.

These standards (like speed limits) are necessary but compliance with standards does not indicate that you are good driver.

Testing against these standards achieves nothing.

Even taking a driver’s test supervised by an experienced instructor achieves nothing.

Firstly consider the accident rates that continue despite the driver’s test, which means the test is inadequate indicator of deriver competency or desired outcomes.

Secondly I would argue that the good drivers would make fewer accidents irrespective of whether they take the driver’s test or not.

Thirdly, the number of failed drivers is negligible. Even those that fail once or twice achieve their license within a few weeks of failure. Their driving skills could not possibly have improved, all that happened is that they successfully avoided the technicality that prevented them from achieving their license in the first instance. (All citizens of this country are familiar with the stories of how the supervisor found a silly transgression and failed you on it.)

And let’s be honest here, the whole system is geared towards finding these technical failures which in reality contribute nothing to raising the standard of education just as those technical failures do not make our roads any safer in a material way. (All RTOs will have stories of how they failed audits on those same technicalities that, in the scheme of things, make absolutely no difference to the standard of education.)

The conscientious will argue that it is all we have and even if it makes a small difference, it helps. And just like a car needs an owner’s manual, we need those technical standards.

They are wrong.

The problems with the system are strategic on one level and on another there are domain-specific issues.

DOMAIN SPECIFIC ISSUES

There is a difference between the issues apprentices face as opposed to those who are trainees.

Apprentices are engaged by employers because they are cheap labour. They know that they will learn the job on the job. What happens at TAFE…. well, whatever… As long as they are on time and do what they are told.

But by the nature of the fields of work where you can serve as an apprentice, there is some merit in a having a formal component of learning. These jobs are OLD jobs. Best practice is well known. These practises can be codified as standards relatively easily.

There if very little change in most of these types of jobs. Having the ‘school’ take care of some of the learning provides mechanisms that ensure that shortcuts and bad habits learned at one employer do not become standard practice.

But these jobs are dying, falling out of favour or being automated. Spending a lot of time and resources on fixing it will gain very little. The proposed quality model may be all that is required to allow an antiquated system to use some new tools. (For eLearning is a new tool and not much more.)

The traineeship system is different.

It is easy to codify in standards the job of an electrician. (Apprentice.)

It is extremely difficult to codify the standards of a manager or a marketer. (Trainee.)

The Trainees system (of white collar apprentices) is a gigantic wealth transfer system that produces very little in terms of tangible results. Employers go through the system to make money, or at least cover a big part of their training expense and absolve themselves of the responsibility for training.

As the diligent and responsible people in government know full well, the (private) RTOs do it purely for the money and the business model requires ticking the boxes and cashing cheques. This causes problems they are well aware of and these problems do not exist to the same extent in the TAFE system.

Everybody in the system knows it is flawed. And they feel they can’t do anything about it except create more boxes.

The eLearning quality model does not address this fundamental issue. FLAG has now simply added another owner’s manual to the car – this one for the electronics and al the fancy new systems – but the car is still (a) heading in the wrong direction and (b) driven by a driver who passed a test instead of one who is competent.

STRATEGIC ISSUES

Consider the quality model. It is a gigantic engine that will produce more tick-the-box requirements than will be humanly possible to manage.

The ladder of this learning is leaning against the wrong wall. It is there. It works. People are climbing. But when they arrive at the top rung and hoist themselves onto the roof, they are mighty surprised that they are in the wrong place.

Their teachers ticked every box and they have a piece of paper that is to all intents and purposes useless as they soon find out when they start wondering the streets.

What VET needs is not a quality model – that merely helps manage the output on the current production line.

We need a new vision – a whole new factory – which may not even be a conventional factory. It is beyond the scope of this note to explore an entirely new vision, suffice to say that the notion of social learning, performance enablement and behavioural analytics would provide the key planks of a new strategy. (Read that link, it is an eye-opener for most people.)

PAINTING A BIG PICTURE

Is the ladder of education leaning against the right wall? The nature of education and the delineation between education, training and learning should be completely re-thought.

Is a driving test the best way to determine readiness? Are we assessing the learner’s ability to be assessed or the real skill? The whole notion of competency should be revisited and considered in the light of changing needs and changing technologies.

Is a checklist-wielding instructor an effective and objective way to test the real skill required? Are we testing the teacher’s ability to comply or their real andragogical effectiveness? The role of teachers/trainers/assessors/auditors should be fundamentally reviewed.

Do we really need a bigger owner’s manual? The benefits of the current system should be questioned without fear or favour. (Nigh impossible in the equally antiquated unionised environment that is designed to maintain the status quo.)

CONCLUSION

There is nothing ‘wrong’ with the quality model as it tick all the boxes.

But it is a model of the wrong thing because that system of education it is supposed to function in has already changed.

I would argue that this is a pretty fundamental flaw.