Monthly Archives: January 2020

A submission to ASQA review

The current rapid review of ASQA will make recommendations to the government on how to ‘position ASQA as an effective modern regulator’. Modernisation is a big word. It implies change at a fundamental level across an entire organisation. Successful modernisation of ASQA could also drive positive change across the wider VET sector. The review provides an opportunity to remedy some of the major issues that sometimes make VET a work environment of murky waters. This submission offers some suggestions for the consideration of the reviewer.

The reform is intended to position ASQA as an effective modern regulator and to deliver on future reform directions agreed through the COAG reform road map.

1.Modernise ASQA’s world view

The COAG Vision for VET (9 August 2019) acknowledges that ‘economic change and transformation will continue to shape the way we work, and the skills needed to succeed as workers, as employers and as a nation.’
 
ASQA’s current modus operandi is underpinned by the assumption that the apprenticeship/traineeship form of delivery is the norm. 29.6% of students who enrolled in a VET subject last year did so to get employed (NCVER 2019). Many of these students are on apprenticeships or traineeships.
 
However, 70.4% of VET students were not looking for a job. They wanted a new career, or extra skills or they were required to do it as upskilling for their work role or for other personal reasons. 
 
The traditional craft-based apprenticeship/traineeship approach inherited from the early 20th Century does not have the flexibility to cope with adults who are turning to VET in droves to ensure that they stay skill-relevant in today’s work landscape or the raft of training options that cater for this majority of VET clients. 
 
This is a macro-issue that needs to be addressed if we want synergy between the work of the regulator and the work of registered training organisations.
 

2. Modernise ASQA’s interpretation of ‘compliance’
Modernisation began in 2017 with adoption of the student-centred audit approach. This was widely recognised as a successful innovation.

Perhaps it is time to examine the belief that compliance is binary and must be applied at a micro-level. It is a simplistic approach that causes a phenomenal amount of work for both RTOs and the regulator.

An example. Standard 1 has 27 clauses, most with numerous sub-sets of requirements. An RTO must be compliant with every item in a very long list linked to this standard. One non-compliance in the list and the standard is ‘non-compliant’. The chances of being ‘compliant with Standard 1’ are minimal.

Educators know that when the level of ‘fail’ is continuously high statistically, then there is most likely a problem with the design of the assessment instrument. The same principle applies here.

3. Modernise ASQA’s assessment audit methodology
ASQA has stayed with the granular approach to auditing an RTO’s assessment instruments that it inherited at its inception. The format of units of competency changed about five years ago, but the method of auditing assessment instruments has not.

It is not hard to find a small phrase in one of the many performance criteria in a unit that has not been covered sufficiently, as far as the auditor is concerned. BAM! Non-compliance.

An example. A theoretical unit of competency has four elements. Each element has five performance criteria. If an assessment does not sufficiently cover one of these twenty performance criteria, then the assessment instrument is non-compliant, even though it addressed 95% of performance criteria successfully.

The modern units of competency include ‘Performance Evidence’ and ‘Knowledge Evidence’. Clearly the designers of the units consider the items in these sections of utmost importance for proving competency. Surely this evidence should be the focus of the auditor’s attention. Performance criteria still feature, but if the fifth word in the fifteenth line is not covered, is this really a reason to issue a non-compliance, assuming ‘Performance Evidence’ and ‘Knowledge Evidence’ are covered sufficiently?

The reform is intended to improve and expand ASQA’s engagement with the VET sector and educative role to ensure training providers are aware of, and supported to understand, expectations and requirements.

1. Re-model the audit tool
This exchange happened in 2019 after an RTO showed an auditor the evidence they had to cover a specific requirement and they were informed that it was not sufficient.

RTO representative: What else would you be expecting to see here?
Auditor: It is not my place to tell you what is required. I simply decide if what you present meets the standards. I’m marking it as a noncompliance. Let’s move on.

This example reveals one of the major flaws in the current regulatory process. There is no shared understanding of terminology or expectations across the sector. Most RTOs want to do the appropriate thing, if they just knew what that was. No wonder consultants of variable quality proliferate in the sector. The only hope is to find out what the individual auditor wants and put that into their ‘response to the audit report’. The subjectivity of it is scary.

A possible solution
• Design a criteria-based auditing instrument set out as a rubric.
• Make this audit instrument publicly available in the interests of transparency and fairness.
• Conduct an educative campaign to ensure understanding across the sector.

This will work if there is a modernised interpretation to ‘compliance’ as mentioned above. Educators in Queensland Secondary schools have been developing criteria-based assessment rubrics for decades, so they are not a mystery. 

Criteria based rubrics would also remove much of the subjectivity associated with the audit process and contribute significantly to consistency across auditors.

2. Build and focus on VET’s body of knowledge
Language matters. Considerable confusion at the operational level of the sector results from the vacuum associated with the lack of shared understanding about terminology, specifically the language of training and assessment. This was not always the case.

Whilst the modernisation of units of competency clarified what was required to be competent in a unit, the re-design omitted some of the information that RTOs and auditors relied on to understand terminology or expectations.

The Users’ Guide to the 2015 Standards is a comprehensive document. It uses terms that are not defined anywhere, anymore. When it was compiled, many of these terms were widely understood because they had been in VET vocabulary for decades. This is no longer the case. The years slip by and new players join the sector.

One example
‘Assessment tools’ and ‘assessment methods’ are both mentioned in the legislation. There is no definition of them anywhere. Not a big deal? Maybe not, until it comes to audit time and the auditor says ‘There is no difference between a tool, a task and a method. Unless there is a definition in the legislation, then it is just semantics.’

These explanations were included in the unit of competence TAEASS402B Assess competence and were accepted generally. However, this unit was superseded in 2016. They have not been published/explained anywhere else since.

Assessment tools include:
• the learning or competency unit(s) to be assessed
• the target group, context and conditions for the assessment
• the tasks to be administered to the candidate
• an outline of the evidence to be gathered from the candidate
• the evidence criteria used to judge the quality of performance
• (i.e. the assessment decision-making rules)
• the administration, recording and reporting requirements
• the evidence of how validity and reliability have been tested and built into the design and use of the tool.

Assessment methods include techniques used to gather different types of evidence, such as:
• direct observation
• structured activities
• oral or written questioning
• portfolios of evidence
• review of products
• third-party feedback.

Both RTOs and the regulator need some solid ground to stand on related to this issue. There is plenty of room for an educative approach to assessment. Here is one example of how another regulator educates their sector on the meaning of core terminology.

Aged Care Commission: Open disclosure

The reform is intended to improve ASQA’s collection and use of data to assist with identifying poor quality training providers, and better enable training providers to give feedback on ASQA (including directly to the Department of Employment, Skills, Small and Family Business).

Currently, a complaint against ASQA must use the form on the ASQA website and this must be submitted to ASQA. Seriously? What part of this process honours the principles of impartiality or privacy, or engenders confidence in a fair hearing or protects against possible negative attention? This is unacceptable.

Interested in adding your two-bob’s worth?
If you would like to support any of these suggestions, or have something else that you want to say, please let the reviewer know. It does not have to be a long response. RTOs are the main audience for ASQA activities and the review would be enriched by experiences from the frontline.

Simply email to ASQAreform@employment.gov.au

Feedback should be sent  by COB Friday 24 January 2020. Feedback will be provided directly to the regulatory expert engaged to undertake the review.

Government anouncement of review: click here