{PROCESS OF ASSESSMENT VALIDATION REGARDING TRAINING ESTABLISHMENTS THROUGHOUT THE AUSTRALIAN LANDSCAPE A FULL GUIDE

{Process of Assessment Validation regarding Training Establishments throughout the Australian landscape A Full Guide

{Process of Assessment Validation regarding Training Establishments throughout the Australian landscape A Full Guide

Blog Article

Intro to Assessment Validation

RTOs are responsible for multiple obligations following registration, such as annual declarations, AVETMISS reporting, and promotional compliance. Among these tasks, assessment validation is notably challenging. While validation has been covered in several publications, let's revisit the fundamental principles. The Australian Skills Quality Authority describes validation of assessments as granular review of the assessment procedure.

Principally, validation of assessments is aimed at identifying which parts of an RTO’s assessment procedures are effective and which need improvement. With a proper grasp of its key aspects, validation becomes less daunting. According to Clause 1.8 of the 2015 Standards for RTOs, RTOs must ensure their assessment systems, including RPL, comply with the training package requirements and are conducted according to the Principles of Assessment and Rules of Evidence.

The rules require two types of validation. The initial type of assessment review checks conformity with the requirements of the training package within your organisation's scope. The subsequent validation verifies that assessments adhere to the principles of assessment and Rules of Evidence. This indicates that we perform validation pre- and post-assessment. This article will discuss the primary type—assessment tool validation.

Two Types of Assessment Validation

- Assessment Tool Validation: Referred to as pre-assessment validation or verification, involves the initial part of the clause, focusing on meeting all unit requirements.
- Post-Assessment Validation: Is concerned with the execution, ensuring RTOs conduct assessments in line with the Principles of Assessment and Rules of Evidence.

Process of Conducting Assessment Tool Validation

Timing for Assessment Tool Validation

The aim of validating assessment tools is to make sure that all elements, performance criteria, and performance and knowledge evidence are included by your evaluation tools. Therefore, whenever you obtain new learning resources, you must carry out validation of assessment tools before students use them. There's no need to wait for your next scheduled validation. Validate new tools right away to confirm they are appropriate for students.

Nevertheless, this isn't the only time to perform this type of validation. Perform assessment tool validation also when you:

- Improve your resources
- Incorporate new training products on scope
- Evaluate your course with training product updates
- Identify potential risks in your learning resources during your risk assessment

The Australian Skills Quality Authority employs a risk-based approach for regulating RTOs and expects regular risk assessments. Therefore, student complaints about learning resources are an ideal time to conduct assessment tool validation.

What Training Products Require Validation

Bear in mind that this validation ensures compliance of all educational resources before use. All RTOs must validate training products for each unit.

Necessary Resources for Assessment Tool Validation

To validate your evaluation tools, you will need the complete set of your educational resources:

- Mapping Tool: The first document to review. It indicates which assessment items meet course unit requirements, helping with faster validation.
- Learner Workbook: Ensure it is suitable as an assessment resource during validation. Check find it here if guidelines are clear and answer fields are sufficient. This is a common issue.
- Assessor Guide: Also verify if instructions for assessors are sufficient and if clear criteria for each assessment task are provided. Clear standards are crucial for reliable evaluation results.
- Supplementary Resources: These may include lists, registers, and evaluation templates designed separately from the learner workbook and assessor guide. Validate these to ensure they match the assessment activity and address unit requirements.

Assessment Validation Panel

Standard 1.11 specifies the requirements for members of the validation panel. It states assessment validation can be performed by one or more people. However, RTOs usually ask all trainers and evaluators to participate, sometimes including industry experts.

Collectively, your assessment validation panel must have:

- Vocational Skills and Up-to-date Industry Skills relevant to the unit under validation.
- Updated Knowledge and Skills in Vocational Teaching and Learning.
- Either of the following certifications for training and assessment:
- TAE40116 Certificate IV in Training and Assessment or its successor.

Principles Guiding Assessment

- Fairness: Does the assessment process offer equal opportunity and access to everyone?
- Adaptability: Is the assessment adaptable to different needs and preferences of candidates?
- Accuracy: Is the assessment relevant to the skills and knowledge it aims to evaluate?
- Dependability: Will different assessors make the same decision on skill competence?

Rules of Evidence

- Relevance: Does the evidence demonstrate that the candidate has the skills, knowledge, and attributes described in the unit of competency and associated assessment requirements?
- Sufficiency: Is there enough evidence to ensure that the learner has the skills and knowledge required?
- Authenticity: Does the evidence confirm the originality of the candidate's work?
- Currency: Is the evidence up-to-date with current industry practices?

Key Considerations for Assessment Validation

Pay attention to the verbs in the unit specifications and ensure they are addressed by the evaluation task. For example, in the unit CHCECE032 Baby and Toddler Care, one performance evidence requirement asks students to:

- Perform diaper changes
- Prepare and feed bottles, clean feeding equipment
- Feed babies with solid food
- Respond appropriately to baby signs and cues
- Get babies ready for sleep and settle them
- Observe and promote suitable physical activities and motor skills for babies

Common Pitfalls

Asking students to describe the nappy-changing process for babies under 12 months old does not meet the unit requirement. Unless the unit specification is meant to evaluate underlying knowledge (i.e., knowledge evidence), students should be carrying out the tasks.

Mind the Plurals!

Pay attention to the frequency. In our example, one of the unit requirements of CHCECE032 calls for the students to complete the tasks at least once on two different babies under 12 months of age. Having students complete the tasks listed twice on just one baby is not sufficient.

Full Competence or Not Competent

Pay attention to itemized requirements. As mentioned earlier, if students only complete half the tasks, it’s non-compliant. Each assessment item must meet all specifications, or the student is not competent, and the evaluation tool is non-compliant.

Provide Specific Details

Each evaluation task must have clear and specific standard answers to guide the evaluator’s decision on the student’s competence. Therefore, it’s crucial that your instructions do not mislead students or trainers.

Avoid Double-Barrelled Questions

Avoiding double-barrelled questions makes it simpler for students to respond and for trainers to accurately evaluate student competence.

Ensuring Audit Compliance

Considering these requirements, you might wonder, “Don’t learning resource developers offer audit guarantees?” However, with these assurances, you must wait until an audit to address noncompliance. This impacts your compliance record, so it's better to take a preventative and compliant approach.

By following these instructions and understanding the principles of assessment and rules of evidence, you can ensure that your assessment tools are valid with the standards established by ASQA and the SRTOs 2015.

Report this page