Assessment is often identified as the critical component of a competency-based system. In competency-based training, assessment can be defined as follows:
Assessment is the process of collecting evidence and making judgments on the nature and extent of progress towards performance requirements set out in a standard or a learning outcome, and at the appropriate point, making the judgment on whether competency has been achieved (VEETAC, 1993, In Smith and Keating 2003).
Assessment is the process of forming a judgement about whether an individual meets a specified standard (Rumsey 1991).
Competency comprises the specification of the knowledge and skill and the application of that knowledge and skill to the standard of performance required in employment (Rumsey 1991).
Elements describe the essential outcomes of a unit of competency, and
Performance Criteria describe the performance needed to demonstrate achievement of the element.
DETERMINE THE FOCUS OF THE ASSESSMENT TOOL
Assessment has several purposes (Smith and Keating 2003):
To let the student know he or she is going
To help teachers or trainers assess the student’s learning needs
To determine whether learning outcomes have been achieved, for the purpose of formal recognition
To determine whether a person has achieved the standards of competency, and also for gaining entry, credit transfer or advanced standing for recognised courses (including university).
There are four major PRINCIPLES OF ASSESSMENT
The following principles have been developed from three sources (Smith and Keating 2003; Killen 2005 pp.118 -139, Australian Government 2011). For a guide in how to check for these principles you should refer to p.57 of the User Guide.
1 Validity – assessment assesses what it claims to assess
2 Reliability – it can be interpreted and applied consistently in different cases
3 Flexibility – it is appropriate to the whole range of learners, sites and delivery methods
4 Fairness – it does not disadvantage any learners
THE RULES OF EVIDENCE include:
Validity: Content validity means the evidence covers the knowledge and skills that are essential to competent performance as set out in the Unit of Competency.
Sufficiency: There is sufficient quality and quantity of evidence – as set out in the Critical Aspects of Evidence for the Unit of Competency
Authenticity: The assessor is assured that the evidence presented for assessment is the candidate’s own work
Currency: Evidence demonstrates it is the current competency of the candidate.
(Sourced from: The Australian Government User Guide for the VET Diploma course TAE50111, 2011).
NB: Assessment should tell teachers and students something they do not already know (Killen 2005, p.137). It should stretch students to the limits of their understanding and ability to apply knowledge. Roy Killen (2005) provides a full list of assessment principles on p.137.
In designing the assessment tools for learning, one must consider the target group, the purposes of the assessment and the context of the learning and assessment.
Assessment instruments, must be revised to reflect the learning environment and context in which they are being used and the audience to whom they are directed. Validation and moderation activities that feed into continuous improvement processes can also be applied to the revision and updating of these learning and assessment resources.
There are a number of considerations to be taken into account when designing and delivering assessment tools:
Are there industry benchmarks which must be met?
What workplace standards must be met and what are the criteria for meeting these standards or benchmarks?
Are assessments to be self-directed, individual learning, project based, structured knowledge or information gathering, group instruction and capacity building, on-the-job assessment, workplace experience based assignments, off the worksite (external)
Where will the learning from the assessment be used?
What outcomes do the assessment tools expected to deliver?
Will the outcomes be satisfying to the learner and the organisation?
The above questions will direct the development of assessment tasks and the choice of assessment tools. Examples of tools to fit with the learning strategy and the principles of assessment may include:
Discussion and reflection e.g. meetings, review panels, appraisals
Small-group/team work or co-operative learning
Problem solving through programmed self-paced learning
Learner research: Applied Research, Action Research
Report writing
Role-plays where trainees act out roles with other trainees
Case studies
Short answer questions
Work task process and completion.
DESIGN THE ASSESSMENT TOOL
When designed properly, UNDERLYING KNOWLEDGE and UNDERSTANDING can be assessed as well as SKILLS (Smith and Keating 2003).
All assessments should be designed with these three components of learning in mind. The tools should stretch the imagination, encourage creativity in learning and encourage lateral thinking.
De Bono’s learning concept based on six thinking hats of different colours was first put forward in 1972. It is designed to help people think clearly by directing their thinking in one direction at a time and limit their reliance on emotion and gut reactions.
Each hat requires a different perspective on a topic or issue e.g. green is for creative thinking. ‘Full-colour’ thinking is achieved by putting on, or taking off a hat, depending on what type of thinking is required, focusing on one aspect of thinking at a time. This type of learning is collaborative and participative.
(Look up De Bono’s six thinking hats in a Google search. What type of approach do the red, yellow, black, white and blue hats require?)
Learning will be more interesting, and is certainly more useful when assessment tools requiring the different types of thinking are used.
The tools available for training and assessment are numerous and depend on the trainees, the type and function of the organisation and the type of training and training strategy. One model for the training and the tools to use, was developed by Alan Chapman (and cannot be provided here due to copyright, however, you can access it on businessballs.com. (https://www.businessballs.com/trainingprocessdiagram.pdf ).
Alan Chapman’s model is really worth looking at as it shows a process of identifying all of the components involved in selecting appropriate tools in the training and development process including:
self-analysis
understanding a trainees needs
breaking down the identified needs
developing tools for the skills to be learned,
developing tools for evaluation, appraisal and potential tools for future training and development.
(If you can’t access the diagram, contact your teacher).
Considerations in designing the assessment tools include:
Selecting the methods that support the collected knowledge/skills/competencies gap evidence
Taking into account the context in which the assessment will take place
Ensuring the design fits with the principles of good assessment (e.g. meets performance criteria and evidence guide)
Enabling candidates to show or support their claim for recognition of current competency to generate options for collection of evidence (i.e. it may be necessary to vary the design and tools for different candidates – consider different assessment instruments for the same assessment).
Consider different ways of administering the same assessment instruments. Some examples are:
face-to-face (workplace assessment)
social media interactive sites
forums on Moodle or other online delivery systems written responses
presentations to groups
Use a range of tools to develop different learning approaches and lateral thinking – there is more than one way to solve a problem, or do a job! Examples include the above points and earlier suggestions (De Bono’s six thinking hats), plus:
reflection, questioning, Recognition of Prior Learning (RPL) knowledge, role plays, case studies, simulations, e-learning and social networking tools, observation logs/journals, interviews, counselling, mentoring or coaching, group work and projects, preparing reports, newsletters, company documents and manuals (OHS, Standards, policy), video and so on.
I hope this information is helpful, please feel free to add any comment.
Carolyn Fletcher
Director ITTA
Leave a Reply