As chairman of SAICA’s Accreditation and Monitoring subcommittee, Graeme O’Reilly (one of our intrepid LTS directors) gets to see where firms generally go wrong in their assessment processes. He recently shared with us his list of the top 3 challenges faced by training offices when it comes to successfully running the SAICA assessment process…
Challenge number 1: Insufficient number of TSRs / PSRs being generated:
SAICA impose a minimum completion requirement of at least one TSR and one PSR entry during every 2 month period. There are still a number of trainees within training offices that fail to meet even this minimum requirement.
Trainees need to become more accountable for meeting this requirement and should actually be seeking to submit much more evidence than just this minimum requirement. Trainees who fail to meet this minimum requirement are in breach of the training regulations (through not complying with their duties set out in Annexure 7 – duty 2.7) and trainees who consistently fail to meet these requirements can actually have their training contracts cancelled at the discretion of the training office (regulation 22.2).
I am constantly amazed by the number of trainees who do not take responsibility for meeting this requirement and the training office is the one who suffers during the SAICA review visit.
LTS provides our client with reports that enable them to easily and quickly identify trainees who have not met these requirements so that they might be chased up.
Challenge number 2: Inappropriate evidence submitted in support of PSR ratings:
Many trainees fail to provide adequate evidence in support of their PSR ratings.
SAICA require that this evidence is:
- Positive – i.e. represents something the trainee actually did (and not what they didn’t do).
- Specific – i.e. detail about what the trainee actually did to demonstrate their ability (and not broad sweeping general statements of competence!).
- Verifiable – i.e. is capable of being validated by the reviewer.
It is common for trainees to record evidence such as “I displayed honesty throughout the engagement” or “I am a good team player”. Neither of these examples are acceptable as evidence in support of a rating of competence. What specifically did the trainee actually do that indicated they were honest or were a good team player?
Why do reviewers rate this kind of unacceptable evidence? Surely if reviewers are doing their jobs properly they would reject evidence that is not positive, specific, and verifiable? Please refer to challenge number 3 below for more about this.
LTS support is able to “roll back” a PSR if a reviewer requests so. This would enable trainees to re-draft their sub-standard evidence to the reviewer’s satisfaction.
Challenge number 3: Inconsistent understanding of assessment principles within the office:
There still appear to be large gaps in the understanding of principles between trainees, reviewers, and evaluators.
In general I believe:
- Trainees are taught about the assessment instruments and rating scale too early on. This is typically covered during the firm’s induction programme, which is typically in advance of the trainee having done any actual work. Trainees thus seldom have a context within which to appreciate assessments, and when they get to complete their first TSRs and PSRs it is often many weeks after induction and they have forgotten everything they were taught about the documents and process – leading to wasted time.
Why not deliver separate training on the completion of these documents just before you first expect trainees to complete them?
- Reviewers and evaluators don’t always understand their roles and responsibilities. Trainees get induction training and assessors have to go through assessor training, but what about reviewers and evaluators?
I believe it critical that reviewers and evaluators receive specific instructions regarding their roles and responsibilities. Evaluators rely on reviewers getting it right and assessors rely on evaluators getting it right, so why are these role-players not receiving adequate instruction about what is expected from them?
If reviewers were taught about the nature of acceptable PSR evidence, for example, surely they would then prohibit unacceptable evidence from accumulating?
If training offices can get the above 3 aspects right, I have no doubt that their assessment processes will flow a lot more smoothly and efficiently, and when SAICA come visiting, review findings should be a lot more pleasant than they might otherwise currently be…