
The SACCO had issued a request for proposal for staff training on loan monitoring and recovery. On the surface, the need seemed straightforward. Portfolio quality was under pressure, and management wanted staff to sharpen recovery practices.
In our proposal, however, we suggested something that initially surprised them: before training, we needed to validate the actual training need.
During the proposal presentation, the Head of Credit repeatedly emphasized one point.
“The training must be practical — not theory.”
When we asked him to elaborate, his response was telling. He explained that many training programs the team had previously attended had been well delivered, technically sound, and even engaging — yet they had led to little or no change in behavior. Staff returned to work and continued doing things exactly as before.
The issue was not resistance. It was not lack of interest. It was not even poor facilitation.
It was what happened after training — or more accurately, what did not happen.
This is where the link between training and performance usually breaks.
In many institutions, training begins without a clear diagnosis. The assumption is that a problem exists, therefore training must be the solution. But without unpacking why loan recovery is weak — whether it is skills, systems, incentives, decision authority, or supervision — training addresses symptoms, not causes.
Second, training is rarely tied to specific performance expectations. Staff attend sessions without clarity on what must change in their daily work. There are no defined behavioral standards, no revised processes, and no measurable outcomes tied to the learning.
Third, training often delivers knowledge but not tools. Participants understand concepts, yet leave without practical templates, monitoring dashboards, recovery scripts, escalation guidelines, or revised policies that would enable them to act differently the next day.
Fourth, accountability for implementation is weak. Once the training ends, responsibility quietly shifts back to individuals. Managers are not tasked to reinforce new practices, and supervisors are not equipped to monitor adoption. Training becomes something staff “attended,” not something the institution “implemented.”
Finally, follow-up is almost always missing. Few SACCOs schedule structured reviews or coaching sessions 60–90 days after training to assess whether behavior has changed, what obstacles remain, and what support is needed.
The Head of Credit was right to demand practical training — but practicality does not begin in the classroom. It begins with diagnosis, ownership, and follow-through.
Training only delivers value when it is deliberately connected to strategy, risk exposure, and day-to-day execution. Otherwise, it becomes an expensive ritual that reassures leadership without changing outcomes.
The uncomfortable truth is this: training rarely fails because of poor content.
It fails because institutions treat it as an event, not an intervention.
Where does training fail most in your institution — diagnosis, delivery, or follow-through?tegy hostage?