What gets measured, gets done.
So when we measure throughput and define efficiency on throughput, our throughput speed increases. And usually will result in using various levels of artificial intelligence and becoming a factory. Perhaps not an actual factory, but we set ourselves up in a assembly line nonetheless. Most recruitment is done this way. And most proposal review processes are or are moving this way. Key words need to be present to get you to the next stage. You need to tick multiple boxes in a ‘machine readable’ manner before you can get in front of a human.
This helps with volume, but doesn’t help get the best candidates or projects. It gets you people who know how the system works.
The other thing it results in is a lack of discussion. Proposals are just that proposals, not end results. And all proposals have blind spots. Therefore, they all can be improved. However, discussions or conversations slow down throughput. And yet, as I tried to point out in yesterday’s post, we need more discussions about potential for harm.
Most donor reps or people in NGO HQs who review proposals are interested in discussions, interested in reducing harm, but their performance metrics don’t allow the ‘time’ for it.
What gets measured, gets done and has unintentional consequences. When we view organisations as machines or factories, we view processes and people in a certain way. When they are seen as organic, learning systems, perhaps efficiency is measured differently.
The choice is up to us.