The truth: decisions based on algorithms are also biased. The bias is buried and opaque. In doing so, they rob organizations of the courage to make humane exceptions that build culture.

The dreaded phone call
When the phone rang on Tuesday, S looked at the number and knew it was her new employer. She had just taken up a new role. She was excited and all set to start on Monday. This was going to be her first role as a people manager leading a team of four. She couldn’t wait to join. On Sunday evening as she was deciding which clothes to wear on the first day at work, her mother called to say, her brother was no more.
The next day was spent organizing the funeral. Amidst all the things to be done, she had missed informing the office that she would miss the new hire orientation. She was understandably apprehensive when S heard the voice at the other end. The HR manager had called. S wiped away her tears and explained that she had every intention to keep the office informed but had missed.
“You prioritized your family. That is how it should be. Take the time you need. We’ll see you in two weeks.“
This wasn’t in the employee handbook. Someone made a call. This exception said more about the company’s values than any mission statement could.
In Billions We Trust, In One We Won’t
Organizations trust leaders to take decisions that cost millions and billions. Where to put plants, the products or services mix, the potential for earnings that they share with the analysts and investors etc. These are often qualitative decisions when they start and only later morph into data (quantitative and hence “defensible”) to back it up. Every decision has a cost. We trust the leaders to exercise judgement and rigor as they decide. If there is data and logic to back a decision, we are comfortable.
Yet for people decisions—hiring, performance reviews, promotions—we reach for algorithms. The promise: objectivity, consistency, efficiency, bias-free outcomes. The uncomfortable question: Why do we trust human judgment for strategy but not for people?
Why “Bias-Free” Is An Illusion
Most hiring today is done by algorithms. Algorithms are not math; they are “opinions embedded in code.” They don’t eliminate bias; they automate it. The bias simply moves “upstream”—into the minds of those who design the model, the selection of what data is collected, and the narrow definition of what gets measured.
Goodhart’s Law
When a measure becomes a target, it ceases to be a good measure
An organization I know well gave number of training sessions conducted as a measure to the training team. The result, the training team started finding creative ways to corner a few unsuspecting employees, herd them to a classroom and talk to them on some random topic like, “Communication Skills” and declare it as a training session held. Employees quickly learn to game the metrics, not do the work.
Even now organizations measure the effectiveness of the training department in terms of butts in seats. The year end report is quantitatively pleasing. Each employee’s time scrolling through self service slides on an online portal is viewed as a proxy for skills built. Yet the skills gap remains a challenge for every CEO
Trying to focus only on quantitative measures misses the point. Algorithms optimize for thin data (numbers, metrics, scores). That completely misses the human behavior which is by definition hard to quantify without having a deeper context to put the data in. Organizations run on thick data: stories, context, relationships, trust.
Algorithms Cannot Make Exceptions
When the leaders have to take a bet on putting someone into a critical role, they have to take a judgement call. You can’t reduce “Is this person ready?” to a score without losing what matters. There is no algorithm that can predict if the person will succeed or fail in the role despite having a previous record of success.
Algorithms are defensible in court, but that’s not leadership. Leaders are supposed to take accountability, not hide behind systems. When decisions are automated, responsibility diffuses. Discretion disappears: “The system says no” becomes the answer.
The culture erodes quickly. There is no one to appeal to, no room for context, no humanity. If an automated camera at a traffic light gives a ticket to a person rushing someone to a hospital, the speeding ticket is technically right. Only humans can truly make exceptions – machines cannot.
Human Judgement Needs Courage
When leaders use their judgement to make exceptions, they create stories. Because anything predictable is NOT a story. When the manager of the newly hired S makes an exception because of her situation, the human side of the enterprise shines through.
What can organizations do to build courageous leaders?
- Hire for what algorithms miss: The candidate who doesn’t tick boxes but overcame extraordinary odds to be here is someone to celebrate. Allow 20% of hires to be those who are atypical.
- Treat metrics as prompts, not verdicts: Data points should start conversations, not end them. Choose the metric as an opportunity to understand the context.
- Document and celebrate exceptions: When you override the algorithm, explain why publicly. It builds trust in the system. It tells people that they will be viewed as unique individuals.
If your organization’s people decisions could be fully automated tomorrow, would you still need leaders—or just IT support?