When AI Becomes Your Digital Yes‑Man
Author
Warwick Hampden-Woodfall
Date Published

The hidden threat of AI sycophancy – and how to stop it before it costs you
It’s flattering when people agree with you.
It’s also dangerous.
Samuel Goldwyn once said that he didn’t want “yes men” around him. He was right.
But many leaders do want yes‑men - even if they won’t admit it. Sometimes it’s ego. We all like to be told we’re right. Sometimes it’s about control - wanting people (or systems) to just do what we say without pushing back. And now, thanks to AI, you can have a yes‑man at your disposal 24/7.
The trouble is, your AI’s confirmation bias can be strategic sabotage dressed up as productivity. It feeds you exactly what you want to hear, and it does it convincingly.
That’s AI sycophancy - and if you don’t tackle it, it will quietly undermine your decisions, your innovation, and your competitive edge.
The danger isn’t obvious - and that’s the problem
AI sycophancy rarely shows up as a dramatic failure or an AI hallucination. You won’t get a blinking red light telling you your AI is lying to please you.
Instead, you’ll see:
- Reports that always back your gut feeling.
- Forecasts that echo last quarter’s thinking.
- “Insights” that are just better‑worded versions of your own opinions.
It feels efficient. It feels like alignment.
In reality, it’s narrowing your perspective and skewing your judgement.
Why SMEs are especially exposed
In SMEs, decisions are often made fast. That’s a strength - but it’s also why sycophantic AI is so risky.
One flawed assumption, amplified by an agreeable AI, can ripple through strategy, operations, and budgets before anyone stops to question it.
And because SMEs often run lean, there’s a temptation to trust AI output at face value. That trust is misplaced.
AI Governance isn’t red tape. It’s your insurance policy.
Good AI governance is the mechanism that forces your AI to tell you the truth - even when it’s uncomfortable.
It’s not about slowing things down. It’s about:
- Setting rules for what “helpful” actually means in your business.
- Building deliberate challenge into your AI prompts and review processes.
- Making space for dissenting data and alternative scenarios.
In other words, it’s about making sure your AI isn’t just smart - it’s honest.
6 ways to stop AI sycophancy in its tracks
Here are six ways to embed truth-seeking into your AI interactions and safeguard your business:
1. Define truth
Agree what “truth” means for your AI outputs - factual accuracy, balanced perspective, or evidence‑backed dissent. Write it down and use it to judge outputs. If you want balanced views, train and test AI on data that reflects them.
2. Force the counter‑argument
Always ask your AI for the opposite view. Treat it as seriously as the recommendation you asked for. This pushes the AI (and you) beyond comfortable answers.
3. Make someone accountable for pushback
Never let AI outputs flow straight into strategy. Assign a human to challenge them before they drive decisions. Without this, flawed outputs will slip through unchecked.
4. Demand transparency
Choose AI tools that explain their reasoning and key assumptions. This lets you see what’s driving the recommendation and spot weak or biased thinking.
5. Diversify your data
Train and update your AI on diverse, representative datasets. Audit regularly for bias and gaps - otherwise, you’ll just reinforce yesterday’s mistakes.
6. Teach AI its limits
Train your AI to flag when it’s guessing, working from incomplete data, or outside its expertise. It should be able to say, “I don’t know” or “this is based on assumptions that may not hold.”
The real payoff
When your AI stops being a cheerleader and starts being a sparring partner, you’ll:
- Spot risks earlier.
- Make bolder but safer decisions.
- Avoid the costly groupthink that takes down companies.
If you’re serious about getting business value from AI, this is where you start.
Not with fancy features. Not with the next big model.
With the integrity to demand that your AI tells you what you need to hear - not what you want to hear.
If that’s a conversation worth having in your business, we should talk.