Wednesday, October 1, 2025

Weekly Voice Insights #59 – Bias Alert Check-In: Guarding Clarity in Voice and Awareness

Noticing the stories we carry into the room


Red glasses on. Bias radar active!

We often walk into conversations carrying assumptions we don’t notice. The same thing happens with the voice. A singer may repeat a phrase without clear intention, or a speaker may let adrenaline carry them through too quickly, without giving the words space to land. These slips aren’t deliberate — they’re habits. In the same way, our minds fall into biases that shape how we sound before we’ve even chosen the words.

The idea of a Bias Alert Check-In came to me while working with AI. I realized the tool often gave back exactly what I wanted to hear. That was flattering, but it also meant I could miss what was really there. So I started asking it to flag possible bias in my drafts. Erasing bias isn’t possible, but paying attention to when it shows up — that is.

And what I mean by bias here isn’t just politics or surveys. It’s the small, everyday judgments that creep into our tone. Speaking too quickly because we want to get everything out without leaving a pause. Losing track of intention. Assuming the other person won’t follow us. Those cues show up long before the words are finished.

I’ve also noticed this reflected in some recent work on communication. A June 2025 article, Why Tone of Voice Matters in Communication, described how small changes in tone can shift the level of trust people place in what’s being said. Another piece, The Emotional Signature of Your Voice (July 2025), pointed out that clarity depends on whether your tone matches what you mean. That matches my experience too: when intention isn’t clear, or when you overlook how much breath you actually have for a phrase, the sound reveals more than the words alone.

“When any impression comes upon you, remember to say: you are an impression, not the thing you appear to be.”
— Epictetus, Discourses 1.20

Impressions arrive quickly and feel convincing, but they’re not always the full story.

That reminds me of Chimamanda Ngozi Adichie’s TED Talk, The Danger of a Single Story. Her point wasn’t that stereotypes are always false, but that they are incomplete — they leave out the rest of the picture. I used her talk often in a first-year seminar course for new college students. Every time I watched it, I learned something myself. It helped students see their own single stories, and it gave me the chance to check mine too.

What made it so powerful was her honesty. She wasn’t lecturing about bias; she was telling her own. She spoke about Fide, the boy who helped with chores in her home. She had always been told his family was very poor, and that became her single story. Then she visited their home and saw the beautiful baskets the family had made. In that moment she realized she had reduced the whole family to one narrative of poverty. The story wasn’t false, but it was incomplete.

I’ve seen the same pattern in my own work. In performance, it can be tempting to repeat a phrase the same way because that’s how it’s always been done. Imitation has its place — choosing a strong model can teach us a great deal — but if we stop there, something gets lost. In my experience, the listener is more engaged when the interpretation has been made our own. Epictetus described this in his own way when he wrote about the art of the speaker and the art of the listener: communication is most alive when what we bring is genuinely ours and it connects across both sides.

Teaching was the same. Templates and old syllabi gave me a starting point, but unless I reworked them for the students in the room, they didn’t quite fit. AI is no different. It can provide useful material, but if I take it as-is, I’m not really bringing my own voice to it. The Bias Alert Check-In makes me stop and ask: is this just a convenient version, or have I adapted it into something that truly reflects me?

And we know bias can be heard, even unconsciously. An article called Silent Signals: How AI Can Read Between the Lines in Your Voice (July 2025) described how machines detect hesitation, stress, or pacing shifts in ways humans often miss consciously — but still respond to. If AI can hear our hesitation, so can people.

That’s why I use a simple Bias Alert Check-In, a short set of prompts to reset tone and attention:

  • Who or what am I reacting to?

  • How’s my breath?

  • Am I leaving space for the other side?

First impressions and single stories will always come. They tell us something, but not everything. The Bias Alert Check-In is a reminder to pause, notice, and stay aware of what comes out of our mouth. That small pause also opens the door for the listener — it makes space for a real exchange, where both sides are part of the conversation.


Related Posts


Further Resources

Elias Mokole Keynote Speaker, BA & Beyond 2025 | Voice Presence & Change Founder, Developing Your Authentic Voice Newsletter.

Please subscribe here

#DevelopingYourAuthenticVoice #VoiceMatters #Clarity #Attention #Awareness #StoicWisdom #Epictetus #BiasAlert #SingleStory #Listening

Weekly Voice Insights #59 – Bias Alert Check-In: Guarding Clarity in Voice and Awareness Noticing the stories we carry into the room Red gla...