When a chatbot can't help, what should that moment feel like? I designed the conversational AI experience for homedepot.com, defining the line between automation and human connection.
Home Depot's existing chat system could answer questions. It could route tickets. By every internal metric, it was performing. But customers weren't satisfied, and the support team was overwhelmed by escalations that felt abrupt and confusing.
The problem wasn't the AI's accuracy. It was the experience around it. Customers didn't know when they were talking to a bot versus a human. The handoff between them felt like falling through a trapdoor. And when the AI couldn't help, it just... stopped. No explanation. No warmth. No path forward.
I was brought in to redesign the entire conversational experience, end to end.
Before sketching anything, I needed to understand what the AI could actually do and where it broke. I spent two weeks mapping the existing system: what triggered the bot, how it parsed intent, where confidence scores dropped, and what happened when they did.
I also interviewed support associates. They had a perspective no dashboard could give me. They knew which escalations felt earned (complex problems that genuinely needed a human) and which felt broken (the bot just gave up mid-conversation).
"Every time the bot transfers me, I have to repeat everything. It's like calling a doctor's office and being put on hold three times."
Paraphrased from customer feedbackThis was the core design decision. Not a UI question. A product philosophy question. The answer shaped everything downstream: the dialog flows, the escalation logic, the tone of every message.
I mapped every conversation type against two axes: complexity (can the AI reliably handle this?) and emotional stakes (how frustrated or anxious is the customer likely to be?). Simple and low-stakes? AI handles it. Complex or high-stakes? Route to a human. The interesting work was everything in between.
Click each to see what I considered and what I chose.
Some companies obscure the line between AI and human. We went the opposite direction. The bot always introduces itself clearly. Transparency upfront actually increased trust and reduced frustration when escalations happened.
The old experience was a hard cut: "Transferring you to an agent." Then silence. The redesigned handoff passes context forward so the customer never repeats themselves, sets expectations, and maintains conversational continuity.
The hardest edge case. When the model's confidence is low but the query doesn't clearly need a human. The old system would loop: "I'm sorry, can you rephrase that?" The new system gives the AI one honest attempt, then offers a clear choice.
The patterns I established for the chatbot (transparency about AI identity, warm handoffs, graceful failure) became the foundation for Magic Apron, Home Depot's generative AI shopping assistant. I joined the early design team and helped shape the interaction model before the product had established patterns.
The core question was the same, just bigger: when a generative AI can answer almost anything, how do you design for the moments it shouldn't?
The redesigned conversational experience launched across homedepot.com, handling both pre- and post-transaction inquiries through a unified chatbot and live chat system built in Voiceflow.
Placeholder: Escalation rate change, e.g. "X% reduction in unnecessary escalations"
Placeholder: Customer satisfaction signal, e.g. CSAT improvement or qualitative associate feedback
Placeholder: Adoption metric, e.g. "Patterns became standard for all new conversational AI work"
Placeholder: Associate feedback, e.g. "Less context-switching during handoffs"
I would have pushed harder for a feedback loop from customers after each AI interaction. We measured handle time and escalation rates, but we didn't systematically capture whether the customer felt heard. Metrics told us the system was faster. They didn't tell us if it felt better. In my next conversational AI project, I'd design the measurement into the experience from day one.