All articles

In Healthcare, CX Experts Balance Technical and Tactical to De-Risk AI

Cresta News Desk
published
September 28, 2025
Credit: Outlever

Key Points

  • The rise of AI in customer experience is exposing new operational risks and business concerns.

  • Eric Edwards, Director of IT Sourcing at Vizient, explains how AI adoption can erode institutional knowledge, leaving companies vulnerable during tech failures.

  • He describes how smaller organizations often face more acute risks because they're more likely to eliminate human roles and lose internal knowledge as a result.

  • Edwards concludes that rigorous vendor due diligence and a concrete contingency plan are essential to mitigate AI-related operational risks.

The benefit of AI is reducing labor costs, but that is also its central risk. Eliminating a human resource to save money also eliminates the internal knowledge that person holds. If the AI goes down, you've lost your only source of truth. You're left with a huge hole in your operation and nowhere to turn.

Eric Edwards

Senior Service Line Director of IT Sourcing

Eric Edwards

Senior Service Line Director of IT Sourcing
Vizient, Inc.

The promise of AI efficiency is driving a surge in adoption across customer experience. But in the rush to deploy, many leaders could be overlooking operational risks, like the erosion of institutional knowledge, which can leave companies without a fallback when technology fails. Now, the risk is so palpable that a new insurance market has emerged to cover the fallout from malfunctioning AI.

The move is a clear sign of measurable business concern, according to Eric Edwards, Senior Service Line Director of IT Sourcing at Vizient, Inc.. With a 20-year career as a retired Non-Commissioned Officer in Global Logistics with the U.S. Navy and as a Certified Professional in Supply Management (CPSM), Edwards brings a unique perspective to the subject.

For Edwards, the central risk lies in the very benefit that makes AI so attractive: its impact on humans. When leaders refocus or eliminate human labor, the institutional knowledge that keeps a business running often disappears as well, he explains. As a result, most leaders are left facing two equally damaging recovery paths, right when their customers need support the most.

  • Mind the knowledge gap: The risk is most acute for smaller organizations, which are more likely to eliminate human roles than refocus them. "The benefit of AI is reducing labor costs, but that is also its central risk. Eliminating a human resource to save money also eliminates the internal knowledge that person holds. If the AI goes down, you've lost your only source of truth. You're left with a huge hole in your operation and nowhere to turn."

  • A rock and a hard place: Neither path offers a practical solution when customers are actively experiencing issues, Edwards explains. "After a failure, you face two bad choices. Pull an existing employee from somewhere else, and you risk burning out someone who is already at their breaking point. Decide to re-hire, and you begin a slow, expensive process while your paying customers wait for support."

According to Edwards, the first line of defense is a rigorous approach to vendor due diligence, a thorough examination of a potential partner's technology and operations.

  • Look under the hood: A vendor's reliance on outside technology introduces risks they cannot fully control, he continues. "The first question to ask a vendor is simple. Is this your own technology or a third-party solution? A native AI means the vendor has full control. But if they're white-labeling another company's technology, that exposes you to risks beyond their control. The weakest link is often the connection between platforms. Personally, I feel much better putting my chips on a native solution."

  • The investigative mindset: Because vendors may not volunteer information about their technology stack, a proactive, skeptical approach is necessary, Edwards says. "You need an investigative journalist's mindset. If you don't get a straight answer about where the technology comes from, you have to keep digging. You must be willing to ask the tough questions until you get to the bottom of it."

However, a familiar challenge can undermine even the most careful due diligence, Edwards warns. Intense pressure to meet budgets and aggressive deadlines from leadership can compromise the integrity of a thorough technical evaluation.

  • The pressure cooker: The battle to negotiate favorable financial terms can consume time and energy that should be reserved for technical vetting, he explains. "Technical due diligence often gets sidelined by financial and political pressure. You get so focused on negotiating the contract that you run out of runway for the technical evaluation. A leader demands the deal get done by year-end to satisfy shareholders, so you get it done. Protect the company financially first, and the technical risks get missed as a result."

A solid vendor is only half the battle, Edwards says. The next step is creating a concrete fallback plan. But the right approach depends on an organization's scale.

  • The 'Janice' playbook: Companies with sufficient internal resources need a straightforward, pre-defined standard operating procedure that can be activated immediately, Edwards explains. "But a larger organization needs a disaster mitigation plan with clear SOPs ready to go. The playbook must be simple. 'If the AI goes down, Janice from this department will step over and run support using this guide.' It needs to be that specific to avoid confusion in a crisis."

  • Phone a friend: "A smaller organization usually can't afford to have staff on standby. Instead, they should leverage their existing professional services partners, like Deloitte or Accenture. Explore their CX capabilities ahead of time. Arrange a contract that allows them to step in on a short-term basis to fill the gap. Then, that third party becomes a resilience resource, not a risk."

For Edwards, the analysis must go deeper than company size. Leaders must also assess risk appetite at the departmental level, where cultural differences can be significant.

  • Operate unabashed: Marketing is a prime example of a department often culturally primed to embrace risk, Edwards explains. Here, decisions tend to reflect customer "feel" rather than empirical data, suggesting more comfort with ambiguity. "Risk tolerance varies more by department than by industry," Edwards claims. "Marketing decisions are often made on emotion and feel. Because they excel at selling how a solution will make a customer feel, they want to operate unabashedly, with no constraints. That culture can lead them to take on more risk than other, more measured parts of the business."

But the cycle of temporary vigilance also highlights a fundamental issue for many organizations: a culture of complacency. Even after high-profile incidents like the 2013 Target data breach and the recent cyberattack on Change Healthcare, complacency inevitably returns, Edwards says. "The weak point for organizations is that they don't go back and reevaluate what they once knew."

Instead of a moral failing, however, the issue is the sheer volume of new requests, Edwards concludes. "It's incredibly difficult to find the time to go back and perform that re-due diligence on systems that seem to be working. But that's how things slip through the cracks. It isn't until a problem happens somewhere else that you're prompted to look under the hood of your own operations."