AI Integration Healthcare Patient Operations 2024

Regional Urgent Care Network: 64% faster intake. No migration required.

A regional urgent care network was losing patients to wait times and inconsistent triage. We built an AI-assisted intake and routing system that plugged into their existing EHR. Nine weeks, no platform migration, measurable outcomes from week one of go-live.

64%
Intake time cut
9wk
Delivery timeline
0
Platform migration
4
Clinic locations

The situation

The client operated four urgent care clinics across the Houston metro. Their front-desk intake process was bottlenecked and inconsistent: patients filled out paper forms or typed into a generic intake portal, a front-desk staff member reviewed the responses and made a judgment call about triage priority, and clinical staff often disagreed with those calls after the fact.

The average intake-to-room time across all four locations was 23 minutes. For walk-in urgent care, that number alone was driving patients to competitors. The operations director had looked at three different patient management platforms and rejected all of them because each would have required replacing their existing EHR — a full migration estimated at 18 months and $400K.

The ask: fix the intake problem without touching the EHR.

The constraint that shaped everything

The network used athenahealth for their EHR. The system was deeply embedded — billing, clinical notes, prescriptions, compliance reporting all lived there. Replacing it was not a real option for a four-location independent network. Whatever we built had to read and write to athenahealth through its API, stay inside HIPAA boundaries, and feel like a native part of the front-desk workflow rather than a separate tool the staff had to remember to use.

We spent the first week on-site at two locations watching intake happen in real time. What we observed: front-desk staff were making triage calls based on incomplete information, chief complaint descriptions were inconsistently recorded, and the downstream clinical team had no visibility into what was coming before a patient reached the exam room.

What we built

The system had three components. First, a patient-facing intake flow — a structured symptom and history collection screen that replaced the generic form. It asked the right follow-up questions based on the chief complaint and flagged anything that clinical guidelines suggested warranted immediate attention. It ran on a tablet at the front desk and on patients' phones via a QR code in the waiting area.

Second, an AI-assisted triage scoring layer. Based on the structured intake data, the system produced a triage recommendation — not a clinical decision, but a structured input to the clinical team's decision. It surfaced relevant history from the EHR, highlighted anything that had changed since the patient's last visit, and formatted everything in the way Meridian's clinical leads had told us they actually wanted to see it.

Third, a real-time queue view for clinical staff — a read-only screen showing every patient currently in intake, their triage score, and estimated acuity. This meant the clinical team could see what was coming before a patient left the waiting room.

"Our nurses stopped arguing with the front desk about triage calls within the first week. The system gave everyone the same information at the same time."
Director of Clinical Operations

HIPAA and the trust problem

Healthcare AI has a credibility problem. Staff are correctly skeptical of systems that make recommendations they cannot explain. We addressed this directly by designing the triage score to always show its reasoning — not a black-box confidence number, but a plain-language summary of which intake responses contributed to the score and why.

The clinical leads reviewed and signed off on the scoring logic before go-live. That review process took two weeks and was worth every hour. It meant that on day one of live deployment, the clinical staff already trusted the system because they had been part of building it.

Outcome

Average intake-to-room time dropped from 23 minutes to 8.3 minutes across all four locations within the first 30 days of operation. Triage disagreements between front-desk and clinical staff — tracked as incident reports — dropped by 87% in the same period. Patient satisfaction scores across all four locations improved in the following quarter's survey.

The network has since extended the engagement to build a follow-up care coordination module that notifies patients with outstanding lab results and routes them to the appropriate follow-on care. No additional platform migration required.

Next case
Cornerstone Capital: Technical leadership pre-seed to Series A.
Read the case →