Autonomous Ultrasound: From Episodic Imaging to Continuous Patient Monitoring
- Hani Eskandari
- Oct 11
- 7 min read
Updated: Oct 14

The New Face of Monitoring
It is two in the morning on a med‑surg floor. A nurse gets a silent alert from a patient two rooms down: right‑sided pressures are creeping up. No one paged a sonographer. A palm‑sized ultrasound sensor has been scanning in the background, trending signals and analyzing on its own. The nurse checks the stream, adjusts fluids, and the patient stabilizes. No escalation.
A few hours later, a patient with heart failure finishes his coffee at home. He places an ultrasound patch on his chest. Yesterday’s scan looked fine. Today’s trend shows early signs of congestion. An AI agent flags the change, and a remote clinician tweaks diuretics during a five‑minute call. No travel. No waiting room.
In a crowded ED, stretchers line the hallway while monitors beep in counterpoint. A junior clinician angles a probe over a patient’s chest, following on-screen guidance that gets them close enough for a usable image, but not definitive. The patient still waits for the one expert who can find the perfect window. The queue grows. Across town in another hospital, a nurse places an autonomous ultrasound patch on a patient in seconds. The system captures the view, trends results, and alerts when intervention is needed. Experts focus on decisions, not acquisition. Throughput rises. Burnout eases and patients move sooner.
These are not science fiction. They combine two mature ideas: ultrasound and machine learning. What changes everything is how they work together.
Ultrasound has been episodic and operator‑dependent. Recent AI guidance lowered the barrier by coaching non‑experts to standard views. Helpful, but still “driver‑assist.” Monitoring demands autonomy.
From Diagnosis to Monitoring
Healthcare needs imaging that behaves like a vital sign and can deliver continuous, trending, and actionable results. The goal is not just a better image at a single moment. It is better timing, earlier detection, and broader access. When imaging becomes autonomous, it stops being a test and starts being a monitor.
Every industry that relies on human skill eventually reaches a point where automation needs a better interface. Medical imaging is arriving there now, much as transportation did two decades ago. In the early 2000s, the DARPA Grand Challenge and Urban Challenge marked the dawn of vehicular autonomy.
Stanford’s Stanley (2005) and Carnegie Mellon’s Boss (2007) fused basic 2D laser scanners, cameras, and radar to navigate desert courses, impressive but mechanical, turning steering wheels and pressing pedals with servos. The real breakthrough came when vehicles went fully electronic. Drive-by-wire systems let software steer, brake, and accelerate directly through digital control, using dense 3D sensors for full situational awareness. Ultrasound stands at the same inflection. Guided scanning proved the concept; true autonomy, powered by a new generation of 3D ultrasound sensors and wearable patches, will make imaging continuous, consistent, and effortless. “Tell a human how to aim” has proven the feasibility of semi-autonomous ultrasound imaging concepts, but “Let software aim” is the operating model for the autonomy of the future.
Autonomous driving offers a useful parallel for understanding this progression. Guided scanning today sits at the equivalent of Level 1-2 autonomy, comparable to cruise control, lane keeping, and advanced driver-assistance systems. These capabilities support the operator but still rely on constant human oversight. True autonomous ultrasound represents the leap to Level 4-5 autonomy, where software controls the process end-to-end, handles its own quality checks, and escalates only when human judgment is needed.
That shift is a paradigm change. Just as the next generation of humans will grow up in a world where driving without autonomy feels archaic, the future generation of clinicians will see manual image acquisition as a relic of the past. This is how ultrasound will evolve from a skilled diagnostic tool to an intelligent, ever-present monitor.
Autonomous imaging is a stack, not a single feature:
Full‑field instrumentation – Capture enough acoustic data that software can see the whole field, reacquire structures, and keep tracking without asking a human to “find the window.”
Acquisition autopilot – Decide when and how to capture views, repeat on schedule, and check quality in real time.
Measurement and trending – Extract biomarkers consistently and track them against personalized baselines.
Alerting and review – Surface meaningful change, route to the right clinician, and document the decision trail.
The right instrumentation is the foundation: without it, the software and autonomy stack stalls; but with it, acquisition becomes background work and clinical attention shifts to interpretation and intervention.
Policy and Practice Aligning
Policy and regulation are evolving in parallel with technology. Regulators already evaluate AI in ultrasound. FDA clearance for guided acquisition and automated echo measurements show the agency’s familiarity with both workflow support and quantitative analysis. Examples include the de novo authorization of Caption Guidance for echo image acquisition and multiple 510(k) clearances for AI‑assisted echo measurement software. These decisions establish precedents for assessing safety, accuracy, and reproducibility.
On care delivery, Medicare’s September 2024 Hospital‑at‑Home study reported lower mortality for beneficiaries treated under the program relative to inpatients, with mixed readmission findings across conditions and lower Medicare spending in the 30 days after discharge for more than half of the top MS‑DRGs studied. The report also noted positive patient and caregiver experiences. These results are encouraging, while calling for continued study.
In July 2025, bipartisan bills were introduced in the U.S. Congress to continue Acute Hospital Care at Home flexibilities through 2030 and to require additional study and standards. Debate continues, but the direction is clear. Even with the relatively low-fidelity sensing modalities used in today’s remote patient-monitoring programs, compared with the in-depth instrumentation available in hospitals, data show meaningful value in keeping patients safely at home while assessing their condition regularly. Hospitals remain reserved for the most acute cases, while home-based monitoring supports early detection, timely intervention, and a better patient experience.
In parallel, the 2025 Physician Fee Schedule kept key virtual supervision rules that let doctors oversee care remotely for routine services delivered by nurses or clinical staff. It also made some of these flexibilities permanent and extended remote oversight for most services. The updates expand how clinics, especially in rural and community settings, can bill for remote care. These practical steps show continued momentum toward policies that support safe, distributed models of care.
The underlying remote‑care reimbursement mechanisms exist. Medicare maintains RPM and RTM pathways with defined requirements, such as 16‑day data capture for most RPM services. Programs will evolve, but the scaffolding for monitoring‑as‑a‑service is already in place, and will only expand and improve.
The Workforce Crisis
Labor shortages have become the defining constraint across healthcare, and imaging is no exception. A 2024 survey of 1,389 sonographers across the U.S. and Canada found that more than half reported moderate to severe burnout. Chronic fatigue, repetitive strain, and limited recovery time are now defining features of the profession. Sonographers experience some of the highest rates of musculoskeletal injury in healthcare. Studies show more than 70% report pain in the neck, shoulders, or back from sustained probe manipulation and static posture. These are not isolated complaints but structural risks embedded in a manual, operator-dependent workflow.
Reducing that dependence is more than a productivity measure. It is a safety, quality, and retention imperative. Automation that removes repetitive motion and minimizes scanning time protects clinicians from injury and burnout while preserving image consistency. In a field where expertise takes years to build and attrition is accelerating, rethinking the human-machine balance is essential to keep imaging sustainable.
Economics and Impact
When imaging shifts from episodic tests to continuous monitoring, the economics shift too. Capital sales give way to service models tied to outcomes and uptime.
Hospitals gain earlier warnings of deterioration, shorter stays in the ED and inpatient units, and a practical path to Hospital-at-Home programs that do not require an expert sonographer at every bedside. Payers gain trend data that helps prevent readmissions and complications. Existing RPM, RTM, and care-management codes already provide the framework for how these services are documented and reimbursed, with clear guardrails for quality and oversight.
For clinicians, this shift is not about replacing expertise but reclaiming it. When image acquisition becomes autonomous, skilled providers spend less time searching for windows and more time interpreting, intervening, and managing outcomes. Workflows become lighter, and burnout eases.
For hospitals, the impact is both operational and clinical. Autonomous ultrasound monitoring improves throughput, optimizes bed utilization, and provides an early-warning safety net for patient deterioration. In Hospital-at-Home settings, it extends hospital-grade monitoring into living rooms without requiring full-time staff.
For patients, the experience transforms from long waits and repeated visits to imaging labs into simple, non-invasive monitoring at home—no more complex than using a digital blood-pressure cuff.
As with all clinical automation, the path to imaging-based monitoring and autonomous ultrasound must be staged and evidence-driven. Confidence in that path rests on ultrasound’s proven record, built on decades of safe use, well-understood physics, and clear clinical value. The FDA is already familiar with AI in ultrasound, having cleared systems for both guided acquisition and image interpretation, so the framework for evaluating safety and efficacy is well established. The sequence starts with supervised acquisition, continuing to the validation of autonomous measurements against expert benchmarks, and finally expands to closed-loop monitoring under regulatory oversight. This approach mirrors how autonomy has matured in other fields through incremental validation, transparent performance thresholds, and gradual domain expansion, producing not a black box but a trusted copilot for clinical care.
The Next-Gen Monitoring
The future of healthcare is about expanding access, simplifying use, and making advanced tools available beyond specialists. The technology, policy, and economics are already aligned. What’s needed now is innovation that enables software, not staffing limits, to determine how closely we can watch those at risk.
This is the future we are unlocking at Sonus. Our polymer-MEMS ultrasound technology provides the foundation for true autonomous imaging, combining advanced sensing with a scalable architecture that supports continuous, AI-driven monitoring.
Innovation in wearable ultrasound is accelerating across the industry. Academic teams, device startups, and established imaging companies are all converging on the same goal: making continuous, noninvasive monitoring part of everyday care. At Sonus, we are building on that momentum by developing the next generation of wearable ultrasound, using proprietary polymer-MEMS technology that combines flexibility, performance, and scalability. This platform provides the foundation for autonomous imaging and for the broader shift toward always-available, AI-driven monitoring in both hospitals and homes.
It’s time to make imaging a vital sign, and broadly accessible.
References
CMS Hospital‑at‑Home study summary and findings on mortality, readmissions, and post‑discharge spending. [link]
CY2025 Physician Fee Schedule: virtual direct supervision policy details and timelines. [link]
RPM/RTM requirements overview. [link]
Sonographer burnout prevalence. [link]
Bipartisan bills introduced in July 2025 to extend Hospital‑at‑Home through 2030. [link]