Accessible expert ultrasound, anywhere.

Bring expert ultrasound imaging and diagnostics to rural communities and other remote settings through our patent-pending mixed reality system without anyone needing to travel.

The Issue

Diagnostic ultrasound is used in every branch of medicine, yet rural communities have little to no access. This impacts about 1 in 5 people globally. The only practical solution used today is for patients to travel for scans. This means time away from work and family, facing the risks of travel and the environmental impact, and it costs healthcare systems billions of dollars per year. The tele-ultrasound market is estimated to be worth $100 billion (USD), but current solutions do not solve the problem. Video guidance is too ineffective and imprecise while robotics is too complex, expensive, and inflexible.

Solution

Guide Medical is addressing the problem of ultrasound access by enabling expert ultrasound guidance in remote communities. Our system bridges the gap between robotics and video guidance, providing a solution that is simultaneously easy-to-use, low-cost, and flexible, while giving the physician precise control and immersive feedback.

We use a patent-pending combination of real-time hand-over-hand augmented reality guidance with high-speed communication and force feedback to give the expert the sensation of performing a scan in person while in fact they are guiding the motions of a novice person in a remote location.

Our approach is based on hundreds of conversations with physicians, health authorities, remote and Indigenous communities, and industry experts, and has been validating in award-winning published research.

At a glance

The novice person wears a mixed reality headset and aligns their real ultrasound probe with a virtual one. The motion of the virtual one is controlled in real time by a remote expert, who manipulates a haptic input device. They see the live-streamed video and ultrasound, communicate verbally with the novice, and receive force feedback through the haptic device so it feels as if they are performing the scan in person.

  • Cloud interface:
    • Secure, robust, end-to-end encrypted, ultra-fast, peer-to-peer connection between expert and follower
    • Two-way verbal communication
    • High-quality, low-latency video and ultrasound streaming
    • Advanced teleoperation algorithms
  • Expert-side interface:
    • High definition video and ultrasound streams
    • Device-agnostic remote control of the ultrasound machine settings
    • Input device with ultrasound probe-like handle for sub-millimeter, sub-degree precision 6 degree-of-freedom input and realistic force feedback
  • Patient-side system:
    • Compellingly intuitive mixed reality guidance system
    • Integrated transducer force, position, and orientation tracking and feedback
    • Patient specific modeling and feedback
    • AI assistance

Key benefits

  • Intuitive and easy to use for expert and novice
  • Precise, real-time guidance for an efficient, high-quality scan
  • Easy, fast setup (less than 1 minute)
  • Completely flexible and mobile (one pair of smart glasses and a tablet)
  • Ultrasound probe agnostic for most point-of-care devices
  • No robot-related safety concerns
  • Massive cost savings compared to travel or robotics

Partners and Sponsors

Research

This venture was born in the Robotics and Control Laboratory at the University of British Columbia, Vancouver, Canada. We are carrying out extensive research on the technical system, instrumentation, control, human-computer interaction, vision, clinical feasibility, and more.

Papers:

Human Teleoperation Dissertation

Human Teleoperation System
Bilateral human teleoperation : a mixed reality system for remote ultrasound
David G. Black
PhD Dissertation

Human Teleoperation

Human Teleoperation System
Human Teleoperation - a Haptically-Enabled Mixed Reality System for Tele-Ultrasound
David G. Black, Yas O. Yazdi, Amir H. H. Hosseinabadi, Septimiu E. Salcudean
Human-Computer Interaction (2023)

Remote Ultrasound Control

Mixed Reality Tele-US
Mixed Reality Human Teleoperation with Device-Agnostic Remote Ultrasound: Communication and User Interaction
David G. Black, Mika Nogami, Septimiu E. Salcudean
Computers & Graphics, Vol. 118 (2024)

Pilot Study in Haida Gwaii

750 km tele-ultrasound clinical study
Mixed Reality Tele-Ultrasound over 750 km: A Clinical Study
Ryan Yeung, David G. Black, Patrick Chen, Vickie Lessoway, Jan Reid, Sergio Rangel-Suarez, Silvia Chang, Septimiu E. Salcudean
IEEE International Conference on Telepresence, Leiden (2025)

Communication

Latency Evaluation
Evaluation of Communication and Human Response Latency for (Human) Teleoperation
David G. Black, Dragan Andjelic, Septimiu E. Salcudean
IEEE Transactions on Medical Robotics & Bionics (2024)

Human vs. Robotic Teleoperation

Latency Evaluation
Robotic versus Human Teleoperation for Remote Ultrasound
David G. Black, Septimiu E. Salcudean
Hamlyn Symposium for Medical Robotics (2025)

Novice Performance

Human-as-robot performance
Human-as-a-Robot Performance in Augmented Reality Tele-Ultrasound
David G. Black, Hamid Moradi, Septimiu E. Salcudean
International Journal of Computer Assisted Radiology & Surgery (2023)

Bilateral Human Teleoperation

Stability and transparency in human teleoperation
Stability and Transparency in Mixed Reality Bilateral Human Teleoperation
David G. Black, Septimiu E. Salcudean
IEEE Transactions on Robotics (2025)

Model-Mediated Teleoperation

Model mediated teleoperation
Visual-Haptic Model-Mediated Teleoperation for Ultrasound
David G. Black, Maria Tirindelli, Septimiu E. Salcudean, Wolfgang Wein, Marco Esposito
IEEE Conference on Intelligent Robots and Systems (IROS), Hangzhou (2025)

Linearity & Passivity of the Novice

Linearity and passivity in human teleoperation
Linearity, Time Invariance, and Passivity of a Novice Person in Human Teleoperation
David G. Black, Septimiu E. Salcudean
IEEE Transactions on Robotics (submitted, 2025)

Model Based Haptics

Linearity and passivity in human teleoperation
Measurement and Potential Field-Based Patient Modeling for Model-Mediated Tele-ultrasound
Ryan S. Yeung, David G. Black, Septimiu E. Salcudean
IEEE International Conference on Robotics and Automation (Submitted 2025)

Force Sensing

6-axis force sensing
Low-Profile 6-Axis Differential Magnetic Force/Torque Sensing
David G. Black, Amir H. H. Hosseinabadi, Mika Nogami, Nicholas Rangga, Septimiu E. Salcudean
IEEE Transactions on Medical Robotics & Bionics (2024)

Force Sensing

Differential magnetic sensing
Towards Differential Magnetic Force Sensing for Ultrasound Teleoperation
David G. Black, Amir H. H. Hosseinabadi, Mika Nogami, Maxime Pol, Nicholas Rangga, Septimiu E. Salcudean
IEEE World Haptics Conference, Delft (2023)

Probe Pose Tracking

Pose tracking for AR
Robust Object Pose Tracking for Augmented Reality Guidance and Teleoperation
David G. Black, Septimiu E. Salcudean
IEEE Transactions on Instrumentation & Measurement (2024)

Mixed Reality Interface

Mixed Reality Tele-US VR
Mixed Reality Human Teleoperation
David G. Black, Septimiu E. Salcudean
IEEE VR (2023)

Awards:

Mitacs Outstanding Innovation Award

Human Teleoperation Mixed Reality system
Mitacs
2024
Prestigious award given to the most innovative of the thousands of projects funded by Mitacs in 2024

Dean's Award for Best PhD Thesis

Human Teleoperation Mixed Reality system
University of British Columbia
2025
Awarded for the best dissertation in engineering.

Best Student Paper Award

Human Teleoperation Mixed Reality system
IEEE International Conference on Telepresence
2025
Given to the best paper presented by a graduate student

Best Presentation Award

Human Teleoperation Mixed Reality system
Hamlyn Symposium for Medical Robotics
2022
Awarded for the best paper and presentation. Imperial College, London

Technical Achievment Award

Human Teleoperation Mixed Reality system
BC Medical Device Design Centre
2021
Awarded after presentations to judges at the BC MDDC

Patents:

Media Coverage

Team

We are scientists and engineers with deep expertise in medical imaging, teleoperation, and mixed reality and a shared vision to improve health equity.

Dr. David Black
Technical Lead
LinkedIn Link
Professor Tim Salcudean
Scientific Lead
LinkedIn Link
Dr. Qi Zeng
AI & Ultrasound Lead
LinkedIn Link
Valentina Toro Chinchilla
Business Lead
LinkedIn Link

Contact

Interested in a pilot, research partnership, or otherwise getting involved? Let's talk.