Meet Dr. Robot

Advanced robotics bring speed and accuracy to operating rooms

Back Longreads Sep 23, 2022 By Russell Nichols

This story is part of our September 2022 print issue. To subscribe, click here.

In pathology, ROSE stands for rapid on-site evaluation, a common procedure where pathologists verify whether a biopsy sample contains adequate cells for processing and diagnosis. But the R in ROSE can be misleading.

The procedure itself might take only a few minutes. A doctor inserts a needle into a tumor of the patient, aspirates the needle and squirts the specimen onto a slide. Then a pathologist smears the sample using another glass slide, applies color dyes (or stains) to it and studies the cells under a microscope. But the tissue sample might not be adequate enough to evaluate anything. 

“The problem is the short supply of pathologists who can perform ROSE,” says Dr. Alejandro S. Mendoza, founder and CEO of Davis-based medtech startup, AmCyt. “If inadequate samples are submitted, patients have to come back for a repeat procedure.”

These repeat visits waste precious time, Mendoza says, not only for patients, but also for pathologists like himself, who typically have to travel to a different site to perform this procedure again and again. But a few years ago, Mendoza had a thought: Why can’t robots do ROSE?

For decades, the question of whether robots will replace doctors has been looming. Silicon Valley investor Vinod Khosla believes this could happen by 2035. The usual argument against this prediction highlights the invaluable human trait of empathy. Still, as medical technology keeps advancing, one thing is clear: AI-controlled machines are just as good, if not better, than humans at certain medical tasks, such as detecting high-risk cancer lesions in mammograms or examining retinal images of patients with diabetes. Robots may not possess empathy, but they provide precision, consistency, speed and accuracy that’s hard for even the most highly trained health professionals to replicate.

The Capital Region has a storied history in this arena. Back in 1992 in Davis, Robodoc became the first robotic system cleared by the FDA to begin clinical trials for surgeries (see sidebar). Forty years later, robotics — with the help of optical imaging tools and sensors  — continue to evolve. This summer, UC Davis received a $6.3 million grant to support a new center coming to Aggie Square designed to bolster medtech startups and bring AI-informed optical imaging technologies to the medical mainstream.

Mendoza launched AmCyt in 2019 to develop a device called eROSE, which processes and captures live images of tissue samples automatically in two minutes or less. But he understands that, like with any medical device, proving its value isn’t a quick process.

“If you have a medical device, it’s always a challenge bringing the product to market,” Mendoza says. “It involves a lot of customer engagement and interviews. Who will buy this? Who will need this?”

Learning curve

With eROSE, a pathologist can receive the live scan for real-time assessment online without ever leaving the office. If the sample isn’t adequate, the patient can have it redone immediately instead of having to come back later to try again. This ROSE-in-a-box machine would save time and save hospitals about $400,000 a year, Mendoza estimates, by cutting down the sampling error in fine needle aspiration biopsies. AmCyt plans to have a commercial product ready for the public by 2025.

“I don’t think robotics will replace the work of doctors,” Mendoza says. “It enhances their work. In general, in medicine, I believe robotics is promising to improve workflow.”

Just because robots can work automatically doesn’t mean they work alone. They assist medical professionals in the field, making their jobs easier through data collection and assessments via technology. Optical instruments, for instance, allow robots to touch and sense and see things to help surgeons make more informed decisions, according to Laura Marcu, professor of biomedical engineering and neurological surgery at UC Davis.

Here’s an example: Say a neurosurgeon wants to remove a tumor from the brain. Ideally, she would want to remove the entire tumor to prevent malignant cells from growing again. But during the operation, she can’t truly know the exact boundary. There is no line that shows where the tumor starts and where it ends. If she removes too much, the surgery might impact brain function, such as affecting the patient’s ability to speak. In this case, Marcu says, optical instruments used in robotics could help surgeons know how aggressive to be.

To learn about Robodoc's legacy, click here.

At UC Davis, Marcu’s graduate students and postdoctoral researchers build these types of optical devices in laboratories. Initially, this is done using tissue phantoms or samples in well-controlled static situations. There is no body on an operating table. No distracting blood. Nothing moves. But the technology developed in Marcu’s laboratory allows for real-life situations with patients in the operating room, where the students can see these devices in action, giving them a unique experience.

“It cannot be described in a book or in a paper,” Marcu says. “You really have to see it with your own eyes.”

Surgeons plan for operations and robots can be programmed for various tasks. But the fact that conditions vary from patient to patient means there’s a learning curve for both the human and the machine. Adjustments need to be made. Factors such as age need to be considered. Surgeons already know how to adapt, Marcu says, so the technology needs to be adaptable to work in dynamic environments.

“A surgeon has encountered many situations and uses that experience,” Marcu says. “It’s the same with the instrument. You have to use it in many patients to account for many variables.”

Extra eyes

Variables also play a role on the financing side. In order to have a big impact, technology needs to go through a commercialization pathway, which means first cloning the devices, conducting multicenter trials and getting them approved by the FDA, an arduous process that takes time and requires funding. In addition to non-dilutive funding (financing without the business giving up equity), private investment supports the creation of a “technological paradigm,” Marcu says, that will make medtech devices more affordable and easily deployable. 

The COVID-19 pandemic highlighted even more opportunities in medtech, with investment themes shifting from a focus on products to services, according to global management consulting firm Bain & Company. Its 2022 report showed that investors completed 96 medtech deals in 2021, almost doubling the 2020 total of 55 and surpassing the previous high of 60 in 2018.

Medtech looks appealing to investors for several reasons. Typically, medtech startups have strong IP potential, often with strong university connections, funded with non-dilutive funding, says John Peters, board chairman and president for the Sacramento Angels, a group of individuals who invest in early-stage emerging technology companies in Northern California. Also, medtech is rich with potential for diverse applications: surgical, wearables, diagnostics, business processes, therapy-related tech and more.

The catastrophe that was Theranos hasn’t deterred investors, either. The scandalized blood-testing startup was seen as a wake-up call, Peters says; a reminder for angels to do their due diligence investigations. The Sacramento Angels favor putting their money behind companies where the heavy lifting and research and development has been mostly done. They don’t prefer companies with long multiyear timelines toward regulatory approvals. The further a company is from regulatory approvals, Peters says, the more the risk of future delays and additional capital raises, which dilutes the investment of the earlier investors.

Risk management is crucial for angels, he adds. Members build diverse portfolios with 20 to 40 companies in various industries and at different stages. They also stay connected to more than 20 other angel groups and swap notes for support to “avoid the Theranos-type of experience,” he says.

“It helps to have peer groups,” Peters says, “to get more eyes on what you’re looking at when you’re investing.”

‘The operating room of the future’

Before broad adoption can happen, medical technologies and health practitioners need training. This is a key component of the National Center for Interventional Biophotonic Technologies planned for Aggie Square. The P41 grant from National Institute of Health’s National Institute of Biomedical Imaging and Bioengineering was awarded in June. The next phase involves creating the infrastructure, says Marcu, who is also the founding director of NCIBT.

The new center will focus on research and development as well as educating and training for two optical imaging technologies: interventional fluorescence lifetime imaging and interferometric diffuse optical spectroscopy. Both noninvasive approaches measure how light diffuses through tissues, which helps surgeons distinguish healthy from altered cells. This data will then be integrated into deep-learning AI platforms to relay information on the tissue’s structure, composition, blood flow and metabolism.

“This is what the operating room of the future looks like. It’s a mix of technology, which is easily interfaceable, miniaturized, combined with a software platform that enables real-time processing of information and display in a readable, easy visualized format.”

Laura Marcu, professor of biomedical engineering and neurological surgery at UC Davis.

Currently, physicians use CT scans, X-rays or MRIs prior to an operation to gather information and plan the procedure. These new technologies will provide real-time guidance during medical and surgical procedures. The premise is simple: If a clinician has access to easily readable, imaging information during an operation or patient monitoring, she will be able to make better decisions that improve outcomes. The data could be projected on a wall for the surgeon to see or visible through AR (augmented reality) goggles. The exact delivery method is yet to be determined, Marcu says, but surgeons of tomorrow will have such high-tech tools at their fingertips.

“This is what the operating room of the future looks like,” Marcu says. “It’s a mix of technology, which is easily interfaceable, miniaturized, combined with a software platform that enables real-time processing of information and display in a readable, easy visualized format.”

Stay up to date on business in the Capital Region: Subscribe to the Comstock’s newsletter today.

Post new comment

44555902 » If you have a visual disability, please type the numbers two one three three into the box. Your submission will be promptly reviewed by a validation service and sent to the site administrators.
By proving you are not a machine, you help us prevent spam and keep the site secure.

Recommended For You