This tiny device lets you see your heart while exercising

It's powered by ultrasound and AI, and it's the size of a postage stamp.

The subtle signs of heart disease rarely reveal themselves when we are relaxed. Nevertheless, current heart imaging technology requires patients to lie calmly on a vinyl-covered bed while a technician holds a cold, lube-coated wand steady over the heart. 

For years, there has been a growing demand for a device that measures heart performance while the patient is in motion. A team of scientists at UC San Diego has now satisfied this demand, with the first wearable device that can provide accurate and continuous measurements of cardiac performance, regardless of whether the patient is static or exercising.

The threemetrics of the heart:  The root of most cardiovascular diseases is the heart not pumping enough blood. So, when searching for signs of heart failure, physicians look at three blood-pumping factors: stroke volume (the volume of blood the heart pumps out each beat), cardiac output (the total total volume of blood the heart pumps out each minute), and ejection fraction (the percentage of blood pumped out of the main pumping chamber every beat).

Stroke volume and cardiac output provide insight into how much blood the heart is pumping, and ejection fraction tells physicians how hard the heart is working.

We need to be able to see the heart before, during, and after exercise.

These three metrics reveal a lot about a heart’s performance capabilities. However, they can only be directly measured by slicing open a patient’s chest cavity and fiddling around with their heart. To avoid this messy affair, technicians use ultrasound to visualize the heart’s structural changes as beats.

Although this doesn’t lead to particularly thrilling cinematography, these highly skilled videographers can use the grainy, black-and-white videos to calculate the heart’s performance metrics. However, this system is far from perfect.

The problem: First, people need to breathe, and this is the bane of the cardiac echocardiogram technician. With every breath a patient takes, their chest expands and contracts, and a technician must scramble to keep the ultrasonic want focused on the patient’s heart. Every moment the heart is not in focus is an opportunity to miss a sign of heart disease. 

So, the patient must remain as still as possible.

However, this creates another problem, because this doesn’t reflect the heart’s behavior in the real world. The heart constantly fluctuates between states of rest, work, and recovery as it responds to our body’s changing needs. Early signs of heart disease are more likely to present themselves as the heart transitions from one state to another. In other words, to fully assess a heart’s health, you need to be able to collect information on cardiac activities before, during, and after exercise.

The algorithm analyzes every frame of the video and automatically isolates images of the heart.

The UC San Diego team decided to address both of these problems with one device. They wanted a tool that didn’t need to be stabilized by a technician and could be used in different physical states, including static and after exercise, which has never been achieved before.

Ultrasonic AI: The team had a solid starting point going into this project. In 2018, they developed an ultrasonic device that conformed to the skin and could capture the blood pressure of deeply embedded arteries and veins. 

They began exploring the possibility of modifying this technology to create a wearable heart monitoring system that uses ultrasound to capture images of the heart continuously.

The postage stamp-sized device is soft, stretchable, and adheres well to human skin. Credit: Hu, H. et al. 2023

Their device was essentially a miniature version of the technician’s clunky, ultrasonic wand. It sends and receives ultrasound waves, which it uses to generate a constant stream of images of the heart’s structure in real time. It is soft and stretchable, adheres well to human skin, and it’s the size of a postage stamp, making it ideal for bodies in motion. 

Of course, a technician today could hold their wand to a patient’s chest for hours while the patient exercised and rested, but that would result in worthless, indecipherable videos. Well, worthless and indecipherable to a human, at least. 

So, in addition to this wearable heart monitor, the team developed an algorithm to facilitate continuous, AI-assisted automatic processing of the data transmitted by the patch. The algorithm analyzes every frame of the video and automatically isolates images of the heart. It uses those images to calculate the heart’s stroke volume, cardiac output, and ejection fraction — without human intervention.

The researchers believe the implications of this technology go far beyond imaging the heart, as it can be generalized to image other important deep tissues, such as the inferior vena cava, abdominal aorta, spine, and liver. The team plans to commercialize this technology through Softsonics, a company spun off from UC San Diego.

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at tips@freethink.com.

Related
How smart devices helped me unlock hidden health wins
By measuring many different body metrics, smart health devices can help support the mental game as much as the physical fitness gains.
How Brilliant Labs CEO is creating a “symbiosis of humanity and artificial intelligence”
CEO Bobak Tavangar discusses the philosophy behind Brilliant’s latest device, Frame, and his vision for the future of AI.
AI-powered wearable “speaks” for people with vocal cord problems
Bioengineers at UCLA are developing an AI-powered wearable that could make it easier for people with vocal cord problems to talk.
FDA approves first over-the-counter CGM
The FDA-approved Stelo is the first continuous glucose monitor (CGM) available without a prescription in the US.
Tech hacks the nervous system to bring touch to virtual reality
Afference’s Phantom conducts electrical signals through nerves in the fingers to convince your brain it feels objects in virtual spaces.
Up Next
Exit mobile version