Featured
Phone camera and flash can measure blood oxygen
Researchers have shown that smartphones are capable of detecting blood oxygen saturation levels to the same limits as specialised equipment.
Share
- Click to share on Twitter (Opens in new window)
- Click to share on Facebook (Opens in new window)
- Click to share on LinkedIn (Opens in new window)
- Click to email a link to a friend (Opens in new window)
- Click to share on Reddit (Opens in new window)
- Click to share on WhatsApp (Opens in new window)
- Click to share on Pinterest (Opens in new window)
First, pause and take a deep breath.
When we breathe in, our lungs fill with oxygen, which is distributed to our red blood cells for transportation throughout our bodies. Our bodies need a lot of oxygen to function, and healthy people have at least 95% oxygen saturation all the time.
Conditions like asthma or Covid-19 make it harder for bodies to absorb oxygen from the lungs. This leads to oxygen saturation percentages that drop to 90% or below, an indication that medical attention is needed.
In a clinic, doctors monitor oxygen saturation using pulse oximeters — those clips you put over your fingertip or ear. But monitoring oxygen saturation at home multiple times a day could help patients keep an eye on Covid symptoms, for example.
In a proof-of-principle study, University of Washington and University of California San Diego researchers have shown that smartphones are capable of detecting blood oxygen saturation levels down to 70%. This is the lowest value that pulse oximeters should be able to measure, as recommended by the U.S. Food and Drug Administration.
The technique involves participants placing their finger over the camera and flash of a smartphone, which uses a deep-learning algorithm to decipher the blood oxygen levels. When the team delivered a controlled mixture of nitrogen and oxygen to six subjects to artificially bring their blood oxygen levels down, the smartphone correctly predicted whether the subject had low blood oxygen levels 80% of the time.
The team published these results September 19 in npj Digital Medicine.
“Other smartphone apps that do this were developed by asking people to hold their breath,” said co-lead author Jason Hoffman, a UW doctoral student in the Paul G. Allen School of Computer Science and Engineering. “But people get very uncomfortable and have to breathe after a minute or so, and that’s before their blood-oxygen levels have gone down far enough to represent the full range of clinically relevant data. With our test, we’re able to gather 15 minutes of data from each subject. Our data shows that smartphones could work well right in the critical threshold range.”
Another benefit of measuring blood oxygen levels on a smartphone is that almost everyone has one.
Co-author Dr Matthew Thompson, professor of family medicine in the UW School of Medicine, said: “This way you could have multiple measurements with your own device at either no cost or low cost. In an ideal world, this information could be seamlessly transmitted to a doctor’s office. This would be really beneficial for telemedicine appointments or for triage nurses to be able to quickly determine whether patients need to go to the emergency department or if they can continue to rest at home and make an appointment with their primary care provider later.”
The team recruited six participants ranging in age from 20 to 34. Three identified as female, three identified as male. One participant identified as being African American, while the rest identified as being Caucasian.
To gather data to train and test the algorithm, the researchers had each participant wear a standard pulse oximeter on one finger and then place another finger on the same hand over a smartphone’s camera and flash. Each participant had this same set up on both hands simultaneously.
“The camera is recording a video: every time your heart beats, fresh blood flows through the part illuminated by the flash,” said senior author Edward Wang, who started this project as a UW doctoral student studying electrical and computer engineering and is now an assistant professor at UC San Diego’s Design Lab and the Department of Electrical and Computer Engineering.
“The camera records how much that blood absorbs the light from the flash in each of the three colour channels it measures: red, green and blue,” said Wang, who also directs the UC San Diego DigiHealth Lab. “Then we can feed those intensity measurements into our deep-learning model.”
Each participant breathed in a controlled mixture of oxygen and nitrogen to slowly reduce oxygen levels. The process took about 15 minutes. For all six participants, the team acquired more than 10,000 blood oxygen level readings between 61% and 100%.
The researchers used data from four of the participants to train a deep learning algorithm to pull out the blood oxygen levels. The remainder of the data was used to validate the method and then test it to see how well it performed on new subjects.
“Smartphone light can get scattered by all these other components in your finger, which means there’s a lot of noise in the data that we’re looking at,” said co-lead author Varun Viswanath, a UW alumnus who is now a doctoral student advised by Wang at UC San Diego. “Deep learning is a really helpful technique here because it can see these really complex and nuanced features and helps you find patterns that you wouldn’t otherwise be able to see.”
The team hopes to continue this research by testing the algorithm on more people.
“One of our subjects had thick calluses on their fingers, which made it harder for our algorithm to accurately determine their blood oxygen levels,” said Hoffman. “If we were to expand this study to more subjects, we would likely see more people with calluses and more people with different skin tones. Then we could potentially have an algorithm with enough complexity to be able to better model all these differences.”
But, the researchers said, this is a good first step toward developing biomedical devices that are aided by machine learning.
“It’s so important to do a study like this,” said Wang. “Traditional medical devices go through rigorous testing. But computer science research is still just starting to dig its teeth into using machine learning for biomedical device development and we’re all still learning. By forcing ourselves to be rigorous, we’re forcing ourselves to learn how to do things right.”
Additional co-authors are Xinyi Ding, a doctoral student at Southern Methodist University; Eric Larson, associate professor of computer science at Southern Methodist University; Caiwei Tian, who completed this research as a UW undergraduate student; and Shwetak Patel, UW professor in both the Allen School and the electrical and computer engineering department. This research was funded by the University of Washington.
Share
- Click to share on Twitter (Opens in new window)
- Click to share on Facebook (Opens in new window)
- Click to share on LinkedIn (Opens in new window)
- Click to email a link to a friend (Opens in new window)
- Click to share on Reddit (Opens in new window)
- Click to share on WhatsApp (Opens in new window)
- Click to share on Pinterest (Opens in new window)
Thank you for Signing Up |