The potential of smartphone cameras in detecting the risk of complications of diabetes and cardiovascular disease

Over the years, the team at Google has focused on researching how to apply technology – especially artificial intelligence (AI) and technological innovations – to improve access to care. globally equitable and high-quality healthcare.

Accessing good health care can be a challenge depending on where people live, whether local health care workers have specialized equipment or have specific clinical training such as disease control. Google Health has expanded its research and applications to help improve healthcare, which could allow doctors to remotely monitor patients’ health in cases where hospital visits are not possible or clinic.

Today, at the Google Health event, Google shared new research and innovations related to the field of artificial intelligence and how Google is providing doctors with easy-to-use tools to help them care for their patients. better multiplier. Here are some of the updates.

The potential of smartphone cameras in protecting cardiovascular health and maintaining vision

One of Google’s first AI Health projects was ARDA. This project aims to support screening and detection of diabetic retinopathy – a complication of diabetes that, if not diagnosed or treated in time, causes vision loss. Google screens 350 patients daily, and to date, nearly 100,000 patients have been screened. Recently, Google completed a prospective study with a national screening program in Thailand. Research shows that the ARDA project produces accurate results and can be safely deployed across multiple areas to facilitate clinical eye exams.

In addition to diabetic retinopathy, Google has also previously shown how fundus pictures can reveal cardiovascular disease risks like high blood sugar and cholesterol levels, with the backing of extensive research. A recent Google study deals with the detection of diabetes-related conditions from images taken outside the eye, using a desktop ophthalmoscope found in the clinic. With such promising results, Google wishes to cooperate with EyePACS and Chang Gung Memorial Hospital (CGMH) to study whether smartphone camera images can detect diabetes. as well as non-diabetic disease from extraocular imaging or not. Although this is an early stage of research and development, Google engineers and scientists envision a future where people, with the help of their doctors, can clearly understand and make decisions about their health status remotely.

Record and listen to your heart sounds with your smartphone

Recently, Google shared how phone sensors combined with machine learning technology can personalize health metrics and provide daily health and fitness insights. This feature is currently available on over 100 Android device models as well as iOS devices. A Google manuscript describing a potential validation study was also approved at Nature Communications Medicine.

Today, Google shared a new area of ​​research that explores how smartphone microphones capture a person’s heart sounds when placed at the chest. Using a stethoscope to listen for heart and lung sounds (also known as auscultation) is an important part of the clinical examination. This can help doctors detect heart valve disorders such as aortic stenosis, which need to be detected early.

Screening for aortic stenosis usually requires specialized equipment such as a stethoscope or ultrasound machine, and a physical exam.

Recent Google research explores whether a cell phone can detect a heartbeat or heart murmur. Google is currently in the early stages of clinical trials, but Google hopes that the research could enable people to use their phones as an additional tool in assessing their health. mine.

Partnering with Northwestern Medicine to apply artificial intelligence to improve the health of pregnant women around the world

Ultrasound is an imaging diagnostic method that uses high-frequency sound waves to create real-time images or videos of internal organs or tissues such as blood vessels or the fetus. Research shows that ultrasound is safe to use in prenatal care, and effective in detecting problems early in pregnancy. However, more than half of parents giving birth in middle- to low-income countries are unable to have an ultrasound, in part due to a lack of expertise in reading the results. Google believes that Google’s expertise in machine learning technology can help solve this problem, allowing mothers and babies to have a healthier pregnancy.

Google will soon publish free and basic research that validates the use of artificial intelligence to help professionals conduct ultrasounds and perform physicals. Google is excited to work with Northwestern Medicine to develop and further test these models to be more generalizable across different levels of experience and technology. With automated and accurate assessments of maternal and fetal health risks, Google hopes to reduce barriers and help people get timely care in the right settings.

To learn more about the technology efforts in healthcare that Google shared at the Google Health event, you can check out the post from Chief Medical Officer Karen DeSalvo. And stay tuned for more health-related research achievements from Google.

Leave a comment

Your email address will not be published. Required fields are marked *