Machine Learning vs. COVID-19
- BY ILENE LELCHUK
- June 8, 2021
In the spring of 2020, Cal State East Bay undergraduate Emmanuel Gallegos was walking through Costco in Livermore and aiming his smartphone at random shoppers while, as he put it, trying not to look creepy.
“I definitely was approached by security guards,” he said with a laugh.
But it was all in the name of science.
Gallegos, who has since graduated, was part of ÂÌñ»»ÆÞ’s computer sciences team that created an ambitious smartphone application called Covid ID.
Covid ID is what that team calls a “health situation awareness” app. Using computer vision and machine learning, Covid ID detects four critical coronavirus risk factors in public spaces: fever indicators, mask usage, social distancing and crowd density.
“I don’t think anything is zero risk, but everyone could benefit from knowing how much risk they are subjecting themselves to and make decisions based on their comfort level.”
It helps users assess infection risks so they can make safer choices – such as decide which market checkout line to wait in or see how “safe” a crowd is at a party.
With broad use, Covid ID also can consolidate crowdsourced data to create live maps. Users could see if high crowd density is detected at their favorite park or if high body temperatures are detected at the corner market.
In short, Covid ID helps users avoid potential close encounters with the virus.
“I think this could be a great benefit for public health,” Ryan Gamba, assistant professor of health sciences, said about the program. “I don’t think anything is zero risk, but everyone could benefit from knowing how much risk they are subjecting themselves to and make decisions based on their comfort level.”
Gallegos and 15 other graduate and undergrad students embarked on the project in May 2020 under the guidance of computer science professor Lynne Grewe, a computer vision and assistive technology specialist.
“Covid ID was our response to a very challenging time,” Grewe said. “I thought this could be a way for students to feel like they were doing something to help their communities while also engaging in their field.”
A year later, the prototype app for Android phones has earned attention and accolades. One participating grad student, Shivali Choudhary, was named a finalist for the National Center for Women & Information Technology Collegiate Award. For other students, participating in the project led to prestigious grad school acceptances, soon-to-be-published papers and job offers. Gallegos, for example, has been invited into master’s programs at Carnegie Mellon and Cornell universities, the University of Southern California and the University of Illinois, among others.
“This research project definitely played a huge part in my applications being accepted,” Gallegos said.
So, how exactly did they create Covid ID? With cutting-edge computer vision, deep machine learning technology, and a lot of patience.
The Infrared Fever Indicator System turned out to be the most complex and challenging function of Covid ID.
The IRFIS team – including Choudhary, Gallegos and grad student Dikshant Patel Jain – started with a basic question: Could they use the coronavirus’s most common symptom against it?
Their goal: Create an app for a smartphone equipped with a small, commercially available infrared camera to detect feverish skin temperatures within 10-20 yards.
They needed the program to not only distinguish people from backgrounds within the camera frame; it also had to identify heads and then capture the highest skin temperature in a particular region – around the cheeks, eyes, forehead, or ear if a face is turned sideways.
Using smartphones with Flir One infrared cameras attached, team members set out to capture 1,000 images at parks and stores, including at Gallegos’s Costco.
“Then you have to go in with a program that’s similar to Microsoft Paint and draw little rectangles (bounding boxes) around every single head to teach the artificial intelligent model what a head is," Gallegos said. “It’s a tedious process. We also had to decide what ‘kind’ of heads we wanted the model to identify. What if they are too far away? What if they are turned sideways? What if they are wearing glasses?”
Grewe explains it this way: “It’s a lot like how you teach a child to read. We are showing it pictures and teaching it to only see specific things.”
In the end, they surprised themselves with their success – a 95.6 percent accuracy rate for identifying heads.
“While working on the project, one thing I learned was that failure is not a failure if you learn from it.”
Choudhary said another major IRFIS challenge was finding a high-resolution, highly accurate infrared camera accessory with a consumer-friendly price. Unfortunately, the higher the resolution, the higher the cost. The team eventually settled on a $300 camera with less-than-ideal resolution.
“While working on the project, one thing I learned was that failure is not a failure if you learn from it,” said Choudhary, who completes her master’s program in May. “Learning from your failures contributes towards your successes.”
Programming the mask detection module had similar challenges. That team had to collect more than 1,000 images.
“We created a data set full of images of people not wearing masks, wearing masks and wearing masks incorrectly to train the machine,” explained Maithri Chullakani House, a recent master’s program graduate who now works as a software engineer in Silicon Valley.
The Covid ID project felt deeply personal to all the students. For House, her parents in India fell ill with COVID-19, and her mother was hospitalized for a few days.
“I think if everyone can use this app, it could definitely make an impact,” House said.
Because of funding constraints, however, ÂÌñ»»ÆÞ’s iLab is not currently developing the project more broadly.
“We would love to deploy this, but it’s not free to do it,” Grewe said, explaining that backend systems such as cloud data storage are costly.
Instead, the team posted their open source code on GitHub, a public sharing and collaborating repository.
“Even if we can’t afford to fund further development, I still feel extremely proud of the students,” Grewe said. “They did something that felt very timely and vital. It was a great motivator.”
Also, she said, she is proud of the many “micro-outcomes” for her students: Gallegos and undergrad Phillip Aguilera from CSU Dominguez Hills were invited to present at the Great Minds in STEM conference. The experience also spurred Gallegos to attend graduate school next year to delve deeper into how large-scale machine learning and AI programs can be used to combat climate change and develop smart cities. Choudhary was named a finalist for the prestigious . And House said this experience helped her showcase her skills and secure a post-graduate job at Intuit.
“I really want to instill in my students that this work is about ongoing research and experimentation.”
“I really want to instill in my students that this work is about ongoing research and experimentation,” Grewe said. “You don't develop the perfect system the first time; you develop ‘a’ system. And then you keep refining it or take what you've learned and use it somewhere else.”
The Covid ID team also included ÂÌñ»»ÆÞ graduate students Subhangi Asati, Divya Gupta, Cemil Kes, Buhmit Patel, Kunjkumar Patel, Dikshant Pravin Jain and Manasi Rajiv Weginwar; ÂÌñ»»ÆÞ undergraduate Jamie Ngyuen; Santa Clara University undergrad Allen Shahshahani; and high school student Jake Shahshahani. National Science Foundation grants supported two students in the program.
To learn more about the Covid ID app, .