UCF Researcher Receives DOE Funding to Advance Human Understanding of AI Reasoning
A University of Central Florida researcher has received funding from the U.S. Department of Energy (DOE) to enhance the current understanding of artificial intelligence (AI) reasoning.
The project focuses on developing algorithms to create robust multi-modal explanations for foundation, or large, AI models through the exploration of several novel explainable AI methods. The DOE recently awarded $400,000 to fund the project.
The project was one of 22 proposals selected for the DOE’s 2022 Exploratory Research for Extreme-Scale Science (EXPRESS) grant, which promotes the study of innovative, high-impact ideas for advancing scientific discovery.
Unlike task-specific models, foundation models are trained with a large set of data and can be applied to different tasks.
These models are more efficient than humans in many challenging tasks and are being used in real-world applications like autonomous vehicles and scientific research, but few methods exist for explaining AI decisions to humans, blocking the wide adoption of AI in fields that ultimately require human trust, such as science.
By creating algorithms that provide meaningful explanations for a model’s decision-making, AI systems can be deployed with higher levels of human trust and understanding, the researchers say.
Rickard Ewetz, lead researcher of the project and an associate professor in UCF’s Department of Electrical and Computer Engineering, says AI models need to be transparent in order to be trusted by humans.
“It’s not just a black box that takes an input and gives an output. You need to be able to explain how the neural network reasons,” Ewetz says.
Instead of examining model gradients, which are the emphasis of many explainable AI efforts over the last decade, the project focuses on providing meaningful explanations of AI models through innovations such as the implementation of symbolic reasoning to describe AI reasoning with trees, graphs, automata and equations.
The researchers aim to not only provide needed explanations for a model’s decision-making but also estimate the model’s explanation accuracy and knowledge limits.
Sumit Jha, co-researcher of the project and a computer science professor at the University of Texas at San Antonio, says that explainable AI is especially necessary with the rapid deployment of AI models.
“In general, AI will not tell you why it made a mistake or provide explanations for what it is doing,” Jha says. “People are accepting AI with a sort of blind trust that it is going to work. This is very worrying because eventually there will be good AI and bad AI.”
Ewetz received his doctorate in electrical and computer engineering from Purdue University and joined UCF’s College of Engineering and Computer Science in 2016. His primary research focuses include AI and machine learning, emerging computing paradigms and future computing systems, and computer-aided design for very large-scale integration.
Share This Article
Explore UCF’s Top-tier Graduate Programs at Grad Fair
Pursuing a graduate degree is a major decision that can significantly impact your life. A graduate degree can boost professional prospects, open doors to new opportunities, and help you become...
Latest News
Graduate Program Seminar, Workshop or Conference Support Available
To augment educational opportunities, the College of Graduate Studies (CGS) will award up to $2,500 per academic year to support department seminars, special workshops, or conferences designed to bring in...
Graduating Kenyan Artist Uses Work to Advocate for Change
As an artist, Njeri Kinuthia draws ample creative inspiration from her life. Having grown up in a small village in rural Kenya, the emerging media MFA with a track in studio...
Celebrating Graduate Excellence in Research, Mentorship, and Scholarship at UCF
Each year, students and faculty at UCF demonstrate incredible dedication to their work. Many go above and beyond to produce high-quality research while helping others reach their potential. The College...
Your UCF Graduate Student Association Officers
The results of the 2024-2024 officer election for the UCF Graduate Student Association (GSA) have been announced. The newly elected officers will partner with offices and services across UCF, uniting...