By creating algorithms that provide meaningful explanations for a model’s decision-making, AI systems can be deployed with higher levels of human trust and understanding. Image credit: Adobe Stock

UCF Researcher Receives DOE Funding to Advance Human Understanding of AI Reasoning

By: Office of Research on

A University of Central Florida researcher has received funding from the U.S. Department of Energy (DOE) to enhance the current understanding of artificial intelligence (AI) reasoning.

The project focuses on developing algorithms to create robust multi-modal explanations for foundation, or large, AI models through the exploration of several novel explainable AI methods. The DOE recently awarded $400,000 to fund the project.

The project was one of 22 proposals selected for the DOE’s 2022 Exploratory Research for Extreme-Scale Science (EXPRESS) grant, which promotes the study of innovative, high-impact ideas for advancing scientific discovery.

Unlike task-specific models, foundation models are trained with a large set of data and can be applied to different tasks.

These models are more efficient than humans in many challenging tasks and are being used in real-world applications like autonomous vehicles and scientific research, but few methods exist for explaining AI decisions to humans, blocking the wide adoption of AI in fields that ultimately require human trust, such as science.

By creating algorithms that provide meaningful explanations for a model’s decision-making, AI systems can be deployed with higher levels of human trust and understanding, the researchers say.

Rickard Ewetz, lead researcher of the project and an associate professor in UCF’s Department of Electrical and Computer Engineering, says AI models need to be transparent in order to be trusted by humans.

“It’s not just a black box that takes an input and gives an output. You need to be able to explain how the neural network reasons,” Ewetz says.

Instead of examining model gradients, which are the emphasis of many explainable AI efforts over the last decade, the project focuses on providing meaningful explanations of AI models through innovations such as the implementation of symbolic reasoning to describe AI reasoning with trees, graphs, automata and equations.

The researchers aim to not only provide needed explanations for a model’s decision-making but also estimate the model’s explanation accuracy and knowledge limits.

Sumit Jha, co-researcher of the project and a computer science professor at the University of Texas at San Antonio, says that explainable AI is especially necessary with the rapid deployment of AI models.

“In general, AI will not tell you why it made a mistake or provide explanations for what it is doing,” Jha says. “People are accepting AI with a sort of blind trust that it is going to work. This is very worrying because eventually there will be good AI and bad AI.”

Ewetz received his doctorate in electrical and computer engineering from Purdue University and joined UCF’s College of Engineering and Computer Science in 2016. His primary research focuses include AI and machine learning, emerging computing paradigms and future computing systems, and computer-aided design for very large-scale integration.

Share This Article

Featured Content image

Celebrating Graduate Excellence in Research, Mentorship, and Scholarship at UCF

Each year, students and faculty at UCF demonstrate incredible dedication to their work. Many go above and beyond to produce high-quality research while helping others reach their potential. The College...

Read More

Featured Content image

UCF Student Research Week 2024: Celebrating Innovation, Collaboration and Impact

There are few opportunities for students from across disciplines and colleges to come together quite like the UCF Student Research Week, taking place March 25-29. From groundbreaking engineering feats to...

Read More

Featured Content image

UCF Student Receives National Acclaim for Project Protecting Leatherback Sea Turtles

UCF doctoral student Callie Veelenturf always knew she wanted to have a positive impact on the natural world. As a child, she idolized environmentalist Jane Goodall and aspired to become...

Read More

Featured Content image

UCF to Grow Next Generation of Digital Twin Researchers with New Graduate Program

A new UCF graduate certificate program will focus on strengthening the university’s talent pipeline to the rapidly growing digital twin industry in Central Florida and beyond. The UCF School of Modeling,...

Read More