Towards AI Democratization: From Efficient Large AI Models to Network Compression Theory

Monday, March 10, 2025 11 a.m. to noon

Speaker: Mr. Yuzhang Shang

From: Illinois Institute of Technology

Abstract

Artificial Intelligence (AI) models are growing increasingly complex and resource-intensive, posing significant challenges for widespread deployment. This research focuses on efficient AI as a crucial pathway towards democratizing large AI models, making their benefits accessible to a global audience. This presentation will explore three key areas of efficient AI: (i) Accelerating Generative Models: I will discuss novel training-free methods to enhance the efficiency of state-of-the-art generative models, specifically diffusion models. (ii) Compressing Large Language Models: My research addresses the acceleration of large language models, improving their performance and accessibility by implementing low-rank decomposition on LLMs. (iii) Neural Network Compression Theory: my research goals are not only to propose effective approaches for deep neural network compression, but also to explore the possibility towards well-defined and explainable compression methods. By improving the efficiency of these critical AI domains, we aim to lower the barriers to entry for AI adoption and foster innovation across diverse fields and industries.

For more info, please follow this link.

Read More

Locations:

L3Harris Engineering Center 101A: 101 A [ View Website ]

Contact:


Calendar:

CS/CRCV Seminars

Category:

Speaker/Lecture/Seminar

Tags:

UCFCRCV