Speaker: Dr. Arman Zharmagambetov
From: Fundamental AI Research (FAIR) group at Meta
Abstract
Modern machine learning (ML) models, trained on real-world data, now underpin a broad spectrum of applications. Behind the success of these models, discrete optimization lays the foundation of the modeling, decision making and enables the downstream applications. Generally, discrete optimization can be utilized in several major areas within ML applications: 1) to model the discreteness of output and input spaces; 2) to investigate model interpretability; 3) to construct efficient models (e.g. compression). In this presentation, with a focus on the first point, I will reflect on my past research in these areas and explore the challenges and opportunities that lie ahead, particularly in the context of modern foundation models.
In the context of the output space, I will discuss Constrained Generation for deep generative models, such as GANs and VAEs. These models have shown impressive capabilities in generating unconstrained objects, like images. However, many design scenarios necessitate the generated objects to adhere to hard combinatorial constraints, in addition to modeling a data distribution. To address this challenge, I will introduce GenCO, a generative framework that ensures constraint satisfaction by leveraging differentiable combinatorial solvers. Turning our attention to the input space, the discussion will focus on Prompt Optimization for Large Language Models and its importance in building more secure and trustworthy AI systems.
For more info, please follow this link.
Read More