BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//UNIFY
BEGIN:VTIMEZONE
TZID:America/New_York
X-LIC-LOCATION:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:19700308T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:19701101T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
END:VTIMEZONE


BEGIN:VEVENT
UID:https://events.ucf.edu/event/3756072/understanding-distribution-learning-of-diffusion-models-via-low-dimensional-modeling/
DTSTAMP:20250313T110000
DTSTART:20250313T110000
DTEND:20250313T120000
LOCATION:Virtual and TC2: 222

SUMMARY:Understanding Distribution Learning of Diffusion Models via Low-Dimensional Modeling
URL:https://events.ucf.edu/event/3756072/understanding-distribution-learning-of-diffusion-models-via-low-dimensional-modeling/
DESCRIPTION:Speaker: Dr. Peng Wang\n\nFrom: University of Michigan\n\nAbstract\n\nRecent empirical studies have demonstrated that diffusion models can effectively learn the image distribution and generate new samples. Remarkably, these models can achieve this even with a small number of training samples despite a large image dimension, circumventing the curse of dimensionality. In this work, we provide theoretical insights into this phenomenon by leveraging key empirical observations: (i) the low intrinsic dimensionality of image datasets and (ii) the low-rank property of the denoising autoencoder in trained diffusion models. These observations motivate us to assume the underlying data distribution as a mixture of low-rank Gaussians and to parameterize the denoising autoencoder as a low-rank model. With these setups, we rigorously show that optimizing the training loss of diffusion models is equivalent to solving the canonical subspace clustering problem over the training samples. This insight carries practical implications for training and controlling diffusion models. Specifically, it allows us to characterize precisely the minimal number of samples necessary for learning correctly the low-rank data support, shedding light on the phase transition from memorization to generalization. Moreover, we empirically establish a correspondence between the subspaces and the semantic representations of image data, facilitating image editing. We validate these results with corroborated experimental results on both simulated distributions and image datasets. \n\nFor more info, please follow this [link](https://ai.ucf.edu/wp-content/uploads/2025/03/Peng-Wang-Flyer.pdf).\n\nVirtual Location URL: https://ucf.zoom.us/j/91346028171?pwd=Bvx6dtQb8bHq6jTRIGRzyMahScl92k.1&amp;from=addon
END:VEVENT



END:VCALENDAR