Dissertation Defense: Towards Optimization and Robustification of Data-Driven Models

Tuesday, March 28, 2023 11 a.m. to 1 p.m.

Announcing the Final Examination of Ehsan Kazemi Foroushani for the degree of Doctor of Philosophy

In the past two decades, data-driven models have experienced a renaissance, with notable success achieved through the use of models such as deep neural networks (DNNs) in various applications. However, complete reliance on intelligent machine learning systems is still a distant dream. Nevertheless, the initial success of data-driven approaches presents a promising path for building trustworthy data-oriented models. This thesis aims to take a few steps toward improving the
performance of existing data-driven frameworks in both training and testing phases. Specifically, we focus on several key questions: textit{1) How to efficiently design optimization methods for learning algorithms that can be used in parallel settings and also when first-order information is unavailable? 2) How to revise existing adversarial attacks on DNNs to structured attacks with minimal distortion of benign samples? 3) How to integrate attention models such as Transformers into data-driven inertial navigation systems? 4) How to address the lack of data problem for existing data-driven models and enhance the performance of existing semi-supervised learning (SSL) methods?}

In terms of parallel optimization methods, our research focuses on investigating a delay-aware asynchronous variance-reduced coordinate descent approach. Additionally, we explore the development of a proximal zeroth-order algorithm for nonsmooth nonconvex problems when first-order information is unavailable. We also extend our study to zeroth-order stochastic gradient descent problems. As for robustness, we develop a structured white-box adversarial attack to enhance research on robust machine learning schemes. Furthermore, our research investigates a group threat model in which adversaries can only perturb image segments rather than the entire image to generate adversarial examples. We also explore the use of attention models, specifically Transformer models, for deep inertial navigation systems based on the Inertial Measurement Unit (IMU). In addressing the problem of data scarcity during the training process, we propose a solution that involves quantizing the uncertainty from the unlabeled data and corresponding pseudo-labels, and incorporating it into the loss term to compensate for noisy pseudo-labeling. We also extend the generic semi-supervised method for data-driven noise suppression frameworks by utilizing a reinforcement learning (RL) model to learn contrastive features in an SSL fashion.

Each chapter of the thesis presents the problem and our solutions using concrete algorithms. We verify our approach through comparisons with existing methods on different benchmarks and discuss future research directions.

Committee in Charge: Liqiang Wang, Gita Sukthankar, George Atia, Chen Chen

Read More

Location:


Contact:

College of Graduate Studies 14078232766 editor@ucf.edu

Calendar:

Graduate Thesis and Dissertation

Category:

Uncategorized/Other

Tags:

Thesis and Dissertation defense Computer Science