Computer Science and Center for Research in Computer Vision

Monday, May 01, 2023

  • Peeling Back the Layers of Deep Neural Networks—A Venture from Implementation Perspective

    MSB: 318 and Virtual

    Speaker: Dr. Aritra Dutta From: University of Southern Denmark Abstract When there is a lot of training data or the deep neural network is too large, distributed parallel training becomes essential, which refers to either data or model parallelism. In both cases, parallelism introduces various overheads. Network communication is one such significant overhead in large-scale distributed deep learning. To mitigate …

    CS/CRCV Seminars