Location:Engineering building (1103), room 329Title:Exploring Deep Neural Collapse via Extended and Controlled Unconstrained Features ModelsAbstract:Training deep neural networks for classification often includes minimizing the training loss beyond the zero training error point. In this phase of training, a "neural collapse" (NC) behavior has been empirically observed: the variability of features (outputs of the penultimate layer) ... Read more
Zoom link: https://us02web.zoom.us/j/4685913265Title:Graph Representation Learning Through Recoverability Abstract:Self-supervised learning methods became popular for graph representation learning because they do not rely on manual labels and offer better generalization. Contrastive methods based on mutual information maximization between augmented instances of the same object are widely used in self-supervised learning of representations. For graph-structured data, however, there are two ... Read more
Zoom link: https://us02web.zoom.us/j/4685913265Title:Toward Fast and Efficient Deep Learning Abstract:Deep Neural Networks (DNNs) are now irreplaceable in various applications. However, DNNs require a vast amount of computational resources. In most cases, complex DNNs training requires several machines working in parallel (most commonly using data parallelism). Moreover, DNNs deployment on devices with limited computational power can be challenging and ... Read more
Location:Engineering building (1103), room 329Title:Exploring the Successes and Limitations of Deep Learning Abstract:In this talk, we will explore the successes and limitations of deep learning networks and highlight the need for more rigorous evaluation. Tree ensemble models often outperform deep learning models for tabular data. However, recently there have been claims that some new deep-learning models ... Read more