Learning Efficiently from Data using Sparse Neural Networks



Zahra Atashgahi (University of Twente)

Zahra is a PhD candidate at the University of Twente, The Netherlands. She has completed her bachelor and master in computer science at Amirkabir University of Technology, Iran, in 2017 and 2019, respectively. Currently, Zahra is a visiting Ph.D. student at the University of Cambridge. During her Ph.D., she focuses on Deep Learning and, particularly, sparse neural networks. She seeks to design environmentally friendly AI systems through the development of computationally efficient deep learning models. During her PhD, she has published in top-tier Machine Learning conferences and journals, including, NeurIPS, ICLR, MLJ and TMLR.



Short Abstract: Sparse neural networks (SNNs) address the high computational complexity of deep neural networks by using sparse connectivity among their layers and aiming to match the predictive performance of their dense counterpart. Pruning dense neural networks is among the most widely used methods to obtain SNNs. Driven by the high training cost of such methods that can be unaffordable for a low-resource device, training SNNs sparsely from scratch has recently gained attention, known as "sparse training" in the literature. In this talk, I will provide a brief introduction to SNNs and recent advances in the field of sparse training. Then, I present how SNNs can be utilized to perform different tasks efficiently with a focus on feature selection.