ECE Guest Lecturer Series
Deep Learning Inference with Limited Resources
Vincent Gripon
Wednesday, September 4, 2019
Noon1 p.m.
Wegmans Hall 1400
Abstract:
Deep learning architectures are the golden standard for many machine learning problems. Thanks to their large number of trainable parameters, they are able to absorb complex dependencies in the input data to produce correct decisions, when trained appropriately. However, this dependency on a very large number of parameters is also a weakness: their computation and memory footprints are considerable and it is hard — if not impossible — to guarantee their ability to perform well when dealing with corrupted and noisy inputs. In this talk, we shall review the main strategies that have been proposed in the literature to reduce computations and memory of deep learning systems, including quantization, factorization, and pruning. We shall also discuss how adequate are these systems to faulty implementations. In a last part, we will discuss the susceptibility of deep learning architectures to deviations of the inputs, what appears to have become a major open question.
Biography:
Vincent Gripon (S'10, M'12) is a permanent researcher with IMT Atlantique, a french top technical university. He obtained his M.S from École Normale Supérieure Paris-Saclay in 2008 and his PhD from IMT Atlantique in 2011. He spent one year as a visiting scientist at McGill University between 2011 and 2012 and he is currently an invited Professor at Mila and Université de Montréal. His research mainly focuses on efficient implementation of artificial neural networks, graph signal processing, deep learning robustness and associative memories. He co-authored more than 70 papers in these domains in prestigious venues (AAAI, tPAMI, Statistical Physics, TNNLS, TSP...).
Refreshments will be provided