- This event has passed.
LANS Seminar
December 4, 2024 @ 10:30 - 11:30 CST
Seminar Title:Energy Efficient Implementation of the Transformer Architecture for Sequence Modeling using Spiking Neural Networks (SNN)
Speaker:Adarsha Balaji, Postdoc, Mathematics and Computer Science Division, Argonne National Laboratory
Date/Time: December 4, 2024/ 10:30 AM-11:30 AM
Location: See Meeting URL on the cels-seminars website which will require an Argonne login.
Description: Neuromorphic architectures implement bio-logical neurons and synapses to execute machine learning algorithms implemented using spiking neural network (SNN) based neurons, encoding and bio-inspired learning algorithms. These architectures are traditionally energy efficient and therefore, suitable for cognitive information processing on resource and power-constrained environments, ones where sensor and edge nodes of internet-of-things (IoT) operate. In this talk, we will explore SNNs to design transformer models for vision and natural language modeling tasks. We will address the challenge of the inefficient and time-consuming training of large-scale SNNs using existing surrogate learning methods on existing digital accelerators. The proposed methodology explores principles of knowledge distillation for designing transformer-based SNN for inference.
Bio: Adarsha Balaji is a postdoc at the Mathematics and Computer Science division at Argonne National Laboratory. He received his master’s and Ph.D. degree from Drexel University, Philadelphia in 2021 with a focus of the hardware software co-design of Neuromorphic Systems. His current research interests include modeling large-scale SNN for scientific tasks and the hardware-software design space exploration of neuromorphic computing systems, particularly data-flow and power-optimization of spiking neural networks (SNN) hardware.
Please note that the meeting URL for this event can be seen on the cels-seminars website which requires an Argonne login.
See all upcoming talks at https://www.anl.gov/mcs/lans-seminars