CS Distinguished Colloquium
James E. Smith
University of Wisconsin-Madison (Emeritus)
Title: A Temporal Neural Network Architecture for Online Learning
Zoom link: Join from PC, Mac, Linux, iOS or Android: https://yale.zoom.us/j/98080001575
Or Telephone：203-432-9666 (2-ZOOM if on-campus) or 646 568 7788
Meeting ID: 980 8000 1575
International numbers available: https://yale.zoom.us/u/acO3VIdKco
Host: Abhishek Bhattacharjee
A long-standing proposition is that by emulating the operation of the brain’s neocortex, a spiking neural network (SNN) can achieve similar desirable features: flexible learning, speed, and efficiency. Temporal neural networks (TNNs) are SNNs that communicate and process information encoded as relative spike times (in contrast to spike rates). A TNN architecture is proposed, and as a proof-of-concept, TNN operation is demonstrated within the larger context of online supervised classification. First, through unsupervised learning, a TNN partitions input patterns into clusters based on similarity. The TNN learning process adjusts synaptic weights by using only signals local to each synapse, and global clustering behavior emerges. The TNN then passes a cluster identifier to a simple online supervised decoder which finishes the classification task. Besides features of the overall architecture, several TNN components and methods are new to this work. A long term research objective is a direct hardware implementation. Consequently, the architecture is described at a level analogous to the gate and register transfer levels used in conventional digital design, and processing is done at very low precision.
James E. Smith is Professor Emeritus in the Department of Electrical and Computer Engineering at the University of Wisconsin-Madison. He received his PhD from the University of Illinois in 1976. He then joined the faculty of the University of Wisconsin-Madison, teaching and conducting research ̶ first in fault-tolerant computing, then in computer architecture. He has been involved in a number of computer research and development projects both as a faculty member at Wisconsin and in industry.
Prof. Smith made a number of contributions to the development of superscalar processors. These contributions include basic mechanisms for dynamic branch prediction and implementing precise traps. He has also studied vector processor architectures and worked on the development of innovative microarchitecture paradigms. He received the 1999 ACM/IEEE Eckert-Mauchly Award for these contributions.
For the past several years, he has been studying neuron-based computing paradigms at home along the Clark Fork near Missoula, Montana.