NTW Logo (Black)

About IEEE

IEEE Membership

Products and Services

Conferences

IEEE Organizations

 

IEEE Nav Bar

 


 

 

https://www.ieee.org/graphics/onepixel.gif

https://www.ieee.org/graphics/onepixel.gif

 

IEEE Signal Processing Society Santa Clara Valley Chapter


https://www.ieee.org/graphics/onepixel.gif

https://www.ieee.org/graphics/onepixel.gif

 

 


Click here for see the full list of upcoming events.


Tuesday, Sep 26, 2017

Hebbian Learning and the LMS Algorithm

This event is hosted/sponsored by IEEE CIS Chapter and co-sponsorted by IEEE SPS Chapter.


Speaker:

Prof. Bernard Widrow, Department of Electrical Engineering, Stanford University

 

Location: ** WARNING: venue is not our usual AMD location **

Texas Instruments, Building E Conference Center, 2900 Semiconductor Drive, Santa Clara, CA 95051 (Google Maps)

 

Schedule:

6:30pm-7:00pm: Registration, Food, Networking

7:00pm-8:00pm: Talk

8:00pm-8:30pm: Q&A and Networking

 

Cost:

FREE for IEEE members

$5 for non-members (pay at door)


 

Abstract:

Hebb's learning rule can be summarized as "neurons that fire together wire together." Wire together means that the weight of the synaptic connection between any two neurons is increased when both are firing. Hebb's rule is a form of unsupervised learning. Hebb introduced the concept of synaptic plasticity, and his rule is widely accepted in the field of neurobiology.

When imagining a neural network trained with this rule, a question naturally arises. What is learned with "fire together wire together," and what purpose could this rule actually have? Not having a good answer has long kept Hebbian learning from engineering applications. The issue is taken up here and possible answers will be forthcoming.

Strictly following Hebb's rule, weights could only increase, never decrease. This would eventually cause all weights to saturate, yielding a useless network. When extending Hebb's rule to make it workable, it was discovered that extended Hebbian learning could be implemented by means of the LMS algorithm. The result was the Hebbian-LMS algorithm.

The LMS (least mean square) algorithm was discovered by Widrow and Hoff in 1959, ten years after Hebb's classic book first appeared. The LMS algorithm optimizes with gradient descent. It is the most widely used learning algorithm today. It has been applied in telecommunications systems, control systems, signal processing, adaptive noise cancelling, adaptive antenna arrays, etc. It is at the foundation of the backpropagation algorithm of Paul Werbos.

Hebb's rule notwithstanding, the nature of the learning algorithm(s) that adapt and control the strength of synaptic connections in animal brains is for the most part unknown. The biochemistry of synaptic plasticity is largely understood, but the overall control algorithm is not understood. A solution to this mystery might be the Hebbian-LMS algorithm, a control process for unsupervised training of neural networks that perform clustering. Considering the structure of neurons, synapses, and neurotransmitters, the electrical and chemical signals necessary for the implementation of the Hebbian-LMS algorithm seem to be all there. Hebbian-LMS seems to be a natural algorithm. It is proving to be a simple useful algorithm that is easy to make work. Neuron to neuron connections are as simple as can be. All this raises a question. Could a brain or major portion of a brain be implemented with basic building blocks that perform clustering? Is clustering nature's fundamental neurological building block?

On the engineering side, layered neural networks trained with Hebbian-LMS have been simulated. Hidden layers are trained, unsupervised, with Hebbian-LMS while the output layer is trained with classic LMS, supervised. The hidden layers perform clustering. The output layer is fed clustered inputs, and from this makes the final classification decisions. Networks that are not layered, for example randomly connected, can be implemented with Hebbian-LMS neurons to provide inputs to an output classifier. The same training algorithm could be utilized.

The Hebbian-LMS network is a general purpose trainable classifier and gives performance comparable to a layered network trained with the backpropagation algorithm. The Hebbian-LMS network is much simpler to implement and easier to make work. It is early to predict, but it seems highly likely that Hebbian-LMS will have many engineering applications to clustering, pattern classification, signal processing, control systems, and to machine learning.



Biography:

Bernard Widrow received the S.B., S.M., and Sc.D. degrees in Electrical Engineering from the Massachusetts Institute of Technology in 1951, 1953, and 1956, respectively. He joined the MIT faculty and taught there from 1956 to 1959. In 1959, he joined the faculty of Stanford University, where he is currently Professor of Electrical Engineering, Emeritus.

He began research on adaptive digital filters, learning processes, and artificial neural networks in 1957. Together with M.E.Hoff, Jr. his first doctoral student at Stanford, he invented the LMS algorithm in the autumn of 1959. Today, this is the most widely used learning algorithm, used in every MODEM in the world. He has continued working on adaptive signal processing, adaptive noise cancelling, adaptive antennas, adaptive controls, and neural networks, since that time.

Dr. Widrow is a Life Fellow of the IEEE and a Fellow of AAAS. He received the IEEE Centennial Medal in 1984, the Alexander Graham Medal in 1986, the IEEE Signal Processing Society Medal in 1986, the IEEE Neural Networks Pioneer Medal in 1991, the IEEE Millennium Medal in 2000, and the Benjamin Franklin Medal for Engineering from the Franklin Institute of Philadelphia in 2001. He was inducted into the National Academy of Engineering in 1995, and the Silicon Valley Engineering Hall of Fame in 1999.

Dr. Widrow is a past president and past member of the Governing Board of the International Neural Network Society. He is associate editor of several journals and is the author of more than 125 technical papers and 21 patents. He is co-author of the Prentice-Hall book 'Adaptive Signal Processing', the IEEE Press book 'Adaptive Inverse Control', and the Cambridge University Press book "Quantization Noise".




Subscribe to future announcements: link