This Lead To Another Winter During Which Research In These Techniques Were Stifled

The Field Was Known As Cybernetics Throughout The 1940S To 1960S

However, it was soon proven by Marvin Minsky that the perceptron could not learn some simple functions like XOR (Exclusive OR) and interest in the field waned. The field was known as cybernetics throughout the 1940s to 1960s. This lack of interest resulted in an extended period of little or no research in the field.

The next wave of deep learning was known as connectionism and occured in the 1980s and early 1990s when the backpropagation algorithm was invented by David Rumelhart, Ronald Williams, and Geoffrey Hinton. Backpropagation showed that a neural network could be trained by back propagating the errors so as to provide credit assessment as to which nodes contributed to a final prediction or output. Backpropagation uses the chain rule and together with gradient descent reduces the error contained in the prediction by adjusting the weights slightly at each iteration. In 1989, Yann Lecun demonstrated the first practical use of backpropagation algorithm when at Bell Labs he trained Convolutional Neural Networks (CNNs) to recognize handwritten digits. However, the promises offered by artificial neural networks to solve various learning tasks were not fulfilled because training datasets were very small and computers too slow. This lead to another winter during which research in these techniques were stifled.

Popular Posts