Neural Networks for Machine Learning by Geoffrey Hinton

[ Neural Networks for Machine Learning by Geoffrey Hinton ]

  • Video Lectures Help Center

    Having trouble viewing lectures? Try changing players. Your current player format is html5. Change to flash.


    • Why do we need machine learning? [13 min]
    • What are neural networks? [8 min]
    • Some simple models of neurons [8 min]
    • A simple example of learning [6 min]
    • Three types of learning [8 min]


    • Types of neural network architectures [7 min]
    • Perceptrons: The first generation of neural networks [8 min]
    • A geometrical view of perceptrons [6 min]
    • Why the learning works [5 min]
    • What perceptrons can’t do [15 min]


    • Learning the weights of a linear neuron [12 min]
    • The error surface for a linear neuron [5 min]
    • Learning the weights of a logistic output neuron [4 min]
    • The backpropagation algorithm [12 min]
    • Using the derivatives computed by backpropagation [10 min]


    • Learning to predict the next word [13 min]
    • A brief diversion into cognitive science [4 min]
    • Another diversion: The softmax output function [7 min]
    • Neuro-probabilistic language models [8 min]
    • Ways to deal with the large number of possible outputs [15 min]


    • Why object recognition is difficult [5 min]
    • Achieving viewpoint invariance [6 min]
    • Convolutional nets for digit recognition [16 min]
    • Convolutional nets for object recognition [17min]


    • Overview of mini-batch gradient descent
    • A bag of tricks for mini-batch gradient descent
    • The momentum method
    • Adaptive learning rates for each connection
    • Rmsprop: Divide the gradient by a running average of its recent magnitude


    • Modeling sequences: A brief overview
    • Training RNNs with back propagation
    • A toy example of training an RNN
    • Why it is difficult to train an RNN
    • Long-term Short-term-memory


    • A brief overview of Hessian Free optimization
    • Modeling character strings with multiplicative connections [14 mins]
    • Learning to predict the next character using HF [12 mins]
    • Echo State Networks [9 min]


    • Overview of ways to improve generalization [12 min]
    • Limiting the size of the weights [6 min]
    • Using noise as a regularizer [7 min]
    • Introduction to the full Bayesian approach [12 min]
    • The Bayesian interpretation of weight decay [11 min]
    • MacKay’s quick and dirty method of setting weight costs [4 min]


    • Why it helps to combine models [13 min]
    • Mixtures of Experts [13 min]
    • The idea of full Bayesian learning [7 min]
    • Making full Bayesian learning practical [7 min]
    • Dropout [9 min]


    • Hopfield Nets [13 min]
    • Dealing with spurious minima [11 min]
    • Hopfield nets with hidden units [10 min]
    • Using stochastic units to improv search [11 min]
    • How a Boltzmann machine models data [12 min]


    • Boltzmann machine learning [12 min]
    • OPTIONAL VIDEO: More efficient ways to get the statistics [15 mins]
    • Restricted Boltzmann Machines [11 min]
    • An example of RBM learning [7 mins]
    • RBMs for collaborative filtering [8 mins]


    • The ups and downs of back propagation [10 min]
    • Belief Nets [13 min]
    • Learning sigmoid belief nets [12 min]
    • The wake-sleep algorithm [13 min]


    • Learning layers of features by stacking RBMs [17 min]
    • Discriminative learning for DBNs [9 mins]
    • What happens during discriminative fine-tuning? [8 mins]
    • Modeling real-valued data with an RBM [10 mins]
    • OPTIONAL VIDEO: RBMs are infinite sigmoid belief nets [17 mins]


    • From PCA to autoencoders [5 mins]
    • Deep auto encoders [4 mins]
    • Deep auto encoders for document retrieval [8 mins]
    • Semantic Hashing [9 mins]
    • Learning binary codes for image retrieval [9 mins]
    • Shallow autoencoders for pre-training [7 mins]



Author: iotmaker

I am interested in IoT, robot, figures & leadership. Also, I have spent almost every day of the past 15 years making robots or electronic inventions or computer programs.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s