   Problem of the week

TensorFlow Tutorial

Python Tutorial

Numpy Tutorial

The Fall 2017 meeting activities are given below:

• Lecture of September 22: level-undergraduate
• An introduction to neural networks-slides here
This is a bird eye-view over the history of machine learning, including its three waves: cybernetics, connectionism and deep learning. Also, some applications, such as deep learning, are discussed.
• A few problems leading to machine learning algorithms-slides here
The goal of this lecture was to introduce a few real life problems and model them as machine learning problems. This implies finding the inputs, setting up the weights and the activation functions, and choosing an appropiate error function.
This material decribes in detail the Adaline neuron, introduced by Widrow and Hoff in 1960. The inputs are random variables and the activation function is just the identity. The learning algorithm uses the least mean square. In spite of its simplicity, the beauty is that it can be worked out first in closed form and then the same formula is retrieved using the gradient descent method.

• Error functions-slides here
The lecture presents different types of error functions used by neural networks. We consider the case when the training data are drawn from two random variables. The sum of square differences, cross entropy and Kullback-Leibler divergence are just a few examples we discussed in detail.
• The sigmoid neuron, Learning by Logistic Regression and Neurons with continuum inputs-slides here
The sigmoid neuron is the paradigm of the neuron model. The Logistic regression is used to show how a sigmoid neuron can be used as a classifier. Other neuron models will be discussed, including neurons with coninuum input. The weights in this case are modeled by a measure subject to be found.
• Backpropagation algorithm, part I. The gradient descent method needs formulae for the gradient of the cost function. These formulae depends on the delta sesitivities, which can be obtained using the backpropagation method. See video_part1, video_part2, video_part3
• Lecture of October 28: level-graduate
• Neural networks as universal approximators We have talked about the following results:
A simple two-layer neural network cannot approximate complicated functions. However, any neural network with one hidden layer can approximate any continuous function, given that the input space is compact. Similar results hold for target functions in L^1 and L^2 spaces, as well as for measurable target functions. See a sample of the discussion at video_part1, video_part2
• Lecture of November 4 (this is a Saturday!) : level-advanced graduate
• Learning with one-dimensional inputs
This lecture deals with the case when the input variable is bounded and one-dimensional, x in [0,1]. Besides its simplicity, this case brings a couple of new things: (i) it can be treated elementary, without the arsenal of functional analysis, and, (ii) due to its constructive nature, it provides an explicit algorithm for finding the network weights. Both cases of perceptron and sigmoid neuronal networks with one hidden layer will be covered.

• Lecture of November 11 - This is an online lecture about how to build a 2-layer neural network using Tensor Flow.
• A short intro into MNIST data - video
• The construction of the 2-layers NN that tests the MNIST data at 92% accuracy - video
• The python file can be found Here.
• Lecture of November 18 - This is an online lecture about:
• how to build a 1-hidden layer layer neural network using Tensor Flow-video
• how to fix an instable method -video
• The python file is Here
• Lecture of November 25 - This is an online lecture about how to buid and train a deep neural network.
• How to buid and train a deep neural network with 7 hidden layers. You can experiment changing the size of the hidden layers (the default is set at 100 neurons). The video can be find here.
• The python file is Here.
• Lecture of December 1 - This is a lecture about building CNNs.
• An introduction to the idea of convolution and pooling - video.
• How to buid a one layer convolutional CNN in Tensor Flow - video. The file is Here.
• How to buid a two layers convolutional CNN in Tensor Flow - video. The file is Here.