Problem of the week

TensorFlow Tutorial

Python Tutorial

Numpy Tutorial

The Fall 2017 meeting activities are given below:

  • Lecture of September 22: level-undergraduate
      • An introduction to neural networks-slides here
        This is a bird eye-view over the history of machine learning, including its three waves: cybernetics, connectionism and deep learning. Also, some applications, such as deep learning, are discussed.
  • Lecture of September 29: level-undergraduate and advanced undergraduate
      • A few problems leading to machine learning algorithms-slides here
        The goal of this lecture was to introduce a few real life problems and model them as machine learning problems. This implies finding the inputs, setting up the weights and the activation functions, and choosing an appropiate error function.
      • The Adaline neuron-slides here
        This material decribes in detail the Adaline neuron, introduced by Widrow and Hoff in 1960. The inputs are random variables and the activation function is just the identity. The learning algorithm uses the least mean square. In spite of its simplicity, the beauty is that it can be worked out first in closed form and then the same formula is retrieved using the gradient descent method.

  • Lecture of October 6: level-advanced undergraduate and graduate
      • Error functions-slides here
        The lecture presents different types of error functions used by neural networks. We consider the case when the training data are drawn from two random variables. The sum of square differences, cross entropy and Kullback-Leibler divergence are just a few examples we discussed in detail.