Problem of the week

Problem of the week (February 2, 2018)
In a FNN (or CNN) each neuron is connected to all (or some) neurons in the previous and next layers, but never to neurons from the same layer. What might be the reason why neurons are not connected with other neurons in the same layer?

------------------------------------------------------------------------------------------------------------------------

Problem of the week (Jan 20, 2018)
Among all the feedforward neural networks with a given number of hidden neurons, N, which architecture is the most proned to overfitting training data?
(By "architecture" we mean here the number of neurons in each hidden layer. Assume also that N is larger that 4).

Answer: The neural network with 2 hidden layers having N/2 neurons in each layer is the FNN with the largest capacity and is the most prone to overfitting. The detailed explanations are given in video, slide 1 and slide 2.
------------------------------------------------------------------------------------------------------------------------

Problem of the week (Jan 12, 2018)

Assume we have the choice to choose between:

In both cases we have 500 hidden neurons. Which one would you choose? (Assume, for instance, that you need this network for MNIST classification purposes).

Answer: You should choose the 1-hidden layer. The 10 hidden layer FNN produces an overfit for the MNIST data. The 1-hidden layer performs more accurate and generalizes better. The 10-hidden layers runs faster though. For detailed explanations see video, slide1, slide2, and codes for the 1-hiden layer and 10-hidden layers FNNs.