Directions:

Question #1 (Neural Networks and Back-propogation)

Consider a simple neural network consisting of 1 hidden layer containing 4 neurons that use the sigmoid activation function. The network will predict a numeric outcome, so the weighted outputs of each neuron contribute directly to the outcome (rather than being passed into another sigmoid function).

For future reference, this network will applied to Iowa City home sales data set, using sale.amount as the outcome and area.living, area.lot, and bedrooms as the predictors.

Because the outcome is numeric, you will use the squared error cost function: \[Cost = \tfrac{1}{n}(\mathbf{y} - \mathbf{\hat{y}})^T(\mathbf{y} - \mathbf{\hat{y}})\]

Question #2 (Application)

Lab 2, part 2 introduced the MNIST handwritten digit data, which is available at this link:

https://remiller1450.github.io/data/mnist_small.csv

Recall that this CSV file contains a flattened version of the handwritten examples, which were originally 28x28 pixel grayscale images.