Gradient Descent Neural Network
Written in C++ with the purpose being to gain an understanding as to exactly how neural
networks work, this network in particular learned to read hand written digits. The data
was provided by MNIST, for which I wrote a small C program to swap the endianness of the
data and parse it for the neural network (this project can be found here).
Upon startup, the neural network forks into two processes to apply a producer and
consumer model to my network. This is to say that while the child process is using the
parser program to load the data into the shared memory, the parent process is using the
gradient to modify the weights of the neurons throughout the network. The shared memory
between the child and parent is explicitely handled by an object specifically meant for
handling the digits read, and thus this object handles the mutex to ensure that the
read/write operations happen aren't interfereing with one another. This project can be
found here on my Github.