# Neural networks In this session we finally venture into the field of neural networks. First we'll train a simple perceptron algorithm to learn the NAND function, as Perceptrons where the historically first version of computed (single-layer) neural networks. While the idea was alread developed in 1943, the first hardware-built Perceptron was created in 1957 by Frank Rosenblatt: the _Mark I Perceptron_. After that we will explore the basics of neural networks based on the interactive examples in Noack & Sanner (2023). Relevant background reading: * A bit of a general background info about the [Perceptron](https://en.wikipedia.org/wiki/Perceptron). * A general explanation of [Boolean algebra](https://en.wikipedia.org/wiki/Boolean_algebra). * What is and does an [Artificial Neuron](https://en.wikipedia.org/wiki/Artificial_neuron) actually do? * Pasquinelli, M. (2017): "[Machines that Morph Logic: Neural Networks and the Distorted Automation of Intelligence as Statistical Inference.](https://www.glass-bead.org/article/machines-that-morph-logic/?lang=enview)" In: Glass Bead Journal 1, pp. 1-17. * Chapter 10 & following about neural networks in Noack & Sanner (2023) (listed in the [](./references.md)) ## Basic Python Percetpron Demo The following code is based on a snippet found on the Wikipedia page [Perceptron, in its version from 24. Dec. 2013](https://en.wikipedia.org/w/index.php?title=Perceptron&oldid=587515588), and extends it by some more debugging output, a limit for the iterations, and some alternative training data sets and weights, to play around with. ```python threshold = 0.5 learning_rate = 0.1 # the original training set and weights for a NAND of 2 values with an additional helper value training_set = [ ((1, 0, 0), 1), ((1, 0, 1), 1), ((1, 1, 0), 1), ((1, 1, 1), 0) ] weights = [0, 0, 0] # a training set and weight only for two values to be NANDed, without any additional helper inputs training_set_dual = [ ((0, 0), 1), ((0, 1), 1), ((1, 0), 1), ((1, 1), 0) ] weights_dual = [0, 0] # actual NAND of 3 values training_set_triple = [ ((1, 0, 0), 1), ((1, 0, 1), 1), ((1, 1, 0), 1), ((1, 1, 1), 0), ((0, 0, 0), 1), ((0, 0, 1), 1), ((0, 1, 0), 1), ((0, 1, 1), 1), ] # again NAND of 3 values, now with an additional 4th helper input training_set_triple_with_helper = [ ((1, 1, 0, 0), 1), ((1, 1, 0, 1), 1), ((1, 1, 1, 0), 1), ((1, 1, 1, 1), 0), ((1, 0, 0, 0), 1), ((1, 0, 0, 1), 1), ((1, 0, 1, 0), 1), ((1, 0, 1, 1), 1), ] # we also have to add a 4th weight for the additional helper input weights_triple_with_helper = [0, 0, 0, 0] # Set alternative training sets and weights #training_set = training_set_triple_with_helper #weights = weights_triple_with_helper def dot_product(values, weights): return sum(value * weight for value, weight in zip(values, weights)) iteration = 0 while True: print('-' * 60) iteration += 1 print(f'iteration: {iteration}') error_count = 0 for input_vector, desired_output in training_set: result = dot_product(input_vector, weights) > threshold error = desired_output - result print(f'Weights: {weights} Input: {input_vector} Result: {result} Error: {error}') if error != 0: error_count += 1 for index, value in enumerate(input_vector): weights[index] += learning_rate * error * value if error_count == 0 or iteration > 100: break ``` ## More realistic basic neural networks When it comes to how actual artificial digital neurons are used in neural networks, we already saw the basic learning principle in linear regression code in [](./markov_and_linreg.md#getting-into-linear-regression). Kylie Ying gives us some more insights in her [Machine Learning for Everybody](https://www.freecodecamp.org/news/machine-learning-for-everybody/) course, specifically in the following parts: * [Training Model](https://youtu.be/i_LwzRVP7bg?t=1197) (20:41 to 30:45) * [Neural networks](https://youtu.be/i_LwzRVP7bg?t=5984) (1:40:00 to 1:47:55) And a nicely visualised explanation on how we can use neural networks in a very simple structure as a kind of multi-layer perceptron to recoginse handwritte numbers is proided by _3Blue1Brown_: [But what is a neural network? | Chapter 1, Deep learning](https://www.youtube.com/watch?v=aircAruvnKk)