title

10.4: Neural Networks: Multilayer Perceptron Part 1 - The Nature of Code

description

In this video, I move beyond the Simple Perceptron and discuss what happens when you build multiple layers of interconnected perceptrons ("fully-connected network") for machine learning.
Next video: https://youtu.be/IlmNhFxre0w
This video is part of Chapter 10 of The Nature of Code (http://natureofcode.com/book/chapter-10-neural-networks/)
This video is also part of session 4 of my Spring 2017 ITP "Intelligence and Learning" course (https://github.com/shiffman/NOC-S17-2-Intelligence-Learning/tree/master/week4-neural-networks)
Support this channel on Patreon: https://patreon.com/codingtrain
To buy Coding Train merchandise: https://www.designbyhumans.com/shop/codingtrain/
To donate to the Processing Foundation: https://processingfoundation.org/
Send me your questions and coding challenges!: https://github.com/CodingTrain/Rainbow-Topics
Contact:
Twitter: https://twitter.com/shiffman
The Coding Train website: http://thecodingtrain.com/
Links discussed in this video:
The Nature of Code: http://natureofcode.com/
Session 4 of Intelligence and Learning: https://github.com/shiffman/NOC-S17-2-Intelligence-Learning/tree/master/week4-neural-networks
Perceptron on Wikipedia: https://en.wikipedia.org/wiki/Perceptron
My Simple Artificial Neural Network JavaScript Library: https://github.com/shiffman/Neural-Network-p5
My video on AND and OR: https://youtu.be/r2S7j54I68c
My video on Perceptrons: https://youtu.be/ntKn5TPHHAk
kwichmann's Learning XOR with a neural net: https://kwichmann.github.io/ml_sandbox/XOR_nn/
Books discussed in this video:
Tariq Rashid's Make Your Own Neural Network: http://amzn.to/2tcVeFS
Marvin Minsky's Perceptrons: http://amzn.to/2u8Jv8f
Source Code for the all Video Lessons: https://github.com/CodingTrain/Rainbow-Code
p5.js: https://p5js.org/
Processing: https://processing.org
The Nature of Code playlist: https://www.youtube.com/user/shiffman/playlists?view_as=subscriber&shelf_id=6&view=50&sort=dd
For More Coding Challenges: https://www.youtube.com/playlist?list=PLRqwX-V7Uu6ZiZxtDDRCi6uhfTH4FilpH
For More Intelligence and Learning: https://www.youtube.com/playlist?list=PLRqwX-V7Uu6YJ3XfHhT2Mm4Y5I99nrIKX
Help us caption & translate this video!
http://amara.org/v/8JC0/
ðŸ“„ Code of Conduct: https://github.com/CodingTrain/Code-of-Conduct

detail

{'title': '10.4: Neural Networks: Multilayer Perceptron Part 1 - The Nature of Code', 'heatmap': [{'end': 130.068, 'start': 25.462, 'weight': 0.743}, {'end': 204.239, 'start': 185.233, 'weight': 0.73}, {'end': 709.002, 'start': 622.563, 'weight': 0.729}], 'summary': 'Explores the limitations of single perceptrons and the need for multi-layered perceptrons to solve non-linearly separable problems, discussing the structure of multi-layered perceptrons, the significance of hidden layers, and the feed forward algorithm.', 'chapters': [{'end': 191.555, 'segs': [{'end': 50.777, 'src': 'embed', 'start': 25.462, 'weight': 0, 'content': [{'end': 34.19, 'text': 'And we drew a line in between and we were trying to classify some points that are on one side of the line and some other points that are on another side of the line.', 'start': 25.462, 'duration': 8.728}, {'end': 40.072, 'text': 'So that was a scenario where we had the single perceptron, the sort of like processing unit.', 'start': 35.089, 'duration': 4.983}, {'end': 42.973, 'text': 'We can call it the neuron or the processor.', 'start': 40.112, 'duration': 2.861}, {'end': 44.594, 'text': 'And it received inputs.', 'start': 43.353, 'duration': 1.241}, {'end': 50.777, 'text': 'It had like x0 and x1 were like the x and y coordinates of the point.', 'start': 45.494, 'duration': 5.283}], 'summary': 'Using single perceptron to classify points based on a line', 'duration': 25.315, 'max_score': 25.462, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg25462.jpg'}, {'end': 169.323, 'src': 'heatmap', 'start': 25.462, 'weight': 2, 'content': [{'end': 34.19, 'text': 'And we drew a line in between and we were trying to classify some points that are on one side of the line and some other points that are on another side of the line.', 'start': 25.462, 'duration': 8.728}, {'end': 40.072, 'text': 'So that was a scenario where we had the single perceptron, the sort of like processing unit.', 'start': 35.089, 'duration': 4.983}, {'end': 42.973, 'text': 'We can call it the neuron or the processor.', 'start': 40.112, 'duration': 2.861}, {'end': 44.594, 'text': 'And it received inputs.', 'start': 43.353, 'duration': 1.241}, {'end': 50.777, 'text': 'It had like x0 and x1 were like the x and y coordinates of the point.', 'start': 45.494, 'duration': 5.283}, {'end': 53.258, 'text': 'It also had this thing called a bias.', 'start': 51.537, 'duration': 1.721}, {'end': 56.32, 'text': 'And then it generated an output.', 'start': 54.459, 'duration': 1.861}, {'end': 62.083, 'text': 'Each one of these inputs was connected to the processor with a weight.', 'start': 58.881, 'duration': 3.202}, {'end': 67.497, 'text': 'weight 1, weight 2, or whatever, weight, weight, weight.', 'start': 65.174, 'duration': 2.323}, {'end': 73.625, 'text': 'And the processor creates a weighted sum of all the inputs multiplied by the weights.', 'start': 68.238, 'duration': 5.387}, {'end': 80.413, 'text': 'That weighted sum is passed through an activation function to generate the output.', 'start': 74.125, 'duration': 6.288}, {'end': 83.256, 'text': "So why isn't this good enough?", 'start': 81.194, 'duration': 2.062}, {'end': 90.113, 'text': "Let's first think about what's the limit here?", 'start': 86.371, 'duration': 3.742}, {'end': 97.837, 'text': 'So the idea is that what if I want any number of inputs to generate any number of outputs?', 'start': 90.173, 'duration': 7.664}, {'end': 105.16, 'text': "That's the essence of what I want to do in a lot of different machine learning applications.", 'start': 99.537, 'duration': 5.623}, {'end': 115.025, 'text': "Let's take a very classic classification algorithm, which is to say OK, well, what if I have a handwritten digit like the number 8,", 'start': 105.2, 'duration': 9.825}, {'end': 121.849, 'text': 'and I have all of the pixels of this digit and I want those to be the inputs to this perceptron?', 'start': 115.025, 'duration': 6.824}, {'end': 130.068, 'text': 'and I want the output to tell me a set of probabilities as to which digit it is?', 'start': 121.849, 'duration': 8.219}, {'end': 133.43, 'text': "So the output should look something like there's a 0.1 chance it's a 0.", 'start': 130.469, 'duration': 2.961}, {'end': 134.85, 'text': "There's a 0.2 chance it's a 1.", 'start': 133.43, 'duration': 1.42}, {'end': 135.988, 'text': "There's a 0.1 chance it's a 2, 0, 3, 4, 5, 6, 7.", 'start': 134.85, 'duration': 1.138}, {'end': 138.391, 'text': "Oh, and there's like a 0.99 chance it's an 8 and a 0.05 chance it's a 10.", 'start': 135.991, 'duration': 2.4}, {'end': 140.992, 'text': "And I don't think I got those to add up to 1, but you get the idea.", 'start': 138.391, 'duration': 2.601}, {'end': 161.619, 'text': 'So the idea here is that we want to be able to have some type of processing unit that can take an arbitrary amount of inputs.', 'start': 153.675, 'duration': 7.944}, {'end': 166.021, 'text': 'Like maybe this is a 28 by 28 pixel image.', 'start': 161.859, 'duration': 4.162}, {'end': 169.323, 'text': "So there's 784 grayscale values.", 'start': 166.381, 'duration': 2.942}], 'summary': 'Single perceptron processes inputs to generate outputs, but not suitable for handling arbitrary number of inputs and outputs, as needed in many machine learning applications.', 'duration': 88.129, 'max_score': 25.462, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg25462.jpg'}], 'start': 0.289, 'title': 'Perceptron limitations and neural network exploration', 'summary': 'Delves into the limitations of a single perceptron, its components, and activation function, stressing the necessity to explore advanced neural network architectures. it addresses the challenges of using a single processing unit for handling multiple inputs and outputs in machine learning, exemplified by a handwritten digit classification algorithm.', 'chapters': [{'end': 80.413, 'start': 0.289, 'title': 'Coding a perceptron and beyond', 'summary': 'Discusses the concept of a single perceptron, its components (inputs, weights, bias), and the activation function, emphasizing the potential limitations and the need to explore more advanced neural network architectures.', 'duration': 80.124, 'highlights': ['The chapter emphasizes the concept of a single perceptron, its components (inputs, weights, bias), and the activation function. It explains the functionality of a single perceptron, including its inputs, weights, bias, and the activation function.', 'It discusses the limitations of a single perceptron and the need to explore more advanced neural network architectures. The chapter highlights the potential limitations of a single perceptron and underscores the importance of exploring more advanced neural network architectures.']}, {'end': 191.555, 'start': 81.194, 'title': 'Limitations of single processing unit', 'summary': 'Discusses the limitations of using a single processing unit to handle an arbitrary number of inputs and outputs in machine learning applications, using the example of a handwritten digit classification algorithm that requires a perceptron to produce a set of probabilities for each digit based on its input pixels.', 'duration': 110.361, 'highlights': ['The essence of the limitation lies in the need to handle any number of inputs to generate any number of outputs in various machine learning applications. This highlights the fundamental challenge of handling a variable number of inputs and outputs in machine learning, emphasizing the need for flexible processing units.', 'The example of a handwritten digit classification algorithm showcases the requirement for a processing unit to provide a set of probabilities for each digit based on its input pixels. The specific example of classifying handwritten digits illustrates the practical need for a processing unit to produce a range of probabilities based on input data, demonstrating the real-world relevance of the discussed limitation.', 'The mention of a 28 by 28 pixel image, containing 784 grayscale values, emphasizes the challenge of handling a large number of inputs in machine learning applications. This highlights the specific technical challenge of dealing with a high volume of input data in machine learning, illustrating the complexity and scale of the input requirements that the processing unit needs to handle.']}], 'duration': 191.266, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg289.jpg', 'highlights': ['The chapter emphasizes the concept of a single perceptron, its components (inputs, weights, bias), and the activation function.', 'The chapter highlights the potential limitations of a single perceptron and underscores the importance of exploring more advanced neural network architectures.', 'The example of a handwritten digit classification algorithm showcases the requirement for a processing unit to provide a set of probabilities for each digit based on its input pixels.', 'The specific example of classifying handwritten digits illustrates the practical need for a processing unit to produce a range of probabilities based on input data, demonstrating the real-world relevance of the discussed limitation.', 'The mention of a 28 by 28 pixel image, containing 784 grayscale values, emphasizes the challenge of handling a large number of inputs in machine learning applications.']}, {'end': 739.654, 'segs': [{'end': 258.079, 'src': 'embed', 'start': 226.312, 'weight': 1, 'content': [{'end': 230.336, 'text': "So what does that mean anyway? And why should you care about that? So let's think about this.", 'start': 226.312, 'duration': 4.024}, {'end': 244.309, 'text': 'This over here is a linearly separable problem, meaning I need to classify this stuff And if I were to visualize all that stuff,', 'start': 231.276, 'duration': 13.033}, {'end': 253.757, 'text': "I can draw a line in between this stuff that's of this class and this stuff that's with this class.", 'start': 244.309, 'duration': 9.448}, {'end': 256.779, 'text': 'The stuff itself is separable by a line.', 'start': 254.137, 'duration': 2.642}, {'end': 258.079, 'text': 'In three dimensions.', 'start': 256.978, 'duration': 1.101}], 'summary': 'The problem is linearly separable, requiring classification and visualized as separable by a line in three dimensions.', 'duration': 31.767, 'max_score': 226.312, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg226312.jpg'}, {'end': 374.208, 'src': 'embed', 'start': 333.673, 'weight': 2, 'content': [{'end': 345.358, 'text': 'Well, in those videos I talked about operations like AND and OR, which in computer programming syntax are often written double ampersand or two pipes.', 'start': 333.673, 'duration': 11.685}, {'end': 357.364, 'text': 'The idea being that if I were to make a truth table, true, true, false, false.', 'start': 346.079, 'duration': 11.285}, {'end': 360.385, 'text': "So what I'm doing now is I'm showing you a truth table.", 'start': 357.864, 'duration': 2.521}, {'end': 363.487, 'text': 'I have two elements.', 'start': 360.525, 'duration': 2.962}, {'end': 374.208, 'text': "I'm saying, what if I say A and the B? So if A is true, whoa, whoa, whoa.", 'start': 366.761, 'duration': 7.447}], 'summary': 'Explained logical operations and truth tables in computer programming.', 'duration': 40.535, 'max_score': 333.673, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg333673.jpg'}, {'end': 488.588, 'src': 'embed', 'start': 459.819, 'weight': 3, 'content': [{'end': 461.82, 'text': 'And or now, ah, all of these become true.', 'start': 459.819, 'duration': 2.001}, {'end': 470.244, 'text': 'Because with an or operation, A or B, I only need one of these to be true in order to get true.', 'start': 462.1, 'duration': 8.144}, {'end': 471.965, 'text': 'But if both are false, I get false.', 'start': 470.304, 'duration': 1.661}, {'end': 475.527, 'text': 'And guess what? Still a linearly separable problem.', 'start': 472.085, 'duration': 3.442}, {'end': 478.888, 'text': 'And is literally separable.', 'start': 476.967, 'duration': 1.921}, {'end': 481.049, 'text': 'Or is literally separable.', 'start': 479.469, 'duration': 1.58}, {'end': 484.931, 'text': 'We could have a perceptron learn to do both of those things.', 'start': 481.31, 'duration': 3.621}, {'end': 488.588, 'text': 'Hold on a second.', 'start': 487.668, 'duration': 0.92}], 'summary': 'An or operation requires only one true input, still a linearly separable problem.', 'duration': 28.769, 'max_score': 459.819, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg459819.jpg'}, {'end': 726.499, 'src': 'heatmap', 'start': 622.563, 'weight': 0, 'content': [{'end': 625.944, 'text': 'And this is a perceptron that knows how to solve OR.', 'start': 622.563, 'duration': 3.381}, {'end': 633.905, 'text': 'What if I took those same inputs and sent them into both? And then I got the output here.', 'start': 626.944, 'duration': 6.961}, {'end': 640.187, 'text': 'So this output would give me the result of AND.', 'start': 634.305, 'duration': 5.882}, {'end': 643.347, 'text': 'And this output would give me the result of OR.', 'start': 640.727, 'duration': 2.62}, {'end': 650.689, 'text': 'Well, what is XOR really? XOR is actually OR but not AND.', 'start': 643.767, 'duration': 6.922}, {'end': 659.748, 'text': 'Right? So if I can solve something and is linearly separable, not and is also linearly separable.', 'start': 651.983, 'duration': 7.765}, {'end': 670.188, 'text': 'So what I want then is for both of these outputs actually to go into another perceptron that would then be and.', 'start': 660.449, 'duration': 9.739}, {'end': 680.53, 'text': 'So if this perceptron can solve not and and this perceptron can solve or and those outputs can come into here, then this would be the result of both,', 'start': 671.068, 'duration': 9.462}, {'end': 683.95, 'text': 'or is true and not, and is true, which is actually this', 'start': 680.53, 'duration': 3.42}, {'end': 688.471, 'text': 'These are the only two things where or is true but not and.', 'start': 684.411, 'duration': 4.06}, {'end': 698.893, 'text': 'And so the idea here is that more complex problems that are not linearly separable can be solved by linking multiple perceptrons together.', 'start': 688.891, 'duration': 10.002}, {'end': 704.238, 'text': 'And this is the idea of a multi-layered perceptron.', 'start': 699.133, 'duration': 5.105}, {'end': 709.002, 'text': 'We have multiple layers.', 'start': 707.501, 'duration': 1.501}, {'end': 711.244, 'text': 'And this is still a very simple diagram.', 'start': 709.042, 'duration': 2.202}, {'end': 714.687, 'text': 'You could think of this almost as like if you were designing a circuit right?', 'start': 711.284, 'duration': 3.403}, {'end': 726.499, 'text': 'If you decide whether electricity should flow and these were switches, how could you have an LED turn on with exclusive ore?', 'start': 714.908, 'duration': 11.591}], 'summary': "Perceptrons combined for xor, showing multi-layered perception's capabilities.", 'duration': 103.936, 'max_score': 622.563, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg622563.jpg'}], 'start': 192.376, 'title': 'Perceptrons and multi-layered perceptron', 'summary': 'Discusses limitations of simple perceptrons in solving linearly separable problems, the need for multi-layered perceptrons to solve non-linearly separable problems, and the comparison of and and or operations in a truth table. it also covers the concept of linearly separable problems and how perceptrons can be used to solve boolean operations like and, or, and xor, and the idea of multi-layered perceptrons to solve more complex problems that are not linearly separable.', 'chapters': [{'end': 409.546, 'start': 192.376, 'title': 'Perceptrons and multi-layered perceptron', 'summary': 'Highlights the limitations of a simple perceptron in solving linearly separable problems, the need for multi-layered perceptrons to solve non-linearly separable problems, and the comparison of and and or operations in a truth table.', 'duration': 217.17, 'highlights': ["The book 'Perceptrons' by Marvin Minsky and Seymour Papert published in 1969 discusses the limitations of a simple perceptron in solving linearly separable problems.", 'Linearly separable problems can be solved by drawing a line or plane to separate the classes, while most interesting problems are not linearly separable.', 'The need for multi-layered perceptrons is emphasized to address non-linearly separable problems such as the XOR problem, in contrast to the limitations of a simple perceptron.', 'Comparison of AND and OR operations in a truth table is demonstrated, showcasing the different results based on logical inputs.']}, {'end': 739.654, 'start': 409.566, 'title': 'Boolean operations and perceptrons', 'summary': 'Explains the concept of linearly separable problems and how perceptrons can be used to solve boolean operations like and, or, and xor, and the idea of multi-layered perceptrons to solve more complex problems that are not linearly separable.', 'duration': 330.088, 'highlights': ['Perceptrons can solve linearly separable problems like AND and OR by drawing a line to separate the true and false values, and XOR is not linearly separable, requiring multiple perceptrons and a multi-layered perceptron to solve. Perceptrons can solve linearly separable problems like AND and OR.', 'XOR is an exclusive OR operation, and it is not linearly separable, requiring multiple perceptrons and a multi-layered perceptron to solve. XOR is not linearly separable, requiring multiple perceptrons and a multi-layered perceptron to solve.', 'The concept of multi-layered perceptrons is introduced as a solution for solving more complex problems that are not linearly separable by linking multiple perceptrons together. Introduction of multi-layered perceptrons as a solution for solving more complex problems.']}], 'duration': 547.278, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg192376.jpg', 'highlights': ['The need for multi-layered perceptrons to address non-linearly separable problems is emphasized.', 'Linearly separable problems can be solved by drawing a line or plane to separate the classes.', 'Comparison of AND and OR operations in a truth table is demonstrated, showcasing different results based on logical inputs.', 'Perceptrons can solve linearly separable problems like AND and OR by drawing a line to separate the true and false values.', 'The concept of multi-layered perceptrons is introduced as a solution for solving more complex problems that are not linearly separable.']}, {'end': 943.008, 'segs': [{'end': 799.583, 'src': 'embed', 'start': 739.654, 'weight': 2, 'content': [{'end': 743.779, 'text': 'take that previous perceptron example and just take it a few steps farther to do exactly this.', 'start': 739.654, 'duration': 4.125}, {'end': 755.146, 'text': "But what I'm going to do actually in the next videos is diagram out this structure of a multi-layered perceptron how the inputs, how the outputs work,", 'start': 744.724, 'duration': 10.422}, {'end': 762.868, 'text': 'how the feed forward algorithm works, where the inputs come in, get multiplied by weights, get summed together and generate an output,', 'start': 755.146, 'duration': 7.722}, {'end': 768.889, 'text': 'and build a simple JavaScript library that has all the pieces of that neural network system in it.', 'start': 762.868, 'duration': 6.021}, {'end': 776.9, 'text': 'So I hope that this video kind of gives you a nice follow up from the perceptron and a sense of why this is important.', 'start': 770.21, 'duration': 6.69}, {'end': 779.503, 'text': "And I'm not sure if I'm done yet.", 'start': 777.761, 'duration': 1.742}, {'end': 783.349, 'text': "I'm going to go check the live chat and see if there are any questions or important things that I missed.", 'start': 779.643, 'duration': 3.706}, {'end': 784.61, 'text': 'And then this video will be over.', 'start': 783.649, 'duration': 0.961}, {'end': 785.571, 'text': "OK, I'm back.", 'start': 785.251, 'duration': 0.32}, {'end': 787.273, 'text': 'So there was one question which is important.', 'start': 785.611, 'duration': 1.662}, {'end': 790.655, 'text': 'Like oh, I heard somebody in the chat ask what about the hidden layer?', 'start': 787.293, 'duration': 3.362}, {'end': 794.939, 'text': "And so this is jumping ahead a little bit, because I'm going to get to this in more detail in the next video.", 'start': 790.936, 'duration': 4.003}, {'end': 797.741, 'text': 'The way that I drew this diagram is pretty awkward.', 'start': 795.319, 'duration': 2.422}, {'end': 799.583, 'text': 'Let me try to fix this up for a second.', 'start': 797.761, 'duration': 1.822}], 'summary': 'Exploring multi-layered perceptron structure and feed forward algorithm in javascript library development.', 'duration': 59.929, 'max_score': 739.654, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg739654.jpg'}, {'end': 883.851, 'src': 'embed', 'start': 851.831, 'weight': 0, 'content': [{'end': 856.414, 'text': "And they're called hidden because as a kind of user of the system, we don't necessarily see them.", 'start': 851.831, 'duration': 4.583}, {'end': 860.017, 'text': 'A user of the system is feeding in data and looking at the output.', 'start': 856.734, 'duration': 3.283}, {'end': 862.939, 'text': 'The hidden layer, in a sense, is where the magic happens.', 'start': 860.277, 'duration': 2.662}, {'end': 868.523, 'text': 'The hidden layer is what allows one to get around this sort of linearly separable question.', 'start': 863.219, 'duration': 5.304}, {'end': 876.667, 'text': 'So, the more hidden layers, the more neurons, the more amount of complexity, in a way that the system the more weights,', 'start': 868.983, 'duration': 7.684}, {'end': 878.468, 'text': 'the more parameters that need to be tweaked.', 'start': 876.667, 'duration': 1.801}, {'end': 881.569, 'text': "And we'll see that as they start to build the neural network library.", 'start': 878.688, 'duration': 2.881}, {'end': 883.851, 'text': 'The way that I want that library to be set up.', 'start': 881.87, 'duration': 1.981}], 'summary': 'Hidden layers in neural networks enable complexity and flexibility, requiring more parameters and weights to be adjusted.', 'duration': 32.02, 'max_score': 851.831, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg851831.jpg'}, {'end': 936.582, 'src': 'embed', 'start': 904.24, 'weight': 3, 'content': [{'end': 905.62, 'text': "That's something called a recurrent network.", 'start': 904.24, 'duration': 1.38}, {'end': 914.023, 'text': 'Convolutional network is if this kind of like set of image processing operations almost happens early on before as one of the layers.', 'start': 905.9, 'duration': 8.123}, {'end': 915.344, 'text': "So there's a lot of stuff.", 'start': 914.163, 'duration': 1.181}, {'end': 917.786, 'text': 'in the grand scheme of things to get to.', 'start': 916.104, 'duration': 1.682}, {'end': 920.208, 'text': 'But this is the fundamental building blocks.', 'start': 918.046, 'duration': 2.162}, {'end': 921.309, 'text': 'So OK.', 'start': 920.808, 'duration': 0.501}, {'end': 924.672, 'text': "So in the next video, I'm going to start building the library.", 'start': 921.409, 'duration': 3.263}, {'end': 927.374, 'text': 'And to be honest, I think what I need to do.', 'start': 924.792, 'duration': 2.582}, {'end': 928.695, 'text': 'No, no, no, no.', 'start': 927.394, 'duration': 1.301}, {'end': 936.582, 'text': "Yeah In the next video, I'm going to set up the basic skeleton of the neural network library and look at all the pieces that we need.", 'start': 929.236, 'duration': 7.346}], 'summary': 'Discussion about recurrent and convolutional networks, and plans to build a neural network library in the next video.', 'duration': 32.342, 'max_score': 904.24, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg904240.jpg'}], 'start': 739.654, 'title': 'Neural network structure and building blocks', 'summary': 'Discusses the structure of a multi-layered perceptron, the concept of a hidden layer, and the impact of hidden layers on system complexity. it also explores the feed forward algorithm and emphasizes the need to tweak more parameters with an increase in hidden layers.', 'chapters': [{'end': 799.583, 'start': 739.654, 'title': 'Understanding multi-layered perceptron', 'summary': 'Discusses the structure of a multi-layered perceptron, the feed forward algorithm, and the importance of building a simple javascript library for neural network system. it also addresses the concept of a hidden layer and potential changes to the diagram.', 'duration': 59.929, 'highlights': ['The chapter discusses the structure of a multi-layered perceptron and the feed forward algorithm.', 'The importance of building a simple JavaScript library for a neural network system is emphasized.', 'The concept of a hidden layer and potential changes to the diagram are mentioned.']}, {'end': 943.008, 'start': 801.804, 'title': 'Neural network building blocks', 'summary': 'Introduces the concept of a three-layer neural network, explaining the input, hidden, and output layers, and emphasizes the impact of hidden layers on the complexity of the system, indicating the need to tweak more parameters as the number of hidden layers increases.', 'duration': 141.204, 'highlights': ["The hidden layer in a neural network is where the 'magic' happens, allowing the system to solve non-linearly separable problems with more complexity and parameters to be tweaked.", 'The number of hidden layers and neurons impacts the complexity of the system, requiring more weights and parameters to be adjusted, as demonstrated when building the neural network library.', 'The chapter sets the stage for further exploration of network configurations, including multiple hidden layers, recurrent networks, and convolutional networks, providing a glimpse of the fundamental building blocks of neural networks.']}], 'duration': 203.354, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/u5GAVdLQyIg/pics/u5GAVdLQyIg739654.jpg', 'highlights': ["The hidden layer in a neural network is where the 'magic' happens, allowing the system to solve non-linearly separable problems with more complexity and parameters to be tweaked.", 'The number of hidden layers and neurons impacts the complexity of the system, requiring more weights and parameters to be adjusted, as demonstrated when building the neural network library.', 'The chapter discusses the structure of a multi-layered perceptron and the feed forward algorithm.', 'The chapter sets the stage for further exploration of network configurations, including multiple hidden layers, recurrent networks, and convolutional networks, providing a glimpse of the fundamental building blocks of neural networks.', 'The concept of a hidden layer and potential changes to the diagram are mentioned.', 'The importance of building a simple JavaScript library for a neural network system is emphasized.']}], 'highlights': ["The hidden layer in a neural network is where the 'magic' happens, allowing the system to solve non-linearly separable problems with more complexity and parameters to be tweaked.", 'The need for multi-layered perceptrons to address non-linearly separable problems is emphasized.', 'The chapter sets the stage for further exploration of network configurations, including multiple hidden layers, recurrent networks, and convolutional networks, providing a glimpse of the fundamental building blocks of neural networks.', 'The number of hidden layers and neurons impacts the complexity of the system, requiring more weights and parameters to be adjusted, as demonstrated when building the neural network library.', 'The chapter discusses the structure of a multi-layered perceptron and the feed forward algorithm.', 'The concept of multi-layered perceptrons is introduced as a solution for solving more complex problems that are not linearly separable.', 'The chapter emphasizes the concept of a single perceptron, its components (inputs, weights, bias), and the activation function.', 'The chapter highlights the potential limitations of a single perceptron and underscores the importance of exploring more advanced neural network architectures.', 'The example of a handwritten digit classification algorithm showcases the requirement for a processing unit to provide a set of probabilities for each digit based on its input pixels.', 'The specific example of classifying handwritten digits illustrates the practical need for a processing unit to produce a range of probabilities based on input data, demonstrating the real-world relevance of the discussed limitation.']}