title

Neural Networks from Scratch - P.2 Coding a Layer

description

Expanding from a single neuron with 3 inputs to a layer of neurons with 4 inputs.
Neural Networks from Scratch book: https://nnfs.io
Playlist for this series: https://www.youtube.com/playlist?list=PLQVvvaa0QuDcjD5BAw2DxE6OF2tius3V3
Python 3 basics: https://pythonprogramming.net/introduction-learn-python-3-tutorials/
Intermediate Python (w/ OOP): https://pythonprogramming.net/introduction-intermediate-python-tutorial/
Mug link for fellow mug aficionados: https://amzn.to/2Vz9Hs0
Channel membership: https://www.youtube.com/channel/UCfzlCWGWYyIQ0aLC5w48gBQ/join
Discord: https://discord.gg/sentdex
Support the content: https://pythonprogramming.net/support-donate/
Twitter: https://twitter.com/sentdex
Instagram: https://instagram.com/sentdex
Facebook: https://www.facebook.com/pythonprogramming.net/
Twitch: https://www.twitch.tv/sentdex
#nnfs #python #neuralnetworks

detail

{'title': 'Neural Networks from Scratch - P.2 Coding a Layer', 'heatmap': [{'end': 635.607, 'start': 621.156, 'weight': 1}], 'summary': 'Covers building neural networks from scratch, updating inputs to 1, 2, and 3, adjusting weights accordingly, initializing weights and biases, explaining the relationship between biases and weights, modeling a layer of three neurons, and discussing the impact of tweaking parameters on output values.', 'chapters': [{'end': 41.348, 'segs': [{'end': 48.511, 'src': 'embed', 'start': 23.93, 'weight': 0, 'content': [{'end': 29.818, 'text': 'That way, if anybody is following along there, the videos are going to match the book exactly.', 'start': 23.93, 'duration': 5.888}, {'end': 32.862, 'text': "So the first thing that we're going to do is we're going to change these inputs.", 'start': 30.098, 'duration': 2.764}, {'end': 35.005, 'text': 'These are going to be one.', 'start': 33.023, 'duration': 1.982}, {'end': 41.348, 'text': 'two and three, and then the weights, and again these are just totally made up values for now.', 'start': 35.746, 'duration': 5.602}, {'end': 48.511, 'text': "later the neural network will will randomly initialize weights, and then what's going to be tweaking those weights is really um,", 'start': 41.348, 'duration': 7.163}], 'summary': 'Align videos with book, change inputs to 1, 2, and 3, and initialize random weights for neural network.', 'duration': 24.581, 'max_score': 23.93, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU23930.jpg'}], 'start': 5.006, 'title': 'Building neural networks', 'summary': 'Continues building neural networks from scratch, updating inputs to 1, 2, and 3, and adjusting weights accordingly to align video content with the book.', 'chapters': [{'end': 41.348, 'start': 5.006, 'title': 'Neural networks part 2: building upon the basics', 'summary': 'Continues building neural networks from scratch, aligning the video content with the book by updating inputs to 1, 2, and 3, and adjusting the weights accordingly.', 'duration': 36.342, 'highlights': ['The chapter aligns the video content with the book by updating inputs to 1, 2, and 3, and adjusting the weights accordingly.', 'The first order of business is to change the inputs to 1, 2, and 3 and update the weights to match the book exactly.', 'The video aims to ensure that anyone following along will have the videos and the book matching exactly.']}], 'duration': 36.342, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU5006.jpg', 'highlights': ['The chapter aligns the video content with the book by updating inputs to 1, 2, and 3, and adjusting the weights accordingly.', 'The first order of business is to change the inputs to 1, 2, and 3 and update the weights to match the book exactly.', 'The video aims to ensure that anyone following along will have the videos and the book matching exactly.']}, {'end': 525.79, 'segs': [{'end': 66.064, 'src': 'embed', 'start': 41.348, 'weight': 1, 'content': [{'end': 48.511, 'text': "later the neural network will will randomly initialize weights, and then what's going to be tweaking those weights is really um,", 'start': 41.348, 'duration': 7.163}, {'end': 49.952, 'text': 'like back propagation.', 'start': 48.511, 'duration': 1.441}, {'end': 55.294, 'text': "so uh, for now though, we're just, we're just making them up, but i just want to use the same made up values as from the book.", 'start': 49.952, 'duration': 5.342}, {'end': 62.801, 'text': 'so 0.2, 0.8 and negative 0.5, and the bias used was 2..', 'start': 55.294, 'duration': 7.507}, {'end': 64.462, 'text': 'So running this, we get 2.3.', 'start': 62.801, 'duration': 1.661}, {'end': 66.064, 'text': "That's what we wanted.", 'start': 64.462, 'duration': 1.602}], 'summary': 'Neural network weights initialized, tweaked by backpropagation, resulting in 2.3 bias.', 'duration': 24.716, 'max_score': 41.348, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU41348.jpg'}, {'end': 115.447, 'src': 'embed', 'start': 91.831, 'weight': 0, 'content': [{'end': 101.703, 'text': "and then what we've done is we've selected a single neuron from that network and in this case that neuron had three inputs to it.", 'start': 91.831, 'duration': 9.872}, {'end': 106.365, 'text': "Now looking at this you can already see okay we've got three connections but only one neuron.", 'start': 101.783, 'duration': 4.582}, {'end': 112.206, 'text': "So in terms of bias there's only going to be one bias associated with this neuron.", 'start': 106.465, 'duration': 5.741}, {'end': 115.447, 'text': "But because we've got three inputs that means we have three weights.", 'start': 112.246, 'duration': 3.201}], 'summary': 'Selected neuron has 3 inputs, 1 bias, and 3 weights.', 'duration': 23.616, 'max_score': 91.831, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU91831.jpg'}, {'end': 265.129, 'src': 'embed', 'start': 236.221, 'weight': 3, 'content': [{'end': 243.786, 'text': 'a sensor or a group of sensors in this case Or they could be outputs from neurons themselves.', 'start': 236.221, 'duration': 7.565}, {'end': 246.148, 'text': "So I'll show an example of that when we get there.", 'start': 244.307, 'duration': 1.841}, {'end': 251.171, 'text': "But for now, let's say we add one more input.", 'start': 246.188, 'duration': 4.983}, {'end': 256.454, 'text': 'So one, two, three, and then a 2.5.', 'start': 251.211, 'duration': 5.243}, {'end': 257.795, 'text': "So now we've gotten there.", 'start': 256.454, 'duration': 1.341}, {'end': 265.129, 'text': "So again, we have the same fully connected neural network, but this time what we're gonna do is we're gonna select a different neuron.", 'start': 259.366, 'duration': 5.763}], 'summary': 'Introduction to neural networks and adding an extra input.', 'duration': 28.908, 'max_score': 236.221, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU236221.jpg'}, {'end': 338.452, 'src': 'embed', 'start': 310.615, 'weight': 2, 'content': [{'end': 314.077, 'text': "No, we're still only modeling a single neuron.", 'start': 310.615, 'duration': 3.462}, {'end': 322.062, 'text': "It doesn't matter that four neurons were the input to this neuron because the outputs of those four neurons become the input of the new neuron.", 'start': 314.377, 'duration': 7.685}, {'end': 330.747, 'text': "And once that neuron has already generated its output, it's already, if it was like a hidden layer neuron, it's already taken into account the bias.", 'start': 322.922, 'duration': 7.825}, {'end': 338.452, 'text': "So in this case, again, when we're modeling four inputs to a single neuron, we're only going to have a single bias.", 'start': 331.207, 'duration': 7.245}], 'summary': 'Modeling a single neuron with four inputs results in a single bias.', 'duration': 27.837, 'max_score': 310.615, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU310615.jpg'}], 'start': 41.348, 'title': 'Neural network fundamentals', 'summary': 'Covers the initialization of weights and biases, using specific values, and explains the relationship between biases and weights in a neuron, as well as the formula for calculating the output. it also discusses the concept of inputs and neurons, demonstrating their addition and modeling with different configurations, and their impact on weights and biases.', 'chapters': [{'end': 138.208, 'start': 41.348, 'title': 'Neural network weights and bias', 'summary': 'Explained the initialization of weights and biases in a neural network, using made-up values of 0.2, 0.8, -0.5, and 2, and clarified the relationship between the number of biases and weights in a neuron, as well as the formula for calculating the output in the network.', 'duration': 96.86, 'highlights': ['The chapter clarified the relationship between the number of biases and weights in a neuron, explaining that while each input has its own unique weight, every neuron has only one bias, with the specific example of three inputs and one bias.', 'The chapter explained the initialization of weights and biases in a neural network, using made-up values of 0.2, 0.8, -0.5, and 2, and clarified the formula for calculating the output in the network, resulting in a calculated value of 2.3.', 'The chapter discussed the process of tweaking weights through backpropagation and the use of made-up values for the weights and bias, such as 0.2, 0.8, -0.5, and 2, to achieve a desired output of 2.3 in the neural network.']}, {'end': 525.79, 'start': 138.208, 'title': 'Neural network inputs and neurons', 'summary': 'Explains the concept of inputs and neurons in a neural network, demonstrating the addition of inputs and the modeling of neurons with different input configurations, highlighting the impact on weights and biases.', 'duration': 387.582, 'highlights': ['The chapter demonstrates the addition of one more input to a neural network, simulating its location in the middle of the network, resulting in a final output of 2.3.', 'It explains the concept of inputs, which can be values from the input layer of a neural network or outputs from neurons themselves, using examples such as sensor data.', 'The chapter illustrates the modeling of a single neuron with four unique inputs, showing the impact on weights and biases, resulting in a final output of 4.8.', 'It discusses the modeling of three neurons with four inputs, emphasizing the need for unique weight sets and separate biases for each neuron, and highlights the intention to eventually optimize the process using NumPy.']}], 'duration': 484.442, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU41348.jpg', 'highlights': ['The chapter clarified the relationship between the number of biases and weights in a neuron, explaining that while each input has its own unique weight, every neuron has only one bias, with the specific example of three inputs and one bias.', 'The chapter discussed the process of tweaking weights through backpropagation and the use of made-up values for the weights and bias, such as 0.2, 0.8, -0.5, and 2, to achieve a desired output of 2.3 in the neural network.', 'The chapter illustrates the modeling of a single neuron with four unique inputs, showing the impact on weights and biases, resulting in a final output of 4.8.', 'The chapter demonstrates the addition of one more input to a neural network, simulating its location in the middle of the network, resulting in a final output of 2.3.']}, {'end': 874.607, 'segs': [{'end': 576.006, 'src': 'embed', 'start': 525.79, 'weight': 0, 'content': [{'end': 527.172, 'text': "okay, so that's changed.", 'start': 525.79, 'duration': 1.382}, {'end': 531.164, 'text': 'Now, what is going to change here on the output?', 'start': 528.382, 'duration': 2.782}, {'end': 537.429, 'text': "So when we're just modeling a single neuron, the output is going to be a single value.", 'start': 531.625, 'duration': 5.804}, {'end': 542.513, 'text': "In this case, though, we're actually modeling three neurons.", 'start': 537.89, 'duration': 4.623}, {'end': 544.535, 'text': "So we're modeling a layer, so to speak.", 'start': 542.533, 'duration': 2.002}, {'end': 548.558, 'text': 'So the layer output is going to look an awful lot like the input.', 'start': 544.595, 'duration': 3.963}, {'end': 555.684, 'text': 'So in this case, the input, again, it could have been maybe the input layer, but it could also be input from another layer in the neural network.', 'start': 548.578, 'duration': 7.106}, {'end': 564.203, 'text': 'So the output for this should look an awful lot like this, only rather than having four values, it should have three.', 'start': 558.402, 'duration': 5.801}, {'end': 566.804, 'text': "So let's make that change.", 'start': 564.684, 'duration': 2.12}, {'end': 570.505, 'text': "So what we're gonna do is basically convert this.", 'start': 567.064, 'duration': 3.441}, {'end': 572.726, 'text': 'This will now be a list.', 'start': 570.625, 'duration': 2.101}, {'end': 576.006, 'text': "And I'm gonna say comma here.", 'start': 574.106, 'duration': 1.9}], 'summary': 'Modeling three neurons as a layer, converting output to a list.', 'duration': 50.216, 'max_score': 525.79, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU525790.jpg'}, {'end': 644.013, 'src': 'heatmap', 'start': 621.156, 'weight': 1, 'content': [{'end': 629.122, 'text': 'And what we see here for the output for these three neurons is 4.8, 1.21, and 2.385.', 'start': 621.156, 'duration': 7.966}, {'end': 632.164, 'text': "So again, looking at everything, you've got your fully connected neural network.", 'start': 629.122, 'duration': 3.042}, {'end': 635.607, 'text': "We've selected three neurons with four inputs each.", 'start': 632.204, 'duration': 3.403}, {'end': 638.469, 'text': 'Those four inputs for every neuron will remain the same.', 'start': 635.727, 'duration': 2.742}, {'end': 644.013, 'text': "But what's different is each neuron is going to have a unique set of weights for each unique input.", 'start': 638.629, 'duration': 5.384}], 'summary': 'A fully connected neural network with three neurons and four inputs each, yielding respective outputs of 4.8, 1.21, and 2.385.', 'duration': 22.857, 'max_score': 621.156, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU621156.jpg'}, {'end': 685.263, 'src': 'embed', 'start': 661.988, 'weight': 1, 'content': [{'end': 670.615, 'text': 'And when modeling for neuron three again its own unique weight set its own unique bias, and it will also have its own unique output,', 'start': 661.988, 'duration': 8.627}, {'end': 672.096, 'text': 'same with all of the other neurons.', 'start': 670.615, 'duration': 1.481}, {'end': 680.962, 'text': 'And the calculation for that output is just simply going to be those inputs times those weights plus the bias.', 'start': 672.779, 'duration': 8.183}, {'end': 685.263, 'text': "So because we've got three neurons, it's going to be three unique outputs.", 'start': 681.542, 'duration': 3.721}], 'summary': 'Neuron model with unique weights and bias, resulting in three unique outputs.', 'duration': 23.275, 'max_score': 661.988, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU661988.jpg'}, {'end': 776.855, 'src': 'embed', 'start': 747.649, 'weight': 4, 'content': [{'end': 755.397, 'text': 'so That is the struggle of deep learning is figuring out how best to tune those weights and those biases.', 'start': 747.649, 'duration': 7.748}, {'end': 759, 'text': "But that's how we can start to have an impact on these output values.", 'start': 755.718, 'duration': 3.282}, {'end': 761.062, 'text': 'And when we get to it.', 'start': 759.761, 'duration': 1.301}, {'end': 764.485, 'text': 'that is the whole crux of deep learning.', 'start': 761.062, 'duration': 3.423}, {'end': 766.186, 'text': 'is the kind of back propagation,', 'start': 764.485, 'duration': 1.701}, {'end': 776.855, 'text': 'calculating gradient and figuring out how do we adjust weights and biases to get the output values that we want for the vast array of inputs that we might have.', 'start': 766.186, 'duration': 10.669}], 'summary': 'Deep learning struggles with tuning weights and biases to impact output values through back propagation and gradient calculation.', 'duration': 29.206, 'max_score': 747.649, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU747649.jpg'}], 'start': 525.79, 'title': 'Neural network modeling and understanding structure', 'summary': 'Discusses modeling a layer of three neurons in a neural network, converting single value output to a list, and updating weights and bias to 1, 2, 3 for each neuron. it also explains the structure of a fully connected neural network with three neurons, each having unique sets of weights and biases, and discusses the impact of tweaking these parameters on the output values, highlighting the importance of backpropagation and the struggle of deep learning in tuning these parameters.', 'chapters': [{'end': 596.082, 'start': 525.79, 'title': 'Neural network layer modeling', 'summary': 'Discusses modeling a layer of three neurons in a neural network, converting single value output to a list, and updating weights and bias to 1, 2, 3 for each neuron.', 'duration': 70.292, 'highlights': ['The output for modeling three neurons in a layer will resemble the input but with three values instead of four.', 'Converting the output to a list from a single value implies a change in the structure of the output.', 'Updating the weights and bias to 1, 2, 3 for each neuron in the layer.']}, {'end': 874.607, 'start': 617.413, 'title': 'Understanding neural network structure', 'summary': 'Explains the structure of a fully connected neural network with three neurons, each having unique sets of weights and biases, and discusses the impact of tweaking these weights and biases on the output values, highlighting the importance of backpropagation and the struggle of deep learning in tuning these parameters.', 'duration': 257.194, 'highlights': ['The chapter discusses the structure of a fully connected neural network with three neurons, each having unique sets of weights and biases, and highlights the impact of tweaking these weights and biases on the output values, emphasizing the importance of backpropagation and the struggle of deep learning in tuning these parameters.', 'The transcript explains the process of tweaking weights and biases to make an impact on output values, which is a crucial aspect of deep learning, highlighting the significance of backpropagation in adjusting these parameters for a vast array of inputs.', 'The speaker mentions the importance of tuning weights and biases in deep learning to have an impact on output values, emphasizing that understanding how to adjust these parameters is a key aspect of the learning process, and acknowledges that solving this mystery will be addressed later in the series.']}], 'duration': 348.817, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/lGLto9Xd7bU/pics/lGLto9Xd7bU525790.jpg', 'highlights': ['The output for modeling three neurons in a layer will resemble the input but with three values instead of four.', 'Updating the weights and bias to 1, 2, 3 for each neuron in the layer.', 'Converting the output to a list from a single value implies a change in the structure of the output.', 'The chapter discusses the structure of a fully connected neural network with three neurons, each having unique sets of weights and biases, and highlights the impact of tweaking these weights and biases on the output values, emphasizing the importance of backpropagation and the struggle of deep learning in tuning these parameters.', 'The transcript explains the process of tweaking weights and biases to make an impact on output values, which is a crucial aspect of deep learning, highlighting the significance of backpropagation in adjusting these parameters for a vast array of inputs.', 'The speaker mentions the importance of tuning weights and biases in deep learning to have an impact on output values, emphasizing that understanding how to adjust these parameters is a key aspect of the learning process, and acknowledges that solving this mystery will be addressed later in the series.']}], 'highlights': ['The chapter demonstrates the addition of one more input to a neural network, simulating its location in the middle of the network, resulting in a final output of 2.3.', 'The chapter illustrates the modeling of a single neuron with four unique inputs, showing the impact on weights and biases, resulting in a final output of 4.8.', 'The chapter clarified the relationship between the number of biases and weights in a neuron, explaining that while each input has its own unique weight, every neuron has only one bias, with the specific example of three inputs and one bias.', 'The chapter aligns the video content with the book by updating inputs to 1, 2, and 3, and adjusting the weights accordingly.', 'The first order of business is to change the inputs to 1, 2, and 3 and update the weights to match the book exactly.', 'The video aims to ensure that anyone following along will have the videos and the book matching exactly.', 'The chapter discusses the structure of a fully connected neural network with three neurons, each having unique sets of weights and biases, and highlights the impact of tweaking these weights and biases on the output values, emphasizing the importance of backpropagation and the struggle of deep learning in tuning these parameters.', 'The transcript explains the process of tweaking weights and biases to make an impact on output values, which is a crucial aspect of deep learning, highlighting the significance of backpropagation in adjusting these parameters for a vast array of inputs.', 'Updating the weights and bias to 1, 2, 3 for each neuron in the layer.', 'The output for modeling three neurons in a layer will resemble the input but with three values instead of four.', 'Converting the output to a list from a single value implies a change in the structure of the output.', 'The chapter discusses the process of tweaking weights through backpropagation and the use of made-up values for the weights and bias, such as 0.2, 0.8, -0.5, and 2, to achieve a desired output of 2.3 in the neural network.', 'The speaker mentions the importance of tuning weights and biases in deep learning to have an impact on output values, emphasizing that understanding how to adjust these parameters is a key aspect of the learning process, and acknowledges that solving this mystery will be addressed later in the series.']}