title
10.14: Neural Networks: Backpropagation Part 1 - The Nature of Code

description
In this video, I discuss the backpropagation algorithm as it relates to supervised learning and neural networks. Next Video: https://youtu.be/r2-P1Fi1g60 This video is part of Chapter 10 of The Nature of Code (http://natureofcode.com/book/chapter-10-neural-networks/) This video is also part of session 4 of my Spring 2017 ITP "Intelligence and Learning" course (https://github.com/shiffman/NOC-S17-2-Intelligence-Learning/tree/master/week4-neural-networks) Support this channel on Patreon: https://patreon.com/codingtrain To buy Coding Train merchandise: https://www.designbyhumans.com/shop/codingtrain/ To donate to the Processing Foundation: https://processingfoundation.org/ Send me your questions and coding challenges!: https://github.com/CodingTrain/Rainbow-Topics Contact: Twitter: https://twitter.com/shiffman The Coding Train website: http://thecodingtrain.com/ Links discussed in this video: The Coding Train on Amazon: https://www.amazon.com/shop/thecodingtrain Deeplearn.js: https://deeplearnjs.org/ Sigmoid function on Wikipedia: https://en.wikipedia.org/wiki/Sigmoid_function Videos mentioned in this video: My Neural Networks series: https://www.youtube.com/playlist?list=PLRqwX-V7Uu6aCibgK1PTWWu9by6XFdCfh 3Blue1Brown Neural Networks playlist: https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi 3Blue1Brown's Linear Algebra playlist: https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab Gradient Descent by 3Blue1Brown: https://youtu.be/IHZwWFHWa-w My Video on Gradient Descent: https://youtu.be/jc2IthslyzM Source Code for the all Video Lessons: https://github.com/CodingTrain/Rainbow-Code p5.js: https://p5js.org/ Processing: https://processing.org The Nature of Code playlist: https://www.youtube.com/user/shiffman/playlists?view_as=subscriber&shelf_id=6&view=50&sort=dd For More Coding Challenges: https://www.youtube.com/playlist?list=PLRqwX-V7Uu6ZiZxtDDRCi6uhfTH4FilpH For More Intelligence and Learning: https://www.youtube.com/playlist?list=PLRqwX-V7Uu6YJ3XfHhT2Mm4Y5I99nrIKX 📄 Code of Conduct: https://github.com/CodingTrain/Code-of-Conduct

detail
{'title': '10.14: Neural Networks: Backpropagation Part 1 - The Nature of Code', 'heatmap': [{'end': 839.691, 'start': 796.145, 'weight': 0.789}], 'summary': 'Covers supervised learning and introduces back propagation to adjust weights based on errors in a neural network. it also discusses neural network implementation, error calculation for hidden layers, and the iterative nature of error calculation and weight adjustment in gradient descent.', 'chapters': [{'end': 127.38, 'segs': [{'end': 74.79, 'src': 'embed', 'start': 51.628, 'weight': 1, 'content': [{'end': 59.397, 'text': 'I have done this in a video where I made a simple perceptron, where I did this same type of learning algorithm.', 'start': 51.628, 'duration': 7.769}, {'end': 62.401, 'text': "In fact, I've done this in genetic algorithm examples,", 'start': 59.657, 'duration': 2.744}, {'end': 67.647, 'text': "where I'm not using the same exact technique but a different technique to sort of teach a system to do something.", 'start': 62.401, 'duration': 5.246}, {'end': 69.128, 'text': 'So this is what I want to do.', 'start': 67.947, 'duration': 1.181}, {'end': 74.79, 'text': 'You can think of all of these weights that are inside the network as these little knobs, little settings.', 'start': 69.488, 'duration': 5.302}], 'summary': 'Exploring various learning algorithms for training systems.', 'duration': 23.162, 'max_score': 51.628, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/QJoa0JYaX1I/pics/QJoa0JYaX1I51628.jpg'}, {'end': 137.958, 'src': 'embed', 'start': 109.431, 'weight': 0, 'content': [{'end': 111.712, 'text': "How do I tune these weights? They're not connected to the error.", 'start': 109.431, 'duration': 2.281}, {'end': 113.693, 'text': "They're connected to the thing that's connected to the error.", 'start': 111.732, 'duration': 1.961}, {'end': 116.615, 'text': 'So I need to propagate backwards.', 'start': 114.313, 'duration': 2.302}, {'end': 120.857, 'text': 'Feed forward is the process of moving all the data forward through the network.', 'start': 116.955, 'duration': 3.902}, {'end': 127.38, 'text': 'Back propagation is the process of taking the error and basically feeding backwards the error through the network.', 'start': 121.217, 'duration': 6.163}, {'end': 131.194, 'text': "Now, here's where I have to admit something.", 'start': 129.013, 'duration': 2.181}, {'end': 137.958, 'text': "This is probably, I would say, I'm trying to think of a topic that I've tackled in any of my videos that's harder than this.", 'start': 131.875, 'duration': 6.083}], 'summary': 'Back propagation is the process of feeding errors backward through the network in neural networks.', 'duration': 28.527, 'max_score': 109.431, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/QJoa0JYaX1I/pics/QJoa0JYaX1I109431.jpg'}], 'start': 0.447, 'title': 'Supervised learning and back propagation', 'summary': 'Discusses supervised learning, where the teacher corrects output, and introduces back propagation to adjust weights based on errors in a neural network.', 'chapters': [{'end': 127.38, 'start': 0.447, 'title': 'Supervised learning and back propagation', 'summary': 'Discusses the concept of supervised learning, where the teacher adjusts settings to correct output, and introduces back propagation as a technique to adjust weights based on errors in a neural network.', 'duration': 126.933, 'highlights': ['The concept of supervised learning is introduced, where the teacher corrects the output by adjusting settings to make it more accurate, demonstrated through examples of linear regression, gradient descent, and genetic algorithms.', 'Back propagation is explained as the process of adjusting weights in a neural network based on the error by feeding the error backwards through the network, enabling the tuning of weights connected to the error.']}], 'duration': 126.933, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/QJoa0JYaX1I/pics/QJoa0JYaX1I447.jpg', 'highlights': ['Supervised learning involves correcting output to improve accuracy through examples of linear regression, gradient descent, and genetic algorithms.', 'Back propagation adjusts weights in a neural network based on the error by feeding the error backwards through the network.']}, {'end': 751.822, 'segs': [{'end': 302.174, 'src': 'embed', 'start': 265.987, 'weight': 0, 'content': [{'end': 269.548, 'text': "especially if you're interested in Python and using the TensorFlow library and that kind of stuff.", 'start': 265.987, 'duration': 3.561}, {'end': 273.349, 'text': "Okay, so that's that.", 'start': 269.668, 'duration': 3.681}, {'end': 276.51, 'text': "Now let me start, I think we're ready to start.", 'start': 274.17, 'duration': 2.34}, {'end': 282.672, 'text': "I've erased the whiteboard and I am now ready to start talking about the back propagation algorithm.", 'start': 277.49, 'duration': 5.182}, {'end': 290.534, 'text': "So, let's assume right now that this is my output neuron.", 'start': 283.492, 'duration': 7.042}, {'end': 302.174, 'text': "And just for the sake of simplicity at this moment, let's just pretend this is one of the hidden neurons.", 'start': 295.513, 'duration': 6.661}], 'summary': 'The transcript covers a discussion on the back propagation algorithm in the context of python and tensorflow.', 'duration': 36.187, 'max_score': 265.987, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/QJoa0JYaX1I/pics/QJoa0JYaX1I265987.jpg'}, {'end': 495.145, 'src': 'embed', 'start': 464.085, 'weight': 3, 'content': [{'end': 468.73, 'text': "But there's a key aspect of the way that the learning process works,", 'start': 464.085, 'duration': 4.645}, {'end': 474.056, 'text': "with gradient descent and back propagation is we really need to figure out who's responsible for the error.", 'start': 468.73, 'duration': 5.326}, {'end': 486.598, 'text': 'So let me take the scenario, these weights could actually be weights, where this weight is 0.2 and this weight is 0.1.', 'start': 474.837, 'duration': 11.761}, {'end': 495.145, 'text': 'Well, we could now make the argument that this connection is more responsible for the error because it has a higher weight.', 'start': 486.598, 'duration': 8.547}], 'summary': 'Gradient descent and backpropagation determine error responsibility in learning process.', 'duration': 31.06, 'max_score': 464.085, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/QJoa0JYaX1I/pics/QJoa0JYaX1I464085.jpg'}, {'end': 722.491, 'src': 'embed', 'start': 693.738, 'weight': 1, 'content': [{'end': 695.859, 'text': 'Because the error coming out of here adjusts these weights.', 'start': 693.738, 'duration': 2.121}, {'end': 697.18, 'text': 'This error adjusts these weights.', 'start': 696.079, 'duration': 1.101}, {'end': 705.805, 'text': 'And this error, E2, error hidden 2, if I knew what that is, then I could adjust the weights coming into that.', 'start': 697.46, 'duration': 8.345}, {'end': 708.647, 'text': 'So this is the idea of back propagation.', 'start': 706.806, 'duration': 1.841}, {'end': 710.047, 'text': "There's an error here.", 'start': 708.887, 'duration': 1.16}, {'end': 711.247, 'text': 'It goes to here.', 'start': 710.467, 'duration': 0.78}, {'end': 714.969, 'text': 'Now I know if I could calculate these errors, then I could continue to go back here.', 'start': 711.288, 'duration': 3.681}, {'end': 717.61, 'text': 'And then if there were more, I would just calculate these errors and keep going this way.', 'start': 715.089, 'duration': 2.521}, {'end': 719.05, 'text': 'So this is the real question.', 'start': 717.93, 'duration': 1.12}, {'end': 722.491, 'text': 'How do I calculate that hidden error?', 'start': 719.55, 'duration': 2.941}], 'summary': 'Back propagation adjusts weights based on calculated errors in neural networks.', 'duration': 28.753, 'max_score': 693.738, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/QJoa0JYaX1I/pics/QJoa0JYaX1I693738.jpg'}], 'start': 129.013, 'title': 'Neural network implementation', 'summary': 'Provides an overview of neural network implementation and plans to create two to three videos. it also discusses the back propagation algorithm, focusing on adjusting weights, identifying responsible weights for errors, and calculating errors for hidden neurons.', 'chapters': [{'end': 265.987, 'start': 129.013, 'title': 'Neural network implementation overview', 'summary': "Outlines the speaker's plan to provide a general overview of neural network implementation, emphasizing the difficulty of the topic and the resources available for further understanding, intending to create two to three videos for implementation.", 'duration': 136.974, 'highlights': ['The speaker acknowledges the difficulty of the topic, admitting that it is one of the hardest they have tackled in their videos, and expresses uncertainty about their deep understanding of it.', 'The speaker plans to provide a general overview of how the algorithm works and intends to look at the math related to matrices to implement it in code.', 'The speaker plans to take two to three videos to achieve their goal of implementation and mentions resources for deeper understanding including a book and video series by Tariq Rashid and 3Blue1Brown, respectively.', 'The speaker mentions additional resources for understanding the math involved in neural networks, including an online book and a YouTube channel by Sir Raj.']}, {'end': 751.822, 'start': 265.987, 'title': 'Back propagation algorithm', 'summary': 'Discusses the back propagation algorithm in neural networks, focusing on adjusting weights based on errors, identifying responsible weights for errors, and the concept of back propagation to calculate errors for hidden neurons in a neural network.', 'duration': 485.835, 'highlights': ['The process of adjusting weights based on the difference between the desired output and the actual output is explained, with an example error of 0.3 calculated using a simple formula. Explanation of adjusting weights based on error difference, with a calculated error of 0.3.', 'Identifying the responsible weights for errors is discussed, emphasizing the importance of distributing the error nudges based on the weights, with an example indicating the proportion of responsibility based on weight values. Discussion on identifying responsible weights for errors and distributing error nudges based on weight values.', 'The concept of back propagation to calculate errors for hidden neurons in a neural network is explained, outlining the process of propagating errors from output to hidden neurons and the challenges involved in connecting errors to the respective weights. Explanation of back propagation to calculate errors for hidden neurons and the challenges in connecting errors to respective weights.']}], 'duration': 622.809, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/QJoa0JYaX1I/pics/QJoa0JYaX1I129013.jpg', 'highlights': ['The speaker acknowledges the difficulty of the topic, admitting that it is one of the hardest they have tackled in their videos.', 'The process of adjusting weights based on the difference between the desired output and the actual output is explained, with an example error of 0.3 calculated using a simple formula.', 'The concept of back propagation to calculate errors for hidden neurons in a neural network is explained, outlining the process of propagating errors from output to hidden neurons and the challenges involved in connecting errors to the respective weights.', 'The speaker plans to take two to three videos to achieve their goal of implementation and mentions resources for deeper understanding including a book and video series by Tariq Rashid and 3Blue1Brown, respectively.']}, {'end': 1162.923, 'segs': [{'end': 839.691, 'src': 'heatmap', 'start': 787.662, 'weight': 0, 'content': [{'end': 795.685, 'text': 'Okay, so the way that I do that is by taking this error and getting a portion of it.', 'start': 787.662, 'duration': 8.023}, {'end': 799.227, 'text': 'What portion should I get? And I sort of said this already.', 'start': 796.145, 'duration': 3.082}, {'end': 802.308, 'text': 'Two thirds and one third.', 'start': 800.027, 'duration': 2.281}, {'end': 808.014, 'text': 'So if this is weight 1 and this is weight 2, this is a very simple scenario.', 'start': 802.808, 'duration': 5.206}, {'end': 810.137, 'text': 'This is almost like back to that perceptron again.', 'start': 808.175, 'duration': 1.962}, {'end': 822.772, 'text': 'Then what I want to do is say the error of hidden 1 is weight 1 divided by weight 1 plus weight 2 times the error of the output, e output.', 'start': 810.698, 'duration': 12.074}, {'end': 829.146, 'text': "The error of hidden one is a portion of that output's error.", 'start': 824.264, 'duration': 4.882}, {'end': 839.691, 'text': 'The error of hidden two is weight two divided by weight one plus weight two times that output error again.', 'start': 829.506, 'duration': 10.185}], 'summary': 'Using weights 1 and 2, error of hidden layers is calculated as two thirds and one third of the output error.', 'duration': 22.475, 'max_score': 787.662, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/QJoa0JYaX1I/pics/QJoa0JYaX1I787662.jpg'}, {'end': 1015.604, 'src': 'embed', 'start': 989.167, 'weight': 1, 'content': [{'end': 993.89, 'text': "it's still the portion of the error based on this weight and this weight.", 'start': 989.167, 'duration': 4.723}, {'end': 1000.514, 'text': 'How much are we contributing to that 0.3 error? Well, this one has a certain percentage and this one has a certain percentage.', 'start': 994.751, 'duration': 5.763}, {'end': 1006.838, 'text': 'And that is weight one one divided by one one plus one two times that error.', 'start': 1000.754, 'duration': 6.084}, {'end': 1008.239, 'text': "Now, here's the thing.", 'start': 1006.898, 'duration': 1.341}, {'end': 1012.262, 'text': "That's just how much it's contributing to error one.", 'start': 1009.36, 'duration': 2.902}, {'end': 1015.604, 'text': "How much is it contributing to error two? We've got to calculate that as well.", 'start': 1012.502, 'duration': 3.102}], 'summary': 'Calculating contributions to errors based on weights and percentages.', 'duration': 26.437, 'max_score': 989.167, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/QJoa0JYaX1I/pics/QJoa0JYaX1I989167.jpg'}], 'start': 752.222, 'title': 'Error and back propagation in neural networks', 'summary': 'Covers error calculation for hidden layers, involving weights and proportions such as two-thirds and one-third, and the use of a gradient descent algorithm. it also explores the concept of back propagation, explaining error calculations for hidden and output layers while emphasizing the iterative nature of error calculation and weight adjustment in gradient descent.', 'chapters': [{'end': 868.726, 'start': 752.222, 'title': 'Error calculation in neural networks', 'summary': 'Explains the calculation of errors for hidden layers in a neural network, involving weights, and proportions such as two-thirds and one-third, and the use of a gradient descent algorithm for tuning weights.', 'duration': 116.504, 'highlights': ['The error of hidden 1 is weight 1 divided by weight 1 plus weight 2 times the error of the output, e output, and the error of hidden two is weight two divided by weight one plus weight two times that output error again.', 'The proportion used for error calculation is two-thirds and one-third, and the same gradient descent algorithm can be used to tune the weights for the errors.', 'The chapter explains the calculation of errors for hidden layers in a neural network, involving weights, and proportions such as two-thirds and one-third, and the use of a gradient descent algorithm for tuning weights.']}, {'end': 1162.923, 'start': 871.847, 'title': 'Back propagation in neural networks', 'summary': 'Delves into the concept of back propagation in neural networks, explaining the calculations for errors in the hidden and output layers, and emphasizes the iterative nature of error calculation and weight adjustment in gradient descent.', 'duration': 291.076, 'highlights': ['The errors in the output layer are calculated by determining the portion of connection weights contributing to each error, such as error 1 and error 2.', 'The process of back propagation involves calculating errors in hidden layers, which are then utilized in gradient descent for weight adjustment.', 'The concept of back propagation is fundamental in determining how errors are computed and weights are adjusted in neural networks.']}], 'duration': 410.701, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/QJoa0JYaX1I/pics/QJoa0JYaX1I752222.jpg', 'highlights': ['The chapter explains the calculation of errors for hidden layers in a neural network, involving weights, and proportions such as two-thirds and one-third, and the use of a gradient descent algorithm for tuning weights.', 'The concept of back propagation is fundamental in determining how errors are computed and weights are adjusted in neural networks.', 'The errors in the output layer are calculated by determining the portion of connection weights contributing to each error, such as error 1 and error 2.']}], 'highlights': ['Supervised learning involves correcting output to improve accuracy through examples of linear regression, gradient descent, and genetic algorithms.', 'Back propagation adjusts weights in a neural network based on the error by feeding the error backwards through the network.', 'The process of adjusting weights based on the difference between the desired output and the actual output is explained, with an example error of 0.3 calculated using a simple formula.', 'The concept of back propagation to calculate errors for hidden neurons in a neural network is explained, outlining the process of propagating errors from output to hidden neurons and the challenges involved in connecting errors to the respective weights.', 'The chapter explains the calculation of errors for hidden layers in a neural network, involving weights, and proportions such as two-thirds and one-third, and the use of a gradient descent algorithm for tuning weights.', 'The concept of back propagation is fundamental in determining how errors are computed and weights are adjusted in neural networks.', 'The errors in the output layer are calculated by determining the portion of connection weights contributing to each error, such as error 1 and error 2.']}