title
10.3: Neural Networks: Perceptron Part 2 - The Nature of Code

description
This is a follow-up to my Perceptron Video (https://youtu.be/ntKn5TPHHAk) This video is part of Chapter 10 of The Nature of Code (http://natureofcode.com/book/chapter-10-neural-networks/) This video is also part of session 4 of my Spring 2017 ITP "Intelligence and Learning" course (https://github.com/shiffman/NOC-S17-2-Intelligence-Learning/tree/master/week4-neural-networks) Source Code from my first Perceptron Coding Challenge: https://github.com/CodingTrain/Rainbow-Code/tree/master/CodingChallenges/CC_72_SimplePerceptron Simple Perceptron code examples: p5.js: https://github.com/shiffman/The-Nature-of-Code-Examples-p5.js/tree/master/chp10_nn/NOC_10_01_Perceptron Processing: https://github.com/shiffman/The-Nature-of-Code-Examples/tree/master/chp10_nn/NOC_10_01_SimplePerceptron Support this channel on Patreon: https://patreon.com/codingtrain To buy Coding Train merchandise: https://www.designbyhumans.com/shop/codingtrain/ To donate to the Processing Foundation: https://processingfoundation.org/ Send me your questions and coding challenges!: https://github.com/CodingTrain/Rainbow-Topics Contact: Twitter: https://twitter.com/shiffman The Coding Train website: http://thecodingtrain.com/ Links discussed in this video: My video on the map() function: https://youtu.be/nicMAoW6u1g My video explaining object overloading: http://youtu.be/V7k5bFQbhG0 My Perceptron Coding Challenge: https://youtu.be/ntKn5TPHHAk Session 4 of Intelligence and Learning: https://github.com/shiffman/NOC-S17-2-Intelligence-Learning/tree/master/week4-neural-networks Perceptron on Wikipedia: https://en.wikipedia.org/wiki/Perceptron Source Code for the all Video Lessons: https://github.com/CodingTrain/Rainbow-Code p5.js: https://p5js.org/ Processing: https://processing.org For More Coding Challenges: https://www.youtube.com/playlist?list=PLRqwX-V7Uu6ZiZxtDDRCi6uhfTH4FilpH For More Intelligence and Learning: https://www.youtube.com/playlist?list=PLRqwX-V7Uu6YJ3XfHhT2Mm4Y5I99nrIKX Help us caption & translate this video! http://amara.org/v/7wh0/ 📄 Code of Conduct: https://github.com/CodingTrain/Code-of-Conduct

detail
{'title': '10.3: Neural Networks: Perceptron Part 2 - The Nature of Code', 'heatmap': [{'end': 1130.962, 'start': 1074.088, 'weight': 0.738}], 'summary': 'Series explores refining a simple perceptron model for creating more sophisticated possibilities, using neural network perceptron to process raw pixel coordinates, training perceptron to recognize linear functions, and debugging the perceptron algorithm while exploring the impact of learning rates in machine learning.', 'chapters': [{'end': 57.456, 'segs': [{'end': 29.34, 'src': 'embed', 'start': 1.527, 'weight': 0, 'content': [{'end': 7.71, 'text': 'Hello Welcome to a follow up on my previous perceptron coding challenge.', 'start': 1.527, 'duration': 6.183}, {'end': 11.812, 'text': "So if you happen to watch the previous one, and if you hadn't, you probably should go back and watch it.", 'start': 7.77, 'duration': 4.042}, {'end': 12.612, 'text': 'Link in the description.', 'start': 11.852, 'duration': 0.76}, {'end': 15.334, 'text': 'I created a simple perceptron.', 'start': 13.373, 'duration': 1.961}, {'end': 21.617, 'text': 'A perceptron is a model of a single neuron that receives inputs and then produces an output.', 'start': 15.374, 'duration': 6.243}, {'end': 29.34, 'text': 'And this is a very simple scenario where the output is only trying to guess whether a point is on one side of the line or the other.', 'start': 21.717, 'duration': 7.623}], 'summary': "Follow-up on perceptron coding challenge, creating a simple model to guess points' positions.", 'duration': 27.813, 'max_score': 1.527, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac1527.jpg'}, {'end': 65.76, 'src': 'embed', 'start': 38.812, 'weight': 1, 'content': [{'end': 46.323, 'text': 'but make this example a bit more sophisticated and allow for some more hopefully some possibly some more creative possibilities.', 'start': 38.812, 'duration': 7.511}, {'end': 51.273, 'text': "So first thing, I got a bunch of things I'm going to do.", 'start': 47.791, 'duration': 3.482}, {'end': 52.974, 'text': "I would list them all, but I can't remember what they are.", 'start': 51.293, 'duration': 1.681}, {'end': 54.414, 'text': "So I'm just going to tell you what the first thing is.", 'start': 52.994, 'duration': 1.42}, {'end': 57.456, 'text': 'The first thing is, let me come over here to the whiteboard.', 'start': 54.795, 'duration': 2.661}, {'end': 65.76, 'text': 'So what I did with the first perceptron was just use the raw pixel coordinates of the processing window.', 'start': 58.016, 'duration': 7.744}], 'summary': 'Introducing more sophisticated and creative possibilities for the first perceptron.', 'duration': 26.948, 'max_score': 38.812, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac38812.jpg'}], 'start': 1.527, 'title': 'Refining the perceptron model', 'summary': 'Discusses refining a simple perceptron model to allow for more creative possibilities and sophistication, without delving into the mechanics of the perceptron itself.', 'chapters': [{'end': 57.456, 'start': 1.527, 'title': 'Refining the perceptron model', 'summary': 'Discusses refining a simple perceptron model to allow for more creative possibilities and sophistication, without delving into the mechanics of the perceptron itself.', 'duration': 55.929, 'highlights': ['The chapter discusses refining a simple perceptron model to allow for more creative possibilities and sophistication without delving into the mechanics of the perceptron itself. Refinement of perceptron model, focus on creativity and sophistication, not delving into mechanics', 'The perceptron is a model of a single neuron that receives inputs and then produces an output. Explanation of perceptron as a model of a single neuron receiving inputs and producing output', 'The output is only trying to guess whether a point is on one side of the line or the other, above or below a line. Purpose of output: determining if a point is above or below a line']}], 'duration': 55.929, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac1527.jpg', 'highlights': ['Refinement of perceptron model, focus on creativity and sophistication, not delving into mechanics', 'Explanation of perceptron as a model of a single neuron receiving inputs and producing output', 'Purpose of output: determining if a point is above or below a line']}, {'end': 382.53, 'segs': [{'end': 165.991, 'src': 'embed', 'start': 131.378, 'weight': 2, 'content': [{'end': 133.559, 'text': 'So how do I make that change? Hmm.', 'start': 131.378, 'duration': 2.181}, {'end': 138.746, 'text': 'Well, where do I make the points? I think I had this point object.', 'start': 134.941, 'duration': 3.805}, {'end': 145.194, 'text': 'And the point object makes a random point with a random x value and a random y value.', 'start': 139.066, 'duration': 6.128}, {'end': 152.316, 'text': "What I'm going to do, and I think I'm going to just make these now random values between negative 1 and 1.", 'start': 145.235, 'duration': 7.081}, {'end': 154.138, 'text': 'Negative 1 and 1.', 'start': 152.316, 'duration': 1.822}, {'end': 156.401, 'text': 'So the random values are between negative 1 and 1.', 'start': 154.138, 'duration': 2.263}, {'end': 157.282, 'text': "In some ways, that's it.", 'start': 156.401, 'duration': 0.881}, {'end': 158.102, 'text': 'Done Aha.', 'start': 157.522, 'duration': 0.58}, {'end': 165.991, 'text': "But now right before I show them, what I want to do is I'm going to say px for pixel x.", 'start': 158.583, 'duration': 7.408}], 'summary': 'Creating random point objects with x and y values between -1 and 1', 'duration': 34.613, 'max_score': 131.378, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac131378.jpg'}, {'end': 335.91, 'src': 'embed', 'start': 280.041, 'weight': 0, 'content': [{'end': 285.313, 'text': 'So I need to also just take the same exact math.', 'start': 280.041, 'duration': 5.272}, {'end': 287.994, 'text': 'And probably I should package that into a function or make it part.', 'start': 285.393, 'duration': 2.601}, {'end': 299.281, 'text': "Oh, you know what I should do? I should make, aha, the point object should just have a function that's called getPixelX.", 'start': 288.235, 'duration': 11.046}, {'end': 304.183, 'text': "It's a little bit of an awkward naming, getPixelY.", 'start': 300.261, 'duration': 3.922}, {'end': 308.446, 'text': "Let's just call it, forget about PixelX and PixelY.", 'start': 304.243, 'duration': 4.203}, {'end': 312.048, 'text': "So I'm just going to calculate these on the fly whenever I need them.", 'start': 309.406, 'duration': 2.642}, {'end': 317.279, 'text': 'And pixel x will do this, return that mapping.', 'start': 313.157, 'duration': 4.122}, {'end': 323.783, 'text': 'And pixel y will return that other mapping.', 'start': 317.9, 'duration': 5.883}, {'end': 324.603, 'text': 'There we go.', 'start': 324.163, 'duration': 0.44}, {'end': 329.826, 'text': 'And then I could just get those values here and get those values here.', 'start': 324.904, 'duration': 4.922}, {'end': 331.067, 'text': 'This should be the same.', 'start': 330.087, 'duration': 0.98}, {'end': 335.91, 'text': 'But at least I took that mapping and put it into its separate function.', 'start': 332.188, 'duration': 3.722}], 'summary': 'Creating functions to calculate pixel x and y mappings for the point object.', 'duration': 55.869, 'max_score': 280.041, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac280041.jpg'}], 'start': 58.016, 'title': 'Using neural network perceptron and mapping points in processing', 'summary': 'Covers using a neural network perceptron to process raw pixel coordinates and intends to create a perceptron that can learn any division of data points. additionally, it discusses mapping random point values to coordinates in the pixel space using the map function in processing, with a focus on updating the code to incorporate separate pixel x and y functions.', 'chapters': [{'end': 130.638, 'start': 58.016, 'title': 'Neural network perceptron example', 'summary': 'Covers the process of using a neural network perceptron to process raw pixel coordinates, the plan to redo the example using a cartesian plane, and the intention to create a perceptron that can learn any division of data points.', 'duration': 72.622, 'highlights': ['The chapter discusses the process of using raw pixel coordinates of a 640 by 480 window for the first perceptron.', 'The speaker plans to redo the example using a Cartesian plane with 0, 0 in the center and y pointing up.', 'The intention is to create a perceptron that can learn any division of data points, not just slicing it down the middle.']}, {'end': 382.53, 'start': 131.378, 'title': 'Mapping points in processing', 'summary': 'Discusses the process of mapping random point values between -1 and 1 to coordinates in the pixel space using the map function in processing, with a focus on updating the code to incorporate separate pixel x and y functions.', 'duration': 251.152, 'highlights': ['The process involves mapping random point values between -1 and 1 to coordinates in the pixel space using the map function in Processing, allowing for a traditional Cartesian plane representation.', 'The code is updated to incorporate separate pixel x and y functions, enabling the mapping of points into the pixel space for drawing, leading to a more organized and flexible approach.', 'The chapter also emphasizes the importance of incorporating the mapping into separate functions for ease of use and efficiency, demonstrating a structured coding practice for future modifications.']}], 'duration': 324.514, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac58016.jpg', 'highlights': ['The intention is to create a perceptron that can learn any division of data points, not just slicing it down the middle.', 'The process involves mapping random point values between -1 and 1 to coordinates in the pixel space using the map function in Processing, allowing for a traditional Cartesian plane representation.', 'The code is updated to incorporate separate pixel x and y functions, enabling the mapping of points into the pixel space for drawing, leading to a more organized and flexible approach.']}, {'end': 913.202, 'segs': [{'end': 453.954, 'src': 'embed', 'start': 423.417, 'weight': 0, 'content': [{'end': 424.958, 'text': 'And I want to be able to change this formula.', 'start': 423.417, 'duration': 1.541}, {'end': 427.459, 'text': 'I can make it negative 2x minus 3.7.', 'start': 425.158, 'duration': 2.301}, {'end': 434.544, 'text': 'I want to be able to have any generic formula for a line work with this example.', 'start': 427.459, 'duration': 7.085}, {'end': 436.085, 'text': "So let's add that into the code.", 'start': 434.684, 'duration': 1.401}, {'end': 442.608, 'text': "So I think the easiest thing for me to do, and I'm just going to put it in this tab, is to write a function.", 'start': 438.246, 'duration': 4.362}, {'end': 444.469, 'text': "I'm going to call it, it's going to return a float.", 'start': 442.628, 'duration': 1.841}, {'end': 446.51, 'text': "And I'm actually just going to call it f.", 'start': 444.829, 'duration': 1.681}, {'end': 453.954, 'text': "Is that like a terrible thing to do? f of x being the function for, I think I'll call this like line, whatever.", 'start': 446.51, 'duration': 7.444}], 'summary': 'Modifying the formula to -2x-3.7 and creating a generic line function.', 'duration': 30.537, 'max_score': 423.417, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac423417.jpg'}, {'end': 547.267, 'src': 'embed', 'start': 512.455, 'weight': 1, 'content': [{'end': 518.019, 'text': 'Again, I could have changed the mapping between negative 10 and 10, but I kind of like this idea of between negative 1 and 1 for whatever reason.', 'start': 512.455, 'duration': 5.564}, {'end': 519.08, 'text': "So I'm going to change it to that.", 'start': 518.058, 'duration': 1.022}, {'end': 522.725, 'text': "And now what I'm going to do, I at least just want to draw that line.", 'start': 519.799, 'duration': 2.926}, {'end': 525.389, 'text': "So I'm actually not going to change any of the code in this example.", 'start': 522.765, 'duration': 2.624}, {'end': 530.196, 'text': "I'm just going to see, can I draw that line? So this is where I previously drew the line.", 'start': 525.409, 'duration': 4.787}, {'end': 547.267, 'text': 'So what I want to do is I want to draw a line from From where? I want to get the y value for when x is negative 1 and the y value for when x is 1.', 'start': 530.556, 'duration': 16.711}], 'summary': 'Mapping changed to range from -1 to 1 for drawing a line.', 'duration': 34.812, 'max_score': 512.455, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac512455.jpg'}, {'end': 709.222, 'src': 'embed', 'start': 682.952, 'weight': 4, 'content': [{'end': 691.294, 'text': 'Well, guess what? Guess what? In Java, processing is Java, you can do something called overloading.', 'start': 682.952, 'duration': 8.342}, {'end': 695.574, 'text': "Just so happens that I'm going to cover overloading, constructor overloading in this case.", 'start': 692.274, 'duration': 3.3}, {'end': 700.94, 'text': 'I could say, I want to have another way of creating a point.', 'start': 695.875, 'duration': 5.065}, {'end': 705.661, 'text': "And I'm going to use the underscore, kind of ugly underscore notation.", 'start': 701.6, 'duration': 4.061}, {'end': 709.222, 'text': 'But what I want to do is pass, have some arguments to the constructor.', 'start': 706.021, 'duration': 3.201}], 'summary': 'Java allows constructor overloading to create points with different arguments.', 'duration': 26.27, 'max_score': 682.952, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac682952.jpg'}, {'end': 757.796, 'src': 'embed', 'start': 729.235, 'weight': 3, 'content': [{'end': 732.077, 'text': 'Okay, so now that went away, and I should be able to.', 'start': 729.235, 'duration': 2.842}, {'end': 733.298, 'text': "There's that line.", 'start': 732.437, 'duration': 0.861}, {'end': 734.579, 'text': "So there's the line.", 'start': 733.658, 'duration': 0.921}, {'end': 736.901, 'text': 'Boy, this is a lot of work just to draw that one line.', 'start': 734.599, 'duration': 2.302}, {'end': 744.906, 'text': 'And you can see, though, I can change the formula for the line if I say, you know, minus negative 0.2.', 'start': 738.522, 'duration': 6.384}, {'end': 746.267, 'text': 'Now the line is further down.', 'start': 744.906, 'duration': 1.361}, {'end': 750.53, 'text': 'If I say minus 0.3, now the line is pointing the other direction.', 'start': 746.628, 'duration': 3.902}, {'end': 755.994, 'text': 'So I can now create any formula for the line and visualize that in the window.', 'start': 750.871, 'duration': 5.123}, {'end': 757.796, 'text': 'Okay, so now..', 'start': 756.595, 'duration': 1.201}], 'summary': 'Demonstrating flexibility in creating different line formulas for visualization.', 'duration': 28.561, 'max_score': 729.235, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac729235.jpg'}, {'end': 919.928, 'src': 'embed', 'start': 891.043, 'weight': 2, 'content': [{'end': 897.57, 'text': "You can see that it's, and this probably has to do with learning rate and how it started, how the initial weights were started.", 'start': 891.043, 'duration': 6.527}, {'end': 900.493, 'text': 'So I want to kind of figure out why is it kind of stuck.', 'start': 898.211, 'duration': 2.282}, {'end': 904.776, 'text': "Here's the reason why this is getting stuck.", 'start': 902.995, 'duration': 1.781}, {'end': 910.3, 'text': 'And I mentioned it in the previous coding challenge, and I completely forgot about it until now, the bias.', 'start': 905.437, 'duration': 4.863}, {'end': 913.202, 'text': "So let's talk about why does there need to be a bias.", 'start': 910.8, 'duration': 2.402}, {'end': 914.463, 'text': "Here's the thing.", 'start': 913.702, 'duration': 0.761}, {'end': 919.928, 'text': "Let's consider the point 0, 0.", 'start': 915.863, 'duration': 4.065}], 'summary': 'Discussion on the impact of bias in training neural networks.', 'duration': 28.885, 'max_score': 891.043, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac891043.jpg'}], 'start': 383.31, 'title': 'Linear function training', 'summary': 'Discusses training a perceptron to recognize a linear function and creating a generic formula for a line in a two-dimensional space, using specific examples and visualizations. it also covers visualizing a line formula and determining points above or below a line using quantifiable data and examples with changing line formulas and calculating y values for given x.', 'chapters': [{'end': 489.958, 'start': 383.31, 'title': 'Linear function training', 'summary': 'Discusses training a perceptron to recognize a linear function and creating a generic formula for a line in a two-dimensional space, using specific examples and visualizations.', 'duration': 106.648, 'highlights': ['The chapter discusses training a perceptron to recognize a linear function and creating a generic formula for a line in a two-dimensional space, using specific examples and visualizations.', 'The function f(x) = 3x + 2 is used as an example, showing the relationship between x and y values, with x=0 corresponding to y=2 and x=1 corresponding to y=5.', 'The concept of a linear formula y=mx+b is introduced, with the specific example of y=3x+2, illustrating the formula for a line in a two-dimensional space.']}, {'end': 913.202, 'start': 490.118, 'title': 'Visualizing line formula and determining points above or below', 'summary': 'Discusses visualizing a line formula and determining points above or below a line using examples with quantifiable data such as changing the line formula and calculating the y value for a given x.', 'duration': 423.084, 'highlights': ["Visualizing the line formula and its impact on the line's position The speaker demonstrates how changing the line formula, such as altering it to 0.3 times x plus 0.8, affects the line's position and direction, showcasing the flexibility in creating different line formulas for visualization.", 'Determining points above or below a line using the line formula The chapter explains the process of determining whether a given point is above or below a line by calculating the point on the line and comparing the y value, providing a clear method for categorizing points with quantifiable examples.', "Importance of bias in the perceptron's learning process The speaker identifies the significance of bias in the perceptron's learning process, highlighting its role in preventing the perceptron from getting stuck and hinting at the impact of initial weights and learning rate on the model's performance."]}], 'duration': 529.892, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac383310.jpg', 'highlights': ['The chapter discusses training a perceptron to recognize a linear function and creating a generic formula for a line in a two-dimensional space, using specific examples and visualizations.', 'The concept of a linear formula y=mx+b is introduced, with the specific example of y=3x+2, illustrating the formula for a line in a two-dimensional space.', 'Determining points above or below a line using the line formula The chapter explains the process of determining whether a given point is above or below a line by calculating the point on the line and comparing the y value, providing a clear method for categorizing points with quantifiable examples.', "Visualizing the line formula and its impact on the line's position The speaker demonstrates how changing the line formula, such as altering it to 0.3 times x plus 0.8, affects the line's position and direction, showcasing the flexibility in creating different line formulas for visualization.", "Importance of bias in the perceptron's learning process The speaker identifies the significance of bias in the perceptron's learning process, highlighting its role in preventing the perceptron from getting stuck and hinting at the impact of initial weights and learning rate on the model's performance.", 'The function f(x) = 3x + 2 is used as an example, showing the relationship between x and y values, with x=0 corresponding to y=2 and x=1 corresponding to y=5.']}, {'end': 1319.416, 'segs': [{'end': 1043.989, 'src': 'embed', 'start': 1010.08, 'weight': 0, 'content': [{'end': 1012.041, 'text': 'This is times x.', 'start': 1010.08, 'duration': 1.961}, {'end': 1016.605, 'text': 'Does this make sense? So this is really what the perceptron is learning.', 'start': 1012.041, 'duration': 4.564}, {'end': 1020.289, 'text': 'We know the formula for the line, and we could do all this with math.', 'start': 1016.665, 'duration': 3.624}, {'end': 1026.035, 'text': 'A neural network is often referred to as a universal function approximator.', 'start': 1021.15, 'duration': 4.885}, {'end': 1029.199, 'text': 'Thank you to the chat who just posted that terminology.', 'start': 1026.115, 'duration': 3.084}, {'end': 1034.623, 'text': 'With a simple two-dimensional space and a formula for a line, we can do the math directly.', 'start': 1030.901, 'duration': 3.722}, {'end': 1043.989, 'text': 'But here you can imagine, once our data input set gets, once we have all this data with lots and lots of inputs that are in n-dimensional space,', 'start': 1035.042, 'duration': 8.947}], 'summary': 'The perceptron learns line formula to approximate universal functions in neural networks.', 'duration': 33.909, 'max_score': 1010.08, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac1010080.jpg'}, {'end': 1130.962, 'src': 'heatmap', 'start': 1074.088, 'weight': 0.738, 'content': [{'end': 1083.301, 'text': 'The perceptron now needs to have three weights, needs to have the weights for the two inputs, the x and y and the bias.', 'start': 1074.088, 'duration': 9.213}, {'end': 1090.771, 'text': 'And one thing I might do is give the perceptron constructor a number of arguments.', 'start': 1083.621, 'duration': 7.15}, {'end': 1095.734, 'text': 'so that we can have a sort of more generic perceptron.', 'start': 1092.372, 'duration': 3.362}, {'end': 1099.075, 'text': 'And I can say weights equals new float n.', 'start': 1095.774, 'duration': 3.301}, {'end': 1105.098, 'text': 'And then when I create the perceptron, I want to say perceptron three.', 'start': 1099.075, 'duration': 6.023}, {'end': 1109.841, 'text': 'And then here, oh, this was just something I had just to test the code.', 'start': 1106.179, 'duration': 3.662}, {'end': 1118.505, 'text': 'Now the inputs should always be an array of three things.', 'start': 1111.641, 'duration': 6.864}, {'end': 1123.356, 'text': 'And is there another here, training.', 'start': 1121.074, 'duration': 2.282}, {'end': 1126.379, 'text': 'I should always make sure to include that bias.', 'start': 1123.376, 'duration': 3.003}, {'end': 1130.962, 'text': "I think I was able to get all of the, and so let's run this again.", 'start': 1127.219, 'duration': 3.743}], 'summary': 'Implement perceptron with three weights for two inputs and a bias', 'duration': 56.874, 'max_score': 1074.088, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac1074088.jpg'}], 'start': 913.702, 'title': 'Perceptron and line visualization', 'summary': 'Covers the concept of perceptron, its limitations in a 2d space, implementing bias, and adjustments in constructor. it also discusses visualization challenges and calculation of slope and y-intercept of the perceptron line.', 'chapters': [{'end': 1141.891, 'start': 913.702, 'title': 'Understanding perceptron and implementing bias', 'summary': 'Discusses the concept of perceptron, its limitations when applied to a simple 2d space, the need for implementing bias, and the adjustments required in the perceptron constructor to accommodate the bias, ultimately aiming to approximate a function in a multidimensional space.', 'duration': 228.189, 'highlights': ["The perceptron's limitation in a 2D space is highlighted, where it struggles to accurately represent the function of a line, leading to the introduction of the concept of bias. The perceptron struggles to accurately represent the function of a line in a simple 2D space, necessitating the introduction of the concept of bias to address this limitation.", 'The need for implementing bias in the perceptron is discussed, emphasizing the addition of a bias term with its own weight to enable better function approximation in multidimensional space. The discussion revolves around the need to implement bias in the perceptron, highlighting the addition of a bias term with its own weight to enhance function approximation in multidimensional space.', 'The adjustments required in the perceptron constructor to accommodate the bias, including the addition of weights for the inputs (x and y) and the bias, are explained. The adjustments needed in the perceptron constructor to accommodate the bias, such as adding weights for the inputs (x and y) and the bias, are thoroughly explained.']}, {'end': 1319.416, 'start': 1143.052, 'title': 'Visualization of perceptron line', 'summary': 'Discusses the visualization of the line that the perceptron is learning, and the challenges faced in accurately representing the line, with a focus on calculating the slope and y-intercept of the line.', 'duration': 176.364, 'highlights': ['The formula for the line that the perceptron is learning is 0 weight times x plus weight index 1 times y plus weight index 2 times b, with the equation equalling 0.', 'The chapter highlights the challenge faced in accurately representing the line that the perceptron is learning and the need to calculate the slope and y-intercept of the line.', 'The chapter discusses the attempt to visualize the line that the perceptron thinks, and the challenges faced in accurately representing the line, with a focus on calculating the slope and y-intercept of the line.']}], 'duration': 405.714, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac913702.jpg', 'highlights': ['The need for implementing bias in the perceptron is discussed, emphasizing the addition of a bias term with its own weight to enable better function approximation in multidimensional space.', 'The adjustments required in the perceptron constructor to accommodate the bias, including the addition of weights for the inputs (x and y) and the bias, are explained.', 'The chapter discusses the attempt to visualize the line that the perceptron thinks, and the challenges faced in accurately representing the line, with a focus on calculating the slope and y-intercept of the line.', "The perceptron's limitation in a 2D space is highlighted, where it struggles to accurately represent the function of a line, leading to the introduction of the concept of bias.", 'The formula for the line that the perceptron is learning is 0 weight times x plus weight index 1 times y plus weight index 2 times b, with the equation equalling 0.', 'The chapter highlights the challenge faced in accurately representing the line that the perceptron is learning and the need to calculate the slope and y-intercept of the line.']}, {'end': 1648.292, 'segs': [{'end': 1390.229, 'src': 'embed', 'start': 1353.355, 'weight': 0, 'content': [{'end': 1354.976, 'text': 'So I never actually even had a bias.', 'start': 1353.355, 'duration': 1.621}, {'end': 1361.791, 'text': "so I think this is actually working, but I'm not visualizing this line perhaps correctly.", 'start': 1356.027, 'duration': 5.764}, {'end': 1366.735, 'text': "so let's go back now and let's see where do I want to do that?", 'start': 1361.791, 'duration': 4.944}, {'end': 1380.745, 'text': "I'm going to say y equals, so let me use the formula that I just used return.", 'start': 1366.735, 'duration': 14.01}, {'end': 1381.486, 'text': 'so let me do this.', 'start': 1380.745, 'duration': 0.741}, {'end': 1390.229, 'text': "I'm just going to create some shorter variable names to make this easier to look at.", 'start': 1384.403, 'duration': 5.826}], 'summary': 'Discussing code optimization and variable naming.', 'duration': 36.874, 'max_score': 1353.355, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac1353355.jpg'}, {'end': 1545.817, 'src': 'embed', 'start': 1468.752, 'weight': 2, 'content': [{'end': 1481.095, 'text': 'So I need to make this equivalent y1 times y equals negative w1 times y equals negative w2 minus negative w0x.', 'start': 1468.752, 'duration': 12.343}, {'end': 1482.695, 'text': "That's going to fix it.", 'start': 1481.855, 'duration': 0.84}, {'end': 1485.076, 'text': 'OK, so now.', 'start': 1484.256, 'duration': 0.82}, {'end': 1495.459, 'text': "I'm going to say negative here and we're going to run this and we're going to watch the perceptron over time and you can see as it's correcting.", 'start': 1486.195, 'duration': 9.264}, {'end': 1499.04, 'text': 'all those circles are turning green.', 'start': 1495.459, 'duration': 3.581}, {'end': 1504.643, 'text': 'with this learning rate, slowly over time, the line is converging to the correct spot.', 'start': 1499.04, 'duration': 5.603}, {'end': 1512.676, 'text': 'Perceptron learned the line.', 'start': 1511.635, 'duration': 1.041}, {'end': 1518.266, 'text': "So, you know, there's going to be a part three and a part four to this because I think I'm going to wrap up this particular video.", 'start': 1512.716, 'duration': 5.55}, {'end': 1520.149, 'text': 'There were some other things that I wanted to add to this.', 'start': 1518.286, 'duration': 1.863}, {'end': 1523.074, 'text': "But at least now in this video, I've added the bias.", 'start': 1520.49, 'duration': 2.584}, {'end': 1527.786, 'text': "I've made it so that I have a Cartesian space I can work with.", 'start': 1524.284, 'duration': 3.502}, {'end': 1530.668, 'text': 'And I can work with any formula for a line.', 'start': 1527.846, 'duration': 2.822}, {'end': 1534.47, 'text': 'As you can see, life is just one big refactoring.', 'start': 1530.848, 'duration': 3.622}, {'end': 1536.391, 'text': 'And there are so many things I could do to refactor this.', 'start': 1534.49, 'duration': 1.901}, {'end': 1541.995, 'text': "And visually, I still don't necessarily love what I have here in terms of explaining what's going on.", 'start': 1536.831, 'duration': 5.164}, {'end': 1545.817, 'text': "But at least now, let's just, before I go, let's make a new formula.", 'start': 1542.355, 'duration': 3.462}], 'summary': 'Perceptron learning with visual feedback, improving over time.', 'duration': 77.065, 'max_score': 1468.752, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac1468752.jpg'}], 'start': 1319.416, 'title': 'Debugging perceptron algorithm and understanding learning rates in machine learning', 'summary': 'Follows the process of debugging a perceptron algorithm, correcting major errors and implementing changes to achieve the correct line convergence, resulting in all circles turning green over time. it also explores the concept of learning rates in machine learning, demonstrating the impact of different learning rates on the performance of a formula for a line, emphasizing the trade-off between speed and accuracy, and teasing future follow-up videos.', 'chapters': [{'end': 1523.074, 'start': 1319.416, 'title': 'Debugging perceptron algorithm', 'summary': 'Follows the process of debugging a perceptron algorithm, correcting major errors and implementing changes to achieve the correct line convergence, resulting in all circles turning green over time.', 'duration': 203.658, 'highlights': ['The perceptron algorithm was debugged to correct major errors, such as initializing the bias and fixing equations, resulting in the convergence of the line over time with all circles turning green.', 'The importance of initializing the bias in the constructor was emphasized, as it played a crucial role in the correct functioning of the perceptron algorithm.', 'The process involved implementing changes, such as correcting equations through negative values, to achieve the correct line convergence over time, demonstrating the effectiveness of the debugging process.']}, {'end': 1648.292, 'start': 1524.284, 'title': 'Understanding learning rates in machine learning', 'summary': 'Explores the concept of learning rates in machine learning, demonstrating the impact of different learning rates on the performance of a formula for a line, emphasizing the trade-off between speed and accuracy, and teasing future follow-up videos.', 'duration': 124.008, 'highlights': ['The learning rate in the formula for a line can significantly impact the speed and accuracy of reaching the correct solution, as demonstrated by adjusting the learning rate from 0.2 to a ridiculously small value, showcasing the trade-off between speed and fine detail.', 'Different learning rates can lead to drastically different movements of the line in the Cartesian space, with a larger learning rate enabling quick convergence to the correct answer, while a smaller learning rate results in slow and refined movements to find the accurate spot.', 'Future follow-up videos are planned to address important aspects such as adding data that is not part of the training set, emphasizing the relevance and significance of this topic in machine learning.']}], 'duration': 328.876, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/DGxIcDjPzac/pics/DGxIcDjPzac1319416.jpg', 'highlights': ['The perceptron algorithm was debugged to correct major errors, resulting in line convergence with all circles turning green.', 'The importance of initializing the bias in the constructor was emphasized for the correct functioning of the perceptron algorithm.', 'Implementing changes, such as correcting equations through negative values, demonstrated the effectiveness of the debugging process.', 'Different learning rates can lead to drastically different movements of the line in the Cartesian space, showcasing the trade-off between speed and fine detail.', 'Adjusting the learning rate from 0.2 to a ridiculously small value demonstrated the impact on speed and accuracy of reaching the correct solution.', 'Future follow-up videos are planned to address important aspects such as adding data not part of the training set, emphasizing the relevance and significance of this topic in machine learning.']}], 'highlights': ['The intention is to create a perceptron that can learn any division of data points, not just slicing it down the middle.', 'Different learning rates can lead to drastically different movements of the line in the Cartesian space, showcasing the trade-off between speed and fine detail.', 'The perceptron algorithm was debugged to correct major errors, resulting in line convergence with all circles turning green.', 'The chapter discusses training a perceptron to recognize a linear function and creating a generic formula for a line in a two-dimensional space, using specific examples and visualizations.', 'The need for implementing bias in the perceptron is discussed, emphasizing the addition of a bias term with its own weight to enable better function approximation in multidimensional space.', 'The process involves mapping random point values between -1 and 1 to coordinates in the pixel space using the map function in Processing, allowing for a traditional Cartesian plane representation.', 'The adjustments required in the perceptron constructor to accommodate the bias, including the addition of weights for the inputs (x and y) and the bias, are explained.', 'The chapter explains the process of determining whether a given point is above or below a line by calculating the point on the line and comparing the y value, providing a clear method for categorizing points with quantifiable examples.', 'The importance of initializing the bias in the constructor was emphasized for the correct functioning of the perceptron algorithm.', 'The chapter discusses the attempt to visualize the line that the perceptron thinks, and the challenges faced in accurately representing the line, with a focus on calculating the slope and y-intercept of the line.']}