title
Heroes of Deep Learning: Andrew Ng interviews Geoffrey Hinton
description
detail
{'title': 'Heroes of Deep Learning: Andrew Ng interviews Geoffrey Hinton', 'heatmap': [{'end': 1674.074, 'start': 1645.765, 'weight': 0.707}, {'end': 1933.45, 'start': 1836.875, 'weight': 0.732}, {'end': 2382.597, 'start': 2359.31, 'weight': 0.75}], 'summary': "Andrew ng interviews geoffrey hinton, the 'godfather of deep learning,' covering the history of ai and neural networks, the evolution of deep learning with a 100-fold performance improvement, the impact of restricted boltzmann machines, potential brain-related aspects of backpropagation, capsules for neural networks, and deep learning insights including advice on breaking into deep learning and jeff dean's perspectives.", 'chapters': [{'end': 41.133, 'segs': [{'end': 29.467, 'src': 'embed', 'start': 1.789, 'weight': 0, 'content': [{'end': 5.514, 'text': 'Welcome Jeff, and thank you for doing this interview with DeepLearning.ai.', 'start': 1.789, 'duration': 3.725}, {'end': 8.358, 'text': 'Thank you for inviting me.', 'start': 7.377, 'duration': 0.981}, {'end': 15.759, 'text': 'I think that at this point you more than anyone else on this planet has invented so many of the ideas behind deep learning.', 'start': 9.735, 'duration': 6.024}, {'end': 20.702, 'text': 'And a lot of people have been calling you the godfather of deep learning.', 'start': 16.359, 'duration': 4.343}, {'end': 27.406, 'text': "Although it wasn't until we're just chatting a few minutes ago that I realized you think I'm the first one to call you that,", 'start': 20.842, 'duration': 6.564}, {'end': 29.467, 'text': "which I'm quite happy to have done.", 'start': 27.406, 'duration': 2.061}], 'summary': 'Jeff is considered the godfather of deep learning, having invented many of its ideas.', 'duration': 27.678, 'max_score': 1.789, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1789.jpg'}], 'start': 1.789, 'title': 'An interview with deep learning pioneer', 'summary': "Delves into an interview with jeff, hailed as the 'godfather of deep learning,' exploring his personal story and insights into deep learning.", 'chapters': [{'end': 41.133, 'start': 1.789, 'title': 'Interview with deep learning pioneer', 'summary': "Explores an interview with jeff, a pioneer in deep learning, who is hailed as the 'godfather of deep learning' and delves into his personal story behind the legend.", 'duration': 39.344, 'highlights': ["Jeff is recognized as a pioneer in deep learning and has been referred to as the 'godfather of deep learning'.", "The interview focuses on Jeff's personal story behind the legend and delves into how he got involved in deep learning."]}], 'duration': 39.344, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1789.jpg', 'highlights': ["Jeff is recognized as a pioneer in deep learning and has been referred to as the 'godfather of deep learning'.", "The interview focuses on Jeff's personal story behind the legend and delves into how he got involved in deep learning."]}, {'end': 437.815, 'segs': [{'end': 91.913, 'src': 'embed', 'start': 65.876, 'weight': 1, 'content': [{'end': 72.22, 'text': "And I said, sort of, what's a hologram? And he explained that in a hologram, you can chop off half of it and you still get the whole picture.", 'start': 65.876, 'duration': 6.344}, {'end': 75.782, 'text': 'And that memories in the brain might be distributed over the whole brain.', 'start': 72.981, 'duration': 2.801}, {'end': 84.788, 'text': "And so I guess he'd read about Lashley's experiments where you chop out bits of a rat's brain and discover it's very hard to find one bit where it stores one particular memory.", 'start': 76.623, 'duration': 8.165}, {'end': 91.913, 'text': "So that's what first got me interested in how does the brain store memories?", 'start': 87.91, 'duration': 4.003}], 'summary': 'Brain stores memories as distributed information, like a hologram.', 'duration': 26.037, 'max_score': 65.876, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE65876.jpg'}, {'end': 213.569, 'src': 'embed', 'start': 190.373, 'weight': 4, 'content': [{'end': 199.798, 'text': 'And in California, Don Norman and David Ronald Hart were very open to ideas about neural nets.', 'start': 190.373, 'duration': 9.425}, {'end': 208.846, 'text': "It was the first time I'd been somewhere where thinking about how the brain works and thinking about how that might relate to psychology was seen as a very positive thing,", 'start': 199.858, 'duration': 8.988}, {'end': 209.827, 'text': 'and it was a lot of fun there.', 'start': 208.846, 'duration': 0.981}, {'end': 212.589, 'text': 'In particular, collaborating with David Rommelhart was great.', 'start': 210.267, 'duration': 2.322}, {'end': 213.569, 'text': 'I see, great.', 'start': 213.049, 'duration': 0.52}], 'summary': 'In california, don norman and david ronald hart were open to neural net ideas; it was a positive and enjoyable experience.', 'duration': 23.196, 'max_score': 190.373, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE190373.jpg'}, {'end': 299.723, 'src': 'embed', 'start': 272.389, 'weight': 3, 'content': [{'end': 279.111, 'text': 'Why do you think it was your paper that helped so much the community latch onto backprop?', 'start': 272.389, 'duration': 6.722}, {'end': 284.233, 'text': 'It feels like your paper marked an inflection in the acceptance of this algorithm, whoever accepted it.', 'start': 279.171, 'duration': 5.062}, {'end': 290.117, 'text': 'So we managed to get a paper into Nature in 1986.', 'start': 285.373, 'duration': 4.744}, {'end': 293.159, 'text': 'And I did quite a lot of political work to get the paper accepted.', 'start': 290.117, 'duration': 3.042}, {'end': 299.723, 'text': 'I figured out that one of the referees was probably going to be Stuart Sutherland, who was a well-known psychologist in Britain.', 'start': 293.799, 'duration': 5.924}], 'summary': 'Paper in nature, 1986, marked inflection in backprop acceptance.', 'duration': 27.334, 'max_score': 272.389, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE272389.jpg'}, {'end': 416.664, 'src': 'embed', 'start': 394.963, 'weight': 0, 'content': [{'end': 403.589, 'text': 'And what this back propagation example showed was you could give it the information that would go into a graph structure or, in this case,', 'start': 394.963, 'duration': 8.626}, {'end': 413.643, 'text': 'a family tree, and it could convert that information into features In such a way that it could then use the features to derive new,', 'start': 403.589, 'duration': 10.054}, {'end': 415.424, 'text': 'consistent information.', 'start': 413.643, 'duration': 1.781}, {'end': 416.664, 'text': 'I generalize,', 'start': 415.424, 'duration': 1.24}], 'summary': 'Back propagation example converts family tree info into features for deriving consistent new information.', 'duration': 21.701, 'max_score': 394.963, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE394963.jpg'}], 'start': 41.133, 'title': 'The history of ai and neural networks', 'summary': 'Delves into the origins of ai and neural networks, tracing back to 1966 and the development of the backpropagation algorithm in 1986, demonstrating its impact on memory distribution and knowledge representation.', 'chapters': [{'end': 84.788, 'start': 41.133, 'title': 'Ai and neural networks origin', 'summary': 'Discusses the origin of ai and neural networks, dating back to 1966, when a classmate introduced the concept of holograms in the brain, potentially influencing the understanding of memory distribution.', 'duration': 43.655, 'highlights': ['The concept of holograms in the brain, introduced by a classmate in 1966, sparked an interest in understanding memory distribution.', "The classmate's explanation of holograms, where chopping off half still presents the whole picture, contributed to the exploration of memory distribution in the brain.", "The mention of Lashley's experiments on rat brains and the difficulty in isolating specific memories further piqued interest in understanding memory storage in the brain."]}, {'end': 437.815, 'start': 87.91, 'title': 'Memory, ai, and backpropagation', 'summary': "Discusses the journey of a scientist from studying physiology and physics to eventually developing the backpropagation algorithm in ai, leading to the acceptance of neural nets through a nature paper in 1986, showcasing the algorithm's ability to learn representations for words and unifying different strands of ideas about knowledge.", 'duration': 349.905, 'highlights': ["The scientist's journey from physiology and physics to developing the backpropagation algorithm in AI and getting a Nature paper accepted in 1986. The scientist initially studied physiology and physics, then switched to psychology and eventually got a PhD in AI. They faced challenges in getting a job in Britain but managed to get a Sloan fellowship in California. The scientist's persistence led to the acceptance of the backpropagation algorithm through a Nature paper in 1986.", "The acceptance of neural nets through a Nature paper in 1986, showcasing the algorithm's ability to learn representations for words. The paper demonstrated that backpropagation could learn representations for words, which impressed a well-known psychologist and led to the paper's acceptance. The algorithm could derive new, consistent information from the graphical representation or the tree structured representation of the family tree.", "The unification of different strands of ideas about knowledge through the backpropagation algorithm. The algorithm unified the old psychologist's view of a concept as a bundle of features with the AI view of a far more structuralist view, showcasing the ability to convert information into features and derive new, consistent information."]}], 'duration': 396.682, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE41133.jpg', 'highlights': ["The backpropagation algorithm's development in AI and its impact on memory distribution and knowledge representation.", 'The concept of holograms in the brain, introduced in 1966, sparked an interest in understanding memory distribution.', "The mention of Lashley's experiments on rat brains and the difficulty in isolating specific memories further piqued interest in understanding memory storage in the brain.", "The scientist's journey from physiology and physics to developing the backpropagation algorithm in AI and getting a Nature paper accepted in 1986.", "The acceptance of neural nets through a Nature paper in 1986, showcasing the algorithm's ability to learn representations for words.", 'The unification of different strands of ideas about knowledge through the backpropagation algorithm.']}, {'end': 918.833, 'segs': [{'end': 492.008, 'src': 'embed', 'start': 438.235, 'weight': 0, 'content': [{'end': 441.076, 'text': 'So this is 1986.', 'start': 438.235, 'duration': 2.841}, {'end': 445.697, 'text': 'In the early 90s, Bengio showed that you could actually take real data.', 'start': 441.076, 'duration': 4.621}, {'end': 453.48, 'text': 'you could take English text and apply the same techniques there and get embeddings for real words from English text.', 'start': 445.697, 'duration': 7.783}, {'end': 456.242, 'text': 'And that impressed people a lot.', 'start': 454.381, 'duration': 1.861}, {'end': 464.825, 'text': "I guess recently we've been talking a lot about how fast computers like GPUs and supercomputers is driving deep learning.", 'start': 457.142, 'duration': 7.683}, {'end': 472.867, 'text': "I didn't realize that back in between 1986 and the early 90s, it sounds like between you and Denjo, there was already the beginnings of this trend.", 'start': 464.945, 'duration': 7.922}, {'end': 475.128, 'text': 'Yes, there was a huge advance.', 'start': 473.768, 'duration': 1.36}, {'end': 483.751, 'text': 'I mean, in 1986, I was using a list machine which was less than a tenth of a megaflop.', 'start': 475.288, 'duration': 8.463}, {'end': 490.527, 'text': 'And by about 1993 or thereabouts, people were seeing like 10 megaflops.', 'start': 484.662, 'duration': 5.865}, {'end': 492.008, 'text': 'So it was a factor of 100.', 'start': 490.987, 'duration': 1.021}], 'summary': 'Between 1986 and the early 90s, computing power for deep learning increased by a factor of 100.', 'duration': 53.773, 'max_score': 438.235, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE438235.jpg'}, {'end': 539.115, 'src': 'embed', 'start': 510.109, 'weight': 1, 'content': [{'end': 514.669, 'text': 'So I think the most beautiful one is the work I did with Terry Sinofsky on Baltimore machines.', 'start': 510.109, 'duration': 4.56}, {'end': 524.092, 'text': 'So we discovered there was this really really simple learning algorithm that applied to great big, densely connected nets,', 'start': 515.809, 'duration': 8.283}, {'end': 525.892, 'text': 'where you could only see a few of the nodes.', 'start': 524.092, 'duration': 1.8}, {'end': 528.593, 'text': 'So it would learn hidden representations.', 'start': 526.712, 'duration': 1.881}, {'end': 530.433, 'text': 'And it was a very simple algorithm.', 'start': 528.933, 'duration': 1.5}, {'end': 533.354, 'text': 'And it looked like the kind of thing you should be able to get in a brain,', 'start': 530.913, 'duration': 2.441}, {'end': 539.115, 'text': 'because each synapse only needed to know about the behavior of the two neurons it was directly connected to.', 'start': 533.354, 'duration': 5.761}], 'summary': 'Discovered simple learning algorithm for densely connected nets, yielding hidden representations.', 'duration': 29.006, 'max_score': 510.109, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE510109.jpg'}, {'end': 601.036, 'src': 'embed', 'start': 572.851, 'weight': 2, 'content': [{'end': 577.273, 'text': 'And instead of letting things settle down, just use one iteration in a somewhat simpler net.', 'start': 572.851, 'duration': 4.422}, {'end': 582.656, 'text': 'And that gave restricted pulsing machines, which actually worked effectively in practice.', 'start': 577.833, 'duration': 4.823}, {'end': 588.631, 'text': 'So in the Netflix competition, for example, Restricted Boltzmann machines were one of the ingredients of the winning entry.', 'start': 582.716, 'duration': 5.915}, {'end': 595.934, 'text': 'In fact, a lot of the recent resurgence of neural networks and deep learning starting about, I guess 2007,', 'start': 589.212, 'duration': 6.722}, {'end': 601.036, 'text': 'was the restricted Boltzmann machine and de-restricted Boltzmann machine work that you and your lab did.', 'start': 595.934, 'duration': 5.102}], 'summary': 'Restricted boltzmann machines contributed to the success in the netflix competition and the resurgence of deep learning.', 'duration': 28.185, 'max_score': 572.851, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE572851.jpg'}, {'end': 678.889, 'src': 'embed', 'start': 648.474, 'weight': 3, 'content': [{'end': 655.938, 'text': "And what we'd managed to come up with by training these restricted boson machines was an efficient way of doing inference in sigmoid belief nets.", 'start': 648.474, 'duration': 7.464}, {'end': 667.704, 'text': "So around that time there were people doing neural nets who would use densely connected nets but didn't have any good ways of doing probabilistic inference in them.", 'start': 657.078, 'duration': 10.626}, {'end': 678.889, 'text': 'And you had people doing graphical models, unlike Mike Jordan, who could do inference properly, but only in sparsely connected nets.', 'start': 668.701, 'duration': 10.188}], 'summary': 'Training restricted boson machines allows efficient inference in sigmoid belief nets.', 'duration': 30.415, 'max_score': 648.474, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE648474.jpg'}, {'end': 740.41, 'src': 'embed', 'start': 712.54, 'weight': 4, 'content': [{'end': 718.064, 'text': 'And I guess the third thing was the work I did with Bradford Neal on variational methods.', 'start': 712.54, 'duration': 5.524}, {'end': 725.33, 'text': "It turns out people in statistics had done similar work earlier, but we didn't know about that.", 'start': 719.926, 'duration': 5.404}, {'end': 733.965, 'text': "So We managed to make EM work a whole lot better by showing you didn't need to do a perfect E-step.", 'start': 725.35, 'duration': 8.615}, {'end': 735.686, 'text': 'You could do an approximate E-step.', 'start': 734.245, 'duration': 1.441}, {'end': 740.41, 'text': "And EM was a big algorithm in statistics, and we'd showed a big generalization of it.", 'start': 735.986, 'duration': 4.424}], 'summary': 'Improved em algorithm with approximate e-step, a big generalization in statistics.', 'duration': 27.87, 'max_score': 712.54, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE712540.jpg'}], 'start': 438.235, 'title': 'Evolution of deep learning and impact of restricted boltzmann machines', 'summary': 'Discusses the evolution of deep learning from the 90s to recent advancements, noting a notable 100-fold performance improvement from 1986 to 1993. it also highlights the significance of restricted boltzmann machines in revolutionizing deep learning, including their role in the netflix competition and the improvement of em algorithm in statistics.', 'chapters': [{'end': 552.05, 'start': 438.235, 'title': 'Evolution of deep learning', 'summary': 'Discusses the evolution of deep learning, starting from the early 90s to the recent advancements, highlighting the significant increase in computing power and the development of simple learning algorithms, with a notable 100-fold performance improvement from 1986 to 1993.', 'duration': 113.815, 'highlights': ['In 1993, there was a 100-fold increase in computing power, with machines reaching 10 megaflops, making it easier to use for deep learning.', 'In the early 90s, Bengio demonstrated the application of deep learning techniques to real data, such as English text, leading to the generation of word embeddings.', 'The most exciting invention, according to the speaker, is the work done on Baltimore machines, which involved a simple learning algorithm for densely connected nets, enabling the learning of hidden representations.']}, {'end': 918.833, 'start': 552.51, 'title': 'The impact of restricted boltzmann machines on deep learning', 'summary': 'Highlights the significance of restricted boltzmann machines in revolutionizing deep learning, including their role in the netflix competition, the efficient inference in deep belief nets, and the improvement of em algorithm in statistics.', 'duration': 366.323, 'highlights': ['The Netflix competition winning entry involved the use of Restricted Boltzmann machines. Restricted Boltzmann machines were a key component of the winning entry in the Netflix competition, showcasing their effectiveness in practice.', 'Efficient inference in deep belief nets was made possible through trained restricted Boltzmann machines. Trained restricted Boltzmann machines provided an efficient way of performing inference in deep belief nets, enabling fast approximate inference in a single forward pass.', 'Significant improvement of EM algorithm in statistics was achieved through variational methods. The work on variational methods significantly improved the EM algorithm in statistics, demonstrating that an approximate E-step could be used to make EM work a whole lot better.']}], 'duration': 480.598, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE438235.jpg', 'highlights': ['In 1993, there was a 100-fold increase in computing power, with machines reaching 10 megaflops, making it easier to use for deep learning.', 'The most exciting invention, according to the speaker, is the work done on Baltimore machines, which involved a simple learning algorithm for densely connected nets, enabling the learning of hidden representations.', 'The Netflix competition winning entry involved the use of Restricted Boltzmann machines. Restricted Boltzmann machines were a key component of the winning entry in the Netflix competition, showcasing their effectiveness in practice.', 'Efficient inference in deep belief nets was made possible through trained restricted Boltzmann machines. Trained restricted Boltzmann machines provided an efficient way of performing inference in deep belief nets, enabling fast approximate inference in a single forward pass.', 'Significant improvement of EM algorithm in statistics was achieved through variational methods. The work on variational methods significantly improved the EM algorithm in statistics, demonstrating that an approximate E-step could be used to make EM work a whole lot better.', 'In the early 90s, Bengio demonstrated the application of deep learning techniques to real data, such as English text, leading to the generation of word embeddings.']}, {'end': 1264.862, 'segs': [{'end': 994.517, 'src': 'embed', 'start': 944.414, 'weight': 0, 'content': [{'end': 954.803, 'text': 'If it turns out the backprop is a really good algorithm for doing learning, then for sure, evolution could have figured out how to implement it.', 'start': 944.414, 'duration': 10.389}, {'end': 959.767, 'text': 'I mean, you have cells that can turn into either eyeballs or teeth.', 'start': 954.823, 'duration': 4.944}, {'end': 964.775, 'text': 'Now, if cells can do that, they can for sure implement backpropagation.', 'start': 960.508, 'duration': 4.267}, {'end': 968.096, 'text': "Presumably, there's huge selective pressure for it.", 'start': 965.675, 'duration': 2.421}, {'end': 973.059, 'text': "I think the neuroscientists idea that it doesn't look plausible is just silly.", 'start': 969.317, 'duration': 3.742}, {'end': 975.481, 'text': 'There may be some subtle implementation of it.', 'start': 973.7, 'duration': 1.781}, {'end': 981.164, 'text': "I think the brain probably has something that may not be exactly backpropagation, but it's quite close to it.", 'start': 976.181, 'duration': 4.983}, {'end': 986.247, 'text': "Over the years, I've come up with a number of ideas about how this might work.", 'start': 982.085, 'duration': 4.162}, {'end': 994.517, 'text': 'In 1987, working with Jay McClelland, I came up with the recirculation algorithm,', 'start': 986.287, 'duration': 8.23}], 'summary': 'Backpropagation could be implemented by cells, with potential subtle variations, supported by selective pressure and resembling the recirculation algorithm.', 'duration': 50.103, 'max_score': 944.414, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE944414.jpg'}, {'end': 1077.717, 'src': 'embed', 'start': 1044.963, 'weight': 3, 'content': [{'end': 1050.527, 'text': "But in recirculation, you're trying to make the post-synaptic input, you're trying to make the old one be good and the new one be bad.", 'start': 1044.963, 'duration': 5.564}, {'end': 1052.809, 'text': "So you're changing it in that direction.", 'start': 1051.428, 'duration': 1.381}, {'end': 1059.214, 'text': 'And we invented this algorithm before neuroscientists had come up with spike time-dependent plasticity.', 'start': 1053.87, 'duration': 5.344}, {'end': 1064.418, 'text': 'Spike. time-dependent plasticity is actually the same algorithm, but the other way around,', 'start': 1060.054, 'duration': 4.364}, {'end': 1068.621, 'text': 'where the new thing is good and the old thing is bad in the learning rule.', 'start': 1064.418, 'duration': 4.203}, {'end': 1077.717, 'text': "So you're changing the weight in proportion to the pre-synaptic activity times the new post-synaptic activity minus the old one.", 'start': 1069.482, 'duration': 8.235}], 'summary': 'Recirculation involves changing synaptic input direction for learning. invented algorithm pre-scientist, similar to spike time-dependent plasticity.', 'duration': 32.754, 'max_score': 1044.963, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1044963.jpg'}, {'end': 1197.625, 'src': 'embed', 'start': 1167.269, 'weight': 4, 'content': [{'end': 1172.273, 'text': 'So weights that adapt rapidly but decay rapidly and therefore can hold short-term memory.', 'start': 1167.269, 'duration': 5.004}, {'end': 1179.319, 'text': 'And I showed in a very simple system in 1973 that you could do true recursion with those weights.', 'start': 1173.114, 'duration': 6.205}, {'end': 1191.742, 'text': 'And what I mean by true recursion is that the neurons that are used for representing things get reused for representing things in the recursive call.', 'start': 1179.799, 'duration': 11.943}, {'end': 1197.625, 'text': 'And the weights that are used representing knowledge get reused in the recursive call.', 'start': 1193.503, 'duration': 4.122}], 'summary': 'Weights that adapt and decay rapidly enable true recursion in neural networks.', 'duration': 30.356, 'max_score': 1167.269, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1167269.jpg'}, {'end': 1264.862, 'src': 'embed', 'start': 1233.142, 'weight': 5, 'content': [{'end': 1234.923, 'text': "So it's about 40 years later.", 'start': 1233.142, 'duration': 1.781}, {'end': 1249.292, 'text': "And I guess one other idea I've heard you talk about for quite a few years now, over five years, I think, is capsules.", 'start': 1241.047, 'duration': 8.245}, {'end': 1250.933, 'text': 'Where are you with that?', 'start': 1249.753, 'duration': 1.18}, {'end': 1261.74, 'text': "Okay, so I'm back to the state I'm used to being in, which is I have this idea I really believe in, and nobody else believes it.", 'start': 1252.494, 'duration': 9.246}, {'end': 1264.862, 'text': 'And I submit papers about it and they all get rejected.', 'start': 1262.561, 'duration': 2.301}], 'summary': 'Speaker has an idea about capsules for over 5 years but faces rejection.', 'duration': 31.72, 'max_score': 1233.142, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1233142.jpg'}], 'start': 919.433, 'title': 'Backpropagation, recirculation, and the brain', 'summary': 'Explores the potential relationship between backpropagation and the brain, and discusses recirculation learning, fast weights, and the development of spike time-dependent plasticity in deep learning.', 'chapters': [{'end': 994.517, 'start': 919.433, 'title': 'Backpropagation and the brain', 'summary': 'Discusses the potential relationship between backpropagation and the brain, suggesting that if backpropagation is a good algorithm for learning, then it could have been implemented by evolution, with the speaker proposing several ideas about how this might work.', 'duration': 75.084, 'highlights': ['The speaker believes that if backpropagation is a good algorithm for learning, then evolution could have figured out how to implement it, suggesting a potential relationship between backpropagation and the brain.', 'The speaker suggests that there is a huge selective pressure for the implementation of backpropagation in the brain, indicating the significance of this potential relationship.', "The speaker proposes that while the brain may not exactly implement backpropagation, it is quite close to it, highlighting the speaker's belief in the plausibility of the relationship between backpropagation and the brain."]}, {'end': 1264.862, 'start': 994.517, 'title': 'Recirculation and fast weights in deep learning', 'summary': 'Discusses the concepts of recirculation learning and fast weights in deep learning, including the development of spike time-dependent plasticity and the use of fast weights for true recursion, with a mention of the idea of capsules.', 'duration': 270.345, 'highlights': ['The idea of recirculation learning and its connection to spike time-dependent plasticity, which changes the weight in proportion to the pre-synaptic activity times the new post-synaptic activity minus the old one. Recirculation learning and spike time-dependent plasticity are connected through the change of weight in proportion to the pre-synaptic activity times the new post-synaptic activity minus the old one.', 'The use of fast weights for true recursion, allowing the reuse of neurons and weights in recursive calls, with the ability to store memory in fast weights and recover activity states of neurons from them. Fast weights enable true recursion by reusing neurons and weights in recursive calls, with the capacity to store memory in fast weights and retrieve activity states of neurons from them.', 'The idea of capsules in deep learning, where the speaker expresses belief in the concept and their struggle in getting it accepted through paper submissions. The speaker expresses belief in the concept of capsules in deep learning and discusses the challenges faced in getting the idea accepted through paper submissions.']}], 'duration': 345.429, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE919433.jpg', 'highlights': ['The brain may not exactly implement backpropagation, but it is quite close to it, highlighting the plausibility of the relationship.', 'There is a huge selective pressure for the implementation of backpropagation in the brain, indicating the significance of this potential relationship.', 'If backpropagation is a good algorithm for learning, then evolution could have figured out how to implement it, suggesting a potential relationship between backpropagation and the brain.', 'Recirculation learning and spike time-dependent plasticity are connected through the change of weight in proportion to the pre-synaptic activity times the new post-synaptic activity minus the old one.', 'Fast weights enable true recursion by reusing neurons and weights in recursive calls, with the capacity to store memory in fast weights and retrieve activity states of neurons from them.', 'The speaker expresses belief in the concept of capsules in deep learning and discusses the challenges faced in getting the idea accepted through paper submissions.']}, {'end': 1779.167, 'segs': [{'end': 1315.184, 'src': 'embed', 'start': 1290.813, 'weight': 2, 'content': [{'end': 1297.158, 'text': "So the idea is in each region of the image, you'll assume there's at most one of a particular kind of feature.", 'start': 1290.813, 'duration': 6.345}, {'end': 1306.4, 'text': "And then you'll use a bunch of neurons and their activities will represent the different aspects of that feature.", 'start': 1298.376, 'duration': 8.024}, {'end': 1308.781, 'text': 'Like within that region?', 'start': 1307.421, 'duration': 1.36}, {'end': 1310.422, 'text': 'exactly what are its X and Y coordinates?', 'start': 1308.781, 'duration': 1.641}, {'end': 1311.603, 'text': 'What orientation is it at?', 'start': 1310.462, 'duration': 1.141}, {'end': 1313.083, 'text': 'How fast is it moving?', 'start': 1311.963, 'duration': 1.12}, {'end': 1313.844, 'text': 'What color is it?', 'start': 1313.163, 'duration': 0.681}, {'end': 1315.184, 'text': 'How bright is it? And stuff like that.', 'start': 1313.864, 'duration': 1.32}], 'summary': 'Neurons represent image features like coordinates, orientation, speed, color, and brightness.', 'duration': 24.371, 'max_score': 1290.813, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1290813.jpg'}, {'end': 1387.151, 'src': 'embed', 'start': 1356.552, 'weight': 0, 'content': [{'end': 1358.854, 'text': 'Rather than I call each of those subsets a capsule.', 'start': 1356.552, 'duration': 2.302}, {'end': 1364.638, 'text': 'And the idea is a capsule is able to represent an instance of a feature, but only one.', 'start': 1359.394, 'duration': 5.244}, {'end': 1369.701, 'text': 'And it represents all the different properties of that feature.', 'start': 1364.658, 'duration': 5.043}, {'end': 1377.449, 'text': "So it's a, it's a feature that has lots of properties as opposed to a normal neuron in a normal neural net, which is just as one scale of property.", 'start': 1370.228, 'duration': 7.221}, {'end': 1378.31, 'text': 'Sure I see.', 'start': 1377.75, 'duration': 0.56}, {'end': 1379.35, 'text': 'Yep Right.', 'start': 1378.51, 'duration': 0.84}, {'end': 1387.151, 'text': "And then what you can do, if you've got that is, you can do something that normal neural nets are very bad at,", 'start': 1379.45, 'duration': 7.701}], 'summary': 'Capsules represent features with multiple properties, improving neural net performance.', 'duration': 30.599, 'max_score': 1356.552, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1356552.jpg'}, {'end': 1475.241, 'src': 'embed', 'start': 1449.426, 'weight': 3, 'content': [{'end': 1459.17, 'text': 'So I think this routing by agreement is going to be crucial for getting neural nets to generalize much better from limited data.', 'start': 1449.426, 'duration': 9.744}, {'end': 1464.215, 'text': "I think it'd be very good at dealing with changes in viewpoint, very good at doing segmentation,", 'start': 1459.911, 'duration': 4.304}, {'end': 1469.14, 'text': "and I'm hoping it'll be much more statistically efficient than what we currently do in neural nets.", 'start': 1464.215, 'duration': 4.925}, {'end': 1475.241, 'text': 'which is if you want to deal with changes in viewpoint, you just give it a whole bunch of changes in viewpoint and train it on them all.', 'start': 1469.758, 'duration': 5.483}], 'summary': 'Routing by agreement crucial for neural nets to generalize better from limited data, good at dealing with changes in viewpoint and segmentation, and more statistically efficient.', 'duration': 25.815, 'max_score': 1449.426, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1449426.jpg'}, {'end': 1609.02, 'src': 'embed', 'start': 1581.555, 'weight': 4, 'content': [{'end': 1588.219, 'text': 'So when I was leading Google Brain, our first project, I spent a lot of work on unsupervised learning because of your influence.', 'start': 1581.555, 'duration': 6.664}, {'end': 1592.502, 'text': 'Right And I may have misled you.', 'start': 1589.04, 'duration': 3.462}, {'end': 1597.205, 'text': 'That is, in the long run, I think unsupervised learning is going to be absolutely crucial.', 'start': 1592.802, 'duration': 4.403}, {'end': 1600.688, 'text': 'But you have to sort of face reality.', 'start': 1598.386, 'duration': 2.302}, {'end': 1609.02, 'text': "And what's worked over the last 10 years or so is supervised learning, discriminative training,", 'start': 1602.809, 'duration': 6.211}], 'summary': 'Unsupervised learning crucial in long run, but supervised learning has worked in the last 10 years.', 'duration': 27.465, 'max_score': 1581.555, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1581555.jpg'}, {'end': 1686.957, 'src': 'heatmap', 'start': 1645.765, 'weight': 5, 'content': [{'end': 1647.486, 'text': "I don't feel like I do.", 'start': 1645.765, 'duration': 1.721}, {'end': 1652.747, 'text': 'Variational autoencoders where you use the reparameterization trick seem to be a really nice idea.', 'start': 1648.206, 'duration': 4.541}, {'end': 1658.168, 'text': 'And generative adversarial nets also seem to me to be a really nice idea.', 'start': 1653.567, 'duration': 4.601}, {'end': 1665.471, 'text': "I think generative adversarial nets are one of the sort of biggest ideas in deep learning that's really new.", 'start': 1658.469, 'duration': 7.002}, {'end': 1669.452, 'text': "I'm hoping I can make capsules that successful.", 'start': 1667.512, 'duration': 1.94}, {'end': 1674.074, 'text': 'But right now, general adversarial nets, I think, have been a big breakthrough.', 'start': 1669.492, 'duration': 4.582}, {'end': 1681.616, 'text': 'What happened to sparsity and slow features, which were two of the other principles for building unsupervised models?', 'start': 1674.954, 'duration': 6.662}, {'end': 1686.957, 'text': 'I was never as big on sparsity as you were.', 'start': 1684.856, 'duration': 2.101}], 'summary': 'Generative adversarial nets are a big breakthrough in deep learning.', 'duration': 41.192, 'max_score': 1645.765, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1645765.jpg'}, {'end': 1735.628, 'src': 'embed', 'start': 1711.823, 'weight': 6, 'content': [{'end': 1723.846, 'text': 'You take your measurements and you apply nonlinear transformations to your measurements until you get to a representation as a state vector in which the action is linear.', 'start': 1711.823, 'duration': 12.023}, {'end': 1729.247, 'text': "So you don't just pretend it's linear, like you do with Kalman filters,", 'start': 1725.866, 'duration': 3.381}, {'end': 1735.628, 'text': 'but you actually find a transformation from the observables to the underlying variables,', 'start': 1729.247, 'duration': 6.381}], 'summary': 'Apply nonlinear transformations to measurements to achieve linear action representation.', 'duration': 23.805, 'max_score': 1711.823, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1711823.jpg'}], 'start': 1266.683, 'title': 'Capsules for neural networks', 'summary': 'Explores using capsules for representing multidimensional entities in neural networks, with a focus on routing by agreement for improved generalization and potential applications in unsupervised learning.', 'chapters': [{'end': 1379.35, 'start': 1266.683, 'title': 'Multidimensional entity representation', 'summary': 'Discusses a new approach to representing multidimensional entities using capsules, which can represent different aspects of a feature in neural networks, with the key idea being to use vectors of activities to capture various properties and coordinates of a feature.', 'duration': 112.667, 'highlights': ['Capsules are used to represent different aspects of a feature, such as X and Y coordinates, orientation, speed, and color, using a distributed representation. This new approach uses capsules to represent multiple dimensions of a feature, allowing for the representation of various properties and coordinates within a region of the image.', 'The concept of capsules allows for the representation of an instance of a feature with multiple properties, unlike a normal neuron in a neural net, which represents just one scale of property. Capsules enable the representation of a feature with multiple properties, distinguishing them from normal neurons in neural nets that represent only one scale of property.', 'The proposed approach involves bundling up neurons into little groups to represent different coordinates of the same thing, thus introducing an extra structure into the representation. The new representation approach involves grouping neurons to capture different coordinates of a feature, introducing an additional structure not present in traditional neural networks.']}, {'end': 1779.167, 'start': 1379.45, 'title': 'Capsules for neural net generalization', 'summary': "Discusses the concept of routing by agreement using capsules for segmentation, aiming to improve neural nets' generalization from limited data and deal with changes in viewpoint, with a focus on the potential of unsupervised learning.", 'duration': 399.717, 'highlights': ["The concept of routing by agreement using capsules for segmentation is seen as crucial for improving neural nets' generalization from limited data and dealing with changes in viewpoint, potentially making it more statistically efficient than current neural net methods. routing by agreement using capsules, improving generalization, dealing with changes in viewpoint, statistical efficiency", 'The shift towards unsupervised learning is deemed crucial for the future of AI, although supervised learning has shown significant success in the past decade. shift towards unsupervised learning, importance of supervised learning success', 'The potential of variational autoencoders and generative adversarial nets in unsupervised learning is acknowledged, with generative adversarial nets being highlighted as a significant breakthrough in deep learning. variational autoencoders, generative adversarial nets, breakthrough in deep learning', 'The concept of modeling using nonlinear transformations to achieve a linear representation state vector is discussed, with a focus on the potential of capsules in finding coordinate representations for changing viewpoints. modeling using nonlinear transformations, potential of capsules in finding coordinate representations, changing viewpoints']}], 'duration': 512.484, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1266683.jpg', 'highlights': ['Capsules represent multiple dimensions of a feature, allowing for the representation of various properties and coordinates within a region of the image.', 'Capsules enable the representation of a feature with multiple properties, distinguishing them from normal neurons in neural nets that represent only one scale of property.', 'The new representation approach involves grouping neurons to capture different coordinates of a feature, introducing an additional structure not present in traditional neural networks.', "Routing by agreement using capsules is crucial for improving neural nets' generalization from limited data and dealing with changes in viewpoint, potentially making it more statistically efficient than current neural net methods.", 'The shift towards unsupervised learning is deemed crucial for the future of AI, although supervised learning has shown significant success in the past decade.', 'The potential of variational autoencoders and generative adversarial nets in unsupervised learning is acknowledged, with generative adversarial nets being highlighted as a significant breakthrough in deep learning.', 'The concept of modeling using nonlinear transformations to achieve a linear representation state vector is discussed, with a focus on the potential of capsules in finding coordinate representations for changing viewpoints.']}, {'end': 2383.399, 'segs': [{'end': 1933.45, 'src': 'heatmap', 'start': 1823.637, 'weight': 0, 'content': [{'end': 1832.96, 'text': 'I think what you want to do is read a little bit of the literature and notice something that you think everybody is doing wrong and contrarian in that sense.', 'start': 1823.637, 'duration': 9.323}, {'end': 1835.534, 'text': "You look at it and it just doesn't feel right.", 'start': 1833.533, 'duration': 2.001}, {'end': 1838.795, 'text': 'And then figure out how to do it right.', 'start': 1836.875, 'duration': 1.92}, {'end': 1843.738, 'text': "And then when people tell you that's no good, just keep at it.", 'start': 1840.056, 'duration': 3.682}, {'end': 1852.442, 'text': "And I have a very good principle for helping people keep at it, which is either your intuitions are good or they're not.", 'start': 1845.779, 'duration': 6.663}, {'end': 1856.623, 'text': "If your intuitions are good, you should follow them and you'll eventually be successful.", 'start': 1853.202, 'duration': 3.421}, {'end': 1859.765, 'text': "If your intuitions are not good, it doesn't matter what you do.", 'start': 1857.304, 'duration': 2.461}, {'end': 1864.396, 'text': 'Right Inspiring advice.', 'start': 1863.095, 'duration': 1.301}, {'end': 1865.596, 'text': 'So might as well go for it.', 'start': 1864.416, 'duration': 1.18}, {'end': 1868.257, 'text': 'You might as well trust your intuitions.', 'start': 1866.617, 'duration': 1.64}, {'end': 1870.438, 'text': "There's no point not trusting them.", 'start': 1868.497, 'duration': 1.941}, {'end': 1878.241, 'text': 'I usually advise people to not just read, but replicate published papers.', 'start': 1871.138, 'duration': 7.103}, {'end': 1884.043, 'text': 'And maybe that puts a natural limiter on how many you could do because replicating results is pretty time consuming.', 'start': 1878.441, 'duration': 5.602}, {'end': 1890.326, 'text': "Yes, it's true that when you try and replicate a published paper, you discover all the little tricks necessary to make it to work.", 'start': 1885.104, 'duration': 5.222}, {'end': 1895.273, 'text': 'The other advice I have is never stop programming.', 'start': 1891.671, 'duration': 3.602}, {'end': 1901.236, 'text': "Because if you give a student something to do, if they're a bad student, they'll come back and say it didn't work.", 'start': 1895.373, 'duration': 5.863}, {'end': 1907.52, 'text': "And the reason it didn't work would be some little decision they made that they didn't realize was crucial.", 'start': 1901.837, 'duration': 5.683}, {'end': 1915.144, 'text': "And if you give it to a good student, like Yi-Yi Tei, for example, you can give him anything and he'll come back and he'll say it worked.", 'start': 1908.28, 'duration': 6.864}, {'end': 1916.885, 'text': 'I remember doing this once.', 'start': 1915.864, 'duration': 1.021}, {'end': 1918.838, 'text': 'And I said, but wait a minute, EY.', 'start': 1917.697, 'duration': 1.141}, {'end': 1922.701, 'text': "Since we last talked, I realized it couldn't possibly work for the following reason.", 'start': 1919.739, 'duration': 2.962}, {'end': 1925.704, 'text': 'And EY said, oh, yeah, well, I realized that right away.', 'start': 1923.502, 'duration': 2.202}, {'end': 1927.025, 'text': "So I assumed you didn't mean that.", 'start': 1925.724, 'duration': 1.301}, {'end': 1928.105, 'text': 'I see.', 'start': 1927.765, 'duration': 0.34}, {'end': 1929.987, 'text': "Yeah, that's good.", 'start': 1929.427, 'duration': 0.56}, {'end': 1933.45, 'text': "Yeah Let's see.", 'start': 1930.027, 'duration': 3.423}], 'summary': 'Trust your intuitions, replicate published papers, and never stop programming.', 'duration': 83.883, 'max_score': 1823.637, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1823637.jpg'}, {'end': 2049.676, 'src': 'embed', 'start': 2023.478, 'weight': 4, 'content': [{'end': 2032.303, 'text': 'Oh and and research topics you know new grad students should work on what capsules and maybe unsupervised learning any other.', 'start': 2023.478, 'duration': 8.825}, {'end': 2042.872, 'text': 'One good piece of advice for new grad students is see if you can find an advisor who has beliefs similar to yours,', 'start': 2034.287, 'duration': 8.585}, {'end': 2049.676, 'text': "because if you work on stuff that your advisor feels deeply about, you'll get a lot of good advice and time from your advisor.", 'start': 2042.872, 'duration': 6.804}], 'summary': 'New grad students should focus on capsules, unsupervised learning, and find an advisor with similar beliefs for better guidance and support.', 'duration': 26.198, 'max_score': 2023.478, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE2023478.jpg'}, {'end': 2119.171, 'src': 'embed', 'start': 2074.782, 'weight': 5, 'content': [{'end': 2082.888, 'text': "I think right now what's happening is there aren't enough academics trained in deep learning to educate all the people.", 'start': 2074.782, 'duration': 8.106}, {'end': 2084.708, 'text': 'we need educated in universities.', 'start': 2082.888, 'duration': 1.82}, {'end': 2087.431, 'text': "There just isn't the faculty bandwidth there.", 'start': 2085.208, 'duration': 2.223}, {'end': 2090.426, 'text': "But I think that's going to be temporary.", 'start': 2089.045, 'duration': 1.381}, {'end': 2097.49, 'text': "I think what's happened is most departments are being very slow to understand the kind of revolution that's going on.", 'start': 2090.947, 'duration': 6.543}, {'end': 2103.574, 'text': "I kind of agree with you that it's not quite a second industrial revolution, but it's something on nearly that scale.", 'start': 2098.111, 'duration': 5.463}, {'end': 2111.638, 'text': "And there's a huge sea change going on, basically because our relationship to computers has changed.", 'start': 2104.354, 'duration': 7.284}, {'end': 2116.441, 'text': 'Instead of programming them, we now show them and they figure it out.', 'start': 2111.778, 'duration': 4.663}, {'end': 2119.171, 'text': "That's a completely different way of using computers.", 'start': 2117.13, 'duration': 2.041}], 'summary': 'Academic shortage in deep learning, slow university adoption, but change is coming.', 'duration': 44.389, 'max_score': 2074.782, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE2074782.jpg'}, {'end': 2310.035, 'src': 'embed', 'start': 2284.617, 'weight': 6, 'content': [{'end': 2293.726, 'text': "What's happened now is there's a completely different view, which is that What a thought is, is just a great big vector of neural activity.", 'start': 2284.617, 'duration': 9.109}, {'end': 2297.888, 'text': 'So contrast that with a thought being a symbolic expression.', 'start': 2295.287, 'duration': 2.601}, {'end': 2303.391, 'text': 'And I think the people who thought that thoughts were symbolic expressions just made a huge mistake.', 'start': 2298.889, 'duration': 4.502}, {'end': 2310.035, 'text': 'What comes in is a string of words, and what comes out is a string of words.', 'start': 2304.452, 'duration': 5.583}], 'summary': 'Thoughts are now viewed as neural vectors, not symbolic expressions.', 'duration': 25.418, 'max_score': 2284.617, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE2284617.jpg'}, {'end': 2383.399, 'src': 'heatmap', 'start': 2359.31, 'weight': 0.75, 'content': [{'end': 2364.913, 'text': 'i see, yep, i guess ai is certainly coming around to this new point of view these days.', 'start': 2359.31, 'duration': 5.603}, {'end': 2370.416, 'text': 'some of it i see, i think a lot of people in there i still think thoughts have to be symbolic expressions.', 'start': 2364.913, 'duration': 5.503}, {'end': 2372.98, 'text': 'Thank you very much for doing this interview.', 'start': 2371.457, 'duration': 1.523}, {'end': 2379.051, 'text': "It's fascinating to hear how deep learning has evolved over the years, as well as how you're still helping drive it into the future.", 'start': 2373.04, 'duration': 6.011}, {'end': 2379.933, 'text': 'So, thank you, Jeff.', 'start': 2379.191, 'duration': 0.742}, {'end': 2382.597, 'text': 'Well, thank you for giving me this opportunity.', 'start': 2380.914, 'duration': 1.683}, {'end': 2383.399, 'text': 'Okay, thank you.', 'start': 2382.858, 'duration': 0.541}], 'summary': 'Ai has evolved, deep learning has advanced, jeff is driving it into the future', 'duration': 24.089, 'max_score': 2359.31, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE2359310.jpg'}], 'start': 1780.467, 'title': 'Deep learning insights', 'summary': 'Provides key advice on breaking into deep learning, emphasizing the importance of developing intuitions, replicating published papers, and trusting instincts. it also discusses insights from jeff dean on deep learning evolution, including advice for new grad students, challenges in academia, and the paradigm shift in ai.', 'chapters': [{'end': 2021.818, 'start': 1780.467, 'title': 'Breaking into deep learning: key advice', 'summary': "Emphasizes the importance of developing intuitions, replicating published papers, and trusting one's instincts to break into deep learning, with a notable example of a successful idea initially deemed 'complete nonsense'.", 'duration': 241.351, 'highlights': ['Replicate published papers to develop a deep understanding of the subject. The speaker advises people to not just read but replicate published papers as it helps in developing intuitions and understanding the subject deeply.', "Trust your intuitions and go for it, even if others disagree. The speaker encourages trusting one's intuitions and pursuing ideas, even if they are considered nonsense by others, citing an example of a successful idea initially dismissed by others.", "Importance of noticing and correcting what is perceived as wrong in the literature. The speaker suggests that creative researchers should notice something in the literature that they think is wrong, contrarian, and doesn't feel right, and then figure out how to do it right.", 'Never stop programming to develop problem-solving skills. The speaker emphasizes the importance of continuous programming to develop problem-solving skills, distinguishing between good and bad students based on their ability to identify crucial decisions.']}, {'end': 2383.399, 'start': 2023.478, 'title': 'Jeff dean on deep learning evolution', 'summary': 'Discusses the advice for new grad students, the challenges in academia for deep learning education, the shift in the relationship with computers, and the paradigm shift in ai from symbolic expressions to neural activity vectors.', 'duration': 359.921, 'highlights': ['The need for new grad students to find an advisor with similar beliefs for valuable advice and time. New grad students should seek advisors with similar beliefs to receive valuable advice and time, as working on topics of interest to the advisor leads to more useful guidance.', 'Challenges in academia due to the slow understanding of the deep learning revolution and the need for faculty trained in deep learning. There is a shortage of academics trained in deep learning to educate the required number of individuals, leading to a reliance on big companies for training.', 'The paradigm shift in AI from symbolic expressions to neural activity vectors and the misconception of thoughts as symbolic expressions. There is a shift in the understanding of AI, with the belief that thoughts are represented as neural activity vectors rather than symbolic expressions, challenging the traditional view of intelligence and reasoning.', 'The shift in the relationship with computers from programming to showing them, causing a sea change in the use of computers. The shift in the relationship with computers from programming to showing them signifies a significant change in computer usage, with a growing emphasis on showing computers to perform tasks rather than programming them.']}], 'duration': 602.932, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/-eyhCTvrEtE/pics/-eyhCTvrEtE1780467.jpg', 'highlights': ['Replicate published papers to develop a deep understanding of the subject.', 'Trust your intuitions and go for it, even if others disagree.', 'Importance of noticing and correcting what is perceived as wrong in the literature.', 'Never stop programming to develop problem-solving skills.', 'The need for new grad students to find an advisor with similar beliefs for valuable advice and time.', 'Challenges in academia due to the slow understanding of the deep learning revolution and the need for faculty trained in deep learning.', 'The paradigm shift in AI from symbolic expressions to neural activity vectors and the misconception of thoughts as symbolic expressions.', 'The shift in the relationship with computers from programming to showing them, causing a sea change in the use of computers.']}], 'highlights': ['The evolution of deep learning with a 100-fold performance improvement.', 'The impact of restricted Boltzmann machines and potential brain-related aspects of backpropagation.', 'The concept of capsules for neural networks and deep learning insights.', "The backpropagation algorithm's development in AI and its impact on memory distribution and knowledge representation.", 'The unification of different strands of ideas about knowledge through the backpropagation algorithm.', "The acceptance of neural nets through a Nature paper in 1986, showcasing the algorithm's ability to learn representations for words.", 'In 1993, there was a 100-fold increase in computing power, with machines reaching 10 megaflops, making it easier to use for deep learning.', 'The most exciting invention, according to the speaker, is the work done on Baltimore machines, which involved a simple learning algorithm for densely connected nets, enabling the learning of hidden representations.', 'The Netflix competition winning entry involved the use of Restricted Boltzmann machines, showcasing their effectiveness in practice.', 'Efficient inference in deep belief nets was made possible through trained restricted Boltzmann machines, enabling fast approximate inference in a single forward pass.', 'Significant improvement of EM algorithm in statistics was achieved through variational methods.', 'Bengio demonstrated the application of deep learning techniques to real data, such as English text, leading to the generation of word embeddings.', 'The brain may not exactly implement backpropagation, but it is quite close to it, highlighting the plausibility of the relationship.', 'There is a huge selective pressure for the implementation of backpropagation in the brain, indicating the significance of this potential relationship.', 'If backpropagation is a good algorithm for learning, then evolution could have figured out how to implement it, suggesting a potential relationship between backpropagation and the brain.', 'Recirculation learning and spike time-dependent plasticity are connected through the change of weight in proportion to the pre-synaptic activity times the new post-synaptic activity minus the old one.', 'Fast weights enable true recursion by reusing neurons and weights in recursive calls, with the capacity to store memory in fast weights and retrieve activity states of neurons from them.', 'The concept of capsules in deep learning and the challenges faced in getting the idea accepted through paper submissions.', 'Capsules represent multiple dimensions of a feature, allowing for the representation of various properties and coordinates within a region of the image.', "Routing by agreement using capsules is crucial for improving neural nets' generalization from limited data and dealing with changes in viewpoint, potentially making it more statistically efficient than current neural net methods.", 'The shift towards unsupervised learning is deemed crucial for the future of AI, although supervised learning has shown significant success in the past decade.', 'The potential of variational autoencoders and generative adversarial nets in unsupervised learning is acknowledged, with generative adversarial nets being highlighted as a significant breakthrough in deep learning.', 'The concept of modeling using nonlinear transformations to achieve a linear representation state vector is discussed, with a focus on the potential of capsules in finding coordinate representations for changing viewpoints.', 'Replicate published papers to develop a deep understanding of the subject.', 'Trust your intuitions and go for it, even if others disagree.', 'Importance of noticing and correcting what is perceived as wrong in the literature.', 'Never stop programming to develop problem-solving skills.', 'The need for new grad students to find an advisor with similar beliefs for valuable advice and time.', 'Challenges in academia due to the slow understanding of the deep learning revolution and the need for faculty trained in deep learning.', 'The paradigm shift in AI from symbolic expressions to neural activity vectors and the misconception of thoughts as symbolic expressions.', 'The shift in the relationship with computers from programming to showing them, causing a sea change in the use of computers.']}