title
Analyzing Models with TensorBoard - Deep Learning with Python, TensorFlow and Keras p.4

description
Welcome to part 4 of the deep learning basics with Python, TensorFlow, and Keras tutorial series. In this part, what we're going to be talking about is TensorBoard. TensorBoard is a handy application that allows you to view aspects of your model, or models, in your browser. Text tutorials and sample code: https://pythonprogramming.net/tensorboard-analysis-deep-learning-python-tensorflow-keras/ Discord: https://discord.gg/sentdex Support the content: https://pythonprogramming.net/support-donate/ Twitter: https://twitter.com/sentdex Facebook: https://www.facebook.com/pythonprogramming.net/ Twitch: https://www.twitch.tv/sentdex G+: https://plus.google.com/+sentdex

detail
{'title': 'Analyzing Models with TensorBoard - Deep Learning with Python, TensorFlow and Keras p.4', 'heatmap': [{'end': 532.014, 'start': 472.481, 'weight': 0.706}, {'end': 593.984, 'start': 561.366, 'weight': 0.737}], 'summary': 'Covers optimizing deep learning models using tensorboard for visualization, allocating gpu resources, the importance of activation functions, and utilizing tensorboard callback in keras to achieve a validation accuracy of 72% and in-sample accuracy of 79% through various parameter optimizations.', 'chapters': [{'end': 99.518, 'segs': [{'end': 45.961, 'src': 'embed', 'start': 1.841, 'weight': 0, 'content': [{'end': 9.144, 'text': "what's going on everybody and welcome to part four of the deep learning with python, tensorflow and keras tutorial series.", 'start': 1.841, 'duration': 7.303}, {'end': 17.748, 'text': "in this video and the next video, what we're gonna be doing is talking about how we can analyze and optimize our models using tensor board.", 'start': 9.144, 'duration': 8.604}, {'end': 23.871, 'text': 'now, tensorboard is a way for us to visualize basically the training of our model over time,', 'start': 17.748, 'duration': 6.123}, {'end': 32.235, 'text': "and basically you're mostly going to be using this to watch things like accuracy versus validation, accuracy and loss versus validation loss.", 'start': 24.651, 'duration': 7.584}, {'end': 36.779, 'text': "but there's also some more tricky, tricky things that we can do that maybe we'll cover more in time,", 'start': 32.235, 'duration': 4.544}, {'end': 44.6, 'text': "but The idea of tensor board is to just kind of help you shed some light on what's going on in your model.", 'start': 36.779, 'duration': 7.821}, {'end': 45.961, 'text': "So with that, let's go ahead and get started.", 'start': 44.6, 'duration': 1.361}], 'summary': 'Learn how to analyze and optimize models using tensorboard for visualizing training progress and accuracy.', 'duration': 44.12, 'max_score': 1.841, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/BqgTU7_cBnk/pics/BqgTU7_cBnk1841.jpg'}, {'end': 81.151, 'src': 'embed', 'start': 61.166, 'weight': 2, 'content': [{'end': 71.289, 'text': 'so but if you wanted to run like multiple models at the exact same time and You could specify a fraction of your gpu that you want that model to take up in this case a third Mostly,', 'start': 61.166, 'duration': 10.123}, {'end': 71.969, 'text': "i'm just doing this.", 'start': 71.289, 'duration': 0.68}, {'end': 78.39, 'text': "So when I rerun a model, sometimes it takes a minute for whatever reason, Especially if i'm using like sublime or something like that,", 'start': 72.029, 'duration': 6.361}, {'end': 81.151, 'text': 'And it just takes a minute to clear out the the vram.', 'start': 78.39, 'duration': 2.761}], 'summary': 'You can run multiple models at the same time, specifying a fraction of gpu usage, mainly 1/3, to clear out vram which may take a minute.', 'duration': 19.985, 'max_score': 61.166, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/BqgTU7_cBnk/pics/BqgTU7_cBnk61166.jpg'}], 'start': 1.841, 'title': 'Optimizing deep learning models', 'summary': 'Delves into optimizing deep learning models through the use of tensorboard for visualizing training progress and model performance, along with techniques for allocating gpu resources to run multiple models concurrently.', 'chapters': [{'end': 99.518, 'start': 1.841, 'title': 'Deep learning model optimization', 'summary': 'Discusses optimizing deep learning models using tensorboard to visualize training progress and model performance, also mentioning the ability to allocate gpu resources for running multiple models simultaneously.', 'duration': 97.677, 'highlights': ['The chapter discusses optimizing deep learning models using TensorBoard to visualize training progress and model performance TensorBoard is used to visualize training progress, accuracy versus validation accuracy, and loss versus validation loss.', 'Mention of allocating GPU resources for running multiple models simultaneously The tutorial mentions the ability to specify a fraction of GPU for running multiple models at the same time, providing an example of allocating a third of the GPU for a model.']}], 'duration': 97.677, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/BqgTU7_cBnk/pics/BqgTU7_cBnk1841.jpg', 'highlights': ['The chapter delves into optimizing deep learning models using TensorBoard to visualize training progress and model performance.', 'TensorBoard is used to visualize training progress, accuracy versus validation accuracy, and loss versus validation loss.', 'The tutorial mentions the ability to specify a fraction of GPU for running multiple models at the same time, providing an example of allocating a third of the GPU for a model.']}, {'end': 582.813, 'segs': [{'end': 163.577, 'src': 'embed', 'start': 100.581, 'weight': 0, 'content': [{'end': 105.062, 'text': "a little tip if you find yourself needing that anyway, let's get started.", 'start': 100.581, 'duration': 4.481}, {'end': 109.924, 'text': 'so the first thing i want to go ahead and do is also a little bit of housekeeping.', 'start': 105.062, 'duration': 4.862}, {'end': 112.625, 'text': 'we need an activation function after this dense layer.', 'start': 109.924, 'duration': 2.701}, {'end': 118.227, 'text': "it was kind of silly of me to forget that someone pointed that out, and yeah, that's definitely going to be very problematic.", 'start': 112.625, 'duration': 5.602}, {'end': 125.33, 'text': 'basically, without the activation function, it becomes like a linear activation function, which is pretty much useless in this case.', 'start': 118.227, 'duration': 7.103}, {'end': 127.411, 'text': "If anything, it's probably causing trouble.", 'start': 125.89, 'duration': 1.521}, {'end': 130.353, 'text': "We're definitely not trying to do any sort of regression or anything like that.", 'start': 127.431, 'duration': 2.922}, {'end': 133.675, 'text': 'So doing that right before the dense layer, ouch.', 'start': 130.453, 'duration': 3.222}, {'end': 135.256, 'text': "So we don't really want to do that.", 'start': 134.095, 'duration': 1.161}, {'end': 137.377, 'text': "So let's go ahead and save this.", 'start': 135.736, 'duration': 1.641}, {'end': 142.54, 'text': 'And with just that change, we should find that accuracy is quite a bit better.', 'start': 137.757, 'duration': 4.783}, {'end': 152.166, 'text': "Now I'm going to let that run, and then I'm going to bop on over to the TensorFlow documentation for the various Keras callbacks.", 'start': 142.58, 'duration': 9.586}, {'end': 160.635, 'text': "So the way that we can interface with TensorBoard is with a Keras callback, and specifically we're going for the TensorBoard one.", 'start': 153.393, 'duration': 7.242}, {'end': 163.577, 'text': 'But take note of the other ones that exist here.', 'start': 161.436, 'duration': 2.141}], 'summary': 'Adding an activation function improved accuracy significantly, leading to a better model performance.', 'duration': 62.996, 'max_score': 100.581, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/BqgTU7_cBnk/pics/BqgTU7_cBnk100581.jpg'}, {'end': 277.376, 'src': 'embed', 'start': 242.37, 'weight': 4, 'content': [{'end': 246.612, 'text': 'So the first thing that we want to go ahead and do is bring in TensorBoard.', 'start': 242.37, 'duration': 4.242}, {'end': 252.153, 'text': "And so we're going to go from tensorflow.keras.callbacks.", 'start': 247.27, 'duration': 4.883}, {'end': 258.077, 'text': 'We want to import tensorboard.', 'start': 252.313, 'duration': 5.764}, {'end': 260.619, 'text': "Okay, so we've brought that in.", 'start': 259.498, 'duration': 1.121}, {'end': 265.362, 'text': 'And I just want to point out validation accuracy, 72.', 'start': 261.139, 'duration': 4.223}, {'end': 268.664, 'text': 'And then in sample accuracy, basically 79.', 'start': 265.362, 'duration': 3.302}, {'end': 270.025, 'text': 'So much better than before.', 'start': 268.664, 'duration': 1.361}, {'end': 274.248, 'text': 'And again, that was only three epochs ago.', 'start': 270.085, 'duration': 4.163}, {'end': 277.376, 'text': 'Right? Yeah.', 'start': 276.431, 'duration': 0.945}], 'summary': 'Imported tensorboard, achieved validation accuracy of 72% and in-sample accuracy of 79% in just three epochs.', 'duration': 35.006, 'max_score': 242.37, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/BqgTU7_cBnk/pics/BqgTU7_cBnk242370.jpg'}, {'end': 532.014, 'src': 'heatmap', 'start': 472.481, 'weight': 0.706, 'content': [{'end': 480.826, 'text': 'Okay, so as we train, you should now have a new directory right here called logs.', 'start': 472.481, 'duration': 8.345}, {'end': 484.749, 'text': "And if we click on that, we can see, oh, there's our model right now.", 'start': 481.207, 'duration': 3.542}, {'end': 492.534, 'text': "So to view it as it's going, you want to open up your command window, either a terminal or command prompt rather.", 'start': 485.389, 'duration': 7.145}, {'end': 497.556, 'text': 'At least on Windows, the way you do this and you can just type cmd from there.', 'start': 493.693, 'duration': 3.863}, {'end': 500.399, 'text': "just make sure you're in that main directory right?", 'start': 497.556, 'duration': 2.843}, {'end': 502.901, 'text': "So you're in the directory that houses logs.", 'start': 500.439, 'duration': 2.462}, {'end': 514.41, 'text': 'Now, to get TensorBoard running, just type tensorboard dash dash logder equals and then logs.', 'start': 503.782, 'duration': 10.628}, {'end': 517.354, 'text': 'I wonder if that has an underscore too.', 'start': 514.431, 'duration': 2.923}, {'end': 519.436, 'text': "Yeah, okay, that must be why I didn't do the underscore.", 'start': 517.374, 'duration': 2.062}, {'end': 530.853, 'text': "okay. so then it says okay, it's running at this address, so you can just like copy and paste that address, if you want, into a browser,", 'start': 520.365, 'duration': 10.488}, {'end': 532.014, 'text': 'or you could just manually type it.', 'start': 530.853, 'duration': 1.161}], 'summary': "Training generates a new directory 'logs' with the model; to view it, open command window, type 'tensorboard --logdir=logs', and access the given address in a browser.", 'duration': 59.533, 'max_score': 472.481, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/BqgTU7_cBnk/pics/BqgTU7_cBnk472481.jpg'}], 'start': 100.581, 'title': 'Neural network activation function and improving model accuracy with tensorboard', 'summary': 'Discusses the importance of an activation function in a neural network and the consequences of not having one, emphasizing the need to avoid linear activation functions. it also demonstrates using the tensorboard callback in keras to improve model accuracy, achieving a validation accuracy of 72% and in-sample accuracy of 79% by optimizing various parameters such as learning rate, early stopping, and model checkpointing.', 'chapters': [{'end': 137.377, 'start': 100.581, 'title': 'Neural network activation function', 'summary': 'Discusses the importance of an activation function in a neural network, highlighting the consequences of not having one and the need to avoid linear activation functions to prevent regression issues.', 'duration': 36.796, 'highlights': ['Without an activation function, the neural network defaults to a linear activation function, which is not suitable for the task at hand and can cause problems.', 'The absence of an activation function was pointed out as an oversight, highlighting the significance of this requirement in neural network configurations.', 'Emphasizes the need to avoid using linear activation functions to prevent regression issues and ensure the effectiveness of the neural network.', 'Housekeeping note: an activation function is essential after the dense layer to avoid potential problems and ensure the proper functioning of the neural network.']}, {'end': 582.813, 'start': 137.757, 'title': 'Improving model accuracy with tensorboard', 'summary': 'Demonstrates how to use the tensorboard callback in keras to improve model accuracy, achieving a validation accuracy of 72% and in-sample accuracy of 79% by optimizing various parameters such as learning rate, early stopping, and model checkpointing.', 'duration': 445.056, 'highlights': ['The chapter demonstrates how to use the TensorBoard callback in Keras to improve model accuracy. The chapter focuses on using the TensorBoard callback in Keras to enhance model accuracy.', 'Achieving a validation accuracy of 72% and in-sample accuracy of 79% by optimizing various parameters such as learning rate, early stopping, and model checkpointing. The model achieved a validation accuracy of 72% and in-sample accuracy of 79% by optimizing parameters like learning rate, early stopping, and model checkpointing.', 'The way to interface with TensorBoard is through a Keras callback, specifically the TensorBoard one, and there are other useful callbacks available. The interface with TensorBoard is through a Keras callback, specifically the TensorBoard callback, with other useful callbacks also available.']}], 'duration': 482.232, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/BqgTU7_cBnk/pics/BqgTU7_cBnk100581.jpg', 'highlights': ['An activation function is essential after the dense layer to avoid potential problems and ensure the proper functioning of the neural network.', 'The absence of an activation function was pointed out as an oversight, highlighting the significance of this requirement in neural network configurations.', 'Emphasizes the need to avoid using linear activation functions to prevent regression issues and ensure the effectiveness of the neural network.', 'Without an activation function, the neural network defaults to a linear activation function, which is not suitable for the task at hand and can cause problems.', 'Achieving a validation accuracy of 72% and in-sample accuracy of 79% by optimizing various parameters such as learning rate, early stopping, and model checkpointing.', 'The chapter demonstrates how to use the TensorBoard callback in Keras to improve model accuracy.', 'The way to interface with TensorBoard is through a Keras callback, specifically the TensorBoard callback, with other useful callbacks also available.']}, {'end': 968.231, 'segs': [{'end': 695.084, 'src': 'embed', 'start': 648.603, 'weight': 0, 'content': [{'end': 653.407, 'text': 'now the question you might ask is well, why, why after, why is this happening?', 'start': 648.603, 'duration': 4.804}, {'end': 657.77, 'text': 'well, the model is only getting the you know in sample training data.', 'start': 653.407, 'duration': 4.363}, {'end': 667.254, 'text': 'so at some point it goes from generalizing and really seeing patterns to memorizing all of the input samples or many of the input samples.', 'start': 657.77, 'duration': 9.484}, {'end': 672.235, 'text': "So what you're going to want to do is watch always validation loss.", 'start': 667.974, 'duration': 4.261}, {'end': 679.058, 'text': "At least for me when I'm like evaluating a model, I'm looking almost purely at validation loss and really nothing else.", 'start': 672.275, 'duration': 6.783}, {'end': 683.419, 'text': 'so now what you could do is you could begin to kind of change things up.', 'start': 679.978, 'duration': 3.441}, {'end': 687.121, 'text': 'so, for example, we could do one, we could do a model.', 'start': 683.419, 'duration': 3.702}, {'end': 689.302, 'text': 'uh, without this dense layer at all.', 'start': 687.121, 'duration': 2.181}, {'end': 691.663, 'text': "so let's go ahead and just remove the dense layer.", 'start': 689.302, 'duration': 2.361}, {'end': 692.903, 'text': "and this time let's do.", 'start': 691.663, 'duration': 1.24}, {'end': 695.084, 'text': "uh, let's do 20 epochs.", 'start': 692.903, 'duration': 2.181}], 'summary': 'Model overfits due to memorizing input samples. focus on validation loss for evaluation.', 'duration': 46.481, 'max_score': 648.603, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/BqgTU7_cBnk/pics/BqgTU7_cBnk648603.jpg'}, {'end': 892.014, 'src': 'embed', 'start': 868.396, 'weight': 3, 'content': [{'end': 875.208, 'text': "Like you can always, It's almost like the better the in-sample looks to the out-of-sample, that's clearly the worse the model is.", 'start': 868.396, 'duration': 6.812}, {'end': 877.409, 'text': 'That means over-fitment is occurring.', 'start': 875.448, 'duration': 1.961}, {'end': 879.63, 'text': 'So this is what I tend to look for.', 'start': 878.029, 'duration': 1.601}, {'end': 887.192, 'text': 'Just for some other quick notes, if you just wanted to look at one, you could check or uncheck, or you can also just click this.', 'start': 880.39, 'duration': 6.802}, {'end': 890.073, 'text': 'If you had 50, you could click just one quickly.', 'start': 887.392, 'duration': 2.681}, {'end': 892.014, 'text': 'You can toggle all runs down here.', 'start': 890.253, 'duration': 1.761}], 'summary': 'Over-fitment occurs when in-sample looks better than out-of-sample.', 'duration': 23.618, 'max_score': 868.396, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/BqgTU7_cBnk/pics/BqgTU7_cBnk868396.jpg'}], 'start': 584.256, 'title': 'Model training and validation', 'summary': 'Covers monitoring validation loss, the impact of overfitting, and modifying model architecture for improved evaluation. it also discusses training a model with 20 epochs, observing performance without a dense layer, and using tensorboard for analyzing validation accuracy and loss.', 'chapters': [{'end': 695.084, 'start': 584.256, 'title': 'Model evaluation and validation', 'summary': 'Discusses the importance of monitoring validation loss in model evaluation, as well as the impact of overfitting on validation accuracy, with a suggestion to modify the model architecture for improved evaluation.', 'duration': 110.828, 'highlights': ['The importance of monitoring validation loss in evaluating a model, emphasizing it as a key metric for assessment. Emphasizes the significance of monitoring validation loss for model evaluation, stating that it is the primary metric considered during evaluation.', 'Discussion on the impact of overfitting, with validation loss creeping back up after the fifth epoch despite an improvement in validation accuracy. Highlights the impact of overfitting on validation loss, despite an improvement in validation accuracy, indicating the need to address overfitting in model training.', 'Suggestion to modify the model architecture by removing a dense layer and running the model for 20 epochs to address overfitting issues. Proposes modifying the model architecture by removing a dense layer and conducting training for 20 epochs as a potential solution to address overfitting issues.']}, {'end': 968.231, 'start': 695.084, 'title': 'Optimizing model training', 'summary': 'Covers training a model with 20 epochs, observing the performance of the model without a dense layer, and using tensorboard to analyze validation accuracy and loss, emphasizing the importance of out-of-sample performance over in-sample performance and explaining the functionalities of tensorboard for model evaluation.', 'duration': 273.147, 'highlights': ['The chapter emphasizes the importance of out-of-sample performance over in-sample performance, highlighting that the model without a dense layer showed better out-of-sample performance, indicating potential overfitting in the in-sample data.', 'The tutorial introduces the use of TensorBoard to analyze validation accuracy and loss, showing the plateauing of validation accuracy after the eighth epoch and the better performance of out-of-sample loss compared to in-sample loss.', 'The speaker discusses the functionalities of TensorBoard, including the option to toggle all runs, adjust smoothing for data visualization, and filter through models based on specific criteria, providing insights into the practical use of TensorBoard for model evaluation.']}], 'duration': 383.975, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/BqgTU7_cBnk/pics/BqgTU7_cBnk584256.jpg', 'highlights': ['Monitoring validation loss is crucial for evaluating a model, serving as the primary metric during assessment.', 'Overfitting impacts validation loss despite improvements in accuracy, necessitating its mitigation during model training.', 'Modifying model architecture by removing a dense layer and running for 20 epochs can address overfitting issues.', 'Out-of-sample performance is emphasized over in-sample performance, indicating potential overfitting in the in-sample data.', 'TensorBoard is introduced for analyzing validation accuracy and loss, providing insights into model performance and practical use.']}], 'highlights': ['The chapter delves into optimizing deep learning models using TensorBoard to visualize training progress and model performance. 0.9', 'TensorBoard is used to visualize training progress, accuracy versus validation accuracy, and loss versus validation loss. 0.85', 'The tutorial mentions the ability to specify a fraction of GPU for running multiple models at the same time, providing an example of allocating a third of the GPU for a model. 0.8', 'The chapter demonstrates how to use the TensorBoard callback in Keras to improve model accuracy. 0.75', 'An activation function is essential after the dense layer to avoid potential problems and ensure the proper functioning of the neural network. 0.7', 'Monitoring validation loss is crucial for evaluating a model, serving as the primary metric during assessment. 0.65', 'Overfitting impacts validation loss despite improvements in accuracy, necessitating its mitigation during model training. 0.6', 'Modifying model architecture by removing a dense layer and running for 20 epochs can address overfitting issues. 0.55', 'Out-of-sample performance is emphasized over in-sample performance, indicating potential overfitting in the in-sample data. 0.5', 'The way to interface with TensorBoard is through a Keras callback, specifically the TensorBoard callback, with other useful callbacks also available. 0.45']}