title
End To End Machine Learning Project Implementation With Dockers,Github Actions And Deployment
description
guthub code link:https://github.com/krishnaik06/bostonhousepricing
Visit https://krishnaik.in for data science blogs
In this video we will be implementing an end to end ML project implementation with Dockers And Github Action
Time stamps:
Understanding the dataset 00:00:00
Preparing Dataset And Basic Analysis 00:08:14
Preparing Dataset For Model Training 00:22:10
Training The Model 00:36:54
Performance Metrics 00:52:41
Prediction Of New Data 00:55:33
Pickling the model file 00:59:45
Setting Up Github And VS Code 01:04:40
Tools And Software Required 01:12:00
Creating A New Environment 01:18:35
Setting up Git 01:26:07
Creating A FLASK Web Application 01:39:04
Running An Testing our application 01:53:14
Prediction From Front End Application 02:01:08
Procfile for Heroku Deployment 02:11:35
Deploying The App To Heroku 02:16:39
Deploying The App Using Dockers 02:23:20
-------------------------------------------------------------------------------------------------------------
All Playlist in my channel
Github Tutorials : https://www.youtube.com/watch?v=GW7B6vwktPA&list=PLZoTAELRMXVOSsBerFZKsdCaA4RYr4RGW
Live NLP Playlist: https://www.youtube.com/watch?v=w3coRFpyddQ&list=PLZoTAELRMXVNNrHSKv36Lr3_156yCo6Nn
Live Deep LEarning Playlist: https://www.youtube.com/watch?v=8arGWdq_KL0&list=PLZoTAELRMXVPiyueAqA_eQnsycC_DSBns
Live EDA Playlist: https://www.youtube.com/watch?v=bTN-6VPe8c0&list=PLZoTAELRMXVPzj1D0i_6ajJ6gyD22b3jh
Live ML Playlist: https://www.youtube.com/watch?v=z8sxaUw_f-M&list=PLZoTAELRMXVPjaAzURB77Kz0YXxj65tYz
Live Stats Playlist: https://www.youtube.com/watch?v=11unm2hmvOQ&list=PLZoTAELRMXVMgtxAboeAx-D9qbnY94Yay
My SQL Playlist: https://www.youtube.com/watch?v=us1XyayQ6fU&list=PLZoTAELRMXVNMRWlVf0bDDSxNEn38u9Cl
---------------------------------------------------------------------------------------------------------------
Please donate if you want to support the channel through GPay UPID,
Gpay: krishnaik06@okicici
Telegram link: https://t.me/joinchat/N77M7xRvYUd403DgfE4TWw
-------------------------------------------------------------------------------------------------------------
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
https://www.youtube.com/channel/UCNU_lfiiWBdtULKOw6X0Dig/join
-----------------------------------------------------------------------------------------------------------
Please do subscribe my other channel too
https://www.youtube.com/channel/UCjWY5hREA6FFYrthD0rZNIw
---------------------------------------------------------------------------------------------------------
Connect with me here:
Twitter: https://twitter.com/Krishnaik06
Facebook: https://www.facebook.com/krishnaik06
instagram: https://www.instagram.com/krishnaik06
detail
{'title': 'End To End Machine Learning Project Implementation With Dockers,Github Actions And Deployment', 'heatmap': [{'end': 1981.536, 'start': 1878.459, 'weight': 0.745}, {'end': 2178.649, 'start': 2076.226, 'weight': 1}, {'end': 4164.408, 'start': 3958.3, 'weight': 0.915}, {'end': 5145.934, 'start': 4844.781, 'weight': 0.72}, {'end': 6928.124, 'start': 6827.276, 'weight': 0.894}], 'summary': 'Demonstrates practical implementation of linear regression using the boston housing data set, regression model evaluation, setting up tools and environments, git and flask setup, flask web app and api development, deploying api and web application, and deploying applications on heroku and docker, achieving a 71% and 68% performance score, and emphasizing the use of docker for efficient application running.', 'chapters': [{'end': 370.349, 'segs': [{'end': 63.923, 'src': 'embed', 'start': 6.024, 'weight': 1, 'content': [{'end': 9.908, 'text': 'In this video, we are going to start our first practical implementation.', 'start': 6.024, 'duration': 3.884}, {'end': 14.413, 'text': 'Already. we have covered up the theoretical part of linear regression.', 'start': 10.609, 'duration': 3.804}, {'end': 17.336, 'text': 'we have understood the maths in-depth intuition.', 'start': 14.413, 'duration': 2.923}, {'end': 19.037, 'text': 'we have understood about gradient descent.', 'start': 17.336, 'duration': 1.701}, {'end': 24.063, 'text': 'we have understood about the performance metrics that we are going to use like R-squared and Existed R-squared.', 'start': 19.037, 'duration': 5.026}, {'end': 29.128, 'text': "Now let's go ahead and implement a project with the help of linear regression.", 'start': 24.643, 'duration': 4.485}, {'end': 37.156, 'text': "And we'll try to see that whenever we are given a data set and this is a very simple project, guys, just to begin with,", 'start': 29.908, 'duration': 7.248}, {'end': 40.799, 'text': 'so that it will actually help you to build up some amount of confidence.', 'start': 37.156, 'duration': 3.643}, {'end': 48.827, 'text': 'So first of all what we are going to do in this is that we are taking an amazing data set which is called as Boston Housing Data Set.', 'start': 41.34, 'duration': 7.487}, {'end': 56.636, 'text': 'And the main aim of that particular data set is that we really need to predict the price of the house based on various features.', 'start': 49.628, 'duration': 7.008}, {'end': 63.923, 'text': 'Now we are going to see what all features are there and what all things are not there, which are the important features and all.', 'start': 57.176, 'duration': 6.747}], 'summary': 'Practical implementation of linear regression using boston housing data set to predict house prices.', 'duration': 57.899, 'max_score': 6.024, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM6024.jpg'}, {'end': 106.011, 'src': 'embed', 'start': 82.121, 'weight': 3, 'content': [{'end': 88.786, 'text': 'first of all, what we are going to do is that we are going to import some libraries like import um pandas,', 'start': 82.121, 'duration': 6.665}, {'end': 96.284, 'text': "because pandas are super important in this import, And I'm going to write each and every line of code and parallelly try to explain them.", 'start': 88.786, 'duration': 7.498}, {'end': 106.011, 'text': 'Import numpy as np and some more libraries I want is that import matplotlib.pyplot as plt.', 'start': 96.825, 'duration': 9.186}], 'summary': 'Importing pandas and numpy libraries for code explanation.', 'duration': 23.89, 'max_score': 82.121, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM82121.jpg'}, {'end': 178.227, 'src': 'embed', 'start': 154.625, 'weight': 5, 'content': [{'end': 163.253, 'text': 'so today in this video we are going to use from a scalar, because scale learn also has a lot of data sets all together within themselves.', 'start': 154.625, 'duration': 8.628}, {'end': 168.518, 'text': "so i'm going to import the data set and i'm going to say import load boston.", 'start': 163.253, 'duration': 5.265}, {'end': 171.821, 'text': 'okay. so there is something called as load boston.', 'start': 168.518, 'duration': 3.303}, {'end': 175.845, 'text': 'along with that there are different, different data sets which you can definitely explore.', 'start': 171.821, 'duration': 4.024}, {'end': 178.227, 'text': 'then this will be my boston underscore, df.', 'start': 175.845, 'duration': 2.382}], 'summary': 'Using scalers in machine learning to import and explore boston dataset.', 'duration': 23.602, 'max_score': 154.625, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM154625.jpg'}, {'end': 304.229, 'src': 'embed', 'start': 273.557, 'weight': 0, 'content': [{'end': 276.077, 'text': 'that basically means how many number of data points are there?', 'start': 273.557, 'duration': 2.52}, {'end': 278.058, 'text': 'they are 506.', 'start': 276.077, 'duration': 1.981}, {'end': 280.319, 'text': 'number of attributes they are 13.', 'start': 278.058, 'duration': 2.261}, {'end': 282.6, 'text': 'numerical or categorical.', 'start': 280.319, 'duration': 2.281}, {'end': 286.681, 'text': 'predictive Median value attribute is usually used as a target.', 'start': 282.6, 'duration': 4.081}, {'end': 293.184, 'text': 'Here you will be able to see various features that are present in that specific data set like CRIN.', 'start': 287.282, 'duration': 5.902}, {'end': 296.506, 'text': 'This basically means per capita crime rate by town.', 'start': 293.685, 'duration': 2.821}, {'end': 304.229, 'text': 'So in that the house in which town it basically exists, how much per capita crime rate is, and obviously, if this increases,', 'start': 296.546, 'duration': 7.683}], 'summary': 'Data set contains 506 data points and 13 attributes, including predictive median value. features include per capita crime rate by town.', 'duration': 30.672, 'max_score': 273.557, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM273557.jpg'}], 'start': 6.024, 'title': 'Linear regression practical implementation', 'summary': 'Covers the practical implementation of linear regression using the boston housing data set with 506 instances and 13 attributes to predict house prices based on various features, aiming to build confidence in the audience.', 'chapters': [{'end': 63.923, 'start': 6.024, 'title': 'Linear regression practical implementation', 'summary': 'Covers the practical implementation of linear regression using the boston housing data set to predict house prices based on various features, aiming to build confidence in the audience.', 'duration': 57.899, 'highlights': ['The video covers the practical implementation of linear regression using the Boston Housing Data Set to predict the price of the house based on various features.', 'The audience has already covered the theoretical part of linear regression, gradient descent, and performance metrics like R-squared and Existed R-squared.', 'The project aims to help the audience build confidence with a simple implementation.']}, {'end': 370.349, 'start': 63.964, 'title': 'Data analysis & linear regression', 'summary': 'Covers importing libraries, loading the boston housing price dataset, and exploring its attributes, including 506 instances and 13 attributes, such as per capita crime rate, residential land zone proportion, and nitric oxide concentration, in preparation for implementing a linear regression machine learning model.', 'duration': 306.385, 'highlights': ['The dataset contains 506 instances and 13 attributes, including features like per capita crime rate, residential land zone proportion, and nitric oxide concentration.', 'Importing libraries such as pandas, numpy, and matplotlib for data visualization and analysis.', 'Loading the Boston housing price dataset using the load_boston function from the scikit-learn library.']}], 'duration': 364.325, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM6024.jpg', 'highlights': ['The dataset contains 506 instances and 13 attributes, including features like per capita crime rate, residential land zone proportion, and nitric oxide concentration.', 'The video covers the practical implementation of linear regression using the Boston Housing Data Set to predict the price of the house based on various features.', 'The audience has already covered the theoretical part of linear regression, gradient descent, and performance metrics like R-squared and Existed R-squared.', 'Importing libraries such as pandas, numpy, and matplotlib for data visualization and analysis.', 'The project aims to help the audience build confidence with a simple implementation.', 'Loading the Boston housing price dataset using the load_boston function from the scikit-learn library.']}, {'end': 2448.914, 'segs': [{'end': 669.475, 'src': 'embed', 'start': 646.296, 'weight': 0, 'content': [{'end': 654.483, 'text': 'Now, with respect to this dataset, you know what we have to do is that quickly, do some kind of quick analysis, you know, and that analysis is super,', 'start': 646.296, 'duration': 8.187}, {'end': 655.264, 'text': 'super important.', 'start': 654.483, 'duration': 0.781}, {'end': 661.229, 'text': "Now, first of all, what I'm actually going to do is that I'm going to use some of the inbuilt function like dataset.info.", 'start': 655.304, 'duration': 5.925}, {'end': 669.475, 'text': 'So if I write dataset on info, it is basically going to talk about like what are the data types of my all the columns that are present over here.', 'start': 662.21, 'duration': 7.265}], 'summary': 'Quickly analyze dataset using dataset.info function.', 'duration': 23.179, 'max_score': 646.296, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM646296.jpg'}, {'end': 1151.118, 'src': 'embed', 'start': 1109.489, 'weight': 1, 'content': [{'end': 1117.576, 'text': 'So here, obviously, from some of the features here you can see, RM is somewhere around 69% correlated, positively correlated,', 'start': 1109.489, 'duration': 8.087}, {'end': 1121.158, 'text': 'and this age is somewhere around minus 0.37, negatively correlated.', 'start': 1117.576, 'duration': 3.582}, {'end': 1126.743, 'text': "So what we'll do in the next video is that we will try to analyze this correlation.", 'start': 1121.519, 'duration': 5.224}, {'end': 1131.947, 'text': "We'll try to analyze this correlation by constructing some of the important plots, okay?", 'start': 1126.783, 'duration': 5.164}, {'end': 1135.55, 'text': 'Some of the plots, like scattered plots or heat map.', 'start': 1132.527, 'duration': 3.023}, {'end': 1140.032, 'text': "or we'll also see some of the outliers, whether there is some outliers in the price or not.", 'start': 1136.17, 'duration': 3.862}, {'end': 1142.493, 'text': 'but this correlation is super, super important.', 'start': 1140.032, 'duration': 2.461}, {'end': 1144.655, 'text': 'two things that you really need to check.', 'start': 1142.493, 'duration': 2.162}, {'end': 1148.437, 'text': 'one is for multicollinearity, uh, and to check for multicollinearity.', 'start': 1144.655, 'duration': 3.782}, {'end': 1151.118, 'text': 'there are also some other methods which we will discuss as we go ahead.', 'start': 1148.437, 'duration': 2.681}], 'summary': 'Rm is 69% positively correlated; age is -0.37 negatively correlated. next, analysis will include constructing plots, checking for outliers, and addressing multicollinearity.', 'duration': 41.629, 'max_score': 1109.489, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM1109489.jpg'}, {'end': 1444.996, 'src': 'embed', 'start': 1416.037, 'weight': 2, 'content': [{'end': 1421.141, 'text': "i'm just writing something like crime rate and plot dot, y label,", 'start': 1416.037, 'duration': 5.104}, {'end': 1427.356, 'text': "you'll be also able to see the crime rate plus on the Y label you'll be able to see the price.", 'start': 1421.141, 'duration': 6.215}, {'end': 1435.834, 'text': 'Now here you can definitely see the relationship between the crime rate and the price.', 'start': 1431.553, 'duration': 4.281}, {'end': 1443.176, 'text': 'Obviously, when the crime rate is increasing, the price will also keep on decreasing.', 'start': 1436.754, 'duration': 6.422}, {'end': 1444.996, 'text': 'This is inversely correlated.', 'start': 1443.596, 'duration': 1.4}], 'summary': 'Crime rate inversely correlated with price, affecting real estate market', 'duration': 28.959, 'max_score': 1416.037, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM1416037.jpg'}, {'end': 1605.862, 'src': 'embed', 'start': 1579.222, 'weight': 4, 'content': [{'end': 1583.003, 'text': 'So one interesting feature that I can see is LSTAT and price.', 'start': 1579.222, 'duration': 3.781}, {'end': 1584.784, 'text': 'This is somewhere on negative correlation.', 'start': 1583.044, 'duration': 1.74}, {'end': 1592.026, 'text': 'Right So again, to analyze it, it will always be better that I will just copy this part and paste it over here.', 'start': 1584.844, 'duration': 7.182}, {'end': 1594.547, 'text': "Here I'm just going to use LSTAT.", 'start': 1592.626, 'duration': 1.921}, {'end': 1597.777, 'text': 'And this will be my price.', 'start': 1596.696, 'duration': 1.081}, {'end': 1605.862, 'text': 'So, if I execute this here, you can see, obviously this is negatively correlated and you can basically create a regression plot like this, right?', 'start': 1597.817, 'duration': 8.045}], 'summary': 'The lstat and price show negative correlation, as seen in the regression plot.', 'duration': 26.64, 'max_score': 1579.222, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM1579222.jpg'}, {'end': 1981.536, 'src': 'heatmap', 'start': 1878.459, 'weight': 0.745, 'content': [{'end': 1883.062, 'text': 'So here we are going to do something called a train test split, okay?', 'start': 1878.459, 'duration': 4.603}, {'end': 1887.665, 'text': 'Now, inside this train test split, what our plan is see.', 'start': 1883.682, 'duration': 3.983}, {'end': 1893.588, 'text': 'whenever we discussed about the performance metrics at that time, you know that whenever we get our data set,', 'start': 1887.665, 'duration': 5.923}, {'end': 1895.329, 'text': 'we should definitely do a train test split.', 'start': 1893.588, 'duration': 1.741}, {'end': 1900.272, 'text': 'We should keep that entire test data separately so that we are not going to touch it anytime.', 'start': 1895.749, 'duration': 4.523}, {'end': 1907.898, 'text': 'because once we create our model and then once we test with that new, with that test data set, we will try to see the performance of the model.', 'start': 1900.792, 'duration': 7.106}, {'end': 1914.303, 'text': "then in there, other than that, we'll use this entire training data set to see that how my model is basically performing.", 'start': 1907.898, 'duration': 6.405}, {'end': 1924.17, 'text': "okay. so for doing the train test split, i'm going to use from sk learn dot model selection and here i'm going to basically import train test split.", 'start': 1914.303, 'duration': 9.867}, {'end': 1926.572, 'text': "so everything i'm basically using from sk learn.", 'start': 1924.17, 'duration': 2.402}, {'end': 1935.816, 'text': "here you have, and here i'm going to use extreme comma x test, comma y train, comma y test.", 'start': 1926.572, 'duration': 9.244}, {'end': 1940.237, 'text': 'so these are all my features that it is going to create with respect to the train test plate.', 'start': 1935.816, 'duration': 4.421}, {'end': 1952.58, 'text': 'so here you have x, comma y and then here you basically have test size or i can also push by test size as train size or test size.', 'start': 1940.237, 'duration': 12.343}, {'end': 1955.42, 'text': "so let's say, test size is nothing but 0.3.", 'start': 1952.58, 'duration': 2.84}, {'end': 1963.064, 'text': "that basically means 30 percent, i'm going to put over there, And some random state is equal to 42..", 'start': 1955.42, 'duration': 7.644}, {'end': 1964.565, 'text': "I'm just taking any random state.", 'start': 1963.064, 'duration': 1.501}, {'end': 1969.728, 'text': 'If probably you also take the same random state value, you may also get the same train test split, okay?', 'start': 1964.965, 'duration': 4.763}, {'end': 1972.55, 'text': 'So now you can basically check out my X-train.', 'start': 1970.289, 'duration': 2.261}, {'end': 1975.512, 'text': 'So these are my entire data set with respect to X-train.', 'start': 1973.151, 'duration': 2.361}, {'end': 1978.534, 'text': 'And this is your X-test.', 'start': 1976.233, 'duration': 2.301}, {'end': 1981.536, 'text': 'and similarly you can check out your y train and y test.', 'start': 1979.435, 'duration': 2.101}], 'summary': 'Performing train test split using sklearn model selection with 30% test size and random state 42.', 'duration': 103.077, 'max_score': 1878.459, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM1878459.jpg'}, {'end': 2178.649, 'src': 'heatmap', 'start': 2076.226, 'weight': 1, 'content': [{'end': 2080.909, 'text': "okay, and here I'm going to basically use scalar is equal to standard scalar.", 'start': 2076.226, 'duration': 4.683}, {'end': 2082.471, 'text': 'and here we execute this.', 'start': 2080.909, 'duration': 1.562}, {'end': 2087.755, 'text': 'and now this is the standard scaling object.', 'start': 2082.471, 'duration': 5.284}, {'end': 2092.877, 'text': 'now all i have to do is that i have to pass and write fit underscore, transform.', 'start': 2087.755, 'duration': 5.122}, {'end': 2102.782, 'text': 'i have to basically pass my x training data set, X training dataset and transform it completely right to the same scale.', 'start': 2092.877, 'duration': 9.905}, {'end': 2105.724, 'text': 'So here I will probably get my X training.', 'start': 2103.242, 'duration': 2.482}, {'end': 2110.327, 'text': 'Now similarly for the X test also we have to do it.', 'start': 2106.705, 'duration': 3.622}, {'end': 2115.51, 'text': "But in X test, whenever we try to apply, we don't have to write fit underscore transform,", 'start': 2110.827, 'duration': 4.683}, {'end': 2124.299, 'text': 'because We are going to make sure that whatever information I have with respect to the training data set and whatever techniques I have applied for transforming it,', 'start': 2115.51, 'duration': 8.789}, {'end': 2126.902, 'text': 'the same techniques needs to be applied to the test data set.', 'start': 2124.299, 'duration': 2.603}, {'end': 2133.508, 'text': 'This is done because to make sure that my model does not know much information about the test data set.', 'start': 2127.342, 'duration': 6.166}, {'end': 2137.252, 'text': "So here I'm basically going to write transform on my X test.", 'start': 2133.909, 'duration': 3.343}, {'end': 2143.579, 'text': "And once we do this, and if I probably try to see my x-train, here you'll be able to see all the data points.", 'start': 2138.733, 'duration': 4.846}, {'end': 2147.824, 'text': "Similarly, if I go ahead and see my x-test, here you'll be able to see all the data points.", 'start': 2143.619, 'duration': 4.205}, {'end': 2152.809, 'text': "We don't have to do with respect to the output feature, but definitely for this specific step.", 'start': 2148.424, 'duration': 4.385}, {'end': 2158.475, 'text': 'And whenever we get our new data set, there also we have to probably apply transform and try to do the prediction.', 'start': 2153.41, 'duration': 5.065}, {'end': 2161.137, 'text': 'now, these are the steps that we did it.', 'start': 2159.196, 'duration': 1.941}, {'end': 2164.74, 'text': 'now, in the next video, we are going to directly train our model.', 'start': 2161.137, 'duration': 3.603}, {'end': 2166.441, 'text': 'this is super, super important steps.', 'start': 2164.74, 'duration': 1.701}, {'end': 2168.823, 'text': 'again, understand, this can be an interview question.', 'start': 2166.441, 'duration': 2.382}, {'end': 2171.765, 'text': "they'll say that why do you standardize a data set in linear regression?", 'start': 2168.823, 'duration': 2.942}, {'end': 2174.726, 'text': 'you just have to say that internally, we use gradient descent.', 'start': 2171.765, 'duration': 2.961}, {'end': 2178.649, 'text': 'our main aim is to come to the global minima and to come to the global media.', 'start': 2174.726, 'duration': 3.923}], 'summary': 'Standardize training and test data to same scale for model training. important step in machine learning process.', 'duration': 102.423, 'max_score': 2076.226, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM2076226.jpg'}], 'start': 370.349, 'title': 'Understanding boston house pricing data', 'summary': 'Provides insights on boston house pricing data, data analysis and preparation, importance of correlation in linear regression, scattered plot analysis, and model training and standard scaling.', 'chapters': [{'end': 564.466, 'start': 370.349, 'title': 'Understanding boston house pricing data', 'summary': 'Introduces boston house pricing data, providing insights on input features, output feature (price of the house), and the process to convert the dataset into a dataframe, emphasizing the importance of understanding the dataset for further analysis.', 'duration': 194.117, 'highlights': ['The chapter introduces Boston house pricing data, providing insights on input features, output feature (price of the house), and the process to convert the dataset into a dataframe.', 'The importance of understanding the dataset for further analysis is emphasized.', 'Demonstrates the process of creating a dataframe using the Boston house pricing data.']}, {'end': 883.77, 'start': 564.466, 'title': 'Data analysis and preparation', 'summary': 'Covers the creation of a data frame with independent and output features, performing quick analysis using inbuilt functions like dataset.info and dataset.describe, checking for missing values, and preparing for exploratory data analysis.', 'duration': 319.304, 'highlights': ['Creation of data frame with independent and output features', 'Performing quick analysis using inbuilt functions like dataset.info and dataset.describe', 'Checking for missing values using dataset.isnull and applying an aggregate function called dot sum']}, {'end': 1216.896, 'start': 884.19, 'title': 'Importance of correlation in linear regression', 'summary': 'Emphasizes the importance of correlation in linear regression, highlighting the use of correlation to identify multicollinearity and its impact on model performance, with an explanation of pearson correlation and its significance in determining feature correlation and the relationship with the output feature.', 'duration': 332.706, 'highlights': ['The significance of Pearson correlation and its impact on determining feature correlation and the relationship with the output feature, with a range from -1 to +1, indicating the degree of correlation.', 'The importance of identifying multicollinearity in linear regression and its impact on model performance, with a recommendation to remove highly correlated independent features to avoid multicollinearity.', 'The use of scatter plots and pair plots to visualize the correlation between features and identify outliers, providing a visual representation of the relationships between variables.']}, {'end': 1869.053, 'start': 1216.896, 'title': 'Scattered plot analysis of boston housing price dataset', 'summary': 'Explores the correlation analysis of features in the boston housing price dataset, finding the relationship between various features and the price, including negative and positive correlations, such as crime rate and price being negatively correlated at -0.388, and the number of rooms (rm) and price being positively correlated at 69%, in preparation for creating models.', 'duration': 652.157, 'highlights': ['The relationship between crime rate and price is negatively correlated at -0.388, indicating that as the crime rate increases, the price decreases.', 'The number of rooms (RM) and price are positively correlated at 69%, suggesting that as the number of rooms increases, the price also increases.', 'The feature LSTAT and price show a negative correlation, indicating that as LSTAT decreases, the price increases.']}, {'end': 2448.914, 'start': 1869.434, 'title': 'Model training and standard scaling', 'summary': 'Covers the implementation of model training and standard scaling, including train-test split with a test size of 30%, and the importance of standard scaling in linear regression to ensure convergence to the global minima for faster model performance.', 'duration': 579.48, 'highlights': ['Importance of standard scaling in linear regression', 'Train-test split with a test size of 30%', 'Model training using linear regression']}], 'duration': 2078.565, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM370349.jpg', 'highlights': ['The importance of understanding the dataset for further analysis is emphasized.', 'The use of scatter plots and pair plots to visualize the correlation between features and identify outliers.', 'The relationship between crime rate and price is negatively correlated at -0.388.', 'The number of rooms (RM) and price are positively correlated at 69%.', 'The feature LSTAT and price show a negative correlation.']}, {'end': 3341.416, 'segs': [{'end': 2647.22, 'src': 'embed', 'start': 2622.223, 'weight': 2, 'content': [{'end': 2633.044, 'text': "so now, if i plot these two values, or i can also change the order, let's say i'm going to write y underscore test and reg underscore pred.", 'start': 2622.223, 'duration': 10.821}, {'end': 2638.57, 'text': "okay, if i plot this, guys here, you'll be able to see that the plotting of the values, that will be seen.", 'start': 2633.044, 'duration': 5.526}, {'end': 2640.732, 'text': 'it is basically linear, right.', 'start': 2638.57, 'duration': 2.162}, {'end': 2647.22, 'text': 'so when this plotting is basically linear, that basically means yes, your model has actually performed well, right,', 'start': 2640.732, 'duration': 6.488}], 'summary': 'Plotting the values shows a linear pattern, indicating good model performance.', 'duration': 24.997, 'max_score': 2622.223, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM2622223.jpg'}, {'end': 2841.624, 'src': 'embed', 'start': 2793.332, 'weight': 0, 'content': [{'end': 2795.915, 'text': 'And there are some points that are ranging between 10 to 30, okay?', 'start': 2793.332, 'duration': 2.583}, {'end': 2802.982, 'text': 'So I still feel that, yes, my model is performing well, because whenever I do this kind of plot, with respect to the residuals,', 'start': 2796.736, 'duration': 6.246}, {'end': 2804.523, 'text': 'I should be getting a normal distribution.', 'start': 2802.982, 'duration': 1.541}, {'end': 2809.067, 'text': 'So this is the second assumption that you can probably consider and you can basically find it out.', 'start': 2804.884, 'duration': 4.183}, {'end': 2813.672, 'text': "Now let's go and probably assume one more scatter plot.", 'start': 2810.349, 'duration': 3.323}, {'end': 2824.371, 'text': 'And this scatter plot is with respect to, with respect to predictions and residuals.', 'start': 2814.884, 'duration': 9.487}, {'end': 2825.872, 'text': 'Okay Residuals.', 'start': 2824.491, 'duration': 1.381}, {'end': 2828.374, 'text': 'Now if I probably try to plot this.', 'start': 2826.473, 'duration': 1.901}, {'end': 2836.52, 'text': "so let's say I'm plotting plot dot scatter with respect to the regression prediction and my residuals, that is this error.", 'start': 2828.374, 'duration': 8.146}, {'end': 2841.624, 'text': "If I plot this here, you'll be able to see that my my, this,", 'start': 2836.9, 'duration': 4.724}], 'summary': 'Model performance evaluated using residual plots and scatter plots for predictions and residuals.', 'duration': 48.292, 'max_score': 2793.332, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM2793332.jpg'}, {'end': 3046.513, 'src': 'embed', 'start': 3014.06, 'weight': 3, 'content': [{'end': 3016.981, 'text': "So here also we'll be able to get the root mean squared error.", 'start': 3014.06, 'duration': 2.921}, {'end': 3026.023, 'text': 'So these are all my values and always remember that, you know, these values will basically indicate how my model is basically performing.', 'start': 3017.421, 'duration': 8.602}, {'end': 3032.484, 'text': "Now apart from this, here we have definitely understood, okay, if I plot the residuals, I'm getting a normal distribution.", 'start': 3026.623, 'duration': 5.861}, {'end': 3035.525, 'text': 'Yes, with some kind of outliers with respect to the differences,', 'start': 3032.544, 'duration': 2.981}, {'end': 3039.686, 'text': 'but most of the points that are having the differences is written minus 10 to plus 10, okay?', 'start': 3035.525, 'duration': 4.161}, {'end': 3046.513, 'text': 'And one more performance metrics that I can definitely use is something called as adjusted square.', 'start': 3040.927, 'duration': 5.586}], 'summary': 'Model performance: rmse, normal distribution of residuals, outliers, and adjusted r-squared.', 'duration': 32.453, 'max_score': 3014.06, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM3014060.jpg'}, {'end': 3178.377, 'src': 'embed', 'start': 3140.838, 'weight': 1, 'content': [{'end': 3146.541, 'text': 'And similarly, we can use mean squared error, mean absolute error and root mean squared error to find out how your model is performing well.', 'start': 3140.838, 'duration': 5.703}, {'end': 3150.303, 'text': 'Now in the next video, which is super important R square and adjusted R square.', 'start': 3146.901, 'duration': 3.402}, {'end': 3154.185, 'text': "we'll try to implement it and we'll try to see what kind of score we are able to get.", 'start': 3150.303, 'duration': 3.882}, {'end': 3155.966, 'text': 'Okay So thank you.', 'start': 3154.426, 'duration': 1.54}, {'end': 3158.388, 'text': 'Yes, I will see you all in the next video.', 'start': 3156.707, 'duration': 1.681}, {'end': 3158.848, 'text': 'Thank you.', 'start': 3158.588, 'duration': 0.26}, {'end': 3178.377, 'text': 'guys, in this video we are going to check out more performance metrics and we are going to see whether our regression model that we have created is good or not okay.', 'start': 3169.892, 'duration': 8.485}], 'summary': 'Exploring performance metrics for regression models, including mse, mae, rmse, r square, and adjusted r square.', 'duration': 37.539, 'max_score': 3140.838, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM3140838.jpg'}], 'start': 2448.914, 'title': 'Regression model evaluation and performance metrics', 'summary': 'Covers regression model evaluation, including model training, prediction, and performance assessment through scatter plots and residual analysis. it also discusses performance metrics such as mean squared error, mean absolute error, root mean squared error, and adjusted r square, achieving a 71% and 68% score respectively, emphasizing their importance in assessing model performance.', 'chapters': [{'end': 2916.636, 'start': 2448.914, 'title': 'Regression model evaluation', 'summary': 'Covers the evaluation of a regression model, including model training, prediction, and the assessment of model performance through scatter plots and residual analysis.', 'duration': 467.722, 'highlights': ['The chapter covers the evaluation of a regression model, including model training, prediction, and the assessment of model performance through scatter plots and residual analysis.', "The scatter plot for the prediction indicates the model's performance, with linear plotting suggesting a good predictive capability.", "Residual analysis shows the distribution of errors, indicating the model's performance, with the aim of achieving a normal distribution of errors.", "The scatter plot of predicted values and residuals shows a uniform distribution, indicating the model's good performance."]}, {'end': 3341.416, 'start': 2916.636, 'title': 'Performance metrics for regression model', 'summary': 'Discusses the use of performance metrics, including mean squared error, mean absolute error, root mean squared error, and adjusted r square, to evaluate a regression model. it also demonstrates the calculation of r square and adjusted r square, achieving a 71% and 68% score respectively, and emphasizes the importance of these metrics in assessing model performance.', 'duration': 424.78, 'highlights': ['The chapter discusses the use of performance metrics, including mean squared error, mean absolute error, root mean squared error, and adjusted R square, to evaluate a regression model.', 'It demonstrates the calculation of R square and adjusted R square, achieving a 71% and 68% score respectively.', 'Emphasizes the importance of these metrics in assessing model performance.']}], 'duration': 892.502, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM2448914.jpg', 'highlights': ['The chapter covers the evaluation of a regression model, including model training, prediction, and performance assessment through scatter plots and residual analysis.', 'The chapter discusses the use of performance metrics such as mean squared error, mean absolute error, root mean squared error, and adjusted R square, achieving a 71% and 68% score respectively, emphasizing their importance in assessing model performance.', "The scatter plot for the prediction indicates the model's performance, with linear plotting suggesting a good predictive capability.", "Residual analysis shows the distribution of errors, indicating the model's performance, with the aim of achieving a normal distribution of errors.", "The scatter plot of predicted values and residuals shows a uniform distribution, indicating the model's good performance."]}, {'end': 4337.69, 'segs': [{'end': 3366.58, 'src': 'embed', 'start': 3341.416, 'weight': 7, 'content': [{'end': 3350.037, 'text': 'we are now going to take up a new data and probably predict it through our regression model and see what output we are able to get.', 'start': 3341.416, 'duration': 8.621}, {'end': 3351.457, 'text': 'now, in this kind of predictions,', 'start': 3350.037, 'duration': 1.42}, {'end': 3357.639, 'text': 'we can either get bulk of data or we can get single single data points and we have to probably do the prediction already.', 'start': 3351.457, 'duration': 6.182}, {'end': 3361.679, 'text': 'with respect to the bulk of data, like in this particular example i have taken, you know,', 'start': 3357.639, 'duration': 4.04}, {'end': 3366.58, 'text': 'in x test data and probably we have done the prediction and we have found out the Here you can see.', 'start': 3361.679, 'duration': 4.901}], 'summary': 'Using regression model to predict new data and analyze the output.', 'duration': 25.164, 'max_score': 3341.416, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM3341416.jpg'}, {'end': 3447.791, 'src': 'embed', 'start': 3418.697, 'weight': 6, 'content': [{'end': 3422.358, 'text': 'So here I will just reshape this into one comma minus one.', 'start': 3418.697, 'duration': 3.661}, {'end': 3428.822, 'text': "Now, if I see this and if I probably try to see the shape of it here, you'll be seeing that one row and 13 columns.", 'start': 3422.799, 'duration': 6.023}, {'end': 3430.662, 'text': 'So this is how I have to give my data set.', 'start': 3428.862, 'duration': 1.8}, {'end': 3437.626, 'text': 'So if I reshape it like this and probably execute it, this is how I have to give my entire data set for my model prediction.', 'start': 3431.103, 'duration': 6.523}, {'end': 3447.791, 'text': "so the regression, that model that we have created, and if i write predict, dot, this entire, let's say i'm just going to give the first data.", 'start': 3438.206, 'duration': 9.585}], 'summary': 'Reshaping the data into one row and 13 columns for model prediction.', 'duration': 29.094, 'max_score': 3418.697, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM3418697.jpg'}, {'end': 3493.486, 'src': 'embed', 'start': 3461.437, 'weight': 4, 'content': [{'end': 3462.678, 'text': 'uh, something related to this.', 'start': 3461.437, 'duration': 1.241}, {'end': 3465.321, 'text': 'But why I am getting minus 45?', 'start': 3463.258, 'duration': 2.063}, {'end': 3467.403, 'text': 'You may be thinking, Krish, this is just a negative value.', 'start': 3465.321, 'duration': 2.082}, {'end': 3469.786, 'text': 'Because we missed one very important step.', 'start': 3467.843, 'duration': 1.943}, {'end': 3474.571, 'text': 'That is, whenever we get our new data, we have to first of all do standardization.', 'start': 3470.066, 'duration': 4.505}, {'end': 3476.373, 'text': 'That you have probably missed it.', 'start': 3474.931, 'duration': 1.442}, {'end': 3478.856, 'text': "So don't ever forget about standardization.", 'start': 3476.773, 'duration': 2.083}, {'end': 3482.619, 'text': 'If you remember, standardization over here has some values.', 'start': 3479.236, 'duration': 3.383}, {'end': 3485.961, 'text': 'Here, you can basically see that we have used standard scalar.', 'start': 3483.179, 'duration': 2.782}, {'end': 3487.942, 'text': 'And this was my object name.', 'start': 3486.401, 'duration': 1.541}, {'end': 3489.664, 'text': "So what I'm actually going to first of all do?", 'start': 3488.002, 'duration': 1.662}, {'end': 3493.486, 'text': "I'm just going to write scalar dot, transform of these values, right?", 'start': 3489.664, 'duration': 3.822}], 'summary': 'Importance of standardization in data preprocessing to avoid negative values.', 'duration': 32.049, 'max_score': 3461.437, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM3461437.jpg'}, {'end': 3527.178, 'src': 'embed', 'start': 3500.571, 'weight': 3, 'content': [{'end': 3506.536, 'text': 'So this is the same thing we have to do it for our new data, transformation of new data.', 'start': 3500.571, 'duration': 5.965}, {'end': 3519.729, 'text': 'So if I write scalar dot, scalar dot transform and now, if i use this entire data set, i will be getting my scaled data set, right scale data set.', 'start': 3507.096, 'duration': 12.633}, {'end': 3523.434, 'text': "so let's see now this is basically my scale data set.", 'start': 3519.729, 'duration': 3.705}, {'end': 3527.178, 'text': 'now i have to use the same data to basically do the prediction.', 'start': 3523.434, 'duration': 3.744}], 'summary': 'Transformed new data using scalar dot transform to get scaled data set.', 'duration': 26.607, 'max_score': 3500.571, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM3500571.jpg'}, {'end': 3620.442, 'src': 'embed', 'start': 3596.846, 'weight': 1, 'content': [{'end': 3603.389, 'text': "Because all these projects that I'll be showing you at the end of the day, I will be doing the deployment with the help of Dockers and GitHub Actions.", 'start': 3596.846, 'duration': 6.543}, {'end': 3608.252, 'text': 'So to begin with, what we are going to do is that, first of all, we need to import pickle.', 'start': 3603.95, 'duration': 4.302}, {'end': 3614.175, 'text': 'This is a very, very important file altogether because this will actually help us to do the pickling of this regression model.', 'start': 3608.432, 'duration': 5.743}, {'end': 3620.442, 'text': 'Now, in order to convert this into a pickle file, I will be using something like pickle.dump.', 'start': 3614.855, 'duration': 5.587}], 'summary': 'Projects will be deployed using dockers and github actions, involving pickling of a regression model.', 'duration': 23.596, 'max_score': 3596.846, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM3596846.jpg'}, {'end': 3830.599, 'src': 'embed', 'start': 3790.913, 'weight': 0, 'content': [{'end': 3797.32, 'text': 'so, in short, what i have actually done is that once you create your model file, you can definitely do the pickling.', 'start': 3790.913, 'duration': 6.407}, {'end': 3799.541, 'text': 'You can do the pickling over there.', 'start': 3798.281, 'duration': 1.26}, {'end': 3803.765, 'text': 'You can save it as a file in your local storage, in the cloud, wherever you want.', 'start': 3799.562, 'duration': 4.203}, {'end': 3805.626, 'text': 'And this is what happens in the real world scenario.', 'start': 3803.825, 'duration': 1.801}, {'end': 3809.869, 'text': "Once we create a pickle file, let's say I'm using an AEA WSS3 bucket.", 'start': 3806.066, 'duration': 3.803}, {'end': 3815.153, 'text': "So I will probably be storing over there and later on I'll be loading it and I'll be doing the prediction.", 'start': 3809.909, 'duration': 5.244}, {'end': 3818.674, 'text': 'okay. so here you can see we can also do the same thing.', 'start': 3815.593, 'duration': 3.081}, {'end': 3825.297, 'text': 'we can create that into a pickle file, which will basically be storing all this information in a serialized format,', 'start': 3818.674, 'duration': 6.623}, {'end': 3830.599, 'text': 'all the information regarding that specific model, and then i can again load it and again i can do the prediction.', 'start': 3825.297, 'duration': 5.302}], 'summary': 'After creating a model file, pickling allows saving and loading it for making predictions, applicable in real-world scenarios.', 'duration': 39.686, 'max_score': 3790.913, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM3790913.jpg'}, {'end': 4164.408, 'src': 'heatmap', 'start': 3958.3, 'weight': 0.915, 'content': [{'end': 3961.181, 'text': 'Okay, so this is what is my repository name.', 'start': 3958.3, 'duration': 2.881}, {'end': 3964.302, 'text': 'I also have to need to add a readme file.', 'start': 3961.921, 'duration': 2.381}, {'end': 3970.905, 'text': 'I also require a gitignore file because I will be doing all my coding with the help of Python.', 'start': 3965.122, 'duration': 5.783}, {'end': 3977.727, 'text': "So some of the files that are already present with the help of Python, I don't have to commit in GitHub again and again.", 'start': 3971.385, 'duration': 6.342}, {'end': 3980.368, 'text': "So that is the reason why I'm specifically using this.", 'start': 3978.067, 'duration': 2.301}, {'end': 3983.849, 'text': 'And you can probably pick up any kind of license.', 'start': 3981.128, 'duration': 2.721}, {'end': 3989.19, 'text': "Let's say that I am picking up some Apache license 2.0.", 'start': 3983.929, 'duration': 5.261}, {'end': 3994.351, 'text': 'And then finally, I will go ahead and create my repository, okay? Now, this is the first step.', 'start': 3989.19, 'duration': 5.161}, {'end': 3999.572, 'text': 'Please go ahead and create your own GitHub account along with that create a repository.', 'start': 3994.751, 'duration': 4.821}, {'end': 4007.714, 'text': 'Now, here you are able to see that, okay, fine, I can see that my entire repository is over here.', 'start': 4000.272, 'duration': 7.442}, {'end': 4013.256, 'text': "And whatever code I'm going to write, Going ahead, you know, I'm going to commit in this repository.", 'start': 4008.114, 'duration': 5.142}, {'end': 4017.838, 'text': 'Okay, now, in order to you know, clone this repository,', 'start': 4013.656, 'duration': 4.182}, {'end': 4022.961, 'text': "because I really need to clone this repository in my Local so that I'll be able to make a commit on to it.", 'start': 4017.838, 'duration': 5.123}, {'end': 4029.004, 'text': 'Okay, So, in order to make a clone, what all things we can do is that first of all just go over here.', 'start': 4023.081, 'duration': 5.923}, {'end': 4034.627, 'text': "You'll be able to see local over here local as an option, and here you have something called as clone.", 'start': 4029.004, 'duration': 5.623}, {'end': 4040.692, 'text': "Okay, so I'm going to click on this and I have created one folder which is called as end-to-end project,", 'start': 4034.627, 'duration': 6.065}, {'end': 4044.997, 'text': 'and my first project will be related to Boston house pricing data set.', 'start': 4040.692, 'duration': 4.305}, {'end': 4048.842, 'text': "okay, so here I'll just open my command prompt.", 'start': 4044.997, 'duration': 3.845}, {'end': 4058.414, 'text': "So here you'll be able to see I can see my command prompt and then I will go to this specific path, because I want my repository over here.", 'start': 4050.191, 'duration': 8.223}, {'end': 4060.655, 'text': 'So this will go to E.', 'start': 4058.954, 'duration': 1.701}, {'end': 4070.418, 'text': 'Again, you open command prompt, you open terminal from your Mac or Linux and go to that specific folder where you want to clone that repository.', 'start': 4060.655, 'duration': 9.763}, {'end': 4072.339, 'text': 'So I will go to that particular folder.', 'start': 4070.478, 'duration': 1.861}, {'end': 4076.322, 'text': "And then I'm just going to write git clone.", 'start': 4073.039, 'duration': 3.283}, {'end': 4080.925, 'text': 'Okay And whatever URL I have actually copied over here.', 'start': 4077.022, 'duration': 3.903}, {'end': 4083.767, 'text': "So this entire URL, I'm going to paste it over here.", 'start': 4081.445, 'duration': 2.322}, {'end': 4089.911, 'text': 'So as soon as I execute this here, you can see that total, the cloning has actually happened.', 'start': 4084.327, 'duration': 5.584}, {'end': 4092.373, 'text': 'Now, if I probably go and see my folder.', 'start': 4090.011, 'duration': 2.362}, {'end': 4099.837, 'text': 'So Boston house pricing is basically here and you can see all your git ignore license file and readme file.', 'start': 4092.873, 'duration': 6.964}, {'end': 4104.479, 'text': 'Okay So in short, what we have done is that we have cloned that entire repository.', 'start': 4100.037, 'duration': 4.442}, {'end': 4105.819, 'text': 'Now from this.', 'start': 4104.64, 'duration': 1.179}, {'end': 4112.622, 'text': "whatever files I will probably be creating, whatever things I'll be creating, I will be publishing over there right?", 'start': 4105.819, 'duration': 6.803}, {'end': 4122.663, 'text': 'Now, if you remember, guys, already in my practicals, you will be able to see that I have actually created one IPYNB file and pickle file right?', 'start': 4113.182, 'duration': 9.481}, {'end': 4125.805, 'text': 'So just copy this file back to your.', 'start': 4123.124, 'duration': 2.681}, {'end': 4129.067, 'text': 'the same folder over here which you have actually created.', 'start': 4126.585, 'duration': 2.482}, {'end': 4135.171, 'text': 'so this two file should also be over here and we will be using this pickle file in order to do the prediction.', 'start': 4129.067, 'duration': 6.104}, {'end': 4137.254, 'text': 'okay, super easy, till here.', 'start': 4135.171, 'duration': 2.083}, {'end': 4144.74, 'text': 'uh, if you find that your git clone is not working, just go to your browser and just download git cli.', 'start': 4137.254, 'duration': 7.486}, {'end': 4148.881, 'text': 'okay, so git cli can be downloaded for windows.', 'start': 4144.74, 'duration': 4.141}, {'end': 4152.084, 'text': 'it can be downloaded for mac or linux.', 'start': 4148.881, 'duration': 3.203}, {'end': 4155.305, 'text': 'okay, so here you will be able to see just go and click on download.', 'start': 4152.084, 'duration': 3.221}, {'end': 4164.408, 'text': 'so mac, os, windows, linux, whatever is your um os, just go and download it and then it will download an exe file and just continue.', 'start': 4155.305, 'duration': 9.103}], 'summary': 'Setting up a github repository, cloning it, and preparing for project work.', 'duration': 206.108, 'max_score': 3958.3, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM3958300.jpg'}, {'end': 4112.622, 'src': 'embed', 'start': 4084.327, 'weight': 2, 'content': [{'end': 4089.911, 'text': 'So as soon as I execute this here, you can see that total, the cloning has actually happened.', 'start': 4084.327, 'duration': 5.584}, {'end': 4092.373, 'text': 'Now, if I probably go and see my folder.', 'start': 4090.011, 'duration': 2.362}, {'end': 4099.837, 'text': 'So Boston house pricing is basically here and you can see all your git ignore license file and readme file.', 'start': 4092.873, 'duration': 6.964}, {'end': 4104.479, 'text': 'Okay So in short, what we have done is that we have cloned that entire repository.', 'start': 4100.037, 'duration': 4.442}, {'end': 4105.819, 'text': 'Now from this.', 'start': 4104.64, 'duration': 1.179}, {'end': 4112.622, 'text': "whatever files I will probably be creating, whatever things I'll be creating, I will be publishing over there right?", 'start': 4105.819, 'duration': 6.803}], 'summary': 'Successfully executed cloning of boston house pricing repository.', 'duration': 28.295, 'max_score': 4084.327, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM4084327.jpg'}, {'end': 4227.034, 'src': 'embed', 'start': 4199.712, 'weight': 5, 'content': [{'end': 4202.553, 'text': 'Installation part is very much easy with respect to Visual Studio Code.', 'start': 4199.712, 'duration': 2.841}, {'end': 4208.395, 'text': 'Now once you have downloaded the Visual Studio Code, what you can do is that you can just open that Visual Studio Code.', 'start': 4203.173, 'duration': 5.222}, {'end': 4209.916, 'text': 'I have already downloaded it.', 'start': 4208.495, 'duration': 1.421}, {'end': 4212.298, 'text': 'so over here you can see this.', 'start': 4210.476, 'duration': 1.822}, {'end': 4214.761, 'text': 'so i will open this right now.', 'start': 4212.298, 'duration': 2.463}, {'end': 4217.324, 'text': 'some of the projects is visible over here.', 'start': 4214.761, 'duration': 2.563}, {'end': 4221.249, 'text': 'so let me just open the folder.', 'start': 4217.324, 'duration': 3.925}, {'end': 4227.034, 'text': 'uh, and the folder is nothing but the same location that we are going to go ahead with.', 'start': 4221.249, 'duration': 5.785}], 'summary': 'Installing and using visual studio code is easy and straightforward.', 'duration': 27.322, 'max_score': 4199.712, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM4199712.jpg'}], 'start': 3341.416, 'title': 'Regression model predictions', 'summary': 'Covers using a regression model to predict outcomes, including reshaping the dataset, emphasizing standardization, pickling the model for deployment, and setting up an end-to-end project with git and github actions.', 'chapters': [{'end': 3482.619, 'start': 3341.416, 'title': 'Regression model predictions', 'summary': 'Discusses using a regression model to predict outcomes, demonstrating how to predict both bulk data and single data points, reshaping the dataset for model prediction, and emphasizing the importance of standardization in obtaining accurate predictions.', 'duration': 141.203, 'highlights': ['The importance of standardization in obtaining accurate predictions.', 'Demonstrating how to predict both bulk data and single data points using a regression model.', 'Reshaping the dataset for model prediction to ensure proper input for the model.']}, {'end': 3790.913, 'start': 3483.179, 'title': 'Model pickling and deployment', 'summary': 'Explains the process of using standard scalar for data transformation and the importance of pickling the regression model for deployment, including the steps of pickling the model and loading the pickled model for prediction.', 'duration': 307.734, 'highlights': ['The importance of pickling the regression model for deployment with the help of Dockers and GitHub Actions is emphasized.', 'The process of using standard scalar for data transformation is explained, including the steps of transforming the data and using the transformed data for prediction.', 'The steps of pickling the model and loading the pickled model for prediction are detailed, including the use of pickle.dump and pickle.load functions.']}, {'end': 4337.69, 'start': 3790.913, 'title': 'Pickling model for deployment', 'summary': 'Covers pickling the model file for deployment, creating a github repository, cloning the repository, and setting up visual studio code for the end-to-end project, with a focus on converting the entire project into an end-to-end project, using git, github actions, and deploying the application to a cloud platform.', 'duration': 546.777, 'highlights': ['The process of pickling the model file for deployment is explained, including the ability to save it as a file in local storage or the cloud, and the process of storing all information in a serialized format for later use, demonstrating the practical application in creating an end-to-end project.', 'The step-by-step process of creating a GitHub repository, adding a README file, Gitignore file, and selecting a license, and then cloning the repository to the local machine is detailed, providing clear instructions for setting up the project on GitHub.', 'Setting up Visual Studio Code for the end-to-end project is explained, including the process of opening the folder, saving the workspace, and the availability of all project files for coding and committing to the repository, highlighting the importance of installing Git CLI for committing changes.']}], 'duration': 996.274, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM3341416.jpg', 'highlights': ['The process of pickling the model file for deployment is explained, including the ability to save it as a file in local storage or the cloud, and the process of storing all information in a serialized format for later use, demonstrating the practical application in creating an end-to-end project.', 'The importance of pickling the regression model for deployment with the help of Dockers and GitHub Actions is emphasized.', 'The step-by-step process of creating a GitHub repository, adding a README file, Gitignore file, and selecting a license, and then cloning the repository to the local machine is detailed, providing clear instructions for setting up the project on GitHub.', 'The process of using standard scalar for data transformation is explained, including the steps of transforming the data and using the transformed data for prediction.', 'The importance of standardization in obtaining accurate predictions.', 'Setting up Visual Studio Code for the end-to-end project is explained, including the process of opening the folder, saving the workspace, and the availability of all project files for coding and committing to the repository, highlighting the importance of installing Git CLI for committing changes.', 'Reshaping the dataset for model prediction to ensure proper input for the model.', 'Demonstrating how to predict both bulk data and single data points using a regression model.', 'The steps of pickling the model and loading the pickled model for prediction are detailed, including the use of pickle.dump and pickle.load functions.']}, {'end': 5086.725, 'segs': [{'end': 4444.436, 'src': 'embed', 'start': 4400.218, 'weight': 1, 'content': [{'end': 4407.202, 'text': "so here i'm just going to write down github account And the same readme file will be updated in the GitHub repository also.", 'start': 4400.218, 'duration': 6.984}, {'end': 4415.89, 'text': "So that tomorrow, if you're probably planning for the interviews, if you give your GitHub repository, if the interviewer sees you know like okay,", 'start': 4407.743, 'duration': 8.147}, {'end': 4417.472, 'text': 'you have written all the steps.', 'start': 4415.89, 'duration': 1.582}, {'end': 4420.715, 'text': 'you have written all the steps to basically create the specific projects.', 'start': 4417.472, 'duration': 3.243}, {'end': 4422.597, 'text': 'they will definitely get impressed by you.', 'start': 4420.715, 'duration': 1.882}, {'end': 4429.683, 'text': "so here i'm just going to put all my links of the tools and softwares that we are going to use.", 'start': 4423.277, 'duration': 6.406}, {'end': 4433.887, 'text': 'so first is github.com definitely coming to the next one.', 'start': 4429.683, 'duration': 4.204}, {'end': 4438.491, 'text': 'the next one that we are going to use for this particular project is vs code ide.', 'start': 4433.887, 'duration': 4.604}, {'end': 4444.436, 'text': "so finally, you will be able to see, i'm just going to write vs code ide.", 'start': 4438.491, 'duration': 5.945}], 'summary': 'Updating github repository with project steps and tools for interviews.', 'duration': 44.218, 'max_score': 4400.218, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM4400218.jpg'}, {'end': 4649.562, 'src': 'embed', 'start': 4624.209, 'weight': 0, 'content': [{'end': 4633.412, 'text': 'So you also have to create an account with respect to Heroku, because this is the cloud platform we will be using in order to deploy our application.', 'start': 4624.209, 'duration': 9.203}, {'end': 4637.473, 'text': 'And it actually provides you five applications to deploy completely for free.', 'start': 4633.952, 'duration': 3.521}, {'end': 4639.915, 'text': 'okay, github.', 'start': 4637.953, 'duration': 1.962}, {'end': 4643.818, 'text': "there will be also something called as github action, which we'll be discussing as we go ahead.", 'start': 4639.915, 'duration': 3.903}, {'end': 4649.562, 'text': 'um, but these are the four tools that we are going to use with respect to our implementation,', 'start': 4643.818, 'duration': 5.744}], 'summary': 'Heroku offers five free deployments; github and github action also used.', 'duration': 25.353, 'max_score': 4624.209, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM4624209.jpg'}, {'end': 4830.972, 'src': 'embed', 'start': 4801.742, 'weight': 4, 'content': [{'end': 4810.43, 'text': 'so here create a new environment for the project, and it is always a better step that whenever we create a new project,', 'start': 4801.742, 'duration': 8.688}, {'end': 4812.571, 'text': 'we have to create a new environment.', 'start': 4810.43, 'duration': 2.141}, {'end': 4821.739, 'text': 'so to create a new environment, the command that we are specifically going to use is something called as, and obviously, if you are using anaconda,', 'start': 4812.571, 'duration': 9.168}, {'end': 4823.801, 'text': 'i think this is the command that you should use.', 'start': 4821.739, 'duration': 2.062}, {'end': 4830.972, 'text': 'so conda, create minus p e, VENV.', 'start': 4823.801, 'duration': 7.171}], 'summary': "Creating a new environment for a project is essential using 'conda create -p venv' command.", 'duration': 29.23, 'max_score': 4801.742, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM4801742.jpg'}, {'end': 5073.417, 'src': 'embed', 'start': 5047.662, 'weight': 3, 'content': [{'end': 5055.186, 'text': 'dot txt file i will be creating over here and in this, whatever libraries you require you know,', 'start': 5047.662, 'duration': 7.524}, {'end': 5061.67, 'text': 'like sklearn or any kind of libraries that you require, you can basically write it over here, like, if i take an example,', 'start': 5055.186, 'duration': 6.484}, {'end': 5066.873, 'text': "like when i'm creating this end-to-end projects, i will be using a library which is called as flask.", 'start': 5061.67, 'duration': 5.203}, {'end': 5073.417, 'text': "so flask will be the library that i'm actually going to use and what we are going to do with respect to flask.", 'start': 5067.413, 'duration': 6.004}], 'summary': 'Creating a .txt file for using libraries like sklearn, flask for end-to-end projects.', 'duration': 25.755, 'max_score': 5047.662, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM5047662.jpg'}], 'start': 4337.69, 'title': 'Setting up tools and environments', 'summary': 'Discusses setting up software and tools for a project, emphasizing creating a github account, downloading vs code ide, and creating a heroku account, as well as creating and activating a new environment using conda command, and the importance of creating a new environment for each project.', 'chapters': [{'end': 4643.818, 'start': 4337.69, 'title': 'Setting up tools for project implementation', 'summary': 'Discusses the process of setting up software and tools for a project, emphasizing the importance of creating a github account, downloading vs code ide, and creating a heroku account for project deployment.', 'duration': 306.128, 'highlights': ['The importance of creating a GitHub account for project implementation and showcasing steps to potential interviewers is emphasized for impressing them.', 'Emphasis is placed on downloading the VS Code IDE and providing links for access, demonstrating the significance of this tool for project development.', 'The necessity of creating a Heroku account for project deployment is stressed, with the advantage of deploying five applications for free highlighted.']}, {'end': 5086.725, 'start': 4643.818, 'title': 'Creating and activating new environment', 'summary': "Covers the process of creating a new environment using conda command 'conda create -p venv 3.7 -y' and activating the environment using 'conda activate venv', emphasizing the importance of creating a new environment for each project and creating a requirement.txt file to list the required libraries.", 'duration': 442.907, 'highlights': ["The chapter covers the process of creating a new environment using conda command 'conda create -p venv 3.7 -y' and activating the environment using 'conda activate venv', emphasizing the importance of creating a new environment for each project", 'Creating a requirement.txt file to list the required libraries such as flask, sklearn, pandas, numpy, and matplotlib for the end-to-end projects']}], 'duration': 749.035, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM4337690.jpg', 'highlights': ['The necessity of creating a Heroku account for project deployment is stressed, with the advantage of deploying five applications for free highlighted.', 'Emphasis is placed on downloading the VS Code IDE and providing links for access, demonstrating the significance of this tool for project development.', 'The importance of creating a GitHub account for project implementation and showcasing steps to potential interviewers is emphasized for impressing them.', 'Creating a requirement.txt file to list the required libraries such as flask, sklearn, pandas, numpy, and matplotlib for the end-to-end projects', "The chapter covers the process of creating a new environment using conda command 'conda create -p venv 3.7 -y' and activating the environment using 'conda activate venv', emphasizing the importance of creating a new environment for each project"]}, {'end': 5890.44, 'segs': [{'end': 5115.297, 'src': 'embed', 'start': 5086.725, 'weight': 0, 'content': [{'end': 5088.346, 'text': "so i'm going to import this.", 'start': 5086.725, 'duration': 1.621}, {'end': 5089.447, 'text': 'flask will be capital f.', 'start': 5088.346, 'duration': 1.101}, {'end': 5092.97, 'text': 'so these all requirements.', 'start': 5091.108, 'duration': 1.862}, {'end': 5100.516, 'text': 'now, if probably i want to install this library, all i have to do is that write down a command over here as python.', 'start': 5092.97, 'duration': 7.546}, {'end': 5105.66, 'text': 'uh, python, uh, sorry, not python.', 'start': 5100.516, 'duration': 5.144}, {'end': 5111.285, 'text': 'it says like pip install, okay, minus r requirements, dot txt.', 'start': 5105.66, 'duration': 5.625}, {'end': 5115.297, 'text': 'So this will be requirements.txt.', 'start': 5112.796, 'duration': 2.501}], 'summary': 'Using pip to install library requirements in flask', 'duration': 28.572, 'max_score': 5086.725, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM5086725.jpg'}, {'end': 5253.783, 'src': 'embed', 'start': 5225.58, 'weight': 1, 'content': [{'end': 5229.522, 'text': 'In order to set it up, first of all, just go and write git config.', 'start': 5225.58, 'duration': 3.942}, {'end': 5234.084, 'text': "And when you write git, it's git config global user.name.", 'start': 5230.623, 'duration': 3.461}, {'end': 5235.525, 'text': 'So two things you have to set up.', 'start': 5234.184, 'duration': 1.341}, {'end': 5237.806, 'text': 'So let me just again clear it.', 'start': 5236.365, 'duration': 1.441}, {'end': 5241.188, 'text': 'Okay So one is git config.', 'start': 5237.966, 'duration': 3.222}, {'end': 5244.196, 'text': 'minus, minus global.', 'start': 5242.955, 'duration': 1.241}, {'end': 5248.859, 'text': 'so we are going to set up the entire global user dot name.', 'start': 5244.196, 'duration': 4.663}, {'end': 5251.481, 'text': 'so if i execute it, i have already configured it.', 'start': 5248.859, 'duration': 2.622}, {'end': 5253.783, 'text': "so here you will be able to see that i'll be getting krishna.", 'start': 5251.481, 'duration': 2.302}], 'summary': 'Set up git config with global user.name: krishna.', 'duration': 28.203, 'max_score': 5225.58, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM5225580.jpg'}], 'start': 5086.725, 'title': 'Git and flask setup', 'summary': "Covers the installation of flask library using pip and setting up git cli with user name and email. it also discusses the process of adding, committing, and pushing files to a git repository, including steps such as using 'git add' to stage files, 'git commit' to create a snapshot of changes, and 'git push' to push changes to the repository, with a focus on setting up the commit message and pushing to the main branch.", 'chapters': [{'end': 5331.968, 'start': 5086.725, 'title': 'Flask library installation and git cli configuration', 'summary': 'Covers the installation of flask library using pip and setting up git cli with user name and email for pushing code to the github repository.', 'duration': 245.243, 'highlights': ["Flask library installation using pip: The transcript explains the process of installing the Flask library using 'pip install -r requirements.txt' command within a virtual environment, ensuring the installation takes place from the specified requirements file.", "Setting up Git CLI with user name and email: It details the configuration of user name and email using 'git config --global user.name' and 'git config --global user.email' commands, emphasizing the significance of setting the email associated with the GitHub account for seamless repository commits."]}, {'end': 5890.44, 'start': 5332.669, 'title': 'Git commit and push process', 'summary': "Discusses the process of adding, committing, and pushing files to a git repository, including steps such as using 'git add' to stage files, 'git commit' to create a snapshot of changes, and 'git push' to push changes to the repository, with a focus on setting up the commit message and pushing to the main branch.", 'duration': 557.771, 'highlights': ["The process of adding, committing, and pushing files to a Git repository involves steps such as using 'git add' to stage files, 'git commit' to create a snapshot of changes, and 'git push' to push changes to the repository.", "Setting up the commit message and pushing to the main branch is emphasized, with a focus on using 'git commit -m' to include a meaningful commit message and 'git push' to push changes to the main branch.", "The detailed explanation of using 'git add' to stage files, 'git commit' to create a snapshot of changes, and 'git push' to push changes to the repository is provided."]}], 'duration': 803.715, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM5086725.jpg', 'highlights': ["Flask library installation using pip: The process of installing the Flask library using 'pip install -r requirements.txt' command within a virtual environment.", "Setting up Git CLI with user name and email: Configuration of user name and email using 'git config --global user.name' and 'git config --global user.email' commands."]}, {'end': 7235.209, 'segs': [{'end': 5929.984, 'src': 'embed', 'start': 5890.44, 'weight': 1, 'content': [{'end': 5897.823, 'text': 'remember that once we do this git commit right, it is basically going to take from the stage environment and put it in something like origin right.', 'start': 5890.44, 'duration': 7.383}, {'end': 5900.724, 'text': 'so that is the reason we wrote git push, origin to mail.', 'start': 5897.823, 'duration': 2.901}, {'end': 5907.588, 'text': "okay. and finally, when we do git push here, you'll be able to see that all the information has been pushed over here.", 'start': 5900.724, 'duration': 6.864}, {'end': 5912.59, 'text': "so finally, you'll be able to see successful github repository which have all these files.", 'start': 5907.588, 'duration': 5.002}, {'end': 5918.672, 'text': "now, in my next step, what i'm actually going to do is that i want to create an app.py file.", 'start': 5913.746, 'duration': 4.926}, {'end': 5920.994, 'text': 'i will create a front-end application.', 'start': 5918.672, 'duration': 2.322}, {'end': 5927.522, 'text': "i'll use this pickle file for the you know prediction purpose and then i'll be able to get the output.", 'start': 5920.994, 'duration': 6.528}, {'end': 5929.984, 'text': 'so that is what we are going to do in the upcoming videos.', 'start': 5927.522, 'duration': 2.462}], 'summary': 'Demonstrating git commit, push, and file creation for prediction app.', 'duration': 39.544, 'max_score': 5890.44, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM5890440.jpg'}, {'end': 6928.124, 'src': 'heatmap', 'start': 6827.276, 'weight': 0.894, 'content': [{'end': 6829.337, 'text': "I'm just going to go to my command prompt.", 'start': 6827.276, 'duration': 2.061}, {'end': 6831.759, 'text': 'It is still inside my venv environment.', 'start': 6829.377, 'duration': 2.382}, {'end': 6835.181, 'text': "So let's go and run python app.py.", 'start': 6831.819, 'duration': 3.362}, {'end': 6840.225, 'text': "So once I run this, you'll be seeing that you have to run in this way, right? Python app.py.", 'start': 6835.201, 'duration': 5.024}, {'end': 6845.409, 'text': "And this is your IP, which you'll be able to see 127.0.", 'start': 6840.985, 'duration': 4.424}, {'end': 6847.01, 'text': '0 500, right.', 'start': 6845.409, 'duration': 1.601}, {'end': 6852.055, 'text': 'so i will click on this and here you can see hello world, you are able to get perfect.', 'start': 6847.01, 'duration': 5.045}, {'end': 6856.379, 'text': 'uh, now the thing is that i had created an api right over here.', 'start': 6852.055, 'duration': 4.324}, {'end': 6862.245, 'text': 'so here you can see that i created an api and the api name was nothing but slash predict underscore api.', 'start': 6856.379, 'duration': 5.866}, {'end': 6867.63, 'text': 'right, so what happens if i just execute this slash predict underscore api.', 'start': 6862.245, 'duration': 5.385}, {'end': 6872.288, 'text': 'obviously it is not going to run because method is not allowed.', 'start': 6868.766, 'duration': 3.522}, {'end': 6874.009, 'text': 'method is not allowed for the requested url.', 'start': 6872.288, 'duration': 1.721}, {'end': 6875.931, 'text': 'why? because the method is post.', 'start': 6874.009, 'duration': 1.922}, {'end': 6881.434, 'text': 'so we have to give some information, some post from the client, some, some information.', 'start': 6875.931, 'duration': 5.503}, {'end': 6883.315, 'text': 'what is the information?', 'start': 6881.434, 'duration': 1.881}, {'end': 6884.876, 'text': 'that data it is looking at.', 'start': 6883.315, 'duration': 1.561}, {'end': 6888.699, 'text': 'so for doing this, one one thing we are going to use is something called as postman.', 'start': 6884.876, 'duration': 3.823}, {'end': 6893.622, 'text': 'so please go ahead and install postman, so you can go ahead and click on post and download for windows.', 'start': 6888.699, 'duration': 4.923}, {'end': 6897.865, 'text': "So once you go and click over here here, you'll be able to see something like postman.", 'start': 6894.182, 'duration': 3.683}, {'end': 6901.607, 'text': 'Windows bit 64 bit, you can download it and you can install it.', 'start': 6898.525, 'duration': 3.082}, {'end': 6903.328, 'text': 'And again, installation is very simple.', 'start': 6901.867, 'duration': 1.461}, {'end': 6907.731, 'text': "After you download this, just click on this, you'll be getting an exe file.", 'start': 6903.789, 'duration': 3.942}, {'end': 6910.753, 'text': 'Okay And you can start the installation.', 'start': 6908.232, 'duration': 2.521}, {'end': 6914.476, 'text': 'So here you can see it is 143 MB file, which is getting downloaded.', 'start': 6910.773, 'duration': 3.703}, {'end': 6918.979, 'text': 'And you can start the installation just by clicking, clicking next, next, next, next, next.', 'start': 6914.996, 'duration': 3.983}, {'end': 6922.822, 'text': "Now, once you install it here, you'll be able to see something like this.", 'start': 6919.519, 'duration': 3.303}, {'end': 6928.124, 'text': "Okay, I have created a lot many, but it's okay, not a problem.", 'start': 6922.922, 'duration': 5.202}], 'summary': 'Demonstrating running python app, creating & testing api using postman.', 'duration': 100.848, 'max_score': 6827.276, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM6827276.jpg'}, {'end': 7045.293, 'src': 'embed', 'start': 7016.36, 'weight': 2, 'content': [{'end': 7022.242, 'text': 'so here you will be able to see this is absolutely a validated json and it is quite accurate.', 'start': 7016.36, 'duration': 5.882}, {'end': 7025.943, 'text': "okay. So I'm going to go again back to the JSON.", 'start': 7022.242, 'duration': 3.701}, {'end': 7027.144, 'text': 'prepare my data like this', 'start': 7025.943, 'duration': 1.201}, {'end': 7029.185, 'text': 'You can put up any values over here.', 'start': 7027.464, 'duration': 1.721}, {'end': 7033.947, 'text': 'And here you can see that this is the post, this is my API.', 'start': 7029.885, 'duration': 4.062}, {'end': 7035.868, 'text': "I'm just going to go and hit on send.", 'start': 7034.107, 'duration': 1.761}, {'end': 7039.55, 'text': 'Once I hit on send, I should be getting my response as an output.', 'start': 7036.388, 'duration': 3.162}, {'end': 7045.293, 'text': 'Okay, so once I click on send, now here you can see I am getting my output that is 30.08.', 'start': 7039.97, 'duration': 5.323}], 'summary': 'The validated json data produced an output of 30.08 upon request.', 'duration': 28.933, 'max_score': 7016.36, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM7016360.jpg'}, {'end': 7203.584, 'src': 'embed', 'start': 7176.415, 'weight': 0, 'content': [{'end': 7182.356, 'text': 'and final step is basically to get push from origin to the main branch right.', 'start': 7176.415, 'duration': 5.941}, {'end': 7186.877, 'text': "so once we push it here you'll be able to see all the pushes basically happening.", 'start': 7182.356, 'duration': 4.521}, {'end': 7188.617, 'text': 'perfect, it has done, completed.', 'start': 7186.877, 'duration': 1.74}, {'end': 7190.418, 'text': "now i'll go to my browser.", 'start': 7188.617, 'duration': 1.801}, {'end': 7193.238, 'text': "let's see whether it has been updated in my github file or not.", 'start': 7190.418, 'duration': 2.82}, {'end': 7200.201, 'text': "now, here you can see that i'll just go and reload it, okay, so here all the files has been updated.", 'start': 7193.238, 'duration': 6.963}, {'end': 7203.584, 'text': "i'll also make sure that i'll update all the readme file, what all steps we did.", 'start': 7200.201, 'duration': 3.383}], 'summary': 'Pushed changes to main branch, verified updates in github.', 'duration': 27.169, 'max_score': 7176.415, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM7176415.jpg'}], 'start': 5890.44, 'title': 'Flask web app and api development', 'summary': 'Covers committing changes to github, creating a flask app for model deployment, developing a flask web application for predictive modeling, and using postman for api testing, including key steps and tools such as github, flask, postman, and the use of pickle files and json data.', 'chapters': [{'end': 5959.226, 'start': 5890.44, 'title': 'Committing changes and building an end-to-end application', 'summary': 'Details the process of committing changes to a github repository, including pushing to the origin, and outlines the plan to create an app.py file for front-end application and prediction purposes in the upcoming videos.', 'duration': 68.786, 'highlights': ['The process of committing changes to a GitHub repository, including using git push to push all the information to the repository, is emphasized as essential for successful implementation.', 'The plan to create an app.py file for front-end application, utilizing a pickle file for prediction, is outlined as the next step in the upcoming videos.', 'The importance of following good practices and configuring tools for committing to the GitHub repository is stressed as a crucial part of the process.']}, {'end': 6307.263, 'start': 5959.326, 'title': 'Creating flask app for model deployment', 'summary': 'Covers the process of creating a flask app for model deployment, including importing necessary libraries, loading the regression model pickle file, defining routes for home page and predict api, and utilizing an extension called github copilot for code generation.', 'duration': 347.937, 'highlights': ['Utilizing github copilot extension for code generation in visual studio code', 'Defining routes for home page and predict API in the Flask app', 'Loading the regression model pickle file and creating a Flask app for model deployment', 'Importing necessary libraries such as numpy and pandas for the Flask app']}, {'end': 6875.931, 'start': 6307.803, 'title': 'Flask web app development', 'summary': 'Covers the development of a flask web application for predictive modeling, including the creation of a pickle file for standardization, transformation of data for prediction, and the creation of an api for prediction using flask. it also mentions the use of postman for testing the prediction api.', 'duration': 568.128, 'highlights': ['Creation of a pickle file for standardization', 'Transformation of data for prediction using Flask', 'Use of Postman for testing the prediction API']}, {'end': 7235.209, 'start': 6875.931, 'title': 'Using postman for api testing', 'summary': 'Discusses the process of using postman to send a post request to a local api, transforming data into json format, and validating the json data. it also demonstrates how to commit and push the code to a main branch in a repository, resulting in successful updates on github.', 'duration': 359.278, 'highlights': ['The chapter demonstrates using Postman to send a post request to a local API, receiving an output of 30.08, and exploring the impact of different input values on the output, providing a practical understanding of API testing.', 'It explains the process of transforming data into JSON format, ensuring the validity of the JSON data through JSON validation, and sending the transformed data to the API, showcasing a meticulous approach to data preparation.', 'The chapter provides a step-by-step guide on committing and pushing code to a main branch in a repository, culminating in successful updates on GitHub, showcasing proficiency in version control and collaboration tools.', 'It emphasizes the importance of ensuring the validity of JSON data by using a JSON validator, demonstrating a commitment to data accuracy and quality in API testing.', 'The chapter demonstrates the use of Postman to send a post request to a local API, receiving an output of 30.08, and explores the impact of different input values on the output, providing a practical understanding of API testing.']}], 'duration': 1344.769, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM5890440.jpg', 'highlights': ['The chapter provides a step-by-step guide on committing and pushing code to a main branch in a repository, showcasing proficiency in version control and collaboration tools.', 'The process of committing changes to a GitHub repository, including using git push to push all the information to the repository, is emphasized as essential for successful implementation.', 'The chapter demonstrates using Postman to send a post request to a local API, receiving an output of 30.08, and exploring the impact of different input values on the output, providing a practical understanding of API testing.', 'The plan to create an app.py file for front-end application, utilizing a pickle file for prediction, is outlined as the next step in the upcoming videos.', 'The chapter emphasizes the importance of ensuring the validity of JSON data by using a JSON validator, demonstrating a commitment to data accuracy and quality in API testing.']}, {'end': 7915.016, 'segs': [{'end': 7279.165, 'src': 'embed', 'start': 7235.209, 'weight': 1, 'content': [{'end': 7238.994, 'text': 'okay, now, this is perfectly fine, till here.', 'start': 7235.209, 'duration': 3.785}, {'end': 7241.256, 'text': "now you'll think of deploying this into a cloud.", 'start': 7238.994, 'duration': 2.262}, {'end': 7245.238, 'text': "once we deploy this into a cloud, then we'll be dockerizing it.", 'start': 7242.157, 'duration': 3.081}, {'end': 7246.678, 'text': 'that will be using github action.', 'start': 7245.238, 'duration': 1.44}, {'end': 7249.679, 'text': 'so there are many things that we are going to learn as we go ahead.', 'start': 7246.678, 'duration': 3.001}, {'end': 7254.881, 'text': "be with me, guys, because one project if you're able to do properly all the projects, you'll be able to do it okay.", 'start': 7249.679, 'duration': 5.202}, {'end': 7255.481, 'text': 'so, thank you.', 'start': 7254.881, 'duration': 0.6}, {'end': 7256.341, 'text': 'this was it from my side.', 'start': 7255.481, 'duration': 0.86}, {'end': 7257.742, 'text': "i'll see you in the next video.", 'start': 7256.341, 'duration': 1.401}, {'end': 7260.783, 'text': "in the next video, we'll try to see how we can actually deploy it.", 'start': 7257.742, 'duration': 3.041}, {'end': 7261.003, 'text': 'thank you.', 'start': 7260.783, 'duration': 0.22}, {'end': 7279.165, 'text': 'So, guys, in our previous video, we have actually created an API which is like predict underscore API, which was a post request.', 'start': 7272.341, 'duration': 6.824}], 'summary': 'Learning about cloud deployment, dockerization, and github actions for project development.', 'duration': 43.956, 'max_score': 7235.209, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM7235209.jpg'}, {'end': 7310.914, 'src': 'embed', 'start': 7284.408, 'weight': 0, 'content': [{'end': 7288.31, 'text': 'After, we are passing it through scalar transformation, that is standardization.', 'start': 7284.408, 'duration': 3.902}, {'end': 7294.865, 'text': 'And from that we are also passing it through the regression model, which is basically giving me the output.', 'start': 7288.83, 'duration': 6.035}, {'end': 7304.551, 'text': 'but instead of just creating in the form of api, why not just create a small web application wherein we just provide the inputs, we submit the form.', 'start': 7294.865, 'duration': 9.686}, {'end': 7310.914, 'text': 'as soon as we submit the form, we we basically take the data over here and do the prediction with the help of, uh,', 'start': 7304.551, 'duration': 6.363}], 'summary': 'Data is standardized, then used in regression model to predict output. a web app is suggested for input and prediction.', 'duration': 26.506, 'max_score': 7284.408, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM7284408.jpg'}, {'end': 7647.882, 'src': 'embed', 'start': 7620.955, 'weight': 3, 'content': [{'end': 7626.537, 'text': 'basic HTML is more than sufficient to just create this Front-end application right.', 'start': 7620.955, 'duration': 5.582}, {'end': 7628.617, 'text': 'and then, finally, I have my predict button right?', 'start': 7626.537, 'duration': 2.08}, {'end': 7635.139, 'text': 'So so, inside this form, when i click on this predict button, it will go to this particular url that is predict, and in short,', 'start': 7628.637, 'duration': 6.502}, {'end': 7640.06, 'text': 'it will basically call this slash predict, and then entire this function will be called.', 'start': 7635.139, 'duration': 4.921}, {'end': 7641.22, 'text': "after that, you'll be seeing that.", 'start': 7640.06, 'duration': 1.16}, {'end': 7642.881, 'text': 'where does this output get displayed?', 'start': 7641.22, 'duration': 1.661}, {'end': 7647.882, 'text': 'so if you go decide right, there will be something called as a placeholder, which is in double flower brackets.', 'start': 7642.881, 'duration': 5.001}], 'summary': "Basic html is sufficient for creating the front-end; 'predict' button calls '/predict' url and displays output in a placeholder.", 'duration': 26.927, 'max_score': 7620.955, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM7620955.jpg'}, {'end': 7761.445, 'src': 'embed', 'start': 7736.778, 'weight': 2, 'content': [{'end': 7744.28, 'text': "here we are going to basically take up all the values that we are going to get from that form and then here i'm going to convert this into an array with this shape,", 'start': 7736.778, 'duration': 7.502}, {'end': 7748.761, 'text': 'and then we are finally transforming it and then finally, we are predicting it and getting the output,', 'start': 7744.28, 'duration': 4.481}, {'end': 7753.923, 'text': 'and we are displaying the output in the same home.html with this new text.', 'start': 7748.761, 'duration': 5.162}, {'end': 7755.383, 'text': 'okay, that is what we are doing.', 'start': 7753.923, 'duration': 1.46}, {'end': 7756.763, 'text': "so let's go ahead and quickly.", 'start': 7755.383, 'duration': 1.38}, {'end': 7761.445, 'text': "let's go ahead and click predict and let's see whether we get the output and something has happened.", 'start': 7756.763, 'duration': 4.682}], 'summary': 'Converting form values into an array, transforming, predicting, and displaying output in home.html.', 'duration': 24.667, 'max_score': 7736.778, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM7736778.jpg'}, {'end': 7839.682, 'src': 'embed', 'start': 7809.951, 'weight': 5, 'content': [{'end': 7820.956, 'text': "and then i will just add git commit, minus m, and i'm going to give the message and here you can see the web app is ready.", 'start': 7809.951, 'duration': 11.005}, {'end': 7831.783, 'text': "okay, and finally, just execute this two file changes and then finally i'll just do git push along with that from origin to main.", 'start': 7820.956, 'duration': 10.827}, {'end': 7834.019, 'text': 'So here it is.', 'start': 7833.058, 'duration': 0.961}, {'end': 7839.682, 'text': 'The entire Git push is basically happening and we are properly fine, right? So everything is done.', 'start': 7834.539, 'duration': 5.143}], 'summary': 'Web app ready, git push executed successfully.', 'duration': 29.731, 'max_score': 7809.951, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM7809951.jpg'}], 'start': 7235.209, 'title': 'Developing api and web application', 'summary': 'Covers deploying an api into the cloud, dockerizing it with github action, and creating a web application to provide inputs, submit a form, and obtain predictions from a regression model. it also details creating a web application to predict house prices using flask, capturing form input, processing it, and successfully deploying the web app.', 'chapters': [{'end': 7345.596, 'start': 7235.209, 'title': 'Deploying api and creating web application', 'summary': 'Covers the deployment of an api into a cloud, dockerizing it using github action, and creating a web application to provide inputs, submit a form, and obtain predictions from a regression model.', 'duration': 110.387, 'highlights': ['Covering deployment of an API into a cloud and dockerizing it using GitHub Action.', 'Creating a web application to provide inputs, submit a form, and obtain predictions from a regression model.', 'Explanation of predicting outcomes using a regression model and scalar transformation.']}, {'end': 7915.016, 'start': 7346.116, 'title': 'Predicting house prices web app', 'summary': 'Details the process of creating a web application to predict house prices using flask, where the form input is captured, processed, and used to make predictions, resulting in the successful deployment of the web app.', 'duration': 568.9, 'highlights': ['The process of capturing form input, converting it into an array, transforming it, and making predictions is explained in detail.', 'The creation and utilization of a simple front-end application to interact with the model and obtain predictions is demonstrated.', 'The process of committing the changes to Git and deploying the web app is outlined, including adding, committing, and pushing the changes to the repository.']}], 'duration': 679.807, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM7235209.jpg', 'highlights': ['Creating a web application to provide inputs, submit a form, and obtain predictions from a regression model.', 'Covering deployment of an API into a cloud and dockerizing it using GitHub Action.', 'The process of capturing form input, converting it into an array, transforming it, and making predictions is explained in detail.', 'The creation and utilization of a simple front-end application to interact with the model and obtain predictions is demonstrated.', 'Explanation of predicting outcomes using a regression model and scalar transformation.', 'The process of committing the changes to Git and deploying the web app is outlined, including adding, committing, and pushing the changes to the repository.']}, {'end': 9890.119, 'segs': [{'end': 8081.295, 'src': 'embed', 'start': 8053.821, 'weight': 0, 'content': [{'end': 8060.182, 'text': "so, based on the number of requests whenever the client is putting or probably, let's say, thousands of users are, uh,", 'start': 8053.821, 'duration': 6.361}, {'end': 8061.582, 'text': 'hitting this particular website.', 'start': 8060.182, 'duration': 1.4}, {'end': 8063.963, 'text': 'whatever website, whatever web application we are creating,', 'start': 8061.582, 'duration': 2.381}, {'end': 8071.688, 'text': 'It makes sure that you know it distributes the entire request through multiple instances and many more things.', 'start': 8064.443, 'duration': 7.245}, {'end': 8074.47, 'text': 'right?. That is the importance of geunicorn.', 'start': 8071.688, 'duration': 2.782}, {'end': 8075.131, 'text': 'in short, right?', 'start': 8074.47, 'duration': 0.661}, {'end': 8081.295, 'text': 'So, over here, when we call this geunicorn app is nothing, but we are basically calling this app.py5..', 'start': 8075.591, 'duration': 5.704}], 'summary': 'Geunicorn distributes requests for scalability, handling thousands of users hitting websites.', 'duration': 27.474, 'max_score': 8053.821, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM8053821.jpg'}, {'end': 8902.625, 'src': 'embed', 'start': 8876.688, 'weight': 2, 'content': [{'end': 8882.791, 'text': 'There may be some kind of configuration, some kind of dependencies, some kind of hardware issue or some kind of operating systems issues.', 'start': 8876.688, 'duration': 6.103}, {'end': 8883.512, 'text': 'also right?', 'start': 8882.791, 'duration': 0.721}, {'end': 8887.754, 'text': "Because let's say, I'm running an application in Windows machine and suddenly I deploy that in a Linux machine.", 'start': 8883.732, 'duration': 4.022}, {'end': 8889.415, 'text': 'I may get some kind of issues over there.', 'start': 8887.754, 'duration': 1.661}, {'end': 8890.216, 'text': 'also right?', 'start': 8889.415, 'duration': 0.801}, {'end': 8892.898, 'text': 'So Docker helps us to prevent that,', 'start': 8890.916, 'duration': 1.982}, {'end': 8902.625, 'text': "because here in the Docker image we will make sure that we have all the basic configurations set up then and there and we'll try to use that same base configuration in every machine we want to deploy it.", 'start': 8892.898, 'duration': 9.727}], 'summary': 'Docker ensures consistent configurations to prevent issues when deploying across different operating systems.', 'duration': 25.937, 'max_score': 8876.688, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM8876688.jpg'}, {'end': 9520.23, 'src': 'embed', 'start': 9495.68, 'weight': 4, 'content': [{'end': 9501.504, 'text': 'Now, as soon as this folder is being seen and whatever file is basically written over here, all these things will get deployed.', 'start': 9495.68, 'duration': 5.824}, {'end': 9504.266, 'text': 'All these things, all this process will automatically happen.', 'start': 9501.845, 'duration': 2.421}, {'end': 9511.812, 'text': 'In short, the build, push and release of a Docker container to Heroku will happen automatically from the GitHub repository itself.', 'start': 9504.787, 'duration': 7.025}, {'end': 9520.23, 'text': 'And this is super important, this build push and release, because there will be many people who will be working in a team right?', 'start': 9512.593, 'duration': 7.637}], 'summary': 'Automated deployment of docker container to heroku from github for team collaboration.', 'duration': 24.55, 'max_score': 9495.68, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM9495680.jpg'}, {'end': 9890.119, 'src': 'embed', 'start': 9882.408, 'weight': 1, 'content': [{'end': 9890.119, 'text': 'So I hope you like this particular video of deploying your data science application into Heroku Cloud with the help of Dockers and GitHub Actions.', 'start': 9882.408, 'duration': 7.711}], 'summary': 'Deploy data science app on heroku with dockers & github actions', 'duration': 7.711, 'max_score': 9882.408, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM9882408.jpg'}], 'start': 7915.556, 'title': 'Deploying applications on heroku and docker', 'summary': 'Discusses deploying python and machine learning applications on heroku using gunicorn, github, and docker, emphasizing the use of a proc file, updating requirement.txt, and successful deployment with ci/cd pipelines, showcasing the benefits of docker for preventing configuration and dependency issues and enabling concurrent and efficient application running.', 'chapters': [{'end': 8075.131, 'start': 7915.556, 'title': 'Deploying python application on heroku with gunicorn', 'summary': 'Discusses deploying a python application on heroku, emphasizing the importance of using a proc file and specifying commands for the heroku instance to run, particularly highlighting the use of gunicorn for running python applications concurrently and efficiently.', 'duration': 159.575, 'highlights': ['The importance of using Gunicorn for running Python applications concurrently by running multiple processes, ensuring efficient distribution of requests and handling thousands of users.', 'The significance of creating a proc file when deploying on Heroku, specifying commands for the app to execute upon startup.', 'The explanation of Gunicorn as a pure Python HTTP server for WSGI applications, allowing for concurrent execution of Python applications.']}, {'end': 8435.137, 'start': 8075.591, 'title': 'Deployment on heroku with github', 'summary': 'Covers the deployment process on heroku using github, including creating a proc file and updating the requirement.txt, and deploying the application on heroku, showcasing the process of connecting the github repository, manual deployment, and the installation process of the required packages.', 'duration': 359.546, 'highlights': ['The deployment process on Heroku using Github, creating a proc file and updating the requirement.txt, and deploying the application on Heroku.', 'The process of connecting the Github repository and the manual deployment on Heroku.', 'The installation process of the required packages such as Python, Flask, pandas, matplotlib, g-unicon, and scikit-learn.']}, {'end': 8841.858, 'start': 8435.137, 'title': 'Deploying ml application with docker on heroku', 'summary': 'Details the deployment of a machine learning application on heroku using docker, showcasing successful deployment and api testing, and also discusses the advantages of using dockers for deployment.', 'duration': 406.721, 'highlights': ['The deployment of a machine learning application on Heroku using Docker showcases successful deployment and API testing, with predictions working fine.', 'The advantages of using Dockers for deployment are discussed, emphasizing its time-saving nature and its ability to run as a container anywhere.', 'The process of creating a Docker file is explained, detailing the commands such as from, copy, work directory, run, expose, and cmd commands.']}, {'end': 9170.102, 'start': 8842.218, 'title': 'Using docker for application deployment', 'summary': 'Explains the challenges in manual setup, the benefits of using docker to prevent configuration and dependency issues, and the step-by-step process of building and running a docker image for application deployment, including copying files, installing dependencies, exposing ports, and running the application with multiple workers.', 'duration': 327.884, 'highlights': ['Docker helps prevent configuration and dependency issues by ensuring all basic configurations are set up in the Docker image, allowing the same base configuration to be used in every machine deployed (quantifiable data: benefits of using Docker to prevent configuration and dependency issues)', "The step-by-step process of building and running a Docker image includes selecting a base image with 'from', copying code to the image with 'copy', installing dependencies with 'run', and exposing and running the application with multiple workers (quantifiable data: step-by-step process of building and running a Docker image for application deployment)"]}, {'end': 9890.119, 'start': 9170.443, 'title': 'Deploying data science app to heroku with docker & github actions', 'summary': 'Details the process of deploying a data science application to heroku using docker and github actions, including configuring bindings, creating github workflows, adding repository secrets, and running the ci/cd pipeline, resulting in the successful deployment of the application as a docker container on heroku.', 'duration': 719.676, 'highlights': ['The process of deploying a data science application to Heroku using Docker and GitHub Actions is detailed.', 'The configuration of bindings and the importance of G-Unicorn for deployment in the Heroku cloud platform are emphasized.', 'The steps for creating GitHub workflows, including the creation of .github and .workflows folders and the main.yaml file, are explained.', 'The process of adding and managing repository secrets for Heroku API key, email, and app name in GitHub Actions is detailed.', 'The significance of CI/CD pipeline and the automatic build, push, and release of a Docker container to Heroku from the GitHub repository are highlighted.']}], 'duration': 1974.563, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/MJ1vWb1rGwM/pics/MJ1vWb1rGwM7915556.jpg', 'highlights': ['The importance of using Gunicorn for running Python applications concurrently by running multiple processes, ensuring efficient distribution of requests and handling thousands of users.', 'The deployment of a machine learning application on Heroku using Docker showcases successful deployment and API testing, with predictions working fine.', 'Docker helps prevent configuration and dependency issues by ensuring all basic configurations are set up in the Docker image, allowing the same base configuration to be used in every machine deployed.', 'The process of deploying a data science application to Heroku using Docker and GitHub Actions is detailed.', 'The significance of CI/CD pipeline and the automatic build, push, and release of a Docker container to Heroku from the GitHub repository are highlighted.']}], 'highlights': ['Demonstrates practical implementation of linear regression using the Boston Housing Data Set', 'Achieving a 71% and 68% performance score in regression model evaluation', 'Emphasizing the use of Docker for efficient application running', 'The dataset contains 506 instances and 13 attributes, including features like per capita crime rate', 'Importing libraries such as pandas, numpy, and matplotlib for data visualization and analysis', 'The importance of understanding the dataset for further analysis is emphasized', 'The process of pickling the model file for deployment is explained', 'The necessity of creating a Heroku account for project deployment is stressed', 'Flask library installation using pip within a virtual environment', 'The chapter provides a step-by-step guide on committing and pushing code to a main branch in a repository', 'Creating a web application to provide inputs, submit a form, and obtain predictions from a regression model', 'The importance of using Gunicorn for running Python applications concurrently by running multiple processes', 'The deployment of a machine learning application on Heroku using Docker showcases successful deployment and API testing', 'The significance of CI/CD pipeline and the automatic build, push, and release of a Docker container to Heroku from the GitHub repository']}