title

Hypothesis Space and Inductive Bias

description

detail

{'title': 'Hypothesis Space and Inductive Bias', 'heatmap': [{'end': 244.728, 'start': 198.349, 'weight': 1}, {'end': 1391.531, 'start': 1363.275, 'weight': 0.802}, {'end': 1907.161, 'start': 1885.731, 'weight': 0.743}], 'summary': 'Delves into the concepts of inductive learning, hypothesis space, and feature space in machine learning, along with discussions on representation, terminology, and inductive bias, aiming to provide a comprehensive understanding of these crucial aspects in the context of classification problems.', 'chapters': [{'end': 445.448, 'segs': [{'end': 71.437, 'src': 'embed', 'start': 19.397, 'weight': 0, 'content': [{'end': 25.54, 'text': 'Good morning, today we will have the first module of machine learning part c.', 'start': 19.397, 'duration': 6.143}, {'end': 30.762, 'text': 'I will talk about hypothesis space and inductive bias,', 'start': 25.54, 'duration': 5.222}, {'end': 38.286, 'text': 'will give a brief introduction to this so that when we talk about different machine learning algorithms, we can refer to this discussion.', 'start': 30.762, 'duration': 7.524}, {'end': 71.437, 'text': 'So, as we have seen that in inductive learning or prediction we are given examples or data and the examples are of the form as we have seen x, y.', 'start': 39.967, 'duration': 31.47}], 'summary': 'Introduction to hypothesis space and inductive bias for machine learning algorithms.', 'duration': 52.04, 'max_score': 19.397, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk19397.jpg'}, {'end': 244.728, 'src': 'heatmap', 'start': 130.678, 'weight': 1, 'content': [{'end': 138.903, 'text': 'depending on whether the output attribute type is discrete valued or continuous valued.', 'start': 130.678, 'duration': 8.225}, {'end': 144.267, 'text': 'In classification problem this function f x is discrete.', 'start': 139.483, 'duration': 4.784}, {'end': 158.57, 'text': 'In regression the function f x is continuous.', 'start': 148.669, 'duration': 9.901}, {'end': 174.723, 'text': 'And we can also apart from classification and regression in some cases we may want to find out the probability of a particular value of y.', 'start': 163.815, 'duration': 10.908}, {'end': 191.509, 'text': 'So, for those problems where we look at probability estimation our f is the probability of x.', 'start': 174.723, 'duration': 16.786}, {'end': 202.233, 'text': 'So, these are the types of inductive learning problems that we are looking at.', 'start': 198.349, 'duration': 3.884}, {'end': 204.455, 'text': 'Why do we call this inductive learning?', 'start': 202.653, 'duration': 1.802}, {'end': 213.565, 'text': 'We are given some data and we are trying to do induction, to try to identify a function which can explain the data.', 'start': 204.916, 'duration': 8.649}, {'end': 216.569, 'text': 'So, induction as opposed to deduction.', 'start': 214.046, 'duration': 2.523}, {'end': 223.956, 'text': 'unless we can see all the instances, all the possible data points,', 'start': 218.132, 'duration': 5.824}, {'end': 235.622, 'text': 'or we make some restrictive assumption about the language in which the hypothesis is expressed, or some bias, this problem is not well defined.', 'start': 223.956, 'duration': 11.666}, {'end': 237.944, 'text': 'So, that is why it is called an inductive problem.', 'start': 235.742, 'duration': 2.202}, {'end': 244.728, 'text': 'Then in the last class, we talked about features.', 'start': 241.206, 'duration': 3.522}], 'summary': 'Inductive learning involves identifying functions to explain data for classification, regression, and probability estimation.', 'duration': 85.891, 'max_score': 130.678, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk130678.jpg'}], 'start': 19.397, 'title': 'Introduction to hypothesis space and feature space in machine learning', 'summary': 'Introduces the concept of inductive learning, explaining the roles of inductive bias and hypothesis space, and discusses the quantification of instances in feature space for classification problems.', 'chapters': [{'end': 253.798, 'start': 19.397, 'title': 'Introduction to hypothesis space and inductive bias', 'summary': 'Introduces the concept of inductive learning, explaining how inductive bias and hypothesis space play a crucial role in understanding machine learning algorithms and identifying the types of inductive learning problems.', 'duration': 234.401, 'highlights': ['The chapter introduces the concept of inductive learning, explaining how inductive bias and hypothesis space play a crucial role in understanding machine learning algorithms and identifying the types of inductive learning problems. The chapter provides an overview of inductive learning, emphasizing the significance of inductive bias and hypothesis space in comprehending machine learning algorithms and classifying inductive learning problems.', 'The function f(x) is discrete for classification problems and continuous for regression problems, with the possibility of estimating the probability of a particular value of y. The function f(x) is discrete for classification problems and continuous for regression problems, and in some cases, it involves estimating the probability of a specific value of y.', 'The chapter emphasizes the inductive nature of learning, where data is utilized for induction to identify a function that can explain the data, distinguishing it from deduction. The chapter emphasizes the inductive nature of learning, where data is utilized for induction to identify a function that can explain the data, distinguishing it from deduction.', 'The chapter discusses the types of inductive learning problems, including classification, regression, and probability estimation, highlighting the need for making restrictive assumptions about the language or introducing bias to define the problem. The chapter discusses the types of inductive learning problems, including classification, regression, and probability estimation, and emphasizes the need for making restrictive assumptions about the language or introducing bias to define the problem.']}, {'end': 445.448, 'start': 255.099, 'title': 'Feature space and classification', 'summary': 'Discusses the concept of features and feature space, where instances are described quantitatively using features, and how a two-class classification problem can be visualized in a feature space.', 'duration': 190.349, 'highlights': ['Instances are described quantitatively using features, and a feature vector is used to represent multiple features.', 'The feature space is defined by the number of features, where two features define a two-dimensional space, and n features define an n-dimensional space.', 'In a two-class classification problem, instances are marked as positive or negative in the feature space based on their class.', 'A training set is used to map instances in the feature space as positive or negative points.']}], 'duration': 426.051, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk19397.jpg', 'highlights': ['The chapter introduces the concept of inductive learning, emphasizing the significance of inductive bias and hypothesis space.', 'The function f(x) is discrete for classification problems and continuous for regression problems, with the possibility of estimating the probability of a particular value of y.', 'The chapter emphasizes the inductive nature of learning, distinguishing it from deduction.', 'The chapter discusses the types of inductive learning problems, including classification, regression, and probability estimation.']}, {'end': 693.819, 'segs': [{'end': 517.666, 'src': 'embed', 'start': 448.928, 'weight': 4, 'content': [{'end': 454.832, 'text': 'Now what we want to do is we want to learn a function so that, based on the function,', 'start': 448.928, 'duration': 5.904}, {'end': 460.956, 'text': 'we want the function to predict whether a new instance which is given to you.', 'start': 454.832, 'duration': 6.124}, {'end': 467.8, 'text': 'Suppose this is a new instance which is given to you, we want to know whether this should be positive or negative.', 'start': 461.596, 'duration': 6.204}, {'end': 471.983, 'text': 'In order to do this, we have to learn a function,', 'start': 468.36, 'duration': 3.623}, {'end': 480.369, 'text': 'or the function could be a particular curve or a line which separates the positive from the negative instances.', 'start': 471.983, 'duration': 8.386}, {'end': 491.579, 'text': 'For example, the function that we learn could be this function, and we can say that any point which lies to this side of the function is positive.', 'start': 480.77, 'duration': 10.809}, {'end': 498.665, 'text': 'any point which lies to this side of the function is negative, and since this yellow point lies to the left of the function, it is negative.', 'start': 491.579, 'duration': 7.086}, {'end': 501.127, 'text': 'So, this is what inductive learning is about.', 'start': 499.045, 'duration': 2.082}, {'end': 506.361, 'text': 'So let us look at this slide.', 'start': 504.44, 'duration': 1.921}, {'end': 511.704, 'text': 'in this slide, which have taken from a slide by Jessa Davis of University of Washington,', 'start': 506.361, 'duration': 5.343}, {'end': 517.666, 'text': 'we can see a feature space is described in terms of the positive and negative examples.', 'start': 511.704, 'duration': 5.962}], 'summary': 'Inductive learning uses a function to predict whether new instances are positive or negative.', 'duration': 68.738, 'max_score': 448.928, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk448928.jpg'}, {'end': 622.739, 'src': 'embed', 'start': 601.349, 'weight': 0, 'content': [{'end': 611.831, 'text': 'So all these are possible functions which we could have found and the set of all such legal functions that we could have come up with.', 'start': 601.349, 'duration': 10.482}, {'end': 613.692, 'text': 'they define the hypothesis space.', 'start': 611.831, 'duration': 1.861}, {'end': 622.739, 'text': 'in a particular learning problem, you first define the hypothesis space, that is, the class of functions that you are going to consider.', 'start': 614.332, 'duration': 8.407}], 'summary': 'Legal functions define hypothesis space in learning problems.', 'duration': 21.39, 'max_score': 601.349, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk601349.jpg'}, {'end': 693.819, 'src': 'embed', 'start': 663.871, 'weight': 1, 'content': [{'end': 671.873, 'text': 'we have to decide the features or the vocabulary, and we have to decide the function class or the type of function,', 'start': 663.871, 'duration': 8.002}, {'end': 676.134, 'text': 'or the language of the function that we will have to we will be using.', 'start': 671.873, 'duration': 4.261}, {'end': 685.937, 'text': 'So, based on the features and the language we can define our hypothesis space.', 'start': 679.655, 'duration': 6.282}, {'end': 693.819, 'text': 'various types of representations have been considered for making predictions.', 'start': 687.577, 'duration': 6.242}], 'summary': 'Deciding features, function class, and language to define hypothesis space for predictions.', 'duration': 29.948, 'max_score': 663.871, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk663871.jpg'}], 'start': 448.928, 'title': 'Inductive learning and hypothesis space', 'summary': 'Discusses inductive learning in function prediction, and defining a hypothesis space for making predictions based on features and language.', 'chapters': [{'end': 517.666, 'start': 448.928, 'title': 'Inductive learning in function prediction', 'summary': 'Discusses inductive learning in function prediction, aiming to learn a function that can predict whether a new instance should be positive or negative based on a particular curve or line, with the example of a feature space described in terms of positive and negative examples.', 'duration': 68.738, 'highlights': ['Learning a function to predict whether a new instance should be positive or negative based on a particular curve or line, with the example of a feature space described in terms of positive and negative examples.', 'Illustrating the function as a curve or line that separates positive from negative instances, with the example of a yellow point lying to the left of the function being negative.', 'Describing inductive learning with the example of a slide from Jessa Davis of University of Washington, depicting a feature space in terms of positive and negative examples.']}, {'end': 693.819, 'start': 518.126, 'title': 'Hypothesis space and function representation', 'summary': 'Discusses the process of defining a hypothesis space, where different functions are considered for making predictions based on features and language, and finding the best hypothesis given the data points.', 'duration': 175.693, 'highlights': ['The hypothesis space is defined as the set of all legal functions considered for making predictions, given the data points. The hypothesis space is the set of all possible functions that can be used to make predictions, and it is defined based on the class of functions considered for the learning problem.', 'The process involves deciding the features or vocabulary and the function class or the type of function to be used for defining the hypothesis space. Defining the hypothesis space involves deciding the features or vocabulary and the function class or the type of function to be used, which ultimately defines the space of possible functions for making predictions.', 'The chapter also mentions various types of representations considered for making predictions based on the features and language. The chapter discusses various types of representations that have been considered for making predictions based on the features and language used to define the hypothesis space.']}], 'duration': 244.891, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk448928.jpg', 'highlights': ['The hypothesis space is defined as the set of all legal functions considered for making predictions, given the data points.', 'Defining the hypothesis space involves deciding the features or vocabulary and the function class or the type of function to be used, which ultimately defines the space of possible functions for making predictions.', 'The process involves deciding the features or vocabulary and the function class or the type of function to be used for defining the hypothesis space.', 'The chapter discusses various types of representations that have been considered for making predictions based on the features and language used to define the hypothesis space.', 'Learning a function to predict whether a new instance should be positive or negative based on a particular curve or line, with the example of a feature space described in terms of positive and negative examples.', 'Illustrating the function as a curve or line that separates positive from negative instances, with the example of a yellow point lying to the left of the function being negative.', 'Describing inductive learning with the example of a slide from Jessa Davis of University of Washington, depicting a feature space in terms of positive and negative examples.']}, {'end': 897.753, 'segs': [{'end': 897.753, 'src': 'embed', 'start': 763.617, 'weight': 0, 'content': [{'end': 775.572, 'text': 'A decision tree, a linear function, a multivariate linear function, a single layer perceptron, the basic unit of a neural network,', 'start': 763.617, 'duration': 11.955}, {'end': 777.133, 'text': 'a multilayer neural network.', 'start': 775.572, 'duration': 1.561}, {'end': 782.477, 'text': 'These are some of the representations that we will talk about later in this class.', 'start': 777.634, 'duration': 4.843}, {'end': 795.826, 'text': 'So, once you have chosen the features and the language or the class of functions, what you have is a hypothesis space.', 'start': 784.538, 'duration': 11.288}, {'end': 819.824, 'text': 'So hypothesis space is the space of all legal hypothesis is a set of all legal hypothesis that you can describe using the features that you have chosen and the language that you have chosen.', 'start': 806.417, 'duration': 13.407}, {'end': 825.907, 'text': 'And this is the set from which the learning algorithm will pick a hypothesis.', 'start': 820.704, 'duration': 5.203}, {'end': 830.489, 'text': 'So, hypothesis space we may represent a hypothesis space by h.', 'start': 826.587, 'duration': 3.902}, {'end': 838.505, 'text': 'and the learning algorithm outputs a hypothesis h belonging to H.', 'start': 831.771, 'duration': 6.734}, {'end': 855.774, 'text': 'This is the output of a So, capital H denotes all legal hypothesis, all possible outputs by the learning algorithm.', 'start': 838.505, 'duration': 17.269}, {'end': 865.3, 'text': 'Given the training set, given the particular data points, the learning algorithm will come up with one of the hypothesis in the hypothesis space.', 'start': 856.554, 'duration': 8.746}, {'end': 877.308, 'text': 'Which hypothesis it comes up with will depend on the data and it also will depend on what type of restrictions or biases that we have imposed.', 'start': 865.861, 'duration': 11.447}, {'end': 880.489, 'text': 'which we will describe later.', 'start': 878.669, 'duration': 1.82}, {'end': 897.753, 'text': 'So supervised learning we can think of is a device which explores the hypothesis space or which searches the hypothesis space in order to find out one of the hypothesis which satisfies certain criteria.', 'start': 881.23, 'duration': 16.523}], 'summary': 'Learning algorithms explore hypothesis space to find satisfying hypothesis.', 'duration': 134.136, 'max_score': 763.617, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk763617.jpg'}], 'start': 694.859, 'title': 'Representation and hypothesis space in machine learning', 'summary': 'Discusses various types of representations in machine learning, including linear functions, decision trees, and neural networks, forming the hypothesis space for classification. additionally, it explores the concept of hypothesis space in supervised learning, where the learning algorithm explores a set of all legal hypotheses to find the most suitable hypothesis based on the given data points and imposed restrictions.', 'chapters': [{'end': 795.826, 'start': 694.859, 'title': 'Representation in machine learning', 'summary': 'Discusses different types of representations in machine learning, such as linear functions, decision trees, and neural networks, which form the hypothesis space for classification.', 'duration': 100.967, 'highlights': ['The chapter discusses different types of representations in machine learning, such as decision tree, linear function, multivariate linear function, single layer perceptron, basic unit of a neural network, and multilayer neural network.', 'The hypothesis space is formed by choosing the features and the class of functions, providing a framework for classification.']}, {'end': 897.753, 'start': 806.417, 'title': 'Hypothesis space in learning', 'summary': 'Discusses the concept of hypothesis space in supervised learning, where the learning algorithm explores a set of all legal hypotheses to find the most suitable hypothesis based on the given data points and imposed restrictions.', 'duration': 91.336, 'highlights': ['The hypothesis space is the set of all legal hypotheses that can be described using chosen features and language, denoted by H.', 'The learning algorithm outputs a hypothesis belonging to the hypothesis space based on the given training set and data points.', 'Supervised learning involves exploring the hypothesis space to find a hypothesis that satisfies certain criteria.']}], 'duration': 202.894, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk694859.jpg', 'highlights': ['The chapter discusses different types of representations in machine learning, such as decision tree, linear function, multivariate linear function, single layer perceptron, basic unit of a neural network, and multilayer neural network.', 'The hypothesis space is formed by choosing the features and the class of functions, providing a framework for classification.', 'The hypothesis space is the set of all legal hypotheses that can be described using chosen features and language, denoted by H.', 'The learning algorithm outputs a hypothesis belonging to the hypothesis space based on the given training set and data points.', 'Supervised learning involves exploring the hypothesis space to find a hypothesis that satisfies certain criteria.']}, {'end': 1214.126, 'segs': [{'end': 987.043, 'src': 'embed', 'start': 900.493, 'weight': 2, 'content': [{'end': 902.994, 'text': 'Now for some more terminology before we proceed.', 'start': 900.493, 'duration': 2.501}, {'end': 922.538, 'text': 'We have already talked about an example as x y, the value of the input and the value of the output, x y pair.', 'start': 909.454, 'duration': 13.084}, {'end': 935.002, 'text': 'Training data is a set of examples,', 'start': 924.799, 'duration': 10.203}, {'end': 943.869, 'text': 'is a collection of examples which have been observed by the learning algorithm or which is input to the learning algorithm.', 'start': 935.002, 'duration': 8.867}, {'end': 955.053, 'text': 'We have instance space or feature space which describes all possible instances.', 'start': 945.07, 'duration': 9.983}, {'end': 970.47, 'text': 'So, if we have two features x 1 and x 2 and let us say x 1 takes value between 0 and 100, x 2 takes value between 0 and 50.', 'start': 955.673, 'duration': 14.797}, {'end': 977.115, 'text': 'and all points in this plane can describe an instance.', 'start': 970.47, 'duration': 6.645}, {'end': 978.916, 'text': 'So, this is the instance space.', 'start': 977.375, 'duration': 1.541}, {'end': 987.043, 'text': 'So, instance space is the set of all possible objects that can be described by the features.', 'start': 979.877, 'duration': 7.166}], 'summary': 'Training data is a collection of observed examples used by a learning algorithm to describe instance space.', 'duration': 86.55, 'max_score': 900.493, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk900493.jpg'}, {'end': 1049.965, 'src': 'embed', 'start': 1016.471, 'weight': 1, 'content': [{'end': 1028.117, 'text': 'So, out of all possible objects that we can describe in the instance space a subset of those objects are positive that is they belong to the concept.', 'start': 1016.471, 'duration': 11.646}, {'end': 1038.859, 'text': 'So, the concept C can be a subset of the instance space x.', 'start': 1028.778, 'duration': 10.081}, {'end': 1045.542, 'text': 'So, which define the positive points c is unknown to us and this is what we are trying to find out.', 'start': 1038.859, 'duration': 6.683}, {'end': 1049.965, 'text': 'In order to find out c we are trying to find a function f.', 'start': 1046.123, 'duration': 3.842}], 'summary': 'Identifying positive objects in instance space to find unknown concept c with function f.', 'duration': 33.494, 'max_score': 1016.471, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1016471.jpg'}, {'end': 1135.086, 'src': 'embed', 'start': 1087.457, 'weight': 0, 'content': [{'end': 1092.981, 'text': 'Now, what you are trying to do in learning is given a hypothesis space H.', 'start': 1087.457, 'duration': 5.524}, {'end': 1109.515, 'text': 'you are trying to come up with a hypothesis small h belonging to the hypothesis h that approximates f.', 'start': 1097.625, 'duration': 11.89}, {'end': 1124.949, 'text': 'You want to find h that approximates f based on the training data that you have been given.', 'start': 1109.515, 'duration': 15.434}, {'end': 1135.086, 'text': 'Now, the set of hypothesis that can be produced can be restricted further by specifying a language bias.', 'start': 1126.184, 'duration': 8.902}], 'summary': 'Learning aims to find h in h that approximates f using training data.', 'duration': 47.629, 'max_score': 1087.457, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1087457.jpg'}], 'start': 900.493, 'title': 'Machine learning terminology and learning concept c in classification', 'summary': 'Explains machine learning terminology including training data, instance space, and feature space, with specific examples, and introduces the concept of learning a function to approximate unknown concept c in a two-class classification problem, aiming to find a hypothesis based on training data.', 'chapters': [{'end': 987.043, 'start': 900.493, 'title': 'Terminology in machine learning', 'summary': 'Explains the terminology in machine learning, including the concept of training data, instance space, and feature space, with a specific example of x and y pairs and the range of values for x1 and x2.', 'duration': 86.55, 'highlights': ['Instance space or feature space describes all possible instances, with x1 taking values between 0 and 100, and x2 taking values between 0 and 50.', 'Training data is a collection of examples observed by the learning algorithm or input to the learning algorithm.', 'An example is represented as x y, the value of the input and the value of the output, forming an x y pair.']}, {'end': 1214.126, 'start': 994.676, 'title': 'Learning concept c in classification', 'summary': 'Introduces the concept of learning a function f to approximate the unknown concept c in a two-class classification problem, with the hypothesis space h and language bias further restricting the set of hypotheses, aiming to find a hypothesis h that approximates f based on training data.', 'duration': 219.45, 'highlights': ['The chapter introduces the concept of learning a function f to approximate the unknown concept C in a two-class classification problem, with the hypothesis space H and language bias further restricting the set of hypotheses, aiming to find a hypothesis h that approximates f based on training data.', 'The concept C is a subset of the instance space x, and the function f maps every input x to an output y, with the goal of approximating the unknown concept C.', 'The hypothesis space H defines all possible sets of hypotheses, further restricted by specifying a language bias in terms of constraints or preferences, aiming to find a hypothesis h that approximates f based on training data.']}], 'duration': 313.633, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk900493.jpg', 'highlights': ['The chapter introduces the concept of learning a function f to approximate the unknown concept C in a two-class classification problem, with the hypothesis space H and language bias further restricting the set of hypotheses, aiming to find a hypothesis h that approximates f based on training data.', 'The concept C is a subset of the instance space x, and the function f maps every input x to an output y, with the goal of approximating the unknown concept C.', 'Instance space or feature space describes all possible instances, with x1 taking values between 0 and 100, and x2 taking values between 0 and 50.', 'Training data is a collection of examples observed by the learning algorithm or input to the learning algorithm.', 'An example is represented as x y, the value of the input and the value of the output, forming an x y pair.', 'The hypothesis space H defines all possible sets of hypotheses, further restricted by specifying a language bias in terms of constraints or preferences, aiming to find a hypothesis h that approximates f based on training data.']}, {'end': 1610.925, 'segs': [{'end': 1395.412, 'src': 'heatmap', 'start': 1363.275, 'weight': 0.802, 'content': [{'end': 1364.356, 'text': 'So what is a function??', 'start': 1363.275, 'duration': 1.081}, {'end': 1373.04, 'text': 'A function will classify some of the points as positive, others as negative, out of this 16 points.', 'start': 1365.675, 'duration': 7.365}, {'end': 1381.485, 'text': 'So, that means the number of functions is the number of possible subsets of this 16 instances.', 'start': 1374.12, 'duration': 7.365}, {'end': 1391.531, 'text': 'So, how many possible subsets are there? There are 2 to the power 16 subsets or 2 to the power 2 to the power 4 subsets.', 'start': 1382.565, 'duration': 8.966}, {'end': 1395.412, 'text': 'Instead of 4 Boolean variables as features.', 'start': 1392.31, 'duration': 3.102}], 'summary': 'Functions classify points as positive/negative, with 2^16 possible subsets.', 'duration': 32.137, 'max_score': 1363.275, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1363275.jpg'}, {'end': 1440.093, 'src': 'embed', 'start': 1392.31, 'weight': 0, 'content': [{'end': 1395.412, 'text': 'Instead of 4 Boolean variables as features.', 'start': 1392.31, 'duration': 3.102}, {'end': 1397.954, 'text': 'if you had n Boolean features,', 'start': 1395.412, 'duration': 2.542}, {'end': 1410.742, 'text': 'then the number of possible instances will be 2 to the power n and number of possible functions will be 2 to the power 2 to the power n.', 'start': 1397.954, 'duration': 12.788}, {'end': 1413.804, 'text': 'So, this is the size of the hypothesis space.', 'start': 1410.742, 'duration': 3.062}, {'end': 1415.365, 'text': 'As you can see,', 'start': 1414.504, 'duration': 0.861}, {'end': 1428.077, 'text': 'the hypothesis space is very large and it is not possible to look at every hypothesis individually in order to select the best hypothesis that you want.', 'start': 1415.365, 'duration': 12.712}, {'end': 1437.329, 'text': 'So, what do you do? You put some restrictions on the hypothesis space, you can put some restrictions.', 'start': 1430.981, 'duration': 6.348}, {'end': 1440.093, 'text': 'So, you select a hypothesis language.', 'start': 1437.81, 'duration': 2.283}], 'summary': 'With n boolean features, there are 2^n possible instances and 2^(2^n) possible functions, requiring restrictions on the hypothesis space and a selected hypothesis language.', 'duration': 47.783, 'max_score': 1392.31, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1392310.jpg'}, {'end': 1525.128, 'src': 'embed', 'start': 1469.666, 'weight': 3, 'content': [{'end': 1476.868, 'text': 'There could be conjunction Boolean formulas, CNF Boolean formulas, unrestricted Boolean formulas.', 'start': 1469.666, 'duration': 7.202}, {'end': 1479.589, 'text': 'So, you choose a hypothesis language.', 'start': 1477.408, 'duration': 2.181}, {'end': 1491.532, 'text': 'The hypothesis language, if you restrict the hypothesis language, the hypothesis language reflects a bias.', 'start': 1480.609, 'duration': 10.923}, {'end': 1505.498, 'text': 'So, this reflects a bias or inductive bias of the learner.', 'start': 1500.276, 'duration': 5.222}, {'end': 1525.128, 'text': 'Now, so let us define formally what is inductive bias.', 'start': 1520.225, 'duration': 4.903}], 'summary': 'Choosing a hypothesis language reflects a bias in inductive learning.', 'duration': 55.462, 'max_score': 1469.666, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1469666.jpg'}], 'start': 1217.466, 'title': 'Learning problems and inductive bias', 'summary': 'Discusses learning problems, hypothesis space, and inductive bias in machine learning, emphasizing the size of the hypothesis space with n boolean features and the importance of bias in selecting a hypothesis language, including types of bias - restriction bias and preference bias.', 'chapters': [{'end': 1415.365, 'start': 1217.466, 'title': 'Learning problems and hypothesis space', 'summary': 'Discusses the concept of learning problems, training set, instance space, hypothesis space, and the size of the hypothesis space, emphasizing that with n boolean features, the size of the hypothesis space is 2 to the power 2 to the power n.', 'duration': 197.899, 'highlights': ['The size of the hypothesis space with n Boolean features is 2 to the power 2 to the power n, indicating exponential growth in the number of possible functions as the number of features increases.', 'The number of possible functions is the number of possible subsets of instances, which is 2 to the power 2 to the power n, demonstrating the exponential growth in the complexity of the hypothesis space.', 'The number of possible instances with 4 Boolean features is 2 to the power 4, resulting in 16 possible instances, providing a quantitative understanding of the instance space.', 'The input in a learning problem is a training set, a subset of the instance space X, and the output is a hypothesis h belonging to the hypothesis space H, outlining the fundamental components of a learning problem.']}, {'end': 1610.925, 'start': 1415.365, 'title': 'Inductive bias in hypothesis space', 'summary': 'Discusses the concept of hypothesis space, hypothesis language, and inductive bias in machine learning, emphasizing the importance of bias in selecting a hypothesis language and the types of bias - restriction bias and preference bias.', 'duration': 195.56, 'highlights': ['The hypothesis space is very large and it is not possible to look at every hypothesis individually in order to select the best hypothesis. The hypothesis space is too large to evaluate every hypothesis individually for selecting the best one.', 'Selection of a hypothesis language reflects a bias or inductive bias of the learner. Choosing a hypothesis language reflects the inductive bias of the learner, indicating the importance of bias in machine learning.', 'Two types of bias in hypothesis space: restriction bias and preference bias, where the former involves putting restrictions on the type of functions and the latter involves preferring certain characteristics within the chosen language. Two types of bias exist: restriction bias, which involves putting restrictions on the type of functions, and preference bias, which involves preferring certain characteristics within the chosen language.']}], 'duration': 393.459, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1217466.jpg', 'highlights': ['The size of the hypothesis space with n Boolean features is 2 to the power 2 to the power n, indicating exponential growth in the number of possible functions as the number of features increases.', 'The number of possible functions is the number of possible subsets of instances, which is 2 to the power 2 to the power n, demonstrating the exponential growth in the complexity of the hypothesis space.', 'The hypothesis space is very large and it is not possible to look at every hypothesis individually in order to select the best hypothesis. The hypothesis space is too large to evaluate every hypothesis individually for selecting the best one.', 'Selection of a hypothesis language reflects a bias or inductive bias of the learner. Choosing a hypothesis language reflects the inductive bias of the learner, indicating the importance of bias in machine learning.']}, {'end': 2001.604, 'segs': [{'end': 1640.674, 'src': 'embed', 'start': 1613.055, 'weight': 0, 'content': [{'end': 1623.942, 'text': 'So, inductive learning means to come up with a general function from training examples, given some training examples you want to generalize.', 'start': 1613.055, 'duration': 10.887}, {'end': 1635.83, 'text': 'So, you construct a hypothesis H, you are given some training examples which comes from a concept C and you want to find out a hypothesis H.', 'start': 1624.863, 'duration': 10.967}, {'end': 1640.674, 'text': 'You can come up with a hypothesis that is consistent with all the training examples given.', 'start': 1635.83, 'duration': 4.844}], 'summary': 'Inductive learning: derive general function from training examples to generalize, consistent with all examples.', 'duration': 27.619, 'max_score': 1613.055, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1613055.jpg'}, {'end': 1780.614, 'src': 'embed', 'start': 1745.387, 'weight': 1, 'content': [{'end': 1753.769, 'text': 'So, inductive learning is a ill posed problem, you are looking for generalization guided by some bias or some criteria.', 'start': 1745.387, 'duration': 8.382}, {'end': 1766.472, 'text': 'So, why you are being able to generalize? It is based on an assumption, we call this assumption the inductive learning hypothesis.', 'start': 1757.91, 'duration': 8.562}, {'end': 1780.614, 'text': 'The hypothesis states that a hypothesis H is found to approximate the target function C well over a sufficiently large set of training examples.', 'start': 1768.069, 'duration': 12.545}], 'summary': 'Inductive learning relies on the hypothesis to generalize well over a large set of examples.', 'duration': 35.227, 'max_score': 1745.387, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1745387.jpg'}, {'end': 1911.863, 'src': 'heatmap', 'start': 1885.731, 'weight': 0.743, 'content': [{'end': 1891.396, 'text': 'So, machine learning coming up with a function is all about doing generalization.', 'start': 1885.731, 'duration': 5.665}, {'end': 1900.977, 'text': 'And when you are doing generalization, you can make some errors and the errors are of two types, bias errors and variance errors.', 'start': 1892.432, 'duration': 8.545}, {'end': 1907.161, 'text': 'So, bias as we saw is a restriction on the hypothesis space or the preference in choosing hypothesis.', 'start': 1901.537, 'duration': 5.624}, {'end': 1911.863, 'text': 'by deciding a particular hypothesis you impose a bias.', 'start': 1908.101, 'duration': 3.762}], 'summary': 'Machine learning involves generalization, leading to bias and variance errors in hypothesis space selection.', 'duration': 26.132, 'max_score': 1885.731, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1885731.jpg'}, {'end': 1935.307, 'src': 'embed', 'start': 1912.424, 'weight': 2, 'content': [{'end': 1922.43, 'text': 'So, this is error due to incorrect assumptions or restrictions on the hypothesis space, the error introduced by that is called bias error.', 'start': 1912.424, 'duration': 10.006}, {'end': 1927.824, 'text': 'Variance error is introduced when you have a small test set.', 'start': 1923.402, 'duration': 4.422}, {'end': 1935.307, 'text': 'So, variance error means the model that you estimate from different training sets will differ from each other.', 'start': 1928.244, 'duration': 7.063}], 'summary': 'Bias error due to incorrect assumptions, variance error from small test sets.', 'duration': 22.883, 'max_score': 1912.424, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1912424.jpg'}, {'end': 1991.6, 'src': 'embed', 'start': 1958.807, 'weight': 3, 'content': [{'end': 1963.429, 'text': 'this is a very important concept, but we will talk about it when we talk about the algorithms.', 'start': 1958.807, 'duration': 4.622}, {'end': 1966.03, 'text': 'this is over fitting and under fitting.', 'start': 1963.429, 'duration': 2.601}, {'end': 1974.192, 'text': 'You may come up with a hypothesis that does well over the training examples but does very poorly over the test examples.', 'start': 1966.41, 'duration': 7.782}, {'end': 1976.233, 'text': 'then we say over fitting has occurred.', 'start': 1974.192, 'duration': 2.041}, {'end': 1982.055, 'text': 'Overfitting comes from using very complex functions or using too few training data.', 'start': 1976.873, 'duration': 5.182}, {'end': 1991.6, 'text': 'When the reverse of overfitting is underfitting, if you have a very simple function then it cannot capture all the nuances of the data.', 'start': 1983.036, 'duration': 8.564}], 'summary': 'Overfitting occurs when a hypothesis does well on training examples but poorly on test examples, caused by using complex functions or too little training data.', 'duration': 32.793, 'max_score': 1958.807, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1958807.jpg'}], 'start': 1613.055, 'title': 'Inductive learning', 'summary': 'Introduces the concept of inductive learning, emphasizing the construction of hypotheses to generalize from training examples, with a focus on consistent hypotheses and preference bias for generalizing well over unseen examples. it also explores inductive learning hypothesis, bias errors, variance errors, overfitting, and underfitting in the context of generalization on unseen data.', 'chapters': [{'end': 1684.553, 'start': 1613.055, 'title': 'Inductive learning process', 'summary': 'Introduces inductive learning, where a hypothesis is constructed to generalize from training examples, with an emphasis on consistent hypotheses and the selection based on preference bias to generalize well over unseen examples.', 'duration': 71.498, 'highlights': ['The process of inductive learning involves constructing a hypothesis to generalize from training examples, aiming to come up with a consistent hypothesis that is guided by preference bias to generalize well over unseen examples.', 'In inductive learning, it is not always possible to come up with a consistent hypothesis, and even when possible, multiple consistent hypotheses can exist within the hypothesis space.']}, {'end': 2001.604, 'start': 1686.173, 'title': 'Inductive learning and generalization', 'summary': 'Discusses inductive learning, which involves finding an hypothesis that generalizes well on unseen data based on a given training set, and explores concepts such as inductive learning hypothesis, bias errors, variance errors, overfitting, and underfitting.', 'duration': 315.431, 'highlights': ["Inductive learning involves finding an hypothesis that generalizes well on unseen data based on a given training set, and is guided by some bias or criteria, such as Occam's Razor and minimum description length. Guided by bias or criteria, e.g., Occam's Razor and minimum description length.", 'The inductive learning hypothesis states that an hypothesis H approximates the target function well over a sufficiently large set of training examples, and a hypothesis with low training error over a large training set is expected to perform well on unseen examples. Inductive learning hypothesis: H approximates target function well over large training set.', 'Machine learning involves coming up with a good hypothesis space, finding an algorithm that works well with the hypothesis space, and understanding the confidence in the hypothesis. Focus on good hypothesis space, algorithm selection, and understanding confidence in the hypothesis.', 'Bias errors occur due to incorrect assumptions or restrictions on the hypothesis space, while variance errors arise when models estimated from different training sets differ from each other. Bias errors due to incorrect assumptions or restrictions, variance errors due to differing models from training sets.', 'Overfitting happens when a hypothesis performs well on training examples but poorly on test examples, often caused by using very complex functions or too few training data, while underfitting occurs when a very simple function cannot capture all the nuances of the data. Overfitting and underfitting due to complexity of functions and training data.']}], 'duration': 388.549, 'thumbnail': 'https://coursnap.oss-ap-southeast-1.aliyuncs.com/video-capture/dYMCwxgl3vk/pics/dYMCwxgl3vk1613055.jpg', 'highlights': ['Inductive learning involves constructing a hypothesis to generalize from training examples, aiming for a consistent hypothesis guided by preference bias.', 'Inductive learning hypothesis: H approximates target function well over large training set.', 'Bias errors occur due to incorrect assumptions or restrictions on the hypothesis space, while variance errors arise when models estimated from different training sets differ from each other.', 'Overfitting happens when a hypothesis performs well on training examples but poorly on test examples, often caused by using very complex functions or too few training data, while underfitting occurs when a very simple function cannot capture all the nuances of the data.']}], 'highlights': ['The chapter introduces the concept of inductive learning, emphasizing the significance of inductive bias and hypothesis space.', 'The hypothesis space is defined as the set of all legal functions considered for making predictions, given the data points.', 'The chapter discusses different types of representations in machine learning, such as decision tree, linear function, multivariate linear function, single layer perceptron, basic unit of a neural network, and multilayer neural network.', 'The chapter introduces the concept of learning a function f to approximate the unknown concept C in a two-class classification problem, with the hypothesis space H and language bias further restricting the set of hypotheses, aiming to find a hypothesis h that approximates f based on training data.', 'The size of the hypothesis space with n Boolean features is 2 to the power 2 to the power n, indicating exponential growth in the number of possible functions as the number of features increases.', 'Inductive learning involves constructing a hypothesis to generalize from training examples, aiming for a consistent hypothesis guided by preference bias.']}