Supervised Learning — Classification, Regression, Generalizing, Overfitting, Underfitting

Ibrahim Kovan
5 min readJun 5, 2021

This article involves fundamental terms used in machine learning algorithms especially in supervised learning. The difference between regression and classification and other terms such as overfitting, underfitting, generalizing are explained with the examples clearly.

When it is modeled machine learning system basically, it is thought that dataset features as inputs, algorithm as a model (black box or gray box), and target of the features (the result of the inputs) as an output. After training the model by introducing inputs and outputs, the model is separated and then features(inputs) of test data are applied to the model. Predictions of the model are stored and compared to real results. Finally, model accuracy is detected. If the model accuracy percentage is sufficient, the model can be used to predict the output of the inputs which are given externally. There are 2 major types of supervised learning machine learning problems, called classification and regression.

Classification vs Regression

In classification, the combination of the inputs corresponds to one of the certain number of outputs. If this number is 2, it is called as Binary Classification. As an example, whether an e-mail is spam has 2 answer. Yes / No. Similarly, suppose dataset consists of dog images and cat images. Machine…

--

--

No responses yet