Logistic regression helps us to find how probabilities are changed by the action. It involves the “best fit” S-curve.

**Cross Entropy : Loss Function**

Cross entropy is the process of minimizing the loss of our model and to improve

the model parameter and gives us a robust model. In Linear Regression, the cost function that we use to find the best fit line is through the Mean Square Error (Measures how far the line is from the actual points.) while in Logistic Regression, the Cost function that we will use is Cross Entropy.

The output of our classification model is a probability score and it measures how well the estimated probabilities of our model matches the actual label of the original categories to which the data points belong.

**Loss cross Entropy**

It means that the probability distribution of the Y actual and the Y predicted are in

sync.

**High cross Entropy**

It is the opposite and means that when we have Y actual and Y predicted, the series are not in sync. So, When we are building a classification and trying to minimize the cross entropy. We want Y actual and Y predicted to be in sync. So, when we are applying the logistic regression, you want to find the best S curve that fits on our underlying data, which minimizes the cross entropy of our actual labels versus the labels predicted by our model.

## 0 responses on "Intuition behind Cross Entropy In Logistic Regression Logistic Regression"