Loss function
A loss function is a mathematical measure of how close a model’s output is to the target result. It returns a single numerical value, called the loss, that represents prediction error—smaller values mean more accurate predictions, while larger values indicate greater deviation.
Loss functions play a key role in improving machine learning models. During training, optimization algorithms such as gradient descent use the loss value to update the model’s parameters—such as weights and biases in neural networks. These updates are guided by backpropagation, which computes how changes in each parameter affect the loss.