Anonymous
bias is the average error of the model and variance is the variance of the model prediction. A model with high bias and low variance have predictions mapped into the same place in the space, a model with high variance and low bias has scattered predictions. The first one suggests that the model is too simple, it has high training error and testing error, the second one suggests overfitting when the model has low training error but high test error. I have seen this happen when a deep learning model starts overfitting, in that case I have used techniques to mitigate it like early stopping, regularization and dropout layer.