Simply, underfitting and overfitting are terms related to how well model is fitted to train data. On the other side, bias and variance are terms in the view of range which model can covers data values.
- underfitting => you can image the simple flat line which predicts all situation as only one value of y-axis => which means the model is highly biased towards only one value => so, we can also call it βhigh biasβ.
- overfitting => you can image the very complicated curve looks like a snake => which means the model predicts variety wide values => so we can also say βitβs high variance!β.