Skip to content

Bias vs Variance Tradeoff

Two ways models can be wrong

Bias (systematic error)

High bias means the model is too simple and misses real patterns.

  • underfits
  • training error is high
  • validation error is high

Variance (sensitivity to noise)

High variance means the model is too complex and learns noise.

  • overfits
  • training error is low
  • validation error is high

false


  flowchart LR
  B[High bias] --> U[Underfitting]
  V[High variance] --> O[Overfitting]

false

The tradeoff

As model complexity increases:

  • bias tends to decrease
  • variance tends to increase

Goal: find a sweet spot.

How to reduce bias

  • add more features
  • use a more flexible model
  • reduce regularization

How to reduce variance

  • collect more data
  • increase regularization
  • simplify the model
  • use bagging/ensembles

Mini-checkpoint

If training and validation accuracy are both low:

  • bias or variance?

(Usually bias / underfitting.)

๐Ÿงช Try It Yourself

Exercise 1 โ€“ Train-Test Split

Exercise 2 โ€“ Fit a Linear Model

Exercise 3 โ€“ Evaluate with MSE

If this helped you, consider buying me a coffee โ˜•

Buy me a coffee

Was this page helpful?

Let us know how we did