My (Chiffon) Nguyen

Home

❯

Resources

❯

AI

❯

Theory

❯

overfitting

overfitting

Feb 11, 20251 min read

  • AI

(ML Spec) (Illustration) ML Specialization

When the machine learning model fits too closely to a small sample of data and generalizes poorly to real-world or unseen data. It’s said to have high variance

It’s opposite to underfitting

Approach

(Address overfitting)

  1. Collect more data. However, getting more data is often hard (e.g. inaccessible, limited, expensive) or impossible
  2. Simplify the model
    1. Select a smaller set of relevant features
    2. Reduce size of parameters by applying regularization

Graph View

Backlinks

  • Machine Learning Specialization
  • Minerva CS156 Machine Learning
  • activation function
  • artificial neural network
  • Random Forest
  • boosting
  • decision tree (ML)
  • ensemble method
  • gradient boosting
  • regularization
  • Bias-Variance
  • underfitting

Created with Quartz v4.5.0 © 2025

  • GitHub
  • Mail
  • Home