Chapter 7: Supervised Learning¶
Master regression and classification—linear models, ensembles, SVMs, and tuning for production.
Metadata¶
| Field | Value |
|---|---|
| Track | Practitioner |
| Time | 10 hours |
| Prerequisites | Chapters 1–6 |
Learning Objectives¶
- Build regression and classification models
- Apply regularization (L1, L2)
- Use decision trees, random forests, and gradient boosting
- Apply SVM and kernel methods
- Evaluate models with ROC, precision-recall, and regression metrics
- Tune hyperparameters effectively
- Handle imbalanced data and credit-risk scenarios
What's Included¶
Notebooks¶
| Notebook | Description |
|---|---|
01_introduction.ipynb | Regression, regularization, model basics |
02_intermediate.ipynb | Classification, SVM, ROC curves |
03_advanced.ipynb | Ensembles, tuning, credit-risk case study |
Scripts¶
supervised_toolkit.py— Core implementations and plotting utilities
Exercises¶
- 5 exercises with solutions (in
solutions/branch)
SVG Diagrams¶
- 3 visual diagrams for model architectures and evaluation
Read Online¶
You can read the full chapter content right here on the website:
- 07.1 Introduction -- Linear/polynomial regression, regularization (Ridge, Lasso)
- 07.2 Intermediate -- Classification, logistic regression, SVM, ROC curves
- 07.3 Advanced -- Ensemble methods, hyperparameter tuning, credit risk capstone
Or try the code in the Playground.
How to Use This Chapter¶
Quick Start
Follow these steps to get coding in minutes.
1. Clone and install dependencies
git clone https://github.com/luigipascal/berta-chapters.git
cd berta-chapters
pip install -r requirements.txt
2. Navigate to the chapter
3. Launch Jupyter
GitHub Folder
All chapter materials live in: chapters/chapter-07-supervised-learning/
XGBoost
This chapter uses XGBoost for gradient boosting. Ensure it's installed: pip install xgboost
Created by Luigi Pascal Rondanini | Generated by Berta AI