Skip to content

elhamabedi/pca-bayesian

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

Statistical Pattern Recognition: PCA and Bayesian Classifiers

This repository contains the implementation of a Homework for the Pattern Recognition course. The project covers two fundamental topics in machine learning and pattern recognition:

  1. Eigenvalues and Eigenvectors - Principal Component Analysis (PCA) from scratch
  2. Bayesian Decision Rules - ML, MAP, and Risk-based classifiers with decision boundaries

Usage

Part 1: Eigenvalues and Eigenvectors

What it does:

  • Generates synthetic data for 2 classes (100 samples each)
  • Computes mean vectors and covariance matrices from scratch
  • Calculates eigenvalues and eigenvectors manually
  • Verifies eigenvector orthogonality
  • Computes explained variance ratios
  • Reconstructs data using 1 and 2 principal components
  • Visualizes data with eigenvectors

Part 2: Bayesian Decision Rules

What it does:

  • Implements multivariate Gaussian log-PDF from scratch
  • Builds three classifiers:
    • Maximum Likelihood (ML) - Uses likelihood only
    • Maximum A Posteriori (MAP) - Uses priors
    • Risk-based MAP - Uses loss matrix for minimum risk
  • Visualizes decision boundaries
  • Compares classification decisions on test points
  • Computes misclassification rates and empirical risk

Project Structure

homework/
├── README.md                           # This file
├── eigen_vals_vects.ipynb              # Part 1: Eigenvalues & Eigenvectors
├── bayes_decision_boundary.ipynb       # Part 2: Bayes Classifier (Basic)
├── ml_mlp_risk.ipynb                   # Part 2: ML, MAP & Risk-based Classifiers

References

  1. Duda, R. O., Hart, P. E., & Stork, D. G. (2001). Pattern Classification (2nd ed.). Wiley-Interscience.
  2. Bishop, C. M. (2006). Pattern Recognition and Machine Learning. Springer.

Releases

No releases published

Packages

 
 
 

Contributors