Pattern Recognition
Pattern Recognition. Instructor: Prof. P.S. Sastry, Department of Electronics and Communication Engineering, IISc Bangalore. This course provides a fairly comprehensive view of fundamentals of pattern classification and regression. Topics covered in the lectures include: overview of pattern classification and regression; Bayesian decision making and Bayes classifier; parametric estimation of densities; mixture densities and EM algorithm; Nonparametric Density Estimation; Linear Models for Classification and Regression; overview of statistical learning theory; empirical risk minimization and VC-dimension; artificial neural networks for classification and regression; support vector machines and kernel based methods; feature selection, model assessment and cross-validation; boosting and classifier ensembles.
(from nptel.ac.in )
Lecture 31 - Support Vector Machines - Introduction, Obtaining the Optimal Hyperplane
VIDEO
Go to the Course Home or watch other lectures:
Overview of Pattern Classification and Regression
Lecture 01 - Instruction to Statistical Pattern Recognition
Lecture 02 - Overview of Pattern Classifiers
Bayesian Decision Making and Bayes Classifier
Lecture 03 - The Bayes Classifier for Minimizing Risk
Lecture 04 - Estimating Bayes Error; Minimax and Neyman-Pearson Classifiers
Parametric Estimation of Densities
Lecture 05 - Implementing Bayes Classifier; Estimation of Class Conditional Densities
Lecture 06 - Maximum Likelihood Estimation of Different Densities
Lecture 07 - Bayesian Estimation of Parameters of Density Functions, MAP Estimates
Lecture 08 - Bayesian Estimation Examples; the Exponential Family of Densities and ML Estimates
Lecture 09 - Sufficient Statistics; Recursive Formulation of ML and Bayesian Estimates
Mixture Densities and EM Algorithm, Nonparametric Density Estimation
Lecture 10 - Mixture Densities, ML Estimation and EM Algorithm
Lecture 11 - Convergence of EM Algorithm; Overview of Nonparametric Density Estimation
Lecture 12 - Nonparametric Estimation, Parzen Windows, Nearest Neighbor Methods
Linear Models for Classification and Regression
Lecture 13 - Linear Discriminant Function; Perceptron - Learning Algorithm and Convergence Proof
Lecture 14 - Linear Least Squares Regression; LMS Algorithm
Lecture 15 - AdaLinE and LMS Algorithm; General Nonlinear Least Squares Regression
Lecture 16 - Logistic Regression; Statistics of Least Squares Method; Regulated Least Squares
Lecture 17 - Fisher Linear Discriminant
Lecture 18 - Linear Discriminant Functions for Multi-Class Case; Multi-Class Logistic Regression
Overview of Statistical Learning Theory, Empirical Risk Minimization and VC-Dimension
Lecture 19 - Learning and Generalization; PAC Learning Framework
Lecture 20 - Overview of Statistical Learning Theory; Empirical Risk Minimization
Lecture 21 - Consistency of Empirical Risk Minimization
Lecture 22 - Consistency of Empirical Risk Minimization; VC-Dimension
Lecture 23 - Complexity of Learning Problems and VC-Dimension
Lecture 24 - VC-Dimension Examples; VC-Dimension of Hyperplanes
Artificial Neural Networks for Classification and Regression
Lecture 25 - Overview of Artificial Neural Networks
Lecture 26 - Multilayer Feedforward Neural Networks with Sigmoidal Activation Functions
Lecture 27 - Backpropagation Algorithm; Representational Abilities of Feedforward Networks
Lecture 28 - Feedforward Networks for Classification and Regression; Backpropagation in Practice
Lecture 29 - Radial Basis Function Networks; Gaussian RBF Networks
Lecture 30 - Learning Weights in RBF Networks; K-means Clustering Algorithm
Support Vector Machines and Kernel Based Methods
Lecture 31 - Support Vector Machines - Introduction, Obtaining the Optimal Hyperplane
Lecture 32 - SVM Formulation with Slack Variables; Nonlinear SVM Classifiers
Lecture 33 - Kernel Functions for Nonlinear SVMs; Mercer and Positive Definite Kernels
Lecture 34 - Support Vector Regression and E-intensive Loss Function, Examples of SVM Learning
Lecture 35 - Overview of SMO and Other Algorithms for SVM; v-SVM and v-SVR; SVM as a Risk Minimizer
Lecture 36 - Positive Definite Kernels; RKHS; Representer Theorem
Feature Selection, Model Assessment and Cross-validation
Lecture 37 - Feature Selection and Dimensionality Reduction; Principal Component Analysis
Lecture 38 - No Free Lunch Theorem; Model Selection and Model Estimation; Bias-variance Trade-off
Lecture 39 - Assessing Learnt Classifiers; Cross Validation
Boosting and Classifier Ensembles
Lecture 40 - Bootstrap, Bagging and Boosting; Classifier Ensembles; AdaBoost
Lecture 41 - Risk Minimization View of AdaBoost