Ects : 2 Enseignant responsable : PIERRE BRUGIERE Volume horaire : 18 Description du contenu de l'enseignement : 1. Supervised and unsupervised learning 2. Calibration versus prediction: how to avoid over-fitting 3. Measure of the complexity of a model according to Vapnik-Chervonenkis 4. Vapnik-Chervonenkis’s inequality and the control of the prediction error 5. Maximum margin SVMs and Gap tolerant classifiers 6. C-SVMs and duality 7. SVMs with kernels and Mercer’s theorem 8. The simplex case 9. Mu-SVM, duality and reduced convex envelopes 10. Single class SVMs, anomaly detections and clustering 11. An introduction to Bootstrap, decision trees and random forests 12. Ridge Regression, penalization, and yield curve smoothing 13. The Representer theorem, Lasso, parsimony and duality.
Compétences à acquérir : Comprendre comment utiliser les Supports Vectors Machines pour l'apprentissage supervisé et non supervisé. Quelques application des méthodes de regressions pénalisées
Mode de contrôle des connaissances : Examen Bibliographie-lectures recommandées [1] Pierre Brugiere: hal.archives-ouvertes.fr/cel-01390383v2 [2] Wolfgang Karl Härdle, Rouslan Moro, Linda Hoffmann : Learning Machines Supporting Bankruptcy Prediction, SFB 649 Discussion Paper 2010-032 [3] Dave DeBarr and Harry Wechsle: Fraud Detection Using Reputation Features SVMs, and Random Forests [4] Trevor Hastie, Robert Tibshirani, Jerome Friedman: The Elements of Statistical Learning [5] Christopher Bishop: Pattern Recognition and Machine Learning