Machine Learning 101
Instructors: Dr. Michael Bowles & Dr. Patricia Hoffman
Overview of the course
Machine Learning 101, deals primarily with supervised learning problems. Machine Learning 102 will cover unsupervised learning and fault detection.
Both 101 and 102 begin at the level of elementary probability and statistics and from that background survey a broad array of machine learning techniques. The classes will give participants a working knowledge of these techniques and will leave them prepared to apply those techniques to real problems. To get the most out of the class, participants will need to work through the homework assignments.
Prerequisites
This class assumes a moderate level of computer programming proficiency. We will use R (the open source statistics language) for the homework and for the examples in class. We will cover some of the basics of R and do not assume any prior knowledge of R. You can find references to how to use R on this website and we will give out sample code during classes that will help get you started.
You'll need some general beginner-level background in probability, calculus, linear algebra and vector calculus. We will cover most of what is required during the lectures. The appendices in the back of the Tan text are more than sufficient level for this class.
Machine Learning 101 and 102 can be taken in any any order. The prerequisites for the two classes are the same. The second five week session (Machine Learning 102) will culminate in the students giving presentations on papers they have read.
Why use R?
We're going to use R as our lingua franca for looking at homework problems, discussing them and comparing different solution approaches. Load R onto your laptop or desk computer before you come to the first class. http://cran.r-project.org/ We will include some descriptive material on using R in the first two lectures in order to get everyone up to speed on it. To integrate R with Eclipse click here. References for R are here: References for R Comment on these references here: Reference for R Comments More R references
General Sequence of Classes:
Machine Learning 101: Supervised learning
Machine Learning 102: Unsupervised Learning and Fault Detection
Text: "Introduction to Data Mining", by Pang-Ning Tan, Michael Steinbach and Vipin Kumar
Machine Learning 201: Advanced Regression Techniques, Generalized Linear Models, and Generalized Additive Models
Machine Learning 202: Collaborative Filtering, Bayesian Belief Networks, and Advanced Trees
Text: "The Elements of Statistical Learning - Data Mining, Inference, and Prediction" by Trevor Hastie, Robert Tibshirani, and Jerome Friedman
Future Topics
Data Mining Social Networks
Text Mining
Recommender Methods
Big Data
Machine Learning 101 Syllabus:
Week |
Topics |
Homework |
Links |
|
|
|
|
1st Week |
Exploring Data |
|
FirstWeekNotes |
5/14/2011 |
Data Quality |
|
|
|
Aggregation, Sampling |
|
|
5/15/2011 |
Beginning with R held at Hacker Dojo 10 AM - Noon |
Notes
|
|
|
|
|
|
2nd Week |
Supervised Classification and Prediction |
|
|
5/21/2011 |
General Background |
HW #1 Due |
SecondWeekNotes
|
|
Performance Evaluation |
|
|
|
Trees |
|
|
|
|
HW02.pdf |
|
3rd Week |
Regression |
|
|
5/28/2011 |
Ordinary Least Squares |
HW #2 Due |
ThirdWeekNotes |
|
Ridge Regression |
HW03.pdf
|
|
|
|
|
|
4th Week |
Classification and Regression Techniques |
|
|
6/4/2011 |
k Nearest Neighbors |
HW #3 Due |
FourthWeekNotes
|
|
Naïve Bayes |
HW04
|
|
|
|
|
|
5th Week |
Support Vector Machines |
|
|
6/18/2011 |
Linear & Nonlinear |
HW #4 Due |
FifthWeekNotes
|
|
Separable & Nonseparable |
|
|
|
|
|
|
There are more Machine Learning References on Patricia's web site http://patriciahoffmanphd.com/
Anyone can read this web site, however only the instructors have permission to edit the site.
Comments (0)
You don't have permission to comment on this page.