Introduction to Machine Learning

Spring 2019 — CMSC 678

Announcements
Who, What, When, and Where

Check out the syllabus for all this information, including policies on academic honesty, accomodations, and late assignments.

Meeting Times
Sondheim, 103
Monday & Wednesday, 2:30pm - 3:45pm
Instructor
Frank Ferraro
ferraro [at] umbc [dot] edu
ITE 358
Monday 3:45 - 4:30
Tuesday 11:00 - 11:30
by appointment
TA
Caroline Kery
ckery1 [at] umbc [dot] edu
ITE 349
Monday 1:00 - 2:00
Thursday 4:00 - 5:00
Topics
The topics covered will include, but are not limited to:
  • perceptrons
  • regression (linear, logistic, and non-linear)
  • spectral, clustering, and dimensionality reduction techniques
  • support vector machines and kernel methods
  • neural networks, including deep learning, recurrent neural networks, and convolutional neural networks
  • Bayesian networks and probabilistic graphical models
  • clustering
  • evaluation methodologies and experiment design.
Goals
After taking this course, you will
  • be introduced to some of the core problems and solutions of ML;
  • learn different ways that success and progress can be measured in ML;
  • be exposed to how these problems relate to those in statistics, artificial intelligence, and specialized areas of ML (such as natural language processing and computer vision);
  • have experience implementing a number of ML programs;
  • read and analyze research papers;
  • practice your (written) communication skills.
Schedule

The following schedule of topics is subject to change.

Legend:

Date Topic Main Reading: Read All Advanced Reading: Optionally Read Some Assignment Out Assignment Due
Monday, 1/28 Introduction: what is ML? ESL Ch 1 ESL Ch 2 A1: Math & Programming Review
Wednesday, 1/30 Probability, loss functions, and decision theory
  • CIML Ch 2
  • if in need of a probability refresh: ITILA Ch 2
  • UML Ch 2
  • UML Ch 14
  • ITILA Ch 36
Monday, 2/4 - Linear regression, classification, and perceptrons (+more on gradient optimization)
  • CIML Ch 7 (linear models)
  • CIML Ch 4 (perceptrons)
  • ESL Ch 3
  • UML Ch 9.2
Wednesday, 2/6
Friday, 2/8 due: A1
Monday, 2/11
Wednesday, 2/13
  • CIML Ch 9.5-9.7
  • ESL Ch 4.4
  • UML Ch 9.3
  • ITILA Ch 39, 41.1-41.3
Monday, 2/18
Wednesday, 2/20 Neural networks, backpropagation, & autodifferentiation
  • ESL Ch 11
  • UML Ch 20
  • ITILA Ch 38-39
Monday, 2/25
Wednesday 2/27 Recurrent & convolutional neural networks Goodfellow et al. (2016), Ch 11 (Practical Methodology)
Friday, 3/1 due: A2
Monday, 3/11 Midterm Review
Wednesday, 3/13 Midterm;
Monday, 4/8; Wednesday 4/10 Dimensionality Reduction: Linear Discriminant Analysis & Principal Component Analysis CIML Ch 15.2
  • ESL Ch 4.3
  • ESL Ch 14.5-14.10
  • UML Ch 23, 24.3
A4: Neural Networks
Monday, 4/15; Wednesday 4/17 Prototype vs. exemplar learning: k-means and k-nearest neighbor CIML Ch 15.1
  • ESL Ch 13, 14.3
  • UML Ch 22
  • ITILA Ch 20
Monday, 4/22 Kernel methods & Support vector machines CIML Ch 11 (Kernel + SVM), 7.7 (SVM)
  • ESL Ch 6 (Kernel)
  • UML Ch 16 (Kernel)
  • ESL Ch 12 (SVM)
  • UML Ch 15 (SVM)
Monday, 4/29 Expectation Maximization & Probabilistic Modeling CIML Ch 16
  • ESL Ch 8.5
  • UML Ch 24.0-24.1 (Maximum likelihood)
  • UML Ch 24.4 (EM)
  • ITILA Ch 20
Wednesday, 5/1
Monday, 5/6 Graphical Models
  • ITILA Ch 16 (Message Passing, e.g., for forward-backward)
  • CIML Ch 9
    • ITILA Ch 25 and 26 (~16 pages)
    • ESL Ch 17 (excluding 17.3.2, 17.4.2-end; ~17 pages)
    • ESL Ch 17 (excluding 17.3.2, 17.4.2-end; ~17 pages)
    Wednesday, 5/8
    Monday, 5/13
  • CIML Ch 1 (Decision Trees)
  • CIML Ch 13 (Ensemble Methods)