Check out the syllabus for all this information, including policies on academic honesty, accomodations, and late assignments.
- Meeting Times
-
Sondheim, 101
Monday & Wednesday, 1pm - 2:15pm
- Instructor
-
Frank Ferraro
ferraro [at] umbc [dot] edu
ITE 358
Monday 2:15 - 3pm
Tuesday 11:00 - 11:30
by appointment
- TA
-
Devajit Asem
devajit.asem [at] umbc [dot] edu
ITE 334
Wednesday 4:00 - 5:00
Friday 2:00 - 3:00
by appointment
- Topics
-
The topics covered will include
- probability, classification, and the efficacy of simple counting methods
- language modeling (n-gram models, smoothing heuristics, maxent/log-linear models, and distributed/vector-valued representations)
- sequences of latent variables (e.g., hidden Markov models, some basic machine translation alignment)
- trees and graphs, as applied to syntax and semantics
- some discourse-related applications (coreference resolution, textual entailment), and
- special and current topics (e.g., fairness and ethics in NLP).
- Goals
-
After taking this course, you will
- be introduced to some of the core problems and solutions of NLP;
- learn different ways that success and progress can be measured in NLP;
- be exposed to how these problems relate to those in statistics, machine learning, and linguistics;
- have experience implementing a number of NLP programs;
- read and analyze research papers;
- practice your (written) communication skills.
The following schedule of topics is subject to change.
Legend:
- 2SLP: 2nd Edition of Jurafsky and Martin's Speech and Language Processing.
- 3SLP: draft 3rd Edition of Jurafsky and Martin's Speech and Language Processing.
The authors have made the chapters, as they're rewritten, available: https://web.stanford.edu/~jurafsky/slp3/.
Date |
Topic |
Suggested Reading |
Assignment Out |
Assignment Due |
Wednesday, 8/28
|
-
Intro/administrivia
-
What is NLP?
|
2SLP: Ch 1
|
[473/673]
Assignment 1
|
— |
Wednesday, 9/4
|
-
Probability Review
-
Count-based Language Modeling
|
Language modeling: 3SLP Ch 3 (2SLP Ch 4)
|
[473/673]
Assignment 2
|
Assignment 1 |
Monday, 9/9
|
—
|
—
|
—
|
Wednesday, 9/11
|
— |
— |
— |
Monday, 9/16
|
— |
— |
— |
Wednesday, 9/18
|
-
Intro to ML: the Noisy Channel Model, Classification, & Evaluation
|
- Machine Learning
- 3SLP: Ch 4.0, 4.1, 4.7, 4.8
|
[673]
Graduate Paper
|
Assignment 2 |
Monday, 9/23
|
—
|
[473/673] Assignment 3
|
— |
Wednesday, 9/25
|
-
Naive Bayes Classifiers
|
- Naive Bayes
- 3SLP: Ch 4.1--4.6
|
—
|
— |
Monday, 9/30
|
(finish up Naive Bayes)
-
Maxent and Neural Language Models (part 1)
|
|
—
|
— |
Wednesday, 10/2 |
— |
— |
Friday, 10/4 |
— |
[673] Graduate Paper: Initial List |
Monday, 10/7 |
— |
— |
Wednesday, 10/9 |
— |
— |
Friday, 10/11 |
— |
[473/673] Assignment 3 |
Monday, 10/14 |
(Finish up neural language models, part 1)
-
Distributed Representations
|
— |
[473/673] Assignment 4 |
— |
Wednesday, 10/16 |
3SLP Ch 6 |
— |
— |
Monday, 10/21 |
— |
— |
Wednesday, 10/23 |
— |
— |
Monday, 10/28 |
-
Overview of Latent Variable Problems and Modeling
|
— |
— |
[673] First Draft of Grad Paper |
Wednesday, 10/30 |
-
Part of Speech Tagging and
Hidden Markov Models
|
3SLP Ch 8 (Part-of-Speech Tagging)
3SLP Appendix A (Hidden Markov Models)
|
— |
[473/673] Project Proposal |
Monday, 11/4 |
[473/673] Assignment 5 |
— |
Wednesday, 11/7 |
— |
— |
Monday, 11/11 |
— |
— |
Wednesday, 11/13 |
— |
— |
Monday, 11/18 |
-
Other Latent Variable Models
|
—
|
— |
— |
Wednesday, 11/20 |
-
Syntax: Constituency Grammars and Parsing
|
3SLP Ch 12 (Constituency Grammars)
|
— |
[473/673] Project Update |
Monday, 11/25 |
[473/673] Assignment 6 |
— |
Monday, 12/2 |
-
Dependency Parsing
|
3SLP Ch 15 (Dependency Parsing)
|
— |
— |
Wednesday, 12/4 |
-
Semantics
|
|
— |
— |
Monday, 12/9 |
-
Question Answering, and recap
|
—
|
— |
Assignment 6 |