Natural Language Processing

Fall 2017 — CMSC 473/673

Who, What, When, and Where

Check out the full syllabus for all this information, including policies on academic honesty, accomodations, and late assignments.

Meeting Times
Performing Arts and Humanities, 107
Monday & Wednesday, 1pm - 2:15pm
Frank Ferraro
ferraro [at] umbc [dot] edu
ITE 358
Monday 2:15 - 3pm
Tuesday 11:30 - 12
by appointment
Chi Zhang
chzhang1 [at] umbc [dot] edu
ITE 353
Thursday 2-4
The topics covered will include
  • probability, classification, and the efficacy of simple counting methods
  • language modeling (n-gram models, smoothing heuristics, maxent/log-linear models, and distributed/vector-valued representations)
  • sequences of latent variables (e.g., hidden Markov models, some basic machine translation alignment)
  • trees and graphs, as applied to syntax and semantics
  • some discourse-related applications (coreference resolution, textual entailment), and
  • special and current topics (e.g., fairness and ethics in NLP).
After taking this course, you will
  • be introduced to some of the core problems and solutions of NLP;
  • learn different ways that success and progress can be measured in NLP;
  • be exposed to how these problems relate to those in statistics, machine learning, and linguistics;
  • have experience implementing a number of NLP programs;
  • read and analyze research papers;
  • practice your (written) communication skills.

The following schedule of topics is subject to change.


Date Topic Suggested Reading Assignment Out Assignment Due
Wednesday, 8/30 Intro: what is NLP? 2SLP: Ch 1

Assignment 1
Graduate Paper 1

Wednesday, 9/6 Probability Concepts & Language Modeling (I) 3SLP: Ch 2.2
3SLP: Ch 4
2SLP: Ch 4
Monday, 9/11 Language Modeling (II) 3SLP: Ch 6
2SLP: Ch 9.0-9.1, 20.0-20.3
Wednesday, 9/13 Intro to ML: the Noisy Channel Model, & Classification 3SLP: Ch 6
2SLP: Ch 9.0-9.1, 20.0-20.3
Monday, 9/18 Naïve Bayes & Maximum Entropy (Log-linear) Models (I) 3SLP: Ch 7
2SLP: Ch 6.6-6.7
Ferraro and Eisner (2013)
Wednesday, 9/20 Maxent (II) 3SLP: Ch 8
2SLP: Ch 6.6-6.7 ;
Mnih and Hinton (2007)
(Friday 9/22/17) Course Project (Saturday 9/23/17) Assignment 1
Monday, 9/25 Maxent (III) & Neural Language Models 3SLP: Ch 15
3SLP: Ch 16
Blei and Lafferty (2009)
Assignment 2
Wednesday, 9/27 Distributed Representations
Monday, 10/2 Intro to Latent Sequences & Expectation Maximization
Wednesday, 10/4 Machine Translation Alignment (Guest Lecturer: Rebecca Knowles)
Monday, 10/9 HMM, I: Intro and Part of Speech Tagging
Wednesday, 10/11 HMM, II: The Forward Algorithm Graduate Paper 2 (Friday) Graduate Paper 1
Monday, 10/16 HMM, III: The Viterbi and EM Algorithms
Wednesday, 10/18 HMM, IV: EM (continued), MEMMs, and CRFs Assignment 2 (Saturday 10/21)
Monday, 10/23 MEMMs and CRFs
Wednesday, 10/25 Midterm Review Assignment 3
Monday, 10/30 Midterm
Wednesday, 11/1 RNNs and Catchup
Monday, 11/6 Intro to Syntax and Probabilistic Context Free Grammars (PCFGs) 3SLP: Ch 11, Ch 12
2SLP: Ch 12-13
Project Update
Wednesday, 11/8 Parsing PCFGs 3SLP: Ch 12, Ch 13
2SLP: Ch 13-14
Monday, 11/13 PCGs: CKY and Viterbi
Wednesday, 11/15 Inside-Outside: Expectation Maximization in PCFGs
Monday, 11/20 Dependency Parsing and Intro to Semantics from Syntax 3SLP: Ch 14
Wednesday, 11/22 Assignment 4 Assignment 3
Monday, 11/27 Semantic Frames, Semantic Role Labeling, and Featurized Semantics 3SLP: Ch 22.0-22.6 Graduate Paper 2
Wednesday, 11/29
Monday, 12/4 Textual Entailment and Logical Inference
Wednesday, 12/6 Entity Coreference
Monday, 12/11 Information Extraction and Question Answering Assignment 4
Wednesday, 12/20 Optional Second Midterm Course Project