Natural Language Processing

Fall 2018 — CMSC 473/673

Announcements
Who, What, When, and Where

Check out the syllabus (to be released soon) for all this information, including policies on academic honesty, accomodations, and late assignments.

Meeting Times
Sherman Hall, 015
Monday & Wednesday, 1pm - 2:15pm
Instructor
Frank Ferraro
ferraro [at] umbc [dot] edu
ITE 358
Monday 2:15 - 3pm
Tuesday 11:00 - 11:30
by appointment
TA
Caroline Kery
ckery1 [at] umbc [dot] edu
Tuesday 2:00 - 3:30
Thursday 1:00 - 2:30
by appointment TBD
Topics
The topics covered will include
  • probability, classification, and the efficacy of simple counting methods
  • language modeling (n-gram models, smoothing heuristics, maxent/log-linear models, and distributed/vector-valued representations)
  • sequences of latent variables (e.g., hidden Markov models, some basic machine translation alignment)
  • trees and graphs, as applied to syntax and semantics
  • some discourse-related applications (coreference resolution, textual entailment), and
  • special and current topics (e.g., fairness and ethics in NLP).
Goals
After taking this course, you will
  • be introduced to some of the core problems and solutions of NLP;
  • learn different ways that success and progress can be measured in NLP;
  • be exposed to how these problems relate to those in statistics, machine learning, and linguistics;
  • have experience implementing a number of NLP programs;
  • read and analyze research papers;
  • practice your (written) communication skills.
Schedule

The following schedule of topics is subject to change.

Legend:

Date Topic Suggested Reading Assignment Out Assignment Due
Wednesday, 8/29 Intro: what is NLP? 2SLP: Ch 1

[473/673] Assignment 1
[673] Graduate Paper

Wednesday, 9/5 Probability Concepts & Language Modeling 3SLP: Ch 2.2
3SLP: Ch 4
2SLP: Ch 4
[473/673] Course Project
Monday, 9/10 3SLP: Ch 6
2SLP: Ch 9.0-9.1, 20.0-20.3
[473/673] Assignment 2 [473/673] Assignment 1
Wednesday, 9/12
Monday, 9/17 Intro to ML: the Noisy Channel Model, & Classification 3SLP: Ch 6
2SLP: Ch 9.0-9.1, 20.0-20.3
Wednesday, 9/19 Naïve Bayes, Maximum Entropy (Log-linear) Models, and Neural Language Models 3SLP: Ch 7
Ferraro and Eisner (2013)
Monday, 9/24 Maxent (II) & Neural Language Models 3SLP: Ch 8 (2SLP: Ch 6.6-6.7))
Mnih and Hinton (2007)
Wednesday, 9/26 Distributed Representations
Monday, 10/1 Intro to Latent Sequences & Expectation Maximization [473/673] Assignment 2
Wednesday, 10/3 Machine Translation Alignment [473/673] Assignment 3
Monday, 10/8 HMM, I: Intro and Part of Speech Tagging
Wednesday, 10/10 HMM, II: The Forward Algorithm, the Viterbi Algorithm, and EM
Friday, 10/12 [473/673] Project Proposal
Monday, 10/15
Wednesday, 10/17 [673] Graduate Paper
Monday, 10/22 MEMMs and CRFs [473/673] Assignment 4
Wednesday, 10/24 RNNs and Catchup
Friday, 10/26 [473/673] Assignment 3
Monday, 10/29 Midterm Review
Wednesday, 10/31 Midterm
Monday, 11/5 Intro to Syntax and Probabilistic Context Free Grammars (PCFGs) 3SLP: Ch 11, Ch 12
2SLP: Ch 12-13
Wednesday, 11/6 Parsing PCFGs 3SLP: Ch 12, Ch 13
2SLP: Ch 13-14
Friday, 11/9 [473/673] Project Update
Monday, 11/12 PCGs: CKY and Viterbi
Wednesday, 11/14 Inside-Outside: Expectation Maximization in PCFGs [673] Graduate Paper Reviews
Monday, 11/19 Dependency Parsing and Intro to Semantics from Syntax 3SLP: Ch 14 [473/673] Assignment 5 [473/673] Assignment 4
Wednesday, 11/21
Monday, 11/26 Semantic Frames, Semantic Role Labeling, and Featurized Semantics 3SLP: Ch 22.0-22.6
Wednesday, 11/26
Monday, 12/3 Textual Entailment and Logical Inference
Wednesday, 12/5 Entity Coreference [673] Graduate Paper Revisions
Monday, 12/10 Information Extraction and Question Answering [473/673] Assignment 5
Wednesday, 12/19 [473/673] Course Project