- 12/6 [473/673] Assignment 5 is available.
It's due 12/15, by 11:59 PM.
- 11/12 [473/673] Assignment 4 is available.
It's due 12/1, by 11:59 PM.
- 10/06 [473/673] Assignment 3 is available.
It's due 10/20, by 11:59 PM.
- 10/04 [473/673] The project prompt is available.
There are multiple milestones and due dates.
- 9/17 [473/673] Assignment 2 is available.
It's due 9/29, by 11:59 PM.
- 9/15 [673] The Graduate Assessment prompt is available.
You can read the short overview document, just the portions relevant to the "Implementation Track,", or just the portions relevant to the "Paper Track." (There's also a PDF that has all the above in a single document. This long document simply repeats the same information as the other three PDFs---just in one document.)
There are multiple milestones and due dates.
- 9/1 [473/673] Assignment 1 is available.
It's due 9/14, by 11:59 PM.
- 9/1 [473/673]
The Discord server is available.
- 8/31 [473/673] Assignments and coursework can be submitted through the submission site, available at https://www.csee.umbc.edu/courses/undergraduate/473/f21/submit.
You must be logged in with your UMBC ID.
- 8/31 [473/673]
The syllabus (version 0.9) is available.
Check out the syllabus for all this information, including policies on academic honesty, accomodations, and late assignments.
- Meeting Times
-
ILSB 201
Monday & Wednesday, 1pm - 2:15pm
- Instructor
-
Frank Ferraro
ferraro [at] umbc [dot] edu
ITE 358/remote
Monday 2:15 - 3pm
Thursday 1:00 - 1:30
by appointment
- TA
-
Ryan Barron
ryanb4 [at] umbc [dot] edu
ITE 334/remote
Monday 4:30 - 5:30pm
Wednesday 4:30 - 5:30
by appointment
- Topics
-
The topics covered will include
- probability, classification, and the efficacy of simple counting methods
- language modeling (n-gram models, smoothing heuristics, maxent/log-linear models, and distributed/vector-valued representations)
- sequences of latent variables (e.g., hidden Markov models, some basic machine translation alignment)
- trees and graphs, as applied to syntax and semantics
- some discourse-related applications (coreference resolution, textual entailment), and
- special and current topics (e.g., fairness and ethics in NLP).
- modern, neural approaches to NLP, such as recurrent neural networks and transformers (e.g., BERT or GPT-2).
- Goals
-
After taking this course, you will
- be introduced to some of the core problems and solutions of NLP;
- learn different ways that success and progress can be measured in NLP;
- be exposed to how these problems relate to those in statistics, machine learning, and linguistics;
- have experience implementing a number of NLP programs;
- read and analyze research papers;
- practice your (written) communication skills.
The following schedule of topics is subject to change.
Legend:
Date |
Topic |
Suggested Reading |
Assignment Out |
Assignment Due |
Wednesday, 9/1
|
-
Intro/administrivia
|
Eisenstein: Ch. 1 (Optional: 1.2.2)
|
[473/673]
Assignment 1
|
— |
Wednesday, 9/8
|
-
What is NLP?
-
Probability Review
|
- What is NLP?: Eisenstein: Ch. 1 (Optional 1.2.2)
- Probability: Eisenstein: Appendix A
|
—
|
— |
Monday, 9/13
|
|
None required, though this will cover approximately the intro of every chapter in either Eisenstein or SLP.
|
—
|
— |
Monday, 9/20
|
- Intro to NLP Research and Research Community
|
|
—
|
— |
Wednesday, 9/22
|
- (Lossy) Overview of Different Types of NLP Tasks
|
|
—
|
— |
Wednesday, 9/29
|
- Machine Learning: Methodology and Evaluation
|
- Eisenstein: Ch 4 (optional: 4.4.3, 4.4.4, and 4.5, unless planning to do doctoral work)
- 3SLP: intro of Ch 4, 4.7, 4.8, and 4.10
|
—
|
— |
Wednesday, 10/06
|
- Classification with Maxent Models
|
- Eisenstein: Ch 2.5
- 3SLP: Ch. 5
|
—
|
— |
Wednesday, 10/20
|
- Distributed Representations
|
- 3SLP: Ch. 6
- Eisenstein: Ch. 14, with exceptions listed on Discord
|
—
|
— |
Monday, 11/1
|
- (Generative) Language Modeling
|
- Eisenstein: Ch. 6 (except 6.2.2-6.2.4, 6.3.1)
|
—
|
— |
Wednedsay 11/10
|
- Recurrent Neural Language Modeling
|
- Eisenstein: Ch. 6 (except 6.2.2-6.2.4, 6.3.1)
|
—
|
— |
Monday 11/29
|
- General Language Modeling: Attention and Transformers
|
|
—
|
— |