CMSC 471, Spring 2014 - Pre-Reading Guide
Last revised 1/19/14

Back to course page
Back to Prof. desJardins's home page

This document is a reference for the pre-reading material, summarizing the main concepts/ideas that you should understand before coming to class that day. Generally speaking, you should have a good grasp of the boldfaced concepts in the indicated sections. The "Reading Notes" column reminds you of the reading that you are expected to do for the exam and provides some additional notes on what is important to know.

Class

Date

Pre-Reading Notes

Reading Notes

1 Tue 1/28 -- Ch. 1; Lisp Ch. 1; McCarthy paper
2 Th 1/30 Read 2.1, 2.2 intro, and 2.2.1; skim 2.3.1-2.3.2 (task environments) Ch. 2; Lisp Ch. 2-3; Graham article. You should have a good understanding of all of the concepts in Ch. 2, which are foundational. Lisp won't be on the exam.
3 Tue 2/4 Read 3.1 intro, 3.1.1, skim 3.3 intro for main concepts Ch. 3.1-3.3; Lisp Ch. 4-5, App. A
4 Thu 2/6 Read 3.4 intro, 3.4.1--3.4.3 Ch. 3.4
5 Tue 2/11 Read 3.5 intro, 3.5.1, skim 3.5.2 for main concepts Ch. 3.5-3.7, Lisp Ch. 7. The exam won't get into the subtleties of 3.5.3 (IDA*) or the details of the heuristic generation methods in 3.6.3-3.6.4 but you should understand why those ideas make sense.
6 Thu 2/13 NO CLASS - SNOW DAY
7 Tue 2/18 Read 4.1 intro, 4.1.1 Ch. 4.1-4.2
8 Thu 2/20 Read Ch. 6 intro, 6.1 intro, and 6.1.1, not worrying about the mathematical notation but being sure you understand the basic concepts and the map coloring example. Ch. 6.1-6.4 (skip 6.3.3); supplementary: Vipin Kumar, "Algorithms for Constraint Satisfaction Problems: A Survey". Kumar is useful because it explains things a bit differently than R&N. But you aren't responsible for any concepts/material in Kumar that aren't also in R&N.
9 Tue 2/25 Read Ch. 5 intro and 5.1 and make sure you understand the concept of a game tree and the meaning of "minimax" Ch. 5.1-5.3, 5.4.1, 5.5. Section 5.3 (alpha-beta pruning) is especially tricky and will demand your close attention. Working in a study group to solve some examples of alpha-beta game trees will be helpful!!
10 Thu 2/27 Read Ch. 13.2.1-13.2.2 and be sure that you understand the concepts of random variables, prior probabilities, conditional probabilities, the product rule, and the joint probability distribution. Ch. 13. This and Ch. 14 are among the most mathematical material that we'll cover. It is essential that you understand all of the math in Ch. 13, or Ch. 14 will be very hard going!
11  Tue 3/4 Be sure that you really understand Ch. 13!! Glance briefly at Ch. 14.1 to see where we're headed with representing different kinds of conditional independence relationships across a set of random variables. Ch. 14.1-14.4.2. Chs. 14.1, 14.2, and 14.4.1-2 are very important and you should understand everything in those three sections thoroughly and be able to apply those concepts. You only need to skim 14.3 and understand the idea of noisy-OR and continuous distributions, not the math. (You are not responsible for Chs. 14.4.3-4 or beyond.)
12 Thu 3/6 Read 15.1 to understand the concepts of the Markov assumption and Markov process. Don't worry too much about the math just yet. Ch. 15.1-15.2.1, 16.1-16.3. There are a lot of other very interesting ideas and important AI techniques in the rest of Ch. 15, but we just don't have time to cover them. Feel free to dabble if you are interested in reading more (and starting to understand, say, how a self-driving car might work...). Similarly, multiattribute utility theory, preferences, and the value of information (rest of Ch. 16) are basic methods that let us start to think about decision-making in really complex domains.
13 Tue 3/11 (No pre-reading!) Ch. 17.5-17.6. The concepts covered in the slides are the most important to know for the exams. Other material will not be tested in depth.
14
Thu 3/13 NO CLASS - WATER DAY
15 Tue 3/25 Read Ch. 18.2 to understand the basic idea behind supervised learning: the problems of classification and regression, search through a hypothesis space, and the test set / training set distinction. Ch. 18.1-18.3. You only need to give Ch. 18.3.5 a cursory read.
16 Thu 3/27 Glance at Ch. 20.1 to see how the basic probability theory that we've already covered can be used to think of machine learning as a Bayesian update problem and the discovery of the most likely hypothesis. Ch. 20.1-20.2. You only need to skim 20.2.6.
17 Tue 4/1 MIDTERM
18 Thu 4/3 -- --
19 Tue 4/8 Review Ch. 7.4.1-7.4.2 (basics of propositional logic), especially if you had any difficulty with the logic part of the pretest. Ch. 7. You need to fully understand 7.1-7.5 and be able to apply resolution theorem proving to a new problem. You only need general knowledge of 7.6. Ch. 7.7 is very important as the foundation for the planning methods that we will study later
20 Thu 4/10 Review Ch. 8.2, especially if you had difficulty with the logic part of the pretest. First-order logic is supposed to be included in CMSC 203, but not every section covers it, so we will review it carefully. Some of the terminology here may be new even if the concepts are familiar. Ch. 8.1-8.3. You should be very familiar with this terminology and comfortable with applying first-order logic representations to encode a domain.
21 Tue 4/15 Skim Ch. 9.5 to get a sense of how resolution theorem proving is done for first-order logic representations. Ch. 9. You can skip 9.2.2, only skim 9.4.3, and skip 9.4.4-9.4.6. It's important that you understand Ch. 9.5 well enough to be able to apply resolution methods to a new first-order logic domain, but you can just skim the proof concepts in 9.5.4 and the discussion of equality in 9.5.5. Ch. 9.5.6 is important, though, and you should thoroughly understand these different search strategies for resolution theorem proving!
22 Thu 4/17 -- (no pre-reading for today) Ch. 12.1-12.2, 12.5-12.6
23 Tue 4/22 PHASE I TOURNAMENT
24 Thu 4/24 -- (no pre-reading for today) Ch. 10.1-10.2, 10.4.2-10.4.4. All of this material is important!
25 Tue 4/29 -- (no pre-reading for today) Ch. 17.1-17.3, but you can skip 17.2.3 (convergence of value iteration) unless you're interested in gaining a deeper understanding.
26 Thu 5/1 -- (no pre-reading for today) Ch. 21.1-21.3. You can skim 21.2.1, 21.2.2, and 22.3.1. Be sure you understand temporal difference learning (21.2.3) and Q-learning (21.3.2). However, you can skip the latter part of 21.3.3, when it talks about the SARSA variation of Q-learning.
27 Tue 5/6 SEE 4/24 CLASS SLIDES FOR INSTRUCTIONS!! Turing article
Searle article
Summary of Kurzweil's book
Kevin Kelly's critique
28 Thu 5/8 TBA/catchup/review
29 Tue 5/13 PHASE II TOURNAMENT