Exams and Quizzes
This page will be updated shortly before the midterm and final exams to reflect what we actually covered this semester.
We will have a midterm exam in mid-March and a final at the end of the semester at the regularly scheduled time. We may have some short online quizzes throughout the semester. Both any quizzes and the exams will be held on Blackboard.
We may have some short online quizzes throughout the semester that are intended to help you keep up with the reading.
Here are some general notes on the midterm exam.
(1) There may be questions that ask you to write or understand very simple Python code.
(2) The exam will be based on the concepts, techniques and algorithms discussed in our text book (AIMI 4th edition) and in class.
(3) It's important to have read chapters 1-6 in our text. Specifically:
- 4.1 (not 4.2-4.5)
- 5.1-5.5, 5.7 (not 5.6)
- 6.1-6.4 (not 6.5)
This will fill in some of the gaps in our class coverage and discussions and also provide more background knowledge.
(4) You can look at the old mid-term exams for 471 and 671 found linked from this page. Note that there will be no questions on Lisp, Prolog or any topics related to them (e.g., unification).
(5) listed below are things you should be prepared to do.
Chapter 2: Artificial Intelligence
- Be familiar with the foundations, history, state of the art and both its risks and benefits.
Chapter 2: Intelligent agents
- Understand the basic frameworks and characteristics of environments and agents introduced in chapter 2.
Chapters 3 and 4: Search
- Take a problem description and come up with a way to represent it as a search problem by developing a way to represent the states, actions, and recognize a goal state.
- Be able to analyze a state space to estimate the number of states, the 'branching factor' of its actions, etc.
- Know the properties that characterize a state space (finite, infinite, etc.) and algorithms that are used to find solutions (completeness, soundness, optimality, etc).
- Understand both uninformed and informed search algorithms including breadth first, depth first, best first, algorithm A, algorithm A*, iterative deepening, depth limited, bi-directional search, beam search, uniform cost search, etc.
- Understand local search algorithms including hill-climbing and its variants, simulate annealing and genetic algorithms.
- Know how to simulate these algorithms.
- Be able to develop heuristic functions and tell which ones are admissible.
- Understand how to tell if one heuristic is more informed than another.
Chapter 6: Constraint Satisfaction Problems
- Understand the basics of CSP, including variables, domains, constraints.
- Be able to take a problem description and set it up as a CSP problem. For example, identify a reasonable set of variables, indicate the domain of each, describe the constraints that must old on variables or sets of variables.
- Understand the forward checking and ARC-3 algorithms and be able to simulate them.
- Understand the min-constraints algorithm and be able to simulate it.
Chapter 5: Adversarial search and Game theory
- Understand the basic characteristics of games
- Understand and be able to simulate Minimax with and without alpha-beta given a game tree and the values of the static evaluation function on the leaves.
- Be able to take a game and develop a representation for it and its moves and to describe a reasonable static evaluation function.
- Understand how to handle games with uncertainty.
- Be familiar with the basic concepts of game theory -- strategies, payoffs, Nash equilibrium, prisoner's dilemma, dominant strategies, etc. and how and when they come into play.
Some old midterms
Here are some old 471 midterm exams and a few from 671, the graduate version of 471.
The spring 2021 final exam will be given online via Blackboard starting at 3:30 pm on Tuesday, May 18
You will have up to two hours to complete the exam once you have started it and you must finish by 6:00 pm. If you start at 3:45, for example, you must finish by 4:45. I will be monitoring the final channel on our class Discord server and my email (firstname.lastname@example.org) if you have any questions during the exam. You are not allowed to collaborate with other students while taking the exam.
The final will be comprehensive with more emphasis on material since the midterm exam. Review the slides we showed in class, the homework assignments, and the old exams I've given in previous semesters.
Chapter 7, 8: Logical Agents 7.1-7.7; 8.1-8.3; 9.1; notes
- Understand how an agent can use logic to model its environment, make decisions and achieve goals, e.g., a player in the Wumpus World
- Understand the syntax and semantics of propositional logic
- Understand the concept of a model for a set of propositional sentences
- Understand the concept of a valid sentence (tautology) and an inconsistent sentence
- Know how to find all models entailed by a set of propositions
- Understand the concepts of soundness and completeness for a logic reasoner
- Understand the resolution inference rule and how a resolution theorem prover works
- Know what a Horn clause is in propositional logic, how to determine if a proposition sentence is a Horn clause, and why Horn clauses are significant
- Know how to convert a set of propositional sentences to conjunctive normal form and then to use resolution to try to prove if an additional propositional sentence is true
- Understand the limitation of propositional logic as a representation and reasoning tool
- Know what it means for a proposition to satisfiable
- Understand first order logic (FOL), its notation(s), and quantifiers
- Understand how FOL differs from higher-order logics differ
- Be able to represent the meaning of an English sentence in FOL and to paraphrase a FOL sentence in English
Chapter 11: Planning 11.1-11.2; notes
- Understand the blocks world domain
- Understand classical STRIPS planning
- Be familiar with the PDDL planning representation
Chapter 12: Bayesian reasoning 12.1-12.7; notes
- Basic probability notation
- Full joint probabilities
- Independence and conditional probabilities
- Bayes rule and how it can be used
- Simple naive Bayesian models
Chapter 13: Bayesian reasoning 13.1; notes
- Bayesian belief models
- Bayesian networks
Chapter 19: Learning from Examples 19.1-4, 19.6, 19.8-9; notes
- supervised vs. unsupervised ML
- Tasks: regression and classification
- Regression: linear and logistic
- Decision trees
- entropy and information gain
- ID3 algorithm using information gain
- advantages and disadvantages of decision trees
- Support Vector Machine (SVM)
- linear separability of data
- use of kernels
- margin and support vectors
- soft margin for allowing non-separable data
- ML methodology
- Separate training and development, test and validation data
- k-fold cross validation
- Metrics: precision, recall, accuracy, F1
- Learning curve
- Confusion matrix
- ML ensembling
- bagging, various ways
- random forest of decision trees
- Unsupervised ML
- Clustering data
- k-means clustering
- hierarchical clustering
- bottom-up agglomerative vs. top-down divisive
Chapter 21: Neural Networks 21.1-6; notes
- Basic elements: nodes (inputs, hidden layers, outputs), connections, weights, activation function
- Types of neural networks and their advantages./disadvantages/purpose
- Basic perceptron (single layer, step activation function)
- MLP: Multi-layer perceptron
- Feed Forward networks
- RNN: recurrent neural network
- CNN: convoluted neural network
- Training process
- Loss function
- Activation functions (step, ReLu, sigmoid, tanh)
- batches and epochs
- Awareness of tools
- Advantages and disadvantages of neural networks for supervised machine learning compared to other methods (e.g., decision trees, SVN)
Here are old exams that you can use as examples of what to expect. The content has varied over the years, so you should ignore anything that we did not cover this semester.