image of bot

UMBC CMSC 471.02 Spring 2023
Introduction to Artificial Intelligence

Exams and Quizzes

This page will be updated shortly before the midterm and final exams to reflect what we actually covered this semester.

We will have a midterm exam in mid-March and a final at the end of the semester at the regularly scheduled time. We may have some short online quizzes throughout the semester. Any quizzes will be held on Blackboard.

Quizzes

We may have some short online quizzes throughout the semester that are intended to help you keep up with the reading.

Midterm exam

Here are some general notes on the midterm exam.

(1) There may be questions that ask you to write or understand very simple Python code.

(2) The exam will be based on the concepts, techniques and algorithms discussed in our text book and in class.

(3) It's important to have read all of chapters 1-6 and sections 18.1-2 in our text. This will fill in some of the gaps in our class coverage and discussions and also provide more background knowledge.

(4) You can look at the old mid-term exams for 471 and 671 found linked from this page. Note that there will be no questions on Lisp, Prolog or any topics related to them (e.g., unification).

(5) listed below are things you should be prepared to do.

Chapter 2: Intelligent agents

  • Understand the basic frameworks and characteristics of environments and agents introduced in chapter 2.

Chapters 3 and 4: Search

  • Take a problem description and come up with a way to represent it as a search problem by developing a way to represent the states, actions, and recognize a goal state.
  • Be able to analyze a state space to estimate the number of states, the 'branching factor' of its actions, etc.
  • Know the properties that characterize a state space (finite, infinite, etc.) and algorithms that are used to find solutions (completeness, soundness, optimality, etc).
  • Understand both uninformed and informed search algorithms including breadth first, depth first, best first, algorithm A, algorithm A*, iterative deepening, depth limited, bi-directional search, beam search, uniform cost search, etc.
  • Understand local search algorithms including hill-climbing and its variants, simulate annealing and genetic algorithms.
  • Know how to simulate these algorithms.
  • Be able to develop heuristic functions and tell which ones are admissible.
  • Understand how to tell if one heuristic is more informed than another.

Chapter 6: Constraint Satisfaction Problems

  • Understand the basics of CSP, including variables, domains, constraints.
  • Be able to take a problem description and set it up as a CSP problem. For example, identify a reasonable set of variables, indicate the domain of each, describe the constraints that must old on variables or sets of variables.
  • Understand the forward checking and ARC-3 algorithms and be able to simulate them.
  • Understand the min-constraints algorithm and be able to simulate it.

Chapter 5: Adversarial search (and 17.6) Game theory

  • Understand the basic characteristics of games
  • Understand and be able to simulate Minimax with and without alpha-beta given a game tree and the values of the static evaluation function on the leaves.
  • Be able to take a game and develop a representation for it and its moves and to describe a reasonable static evaluation function.
  • Understand how to handle games with uncertainty.
  • Be familiar with the basic concepts of game theory -- strategies, payoffs, Nash equilibrium, prisoner's dilemma, dominant strategies, etc. and how and when they come into play.

Chapter 17.6: Game theory

  • Be familiar with the basic concepts of game theory -- strategies, payoffs, Nash equilibrium, prisoner's dilemma, dominant strategies, etc. and how and when they come into play.

Some old midterms

Here are some old 471 midterm exams and a few from 671, the graduate version of 471.

Final exam

The Spring 2023 final exam will be given at its scheduled time: TBD

The final will be comprehensive with more emphasis on material since the midterm exam. Review the slides we showed in class, the homework assignments, and the old exams I've given in previous semesters.

Chapter 7, 8: Logical Agents 7.1-7.7; 8.1-8.3; 9.1; notes

  • Understand how an agent can use logic to model its environment, make decisions and achieve goals, e.g. a player in the Wumpus World
  • Understand the syntax and semantics of propositional logic
  • Understand the concept of a model for a set of propositional sentences
  • Understand the concept of a valid sentence (tautology) and an inconsistent sentence
  • Know how to find all models entailed by a set of propositions
  • Understand the concepts of soundness and completeness for a logic reasoner
  • Understand the resolution inference rule and how a resolution theorem prover works
  • Know what a Horn clause is in propositional logic, how to determine if a proposition sentence is a Horn clause and why Horn clauses are significant
  • know how to convert a set of propositional sentences to conjunctive normal form and then to use resolution to try to prove if an additional propositional sentence is true
  • Understand the limitation of propositional logic as a representation and reasoning tool
  • Know what it means for a proposition to satisfiable
  • Understand first order logic (FOL), it's notation(s) and quantifiers
  • Understand how FOL differs from higher-order logics differ
  • Be able to represent the meaning of an English sentence in FOL and to paraphrase a FOL sentence in English

Chapter 10: Knowledge Representation 10.1, 10.2, notes

  • Ontologies and data
  • Semantic Web technology
  • Knowledge graph technology
  • Reasoners, rules and query languages

Chapter 11: Planning 11.1-11.3; notes

  • Understand the blocks world domain
  • Understand classical strips planning
  • Understand algorithms for state-space search
  • Understand the partial-order planning approach and its advantage
  • Be familiar with the PDDL planning representation

Chapter 12: Bayesian reasoning 12.1-12.7; notes

  • Basic probability notation
  • Full joint probabilities
  • indenpendence and conditional probabilities
  • Bayes rule and simple naive baysean models

Chapter 13: Bayesian reasoning 13.1, 13.2; notes

  • Bayesian networks

Chapter 19: Learning from Examples 19.1-4, 19.6, 19.8-9; notes

  • supervised vs. unsupervised ML
  • Tasks: regression and classification
  • Regression: linear and logistic
  • Decision trees
    • entropy and information gain
    • ID3 algorithm using information gain
    • pruning
    • advantages and disadvantages of decision trees
  • Support Vector Machine (SVM)
    • linear separability of data
    • use of kernels
    • margin and support vectors
    • soft margin for allowing non-separable data
  • Tools
    • numpy array basics
  • ML methodology
    • Separate training and development, test and validation data
    • k-fold cross validation
    • Metrics: precision, recall, accuracy, F1
    • Learning curve
    • Confusion matrix
  • ML ensembling
    • bagging, various ways
    • random forest of decision trees
    • Advantages
  • Unsupervised ML
    • Clustering data
    • k-means clustering
    • hierarchical clustering
      • dendogram
      • bottom-up agglomerative vs. top-down divisive

Chapter 21: Neural Networks 21.1-8; notes

  • Basic elements: nodes (inputs, hidden layers, outputs), connections, weights, activation function
  • Types of neural networks and their advantages./disadvantages/purpose
    • Basic perceptron (single layer, step activation function)
    • MLP: Multi-layer perceptron
    • Feed Forward networks
    • RNN: recurrent neural network
    • CNN: convoluted neural network
  • Training process
    • Loss function
    • Backpropagation
    • Activation functions (step, ReLu, sigmoid, tanh)
    • batches and epochs
    • dropout
  • Awareness of tools
    • TensorFlow
    • Keras
  • Advantages and disadvantages of neural networks for supervised machine learning compared to other methods (e.g., decision trees, SVN)

Here are old exams that you can use as examples of what to expect. The content has varied over the years, so you should ignore anything that we did not cover this semester.