CMSC 471

Artificial Intelligence -- Fall 2014

HOMEWORK TWO

Out 4/21, due 5/4




The group homework policy from the syllabus applies to this assignment.

I. Learning in the wild

Russell & Norvig, Exercise 18.1 (page 763). Consider the problem faced by an infant learning to speak and understand a language. Explain how this problem fits into the general learning model. Describe the percepts nad actions of the infant, and the types of learning the infant must do. Describe the subfunctions the infant is trying to learn in terms of inputs and outputs, and available example data.

II. Decision tree learning

The following table gives a data set for deciding whether to play or cancel a ball game, depending on the weather conditions.
 
Outlook Temp (F) Humidity (%) Windy? Class
sunny 75 70 true Play
sunny 80 90 true Don't Play
sunny 85 85 false Don't Play
sunny 72 95 false Don't Play
sunny 69 70 false Play
overcast 72 90 true Play
overcast 83 78 false Play
overcast 64 65 true Play
overcast 81 75 false Play
rain 71 80 true Don't Play
rain 65 70 true Don't Play
rain 75 80 false Play
rain 68 80 false Play
rain 70 96 false Play
  1. Decision Tree (10 pts.) Suppose you build a decision tree that splits on the Outlook attribute at the root node. How many children nodes are there are at the first level of the decision tree? Which branches require a further split in order to create leaf nodes with instances belonging to a single class? For each of these branches, which attribute can you split on to complete the decision tree building process at the next level (i.e., so that at level 2, there are only leaf nodes)? Draw the resulting decision tree, showing the decisions (class predictions) at the leaves.
     

III. Learning Bayes nets

Consider an arbitrary Bayesian network, a complete data set for that network, and the likelihood for the data set according to the network. Give a clear explanation (i.e., an informal proof, in words) of why the likelihood of the data cannot decrease if we add a new link to the network and recompute the maximum-likelihood parameter values.

IV. Knowledge-Based Agents

(Adapted from Russell & Norvig 2nd edition, Exercise 7.1.) Describe the Wumpus world according to the properties of task environments listed in Chapter 2 (i.e., the seven characteristics described in Section 2.3.2)? Your answer should include a brief (single sentence or phrase) justification for each of the seven answers.

How would your answer change in a world in which the wumpus could move according to fixed rules (i.e., rules that are known to the agent)? How would your answer change for a world in which the wumpus moved using an unknown mechanism? Note: Use the description of the Wumpus world from the book (not the online variations that we saw in class).

V. Logic

(a) Russell & Norvig Exercise 7.7, page 281 (15 pts).

(b) Russell & Norvig Exercise 7.8 (b,c), page 281 (15 pts).

(c) Russell & Norvig Exercise 7.22 (a), page 284 (10 pts).

(d) Russell & Norvig Exercise 8.28 (c,f,h,k.l), page 320-321 (15 pts).

VI. Resolution Theorem Proving


(a) Represent the following knowledge base in first-order logic.  Use the predicates

where arguments x have the domain of all people, and arguments t have the domain of all tests.
  1. Everyone who is smart, studies, and attends class will be prepared.
  2. Everyone who is prepared will pass a test if it is fair.
  3.  A student passes a test if and only if they don't fail it.
  4.  Every UMBC student is smart.
  5.  If a test isn't fair, everyone will fail the test.
  6.  Aidan is a UMBC student.
  7.  Sandy passed the 471 exam.
  8.  Aidan attends class.

(b) Convert the KB to conjunctive normal form.

(c) We wish to prove that

study(Aidan) -> pass(Aidan, 471-exam)

Express the negation of this goal in conjunctive normal form.

(c) Add the negated goal to the KB, and use resolution refutation to prove that it is true. You may show your proof as a series of sentences to be added to the KB or as a proof tree.  In either case, you must clearly show which sentences are resolved to produce each new sentence, and what the unifier is for each resolution step.