CS 188, Fall 2005, Introduction to Artificial Intelligence
Assignment 5 Part 1, due 11/14, total value 3.2% of grade

This assignment should be done in pairs. Don't leave it to the last minute! Consult early and often with me and/or your TA.

This assignment comes in two parts. The first part is worth 40 points out of 100 and is mainly intended to help you become familiar with the AIMA Bayes net software and the basics of Bayes net representation and inference. The second part, to be posted shortly, is worth 60 points and deals with DBNs.

The first thing you need to do is load the CS188 AIMA code in the usual way: load aima.lisp and then do (aima-load 'probability).

Be sure to use the latest version from ~cs188. Several things have changed and new code has been added. As always, remember to compile the code.

The AIMA code includes a general facility for defining and using Bayes nets. Code defining Bayes nets and all the basic operations on them (including I/O) appears in

There are also files for exact and approximate inference: Basic functions dealing with distributions, random numbers, etc., appear in One useful point to note is that discrete probability distributions come in two forms: the "standard" form is just a vector of probabilities, and the "interface" form is an association list, each element of which is a value-probability pair. Except in the interface functions for the user, the values of all discrete random variables are integers starting at 0; e.g., if a variable has values "false" and "true", these are represented as 0 and 1 internally.

There are some ready-made Bayes nets in the probability/domains directory. For example, try

>> (setq burglary-net (load-bn "probability/domains/burglary.bn"))
                          :CHILDREN # :ARITY 2 :VALUE-NAMES #
                          :CPT #0A #)

The raw data structure for a Bayes net is circular and very ugly to look at. (Because it's circular, make sure *print-circle* is set to t, otherwise printing a Bayes net causes an infinite loop. If this does happen, you can hit control-C a few times to cause a break, then :res to restart.) You can display a Bayes net as follows:

>> (display-bayes-net burglary-net)

            | 0.998000 0.002000
Another very useful function, especially for debugging, is (bnode-by-name name bn), which returns the actual node with the given name from bn.

Once you have a Bayes net, you can create evidence for it by creating a data structure called an event. Do the following:

>> (setq be1 (create-event burglary-net))
Name of node to set (nil if none)?  JohnCalls
Value to set it to?  true
Name of node to set (nil if none)?  MaryCalls
Value to set it to?  true
Name of node to set (nil if none)?  nil

#(NIL NIL NIL 1 1)
Notice that an event is actually a vector of integers (or NILs) representing the values of all the nodes. (The integers correspond to the values defined for the node; so the 1s here mean "true".) Once you have evidence, you can ask for the probability of a variable given the evidence. The simplest exact inference algorithm is (enumeration-ask-by-name X e bn), which takes a variable name X and an event e and returns a distribution over X:
>> (enumeration-ask-by-name 'Burglary be1 burglary-net)
((FALSE . 0.7158281651489582d0) (TRUE . 0.2841718348510418d0))
Enumeration works by enumerating all entries in the joint distribution. The variable elimination algorithm is usually more efficient:
>> (elimination-ask-by-name 'Burglary be1 burglary-net)
((FALSE . 0.7158281651489582d0) (TRUE . 0.2841718348510418d0))
The three sampling algorithms that are provided take an extra argument indicating the number of samples to generate (defaults to 1000):
>> (rejection-sampling-ask-by-name 'Burglary be1 burglary-net 100000)
((FALSE . 0.6839622641509434d0) (TRUE . 0.3160377358490566d0))
>> (likelihood-weighting-ask-by-name 'Burglary be1 burglary-net 100000)
((FALSE . 0.6929455965405458d0) (TRUE . 0.3070544034594543d0))
>> (mcmc-ask-by-name 'Burglary be1 burglary-net 100000)
((FALSE . 0.71652d0) (TRUE . 0.28348d0))
Notice that even with 100000 samples, the sampling algorithms are a bit off.

A Bayes net can be constructed using the interactive function create-bayes-net. For example, the following transcript shows the construction of a two-node network; you should try this for practice:

>> (display-bayes-net (setq ab-net (create-bayes-net)))
*****************Creating New Node*******************
What is the node's name (nil if none)?  A
Variable type? One of tabulated, deterministic, noisy-or, linear-gaussian, probit.
Enter list of values  (false true)
*****************Creating New Node*******************
What is the node's name (nil if none)?  B
Variable type? One of tabulated, deterministic, noisy-or, linear-gaussian, probit.
Enter list of values  (false true)
*****************Creating New Node*******************
What is the node's name (nil if none)?  nil
************Creating bayes net arcs*****************
Enter list of parent node names for B  (A)
Enter list of parent node names for A  nil
****Creating distribution for node A of type TABULATED-BNODE****
     Enter probabilities for A = (FALSE TRUE)
     (0.4 0.6)
****Creating distribution for node B of type TABULATED-BNODE****
   Given A = FALSE
       Enter probabilities for B = (FALSE TRUE)
       (0.9 0.1)
   Given A = TRUE
       Enter probabilities for B = (FALSE TRUE)
       (0.1 0.9)

Node: A  Parents: NIL  Type: TABULATED-BNODE
The interactive entry process is somewhat error-prone, so you may choose to use some of the functions called by create-bayes-net to create and save intermediate stages in the process. For example, you might create the network structure (with variables and values) first, then create and edit a file of lisp commands to set the CPT entries. In general, it's a good idea to draw out your Bayes net and CPTs on paper first. Once a network hs been created, you can save it to a file; for example, do
>> (save-bn ab-net "probability/domains/ab.bn")

A Bayes net for car diagnosis

Consider the network in Figure 4.18 on p533 of AIMA2e. We will use this as a starting point. Because we will use a noisy-OR node for whether the car starts, we will have to make all the variables into "diseases" and "symptoms", which means "changing the sign" of each variable. Thus, the variables in your network will include RadioDead, BatteryDead, IgnitionFailure, WontStart, WontMove, NoGas. Question 1 (5 pts). The variables IcyWeather, EngineBlockFrozen, and GasGaugeOnEmpty should also be included. Explain which parents and children they should have. (You may find it helpful at this point to draw out your network, but you need not turn in the drawing.)

Question 2 (10 pts). Now use the function create-bayes-net to create the Bayes net itself. Remember that WontStart should be a noisy-OR node. It is up to you to pick reasonable probabilities and coefficients; avoid determinism! Use the function display-bayes-net to display your network readably. (You may also want to use save-bn to save the network to a file.)

Question 3 (15 pts). The inference algorithms access a Bayes net through two methods:

Write these methods for the noisy-OR bnode type, using the definitions in Ch. 14. (You may find it useful to look at the methods defined for probit bnodes.)

Question 4 (10 pts). Now use enumeration-ask-by-name to calculate the probability that the radio is dead, given that the car won't move and the gas gauge is not on empty. Then run each of the three approximate inference algorithms 100 times, with 1000 samples each time, to answer the same query. Calculate the mean squared error (see ms-error in utilities/utilities.lisp) for each algorithm and comment briefly on the results.