### CSC 550: Introduction to Artificial Intelligence Fall 2004 HW5: Semantics Nets, Expert Systems, and Neural Nets

1. Consider the following information:

In general, fish can swim and are stung when they touch an anemone.
Dorie is a Blue Tang, which are blue colored.
Nemo and Marlin are clown fish, which are orange with white stripes.
Clown fish are not stung by anemones.
Marlin is Nemo's father.
1. Draw a semantic network representation of the above information.
2. Your representation should allow you to conclude that Dorie is stung when she touches an anemone. Does it? Can Marlin be stung by an anemone? Why?
3. Suppose that Nemo admires Blue Tangs. How would you represent this in your network? Would this allow you to conclude that he admires Dorie?
4. Once you have answered these questions, implement your semantic net as Scheme expressions (similar to the semantic net defined in net.scm. Verify your answers to the above questions using the lookup relation defined in that file.

2. In class, we studied a simple expert system, auto.html, which was used to diagnose problems with automobiles. Using this Web-based expert system, answer the following questions:

1. Suppose you have a car that won't run. When you try to start the car, the engine cranks slowly and the headlights dim. Assuming you are willing to pay any amount to fix the car, what would the expert system recommend?
2. When dealing with concepts such as "slow" and "normal", it can often be difficult to commit with absolute certainty. Suppose you try to start the car and it seems like it cranks slowly, but you are only 90% sure. Assuming the other symptoms are unchanged (lights dim and unlimited cash), will this uncertainty affect the answer you receive from the expert system? Explain.
3. Now that doubt is creeping into your mind, suppose you further doubt whether the headlights really are dimming. Does a 90% certainty that the lights dim change the result (assuming the 90% certainty of slow cranking and unlimited funds still hold)? How about an 80% certainty of dim lights? Explain.
4. Would the order in which the rules are written in the knowledge base affect the answers you might obtain? Explain why or why not. If the system is sensitive to rule ordering, give an example where the same symptoms would yield a different result, simply because the rules have been swapped around. You can view the knowledge base for the expert system in auto.kb.

3. In class, we trained a backpropagation network to predict a person's political affiliation based on survey data. For this assignment, you are to similarly train a backpropagation network to recommend potential majors for students. Your network will need to choose from at least three majors (with outputs at 0.0, 0.5, and 1.0). You will need to design a survey with at least five questions that will identify characteristics that recommend one major or another. Questions might refer to the person's personality, interests, aptitudes and/or lifestyle goals.

To train and test your network, you will need to give the survey to at least three people per major. One person per major will be reserved to test the network once it is trained. The rest will be used to train the network, using the applet from http://www.cs.ubc.ca/labs/lci/CIspace/Version4/neural/.

You will need to turn in your survey data, a picture of the trained network, and the classification of each person (both the training examples and test examples).