CSC 550: Introduction to Artificial Intelligence
CSC 650: Advanced Artificial Intelligence
Spring 2003

HW5: Semantics Nets and Expert Systems


  1. Consider the following information:

    The captain of the Enterprise is Jean-Luc Picard (a human).
    By default, life forms throughout the galaxy have organic brains.
    Androids are life forms with artificial (inorganic) brains.
    Lt. Commander Data of the Starship Enterprise is an android.
    1. Draw a semantic network representation of the above information.
    2. Your representation should allow you to conclude that Jean-Luc Picard has an organic brain. Does it? What can you conclude about Data's brain? Why?
    3. Suppose that Captain Picard admires androids. How would you represent this in your network? Would this allow you to conclude that he admires Data?
    4. Once you have answered these questions, implement your semantic net as Scheme expressions (similar to the semantic net defined in net.scm. Verify your answers to the above questions using the lookup relation defined in that file.

  2. In class, we studied a sample expert system knowledge base, auto.kb, which was used to diagnose problems with automobiles. Download a copy of this knowledge base, as well as the inference engine page and applet (auto.html and e2glite.jar). Then, use the complete expert system to answer the following questions:

    1. Suppose you have a car that won't run. When you try to start the car, the engine cranks slowly and the headlights dim. Assuming you are willing to pay any amount to fix the car, what would the expert system recommend?
    2. When dealing with concepts such as "slow" and "normal", it can often be difficult to commit with absolute certainty. Suppose you try to start the car and it seems like it cranks slowly, but you are only 90% sure. Assuming the other symptoms are unchanged (lights dim and unlimited cash), will this uncertainty affect the answer you receive from the expert system? Explain.
    3. Now that doubt is creeping into your mind, suppose you further doubt whether the headlights really are dimming. Does a 90% certainty that the lights dim change the result (assuming the 90% certainty of slow cranking and unlimited funds still hold)? How about an 80% certainty of dim lights? Explain.
    4. Does the order in which the rules are written in the knowledge base affect the answers you might obtain? Explain why or why not. If the system is sensitive to rule ordering, give an example where the same symptoms yield a different result, simply because the rules have been swapped around.

  3. The diagram below shows a collection of rules that can be used to identify animals based on observable characteristics. Some of these rules have already been encoding in the knowledge base animal.kb. Download this file and add additional rules to capture the rest of the information in the diagram. To test the system, you will need to create a Web page, call it animal.html that includes the inference engine applet with animal.kb as the specified parameter. Use the source of auto.html as your template.

    Note: You should strive to make your rules as general as possible, so that rules for additional animals could be easily added.

    Animal knowledge base