Journal of Computing Sciences in Colleges, 26(5), 2011.
Fluctuating computer science enrollments and the realization that many disciplines require basic computing knowledge are leading to an increased interest in non-majors (CS0) computing courses. Currently, there is no single dominant model of CS0 that simultaneously teaches the big ideas underlying computing, provides practical experience with programming, and makes computing relevant to a wide spectrum of students. What has become clear, however, is that meaningful, real-world applications are needed to drive student interest and learning. This paper describes one particular application, the Prisoner's Dilemma, that bridges game theory and evolutionary biology and provides a framework for critical thinking, modeling, and interdisciplinary problem solving within a CS0 course.
Journal of Computing Sciences in Colleges, 25(5), 2010.
Programming, like any creative endeavor, involves some personal style choices on the part of the programmer. Within the computer science education community, there are some programming style conventions that have been widely agreed upon, because following them unequivocally leads to programs that are easier to read and maintain. In other cases, a programmer might reasonably choose between competing styles, each of which provides similar advantages. This paper describes three instances where programming style is more than just aesthetics - following these conventions can actually help a beginning programmer to avoid mistakes and better understand the underlying programming concepts that they are utilizing.
ACM Journal on Educational Resources in Computing, 7(2), 2007.
In 1986, guidelines for a computer science major degree program offered in the context of the
liberal arts were developed by the Liberal Arts Computer Science Consortium (LACS)
[Gibbs and Tucker 1986]. In 1996 the same group offered a revised curriculum reflecting advances
in the discipline, the accom- panying technology, and teaching pedagogy [Walker and
Schneider 1996]. In each case, the LACS models represented, at least in part, a response to the
recommendations of the ACM/IEEE-CS [1979, 1991]. Continuing change in the discipline, technology,
and pedagogy coupled with the appearance of Comput- ing Curriculum 2001  have led to the
2007 Model Curriculum described here.
This report begins by considering just what computer science is and what goals are appropriate for the study of computer science in the landscape of the liberal arts. A curricular model for this setting follows, updating the 1996 revision. As in previous LACS curricula, [Gibbs and Tucker 1986] and [Walker and Schneider 1996], the model is practical; that is, students can schedule it, it can be taught with a relatively small size faculty, and it contributes to the foundation of an excellent liberal arts education. Finally, this 2007 Model Curriculum is compared with the recommendations of CC2001 .
Journal of Computing Sciences in Colleges, 21(3), 2006
Traditionally, computer science curricula have focused on the algorithmic development of software, while design issues such as typography, text layout, and image manipulation have been the domain of graphic design or fine arts programs. With the emergence of the Web as a publishing and programming medium, as well as the availability of high-level development tools, the distinctions between algorithm and presentation are blurring. Today, a programmer needs to understand and apply principles of graphic design in order to develop applications that are usable and attractive to the user. Likewise, a graphic designer developing electronic media must understand and apply programming principles to control the dynamic behavior of the media. This paper addresses the convergence of programming and graphic design, from both the perspectives of a computer scientist and a graphic designer. Simple and practical design principles are presented that can be integrated into a variety of computer science courses.
Proceedings of the 35th SIGCSE Technical Symposium on Computer Science
SIGCSE Bulletin 36(1), 2004
Educators are increasingly acknowledging that practical problems in computer science demand basic competencies in experimentation and data analysis. However, little effort has been made towards explicitly identifying those empirical concepts and skills needed by computer scientists, nor in developing methods of integrating those concepts and skills into CS curricula. In this paper, we identify a core list of empirical competencies and motivate them based on established courses outside of computer science, their potential use in standard CS courses, and their application to real-world problems. Sample assignments that integrate these competencies into the CS curriculum are also discussed.
Journal of Computing Sciences in Colleges, 18(1), 2002
Education research has shown that an effective technique for developing problem-solving and critical-thinking skills is to expose students early and often to "ill-defined" problems in their field. An ill-defined problem is one that addresses complex issues and thus cannot easily be described in a concise, complete manner. Furthermore, competing factors suggest several approaches to the problem, requiring careful analysis to determine the best approach. This paper describes a specific ill-defined problem that was successfully used as an assignment in a recent CS1 course. In completing this assignment, students actively participated in the entire process of problem solving and scientific inquiry, from the formulation of a hypothesis, to the design and implementation of experiments (via a program), to the collection and analysis of the experimental data. As a result, students developed empirical and critical-thinking skills, while also experiencing the use of programming as a tool for investigative inquiry. Experiences using this particular assignment will be discussed, as well as general approaches to identifying ill-defined problems and integrating them into a CS1 course.
Proceedings of the 33rd SIGCSE Technical Symposium on Computer Science
SIGCSE Bulletin 34(1), 2002
We present several introductory computer science laboratory assignments designed to reinforce the use of the scientific method. These assignments require students to make predictions, write simulations, perform experiments, collect data, analyze the results, and possibly revise their predictions, simulations and/or experiments. The assignments are specifically designed to place student predictions in conflict with the observed results producing a disequilibration. As a result, students are encouraged to rethink their assumptions, perform repeated experiments, and critically examine their code. These potential benefits of disequilibration are discussed and additional ways to apply disequilibration in computer science education are suggested.
ACM Journal of Educational Resources in Computing, 1(4), 2001
The Knob & Switch Computer is a computer architecture simulator designed to teach beginning students the basics of computer organization. It differs from existing simulators in two significant ways: (1) it incorporates "cognitive hooks" in the form of knobs and switches that encourage exploration and discovery on the part of the student, and (2) it can be presented one component at a time, starting with a simple interactive data path and building incrementally to a full-featured stored program machine. Both of these features make it possible to engage beginning students and effectively convey an understanding of how computers work. The Knob & Switch Computer Simulator can also motivate the study of other computing topics such as data representation, assembly language programming, and RISC vs. CISC architectures. In addition to describing the Knob & Switch Computer, experiences using the simulator in breadth-based introductory courses both at Dickinson College and Creighton University are discussed.
Proceedings of the 32nd SIGCSE Technical Symposium on Computer Science
SIGCSE Bulletin 33(1), 2001
Proceedings of the 34th Midwest Instruction and Computing Symposium, 2001
Proceedings of the 31st SIGCSE Technical Symposium on Computer Science
SIGCSE Bulletin 32(1), 2000
Empirical skills are playing an increasingly important role in the computing profession and our society. But while traditional computer science curricula are effective in teaching software design skills, little attention has been paid to developing empirical investigative skills such as forming testable hypotheses, designing experiments, critiquing their validity, collecting data, explaining results, and drawing conclusions. In this paper, we describe an initiative at Dickinson College that integrates the development of empirical skills throughout the computer science curriculum. At the introductory level, students perform experiments, analyze the results, and discuss their conclusions. In subsequent courses, they develop their skills at designing, conducting and critiquing experiments through incrementally more open-ended assignments. By their senior year, they are capable of forming hypotheses, designing and conducting experiments, and presenting conclusions based on the results.
Proceedings of the 29th SIGCSE Technical Symposium on Computer Science
SIGCSE Bulletin 30(1), 1998
At SIGCSE 96, Wallingford described an approach to introductory courses that is based on programming patterns, i.e., algorithms or problem-solving approaches that can be applied to various applications. By focusing on patterns, students reason at a higher-level of abstraction and can learn to develop programs by building upon existing code schemas. Closely related to the patterns approach is the use of themes in a programming course. Selecting a particular idea, methodology, or application domain provides a framework for learning new techniques and concepts. This paper describes the use of a particular problem-solving pattern, binary reduction, as a recurring theme in the CS1 course. By introducing problem-solving patterns such as this one early in the course and then revisiting them in different contexts, students learn to look for common characteristics in problems, and to use an existing solution as a framework for solving related problems. Perhaps more importantly, understanding the behavior of one problem solution can simplify the analysis of other problem solutions based on the same pattern.
Annals of Mathematics and Artificial Intelligence, 14, 1995
The Near-Horn Prolog procedures have been proposed as effective procedures in the area of disjunctive logic programming, an extension of logic programming to the (first-order) non-Horn clause domain. In this paper, we show that these case-analysis based procedures may be viewed as members of a class of procedures called the "ancestry family," which also includes Model Elimination (and its variants), the Positive Refinement of Model Elimination, and SLWV-resolution. The common feature which binds these procedures is the extension of SLD-resolution to full first-order logic with the addition of an ancestor cancellation rule. Procedures in the ancestry family are all sound and complete first-order procedures that can be seen to vary along three parameters: (1) the number of clause contrapositives required, (2) the amount of ancestor checking that must occur, and (3) the use (if any) of a Restart rule. Using a sequent-style presentation format for all procedures, we highlight the close relationships among these procedures and compare their relative advantages.
In Proceedings of 26th SIGCSE Technical Symposium on Computer Science
SIGCSE 27(1), 1995
We have developed an application-based approach to introductory courses in computer science. This approach follows an apprenticeship model of learning, where students begin by reading, studying, and extending programs written by experienced and expert programmers. Applications play a central role since programming constructs are motivated and introduced in the context of applications, not the other way around as is the tradition in most texts and courses. Under our applied approach, students are able to learn from interesting real-world examples, the synthesis of different programming constructs is supported using incremental examples, and good design is stressed via code reuse. In this paper, we provide several examples of our method as well as pointers to all the material we have developed which is freely available electronically. The philosophy underlying this method transcends a particular programming language, but we present our examples using C++ since that is the language we use in our CS 1 and CS 2 courses. We have also used this method in a Pascal-based CS 0.5 class with equal success.
Journal of Automated Reasoning, 14:325-351, 1995
We introduce a relevancy detection algorithm to be used in conjunction with the SATCHMO prover. The version of SATCHMO considered here is essentially a bidirectional prover, utilizing Prolog (back chaining) on Horn clauses and forward chaining on non-Horn clauses. Our extension, SATCHMORE (SATCHMO with RElevancy), addresses the major weakness of SATCHMO: the uncontrolled use of forward chaining. By marking potentially relevant clause head literals, and then requiring that all the head literals be marked relevant (be ``totally relevant'') before a clause is used for forward chaining, SATCHMORE is able to guide the use of these rules. Furthermore, the relevancy testing is performed without extending the proof search beyond what is done in SATCHMO. In addition, a very simple implementation of the extended SATCHMO can be written in Prolog. We describe our relevancy testing approach, present the implementation, prove soundness and completeness, and provide examples which demonstrate the power of relevancy testing.
Lecture Notes in AI 596, Springer-Verlag, 1992
This paper presents an overview of the near-Horn Prolog project at Duke University. The basic goal behind this project has been to extend Prolog to disjunctive logic programs (and thus full first-order expressibility) while retaining as much of the clarity and procedural simplicity of Prolog as possible. The approach taken to achieve this goal has been to combine Prolog with case analysis reasoning. The research work within the project can roughly be divided into three areas: procedure design, semantics, and implementation. Three different variants of Near-Horn Prolog have been devised, of which the most recent, Inheritance near-Horn Prolog (InH-Prolog), is the variant currently being favored. The semantics for the near-Horn Prologs, specifically for InH-Prolog, have been investigated, resulting in a case-analysis based fixpoint semantics which mimics the procedural behavior of InH-Prolog. Also, both classical and default negation have been incorporated into the near-Horn Prolog systems. Finally, an interpreter for the original near-Horn Prolog variant has been implemented, and a compiler for the InH-Prolog variant is currently nearing completion.
Journal of Logic Programming, 12(1), 1992
We present three extensions of Prolog, discussing their similarities and differences. The systems -- near-Horn Prolog (Loveland), Simplified Problem Reduction Format (Plaisted), and N-Prolog (Gabbay \& Reyle) -- differ in their approach to the extension of Prolog, yet each utilizes case analysis as the mechanism for non-Horn reasoning. The fact that these systems with quite different origins, purposes, and presentation forms utilize the case analysis method in a strikingly similar fashion suggests that their underlying reasoning is general and intuitive. This paper describes the three systems and outlines the close relationship between them. While they are closely related, the systems also appear to have essential differences, properties of one system that cannot be incorporated in another system without serious distortion of the unique properties of the receiving system. Highlighting those tradeoffs aids our understanding of the systems involved.
In Proceedings of the 1991 International Logic Programming Symposium, 1991
We present an alternative characterization of disjunctive logic programs. We first review Inheritance near-Horn Prolog (InH-Prolog), an intuitive and computationally effective procedure that extends Prolog using case-analysis. We then describe a fixpoint characterization of disjunctive logic programs that is similarly based on case-analysis. This fixpoint characterization closely corresponds to the InH-Prolog procedure, and so gives needed insight into the difficult problem of recognizing the minimal disjunctive answers obtainable from the procedure. Due to its case-analysis nature, this characterization maintains a natural and close relationship to the standard characterization of Horn programs. It also gives added insight into the role of definite answers for disjunctive programs and enables one to neatly focus attention on the interesting class of definite (single atom) consequences.
In Computational Logic: Essays in Honor of Alan Robinson,
Near-Horn Prolog is a logic programming language which extends Prolog to handle non-Horn clauses. It was designed with the goal of minimizing the performance loss for programs with very few non-Horn clauses, while preserving the Prolog format. In this paper, we present a version of near-Horn Prolog that provides a stronger proof system than used by previous near-Horn procedures, and takes advantage of the preprocessing capability of compilers to reduce the accompanying performance penalty. In fact, for a sizable class of non-Horn programs the added inference rule strength incurs no performance penalty at all. In addition to describing this variant, called Inheritance near-Horn Prolog, we prove its soundness and (classical) completeness.