List of publications in computer science
Encyclopedia

Computing Machinery and Intelligence
Computing machinery and intelligence
Computing Machinery and Intelligence, written by Alan Turing and published in 1950 in Mind, is a seminal paper on the topic of artificial intelligence in which the concept of what is now known as the Turing test was introduced to a wide audience....

  • Alan Turing
    Alan Turing
    Alan Mathison Turing, OBE, FRS , was an English mathematician, logician, cryptanalyst, and computer scientist. He was highly influential in the development of computer science, providing a formalisation of the concepts of "algorithm" and "computation" with the Turing machine, which played a...

  • Mind, 59:433–460, 1950.
  • Online copy


Description: This paper discusses whether machines can think and suggested the Turing test
Turing test
The Turing test is a test of a machine's ability to exhibit intelligent behaviour. In Turing's original illustrative example, a human judge engages in a natural language conversation with a human and a machine designed to generate performance indistinguishable from that of a human being. All...

 as a method for checking it.

A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence

  • John McCarthy
    John McCarthy (computer scientist)
    John McCarthy was an American computer scientist and cognitive scientist. He coined the term "artificial intelligence" , invented the Lisp programming language and was highly influential in the early development of AI.McCarthy also influenced other areas of computing such as time sharing systems...

  • Marvin Minsky
    Marvin Minsky
    Marvin Lee Minsky is an American cognitive scientist in the field of artificial intelligence , co-founder of Massachusetts Institute of Technology's AI laboratory, and author of several texts on AI and philosophy.-Biography:...

  • N. Rochester
    Nathaniel Rochester (computer scientist)
    Nathan Rochester designed the IBM 701, wrote the first assembler and participated in the founding of the field of artificial intelligence.- Early work :...

  • C.E. Shannon
  • Online copy


Description: This summer research proposal inaugurated and defined the field. It contains the first use of the term artificial intelligence
Artificial intelligence
Artificial intelligence is the intelligence of machines and the branch of computer science that aims to create it. AI textbooks define the field as "the study and design of intelligent agents" where an intelligent agent is a system that perceives its environment and takes actions that maximize its...

 and this succinct description of the philosophical foundation of the field: "every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it." (See philosophy of AI) The proposal invited researchers to the Dartmouth conference
Dartmouth Conference
The Dartmouth Summer Research Conference on Artificial Intelligence was the name of a 1956 conference now considered the seminal event for artificial intelligence as a field.-People:...

, which is widely considered the "birth of AI". (See history of AI.)

Fuzzy sets

  • Lotfi Zadeh
  • Information and Control, Vol. 8, pp. 338–353. (1965).
  • Online copy


Description: The seminal paper published in 1965 provides details on the mathematics of fuzzy set
Fuzzy set
Fuzzy sets are sets whose elements have degrees of membership. Fuzzy sets were introduced simultaneously by Lotfi A. Zadeh and Dieter Klaua in 1965 as an extension of the classical notion of set. In classical set theory, the membership of elements in a set is assessed in binary terms according to...

 theory.

Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference

  • Judea Pearl
    Judea Pearl
    Judea Pearl is a computer scientist and philosopher, best known for developing the probabilistic approach to artificial intelligence and the development of Bayesian networks ....

  • ISBN 1-55860-479-0 Publisher: Morgan Kaufmann Pub, 1988


Description: This book introduced Bayesian methods to AI.

Artificial Intelligence: A Modern Approach
Artificial Intelligence: A Modern Approach
Artificial Intelligence: A Modern Approach is a college textbook on Artificial Intelligence, written by Stuart J. Russell and Peter Norvig. The third edition of the book was released 11 December 2009...

  • Stuart J. Russell
    Stuart J. Russell
    Stuart Russell is a computer scientist known for his contributions to artificial intelligence.Stuart Russell was born in Portsmouth, England. He received his Bachelor of Arts degree with first-class honours in Physics from Wadham College, Oxford in 1982, and his Ph.D. in Computer Science from...

     and Peter Norvig
    Peter Norvig
    Peter Norvig is an American computer scientist. He is currently the Director of Research at Google Inc.-Educational Background:...

  • Prentice Hall, Englewood Cliffs, New Jersey, 1995, ISBN 0-13-080302-2
  • Textbook's website


Description: The standard textbook in Artificial Intelligence. The book web site lists over 1100 colleges and universities in 102 countries using it.

An Inductive Inference Machine

  • Ray Solomonoff
    Ray Solomonoff
    Ray Solomonoff was the inventor of algorithmic probability, and founder of algorithmic information theory, He was an originator of the branch of artificial intelligence based on machine learning, prediction and probability...

  • IRE Convention Record, Section on Information Theory, Part 2, pp. 56–62, 1957
  • (A longer version of this, a privately circulated report, 1956, is online).


Description: The first paper written on machine learning. Emphasized the importance of training sequences, and the use of parts of previous solutions to problems in constructing trial solutions to new problems.

Language identification in the limit
Language identification in the limit
Language identification in the limit is a formal model for inductive inference. It was introduced by E. Mark Gold in his paper with the same title . In this model, a learner is provided with presentation of some language. The learning is seen as an infinite process. Each time an element of the...



Description: This paper created Algorithmic learning theory
Algorithmic learning theory
Algorithmic learning theory is a framework for machine learning.The framework was introduced in E. Mark Gold's seminal paper "Language identification in the limit"...

.

On the uniform convergence of relative frequencies of events to their probabilities

  • V. Vapnik
    Vladimir Vapnik
    Vladimir Naumovich Vapnik is one of the main developers of Vapnik–Chervonenkis theory. He was born in the Soviet Union. He received his master's degree in mathematics at the Uzbek State University, Samarkand, Uzbek SSR in 1958 and Ph.D in statistics at the Institute of Control Sciences, Moscow in...

    , A. Chervonenkis
    Alexey Chervonenkis
    Alexey Jakovlevich Chervonenkis is a Soviet and Russian mathematician, and, with Vladimir Vapnik, was one of the main developers of the Vapnik–Chervonenkis theory, also known as the "fundamental theory of learning" an important part of computational learning theory. As of September 2007, Dr...

  • Theory of Probability and its Applications, 16(2):264—280, 1971


Description: Computational learning theory
Computational learning theory
In theoretical computer science, computational learning theory is a mathematical field related to the analysis of machine learning algorithms.-Overview:Theoretical results in machine learning mainly deal with a type of...

, VC theory, statistical uniform convergence and the VC dimension
VC dimension
In statistical learning theory, or sometimes computational learning theory, the VC dimension is a measure of the capacity of a statistical classification algorithm, defined as the cardinality of the largest set of points that the algorithm can shatter...

.

A theory of the learnable

  • Leslie Valiant
    Leslie Valiant
    Leslie Gabriel Valiant is a British computer scientist and computational theorist.He was educated at King's College, Cambridge, Imperial College London, and University of Warwick where he received his Ph.D. in computer science in 1974. He started teaching at Harvard University in 1982 and is...

  • Communications of the ACM
    Communications of the ACM
    Communications of the ACM is the flagship monthly journal of the Association for Computing Machinery . First published in 1957, CACM is sent to all ACM members, currently numbering about 80,000. The articles are intended for readers with backgrounds in all areas of computer science and information...

    , 27(11):1134–1142 (1984)


Description: The Probably approximately correct learning
Probably approximately correct learning
In computational learning theory, probably approximately correct learning is a framework for mathematical analysis of machine learning. It was proposed in 1984 by Leslie Valiant....

 (PAC learning) framework.

Learning representations by back-propagating errors

  • David E. Rumelhart, Geoffrey E. Hinton and Ronald J. Williams
    Ronald J. Williams
    Ronald J. Williams is professor of computer science at Northeastern University, and one of the pioneers of neural networks. He co-authored a paper on the backpropagation algorithm which triggered a boom in neural network research. He also made fundamental contributions to the fields of recurrent...

  • Nature, 323, 533—536, 1986


Description: Development of Backpropagation
Backpropagation
Backpropagation is a common method of teaching artificial neural networks how to perform a given task. Arthur E. Bryson and Yu-Chi Ho described it as a multi-stage dynamic system optimization method in 1969 . It wasn't until 1974 and later, when applied in the context of neural networks and...

 algorithm for artificial neural network
Artificial neural network
An artificial neural network , usually called neural network , is a mathematical model or computational model that is inspired by the structure and/or functional aspects of biological neural networks. A neural network consists of an interconnected group of artificial neurons, and it processes...

s. Note that the algorithm was first described by Paul Werbos
Paul Werbos
Paul J. Werbos is a scientist best known for his 1974 Harvard University Ph.D. thesis, which first described the process of training artificial neural networks through backpropagation of errors. The thesis, and some supplementary information, can be found in his book, The Roots of Backpropagation...

 in 1974.

Induction of Decision Trees

  • J.R. Quinlan
    Ross Quinlan
    John Ross Quinlan is a computer science researcher in data mining and decision theory. He has contributed extensively to the development of decision tree algorithms, including inventing the canonical C4.5 and ID3 algorithms...

  • Machine Learning, 1. 81—106, 1986.

Description: Decision Tree
Decision tree
A decision tree is a decision support tool that uses a tree-like graph or model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility. It is one way to display an algorithm. Decision trees are commonly used in operations research, specifically...

s are a common learning algorithm and a decision representation tool. Development of decision trees was done by many researchers in many areas, even before this paper. Though this paper is one of the most influential in the field.

Learning Quickly When Irrelevant Attributes Abound: A New Linear-threshold Algorithm



Description: One of the papers that started the field of on-line learning. In this learning setting, a learner receives a sequence of examples, making predictions after each one, and receiving feedback after each prediction. Research in this area is remarkable because (1) the algorithms and proofs tend to be very simple and beautiful, and (2) the model makes no statistical assumptions about the data. In other words, the data need not be random (as in nearly all other learning models), but can be chosen arbitrarily by "nature" or even an adversary. Specifically, this paper introduced the winnow algorithm
Winnow (algorithm)
The winnow algorithm is a technique from machine learning for learning a linear classifier from labeled examples. It is very similar to the perceptron algorithm. However, the perceptron algorithm uses an additive weight-update scheme, while Winnow uses a multiplicative scheme that allows it to...

.

Learning to predict by the method of Temporal difference

  • Richard S. Sutton
    Richard S. Sutton
    Richard S. Sutton is professor of computer science and chair at the University of Alberta. Professor Sutton is known for his significant contributions in the field of reinforcement learning. He is the author of the original paper on temporal difference learning...

  • Machine Learning 3(1): 9–44
  • Online copy


Description: The Temporal difference method for reinforcement learning
Reinforcement learning
Inspired by behaviorist psychology, reinforcement learning is an area of machine learning in computer science, concerned with how an agent ought to take actions in an environment so as to maximize some notion of cumulative reward...

.

Learnability and the Vapnik–Chervonenkis dimension

  • A. Blumer
  • A. Ehrenfeucht
  • D. Haussler
  • M. K. Warmuth
    Manfred K. Warmuth
    Manfred Klaus Warmuth is a researcher and professor at the University of California, Santa Cruz. His main research interest is computational learning theory with a special focus on online learning algorithms.-External links:*...

  • Journal of the ACM
    Journal of the ACM
    The Journal of the ACM is the flagship scientific journal of the Association for Computing Machinery . It is peer-reviewed and covers computer science in general, especially theoretical aspects. Its current editor-in-chief is Victor Vianu, from University of California, San Diego.The journal has...

    , 36(4):929–965, 1989.


Description: The complete characterization of PAC learnability
Probably approximately correct learning
In computational learning theory, probably approximately correct learning is a framework for mathematical analysis of machine learning. It was proposed in 1984 by Leslie Valiant....

 using the VC dimension
VC dimension
In statistical learning theory, or sometimes computational learning theory, the VC dimension is a measure of the capacity of a statistical classification algorithm, defined as the cardinality of the largest set of points that the algorithm can shatter...

.

Cryptographic limitations on learning boolean formulae and finite automata

  • M. Kearns
  • L. G. Valiant
    Leslie Valiant
    Leslie Gabriel Valiant is a British computer scientist and computational theorist.He was educated at King's College, Cambridge, Imperial College London, and University of Warwick where he received his Ph.D. in computer science in 1974. He started teaching at Harvard University in 1982 and is...

  • In Proceedings of the 21st Annual ACM Symposium on Theory of Computing
    Symposium on Theory of Computing
    STOC, the Annual ACM Symposium on Theory of Computing is an academic conference in the field of theoretical computer science. STOC has been organized annually since 1969, typically in May or June; the conference is sponsored by the Association for Computer Machinery special interest group SIGACT.As...

    , pages 433–444, New York. ACM.
  • Online version(HTML)


Description: Proving negative results for PAC learning
Probably approximately correct learning
In computational learning theory, probably approximately correct learning is a framework for mathematical analysis of machine learning. It was proposed in 1984 by Leslie Valiant....

.

The strength of weak learnability



Description: Proving that weak and strong learnability are equivalent in the noise free PAC framework
Probably approximately correct learning
In computational learning theory, probably approximately correct learning is a framework for mathematical analysis of machine learning. It was proposed in 1984 by Leslie Valiant....

. The proof was done by introducing the boosting
Boosting
Boosting is a machine learning meta-algorithm for performing supervised learning. Boosting is based on the question posed by Kearns: can a set of weak learners create a single strong learner? A weak learner is defined to be a classifier which is only slightly correlated with the true classification...

 method.

Learning in the presence of malicious errors



Description: Proving possibility and impossibility result in the malicious errors framework.

A training algorithm for optimum margin classifiers

  • Bernhard E. Boser
  • Isabelle M. Guyon
  • Vladimir N. Vapnik
    Vladimir Vapnik
    Vladimir Naumovich Vapnik is one of the main developers of Vapnik–Chervonenkis theory. He was born in the Soviet Union. He received his master's degree in mathematics at the Uzbek State University, Samarkand, Uzbek SSR in 1958 and Ph.D in statistics at the Institute of Control Sciences, Moscow in...

  • Proceedings of the Fifth Annual Workshop on Computational Learning Theory 5 144–152, Pittsburgh (1992).
  • Online version(HTML)


Description: This paper presented support vector machines
Support vector machine
A support vector machine is a concept in statistics and computer science for a set of related supervised learning methods that analyze data and recognize patterns, used for classification and regression analysis...

, a practical and popular machine learning algorithm. Support vector machines utilize the kernel trick
Kernel trick
For machine learning algorithms, the kernel trick is a way of mapping observations from a general set S into an inner product space V , without ever having to compute the mapping explicitly, in the hope that the observations will gain meaningful linear structure in V...

, a generally used method.

Knowledge-based analysis of microarray gene expression data by using support vector machines

  • MP Brown
  • WN Grundy
  • D Lin
  • Nello Cristianini
    Nello Cristianini
    Nello Cristianini is a Professor of Artificial Intelligence at the University of Bristol and a current holder of the Royal Society-Wolfson Research Merit Award....

  • CW Sugnet
  • TS Furey
  • M Ares Jr,
  • David Haussler
    David Haussler
    David Haussler is a Howard Hughes Medical Institute Investigator. He is also Professor of Biomolecular Engineering and Director of the Center for Biomolecular Science and Engineering at the University of California, Santa Cruz; director of the California Institute for Quantitative Biosciences on...

  • PNAS, 2000 Jan 4;97(1):262–7 (http://www.pnas.org/cgi/content/abstract/97/1/262)


Description: The first application of supervised learning to gene expression
Gene expression
Gene expression is the process by which information from a gene is used in the synthesis of a functional gene product. These products are often proteins, but in non-protein coding genes such as ribosomal RNA , transfer RNA or small nuclear RNA genes, the product is a functional RNA...

 data, in particular Support Vector Machines. The method is now standard, and the paper one of the most cited in the area.

Collaborative networks

  • Camarinha-Matos, L. M.; Afsarmanesh,H. (2005). Collaborative networks: A new scientific discipline, J. Intelligent Manufacturing, vol. 16, Nº 4–5, pp 439–452.
  • Camarinha-Matos, L. M.; Afsarmanesh,H. (2008). Collaborative Networks: Reference Modeling, Springer: New York.

On the translation of languages from left to right

  • Donald Knuth
    Donald Knuth
    Donald Ervin Knuth is a computer scientist and Professor Emeritus at Stanford University.He is the author of the seminal multi-volume work The Art of Computer Programming. Knuth has been called the "father" of the analysis of algorithms...

  • Information and Control 8 (1965), 607–639.


Description: Bottom up parsing for deterministic context-free languages from which later the LALR approach of Yacc developed.

Semantics of Context-Free Languages.

  • Donald Knuth
    Donald Knuth
    Donald Ervin Knuth is a computer scientist and Professor Emeritus at Stanford University.He is the author of the seminal multi-volume work The Art of Computer Programming. Knuth has been called the "father" of the analysis of algorithms...

  • Math. Systems Theory 2:2 (1968), 127–145.


Description: About grammar attribution, the base for yacc's s-attributed
S-attributed grammar
S-Attributed Grammars are a class of attribute grammars characterized by having no inherited attributes, but only synthesized attributes. Inherited attributes, which must be passed down from parent nodes to children nodes of the abstract syntax tree during the semantic analysis of the parsing...

 and zyacc's LR-attributed
LR-attributed grammar
LR-attributed grammars are a special type of attribute grammars. They allow the attributes to be evaluated on LR parsing. As a result, attribute evaluation in LR-attributed grammars can be incorporated conveniently in bottom-up parsing. zyacc is based on LR-attributed grammars...

 approach.

A program data flow analysis procedure

  • F.E. Allen, J. Cocke
  • Commun. ACM, 19, 137—147.


Description: From the abstract: "The global data relationships in a program can be exposed and codified by the static analysis methods described in this paper. A procedure is given which determines all the definitions which can possibly reach each node of the control flow graph of the program and all the definitions that are live on each edge of the graph."

A Unified Approach to Global Program Optimization

  • Gary Kildall
    Gary Kildall
    Gary Arlen Kildall was an American computer scientist and microcomputer entrepreneur who created the CP/M operating system and founded Digital Research, Inc....

  • Proceedings of ACM SIGACT-SIGPLAN 1973 Symposium on Principles of Programming Languages.


Description: Formalized the concept of data-flow analysis
Data-flow analysis
Data-flow analysis is a technique for gathering information about the possible set of values calculated at various points in a computer program. A program's control flow graph is used to determine those parts of a program to which a particular value assigned to a variable might propagate. The...

 as fixpoint computation over lattice
Lattice (order)
In mathematics, a lattice is a partially ordered set in which any two elements have a unique supremum and an infimum . Lattices can also be characterized as algebraic structures satisfying certain axiomatic identities...

s, and showed that most static analyses used for program optimization can be uniformly expressed within this framework.

YACC: Yet another compiler-compiler

  • Stephen C. Johnson
    Stephen C. Johnson
    Stephen Curtis Johnson spent nearly 20 years at Bell Labs and AT&T where he wrote yacc, lint, spell and the Portable C Compiler machine .Johnson earned his PhD in mathematics but has spent his entire career in computer science...

  • Unix Programmer's Manual Vol 2b, 1979
  • Online copy (HTML)


Description: Yacc
Yacc
The computer program yacc is a parser generator developed by Stephen C. Johnson at AT&T for the Unix operating system. The name is an acronym for "Yet Another Compiler Compiler." It generates a parser based on an analytic grammar written in a notation similar to BNF.Yacc used to be available as...

 is a tool that made compiler
Compiler
A compiler is a computer program that transforms source code written in a programming language into another computer language...

 writing much easier.

gprof: A Call Graph Execution Profiler

  • Susan L. Graham
    Susan L. Graham
    Susan L. Graham is a computer scientist. Graham is the Pehong Chen Distinguished Professor in the Computer Science Division of the Department of Electrical Engineering and Computer Sciences at the University of California, Berkeley...

    , Peter B. Kessler, Marshall Kirk McKusick
    Marshall Kirk McKusick
    Marshall Kirk McKusick is a computer scientist, known for his extensive work on BSD, from the 1980s to FreeBSD in the present day. He was president of the USENIX Association from 1990 to 1992 and again from 2002 to 2004, and still serves on the board. He is also on the editorial board of...

  • Proceedings of the ACM SIGPLAN 1982 Symposium on Compiler Construction, SIGPLAN Notices 17, 6, Boston, MA. June 1982.
  • Online copy


Description: The gprof profiler

Compilers: Principles, Techniques and Tools

  • Alfred V. Aho
  • Ravi Sethi
    Ravi Sethi
    Ravi Sethi is an Indian computer scientist retired from Bell Labs and president of Avaya Labs Research. He is best known as one of three authors of the classic computer science textbook Compilers: Principles, Techniques, and Tools, also known as the Dragon Book.Sethi was born in 1947 in Murdana,...

  • Jeffrey D. Ullman
  • Monica Lam
  • Addison-Wesley
    Addison-Wesley
    Addison-Wesley was a book publisher in Boston, Massachusetts, best known for its textbooks and computer literature. As well as publishing books, Addison-Wesley also distributed its technical titles through the Safari Books Online e-reference service...

    , 1986. ISBN 0-201-10088-6


Description: This book became a classic in compiler writing. It is also known as the Dragon book
Dragon book
The Dragon Book may refer to:* Principles of Compiler Design , by Alfred V. Aho and Jeffrey D. Ullman* Compilers: Principles, Techniques, and Tools First Edition , by Alfred V. Aho, Ravi Sethi and Jeffrey D...

, after the (red) dragon that appears on its cover.

Colossus computer
Colossus computer
Not to be confused with the fictional computer of the same name in the movie Colossus: The Forbin Project.Colossus was the world's first electronic, digital, programmable computer. Colossus and its successors were used by British codebreakers to help read encrypted German messages during World War II...

  • T. H. Flowers
    Tommy Flowers
    Thomas "Tommy" Harold Flowers, MBE was an English engineer. During World War II, Flowers designed Colossus, the world's first programmable electronic computer, to help solve encrypted German messages.-Early life:...

  • Annals of the History of Computing, Vol. 5 (No. 3), 1983, pp. 239–252.
  • The Design of Colossus


Description: The Colossus machines were early computing devices used by British codebreaker
Codebreaker
Codebreaker may refer to:*A person who performs cryptanalysis*The Codebreakers, a 1967 book on history of cryptography by David Kahn*Codebreaker , a 1981 puzzle-based computer game, originally released for the Atari 2600...

s to break German messages encrypted with the Lorenz Cipher
Lorenz cipher
The Lorenz SZ40, SZ42A and SZ42B were German rotor cipher machines used by the German Army during World War II. They were developed by C. Lorenz AG in Berlin. They implemented a Vernam stream cipher...

 during World War II
World War II
World War II, or the Second World War , was a global conflict lasting from 1939 to 1945, involving most of the world's nations—including all of the great powers—eventually forming two opposing military alliances: the Allies and the Axis...

. Colossus was an early binary
Binary numeral system
The binary numeral system, or base-2 number system, represents numeric values using two symbols, 0 and 1. More specifically, the usual base-2 system is a positional notation with a radix of 2...

 electronic digital computer
Computer
A computer is a programmable machine designed to sequentially and automatically carry out a sequence of arithmetic or logical operations. The particular sequence of operations can be changed readily, allowing the computer to solve more than one kind of problem...

. The design of Colossus was later described in the referenced paper.

First Draft of a Report on the EDVAC
First Draft of a Report on the EDVAC
The First Draft of a Report on the EDVAC was an incomplete 101-page document written by John von Neumann and distributed on June 30, 1945 by Herman Goldstine, security officer on the classified ENIAC project...

  • John von Neumann
    John von Neumann
    John von Neumann was a Hungarian-American mathematician and polymath who made major contributions to a vast number of fields, including set theory, functional analysis, quantum mechanics, ergodic theory, geometry, fluid dynamics, economics and game theory, computer science, numerical analysis,...

  • June 30, 1945, the ENIAC
    ENIAC
    ENIAC was the first general-purpose electronic computer. It was a Turing-complete digital computer capable of being reprogrammed to solve a full range of computing problems....

     project.
  • First Draft of a report on the EDVAC (PDF)


Description: It contains the first published description of the logical design of a computer using the stored-program concept, which has come to be known as the von Neumann architecture
Von Neumann architecture
The term Von Neumann architecture, aka the Von Neumann model, derives from a computer architecture proposal by the mathematician and early computer scientist John von Neumann and others, dated June 30, 1945, entitled First Draft of a Report on the EDVAC...

.

Architecture of the IBM System/360

  • Gene Amdahl
    Gene Amdahl
    Gene Myron Amdahl is a Norwegian-American computer architect and high-tech entrepreneur, chiefly known for his work on mainframe computers at IBM and later his own companies, especially Amdahl Corporation...

    , Fred Brooks
    Fred Brooks
    Frederick Phillips Brooks, Jr. is a software engineer and computer scientist, best known for managing the development of IBM's System/360 family of computers and the OS/360 software support package, then later writing candidly about the process in his seminal book The Mythical Man-Month...

    , G. A. Blaauw
  • IBM Journal of Research and Development, 1964.
  • Architecture of the IBM System/360


Description: The IBM System/360 (S/360) is a mainframe computer
Mainframe computer
Mainframes are powerful computers used primarily by corporate and governmental organizations for critical applications, bulk data processing such as census, industry and consumer statistics, enterprise resource planning, and financial transaction processing.The term originally referred to the...

 system family announced by IBM
IBM
International Business Machines Corporation or IBM is an American multinational technology and consulting corporation headquartered in Armonk, New York, United States. IBM manufactures and sells computer hardware and software, and it offers infrastructure, hosting and consulting services in areas...

 on April 7, 1964. It was the first family of computers making a clear distinction between architecture
Computer architecture
In computer science and engineering, computer architecture is the practical art of selecting and interconnecting hardware components to create computers that meet functional, performance and cost goals and the formal modelling of those systems....

 and implementation.

The case for the reduced instruction set computer

  • DA Patterson, DR Ditzel
  • Computer ArchitectureNews, vol. 8, no. 6, October 1980, pp 25–33.
  • Online version(PDF)


Description: The reduced instruction set computer( RISC) CPU design
CPU design
CPU design is the design engineering task of creating a central processing unit , a component of computer hardware. It is a subfield of electronics engineering and computer engineering.- Overview :CPU design focuses on these areas:...

 philosophy. The RISC is a CPU design
CPU design
CPU design is the design engineering task of creating a central processing unit , a component of computer hardware. It is a subfield of electronics engineering and computer engineering.- Overview :CPU design focuses on these areas:...

 philosophy that favors a reduced set of simpler instructions.

Comments on "the Case for the Reduced Instruction Set Computer"



Description:

The CRAY-1 Computer System

  • DW Clark, WD Strecker
  • Communications of the ACM, January 1978, volume 21, number 1, pages 63–72.
  • Online version(PDF)


Description: The Cray-1
Cray-1
The Cray-1 was a supercomputer designed, manufactured, and marketed by Cray Research. The first Cray-1 system was installed at Los Alamos National Laboratory in 1976, and it went on to become one of the best known and most successful supercomputers in history...

 was a supercomputer
Supercomputer
A supercomputer is a computer at the frontline of current processing capacity, particularly speed of calculation.Supercomputers are used for highly calculation-intensive tasks such as problems including quantum physics, weather forecasting, climate research, molecular modeling A supercomputer is a...

 designed by a team including Seymour Cray
Seymour Cray
Seymour Roger Cray was an American electrical engineer and supercomputer architect who designed a series of computers that were the fastest in the world for decades, and founded Cray Research which would build many of these machines. Called "the father of supercomputing," Cray has been credited...

 for Cray Research. The first Cray-1 system was installed at Los Alamos National Laboratory
Los Alamos National Laboratory
Los Alamos National Laboratory is a United States Department of Energy national laboratory, managed and operated by Los Alamos National Security , located in Los Alamos, New Mexico...

 in 1976, and it went on to become one of the best known and most successful supercomputers in history.

Validity of the Single Processor Approach to Achieving Large Scale Computing Capabilities

  • Gene Amdahl
    Gene Amdahl
    Gene Myron Amdahl is a Norwegian-American computer architect and high-tech entrepreneur, chiefly known for his work on mainframe computers at IBM and later his own companies, especially Amdahl Corporation...

  • AFIPS 1967 Spring Joint Computer Conference, Atlantic City, N.J.
  • Online version(PDF)


Description: The Amdahl's Law
Amdahl's law
Amdahl's law, also known as Amdahl's argument, is named after computer architect Gene Amdahl, and is used to find the maximum expected improvement to an overall system when only part of the system is improved...

.

A Case for Redundant Arrays of Inexpensive Disks (RAID)

  • David A. Patterson, Garth Gibson, Randy H. Katz
  • In International Conference on Management of Data, pages 109—116, 1988.
  • Online version(PDF)


Description: This paper discusses the concept of RAID
RAID
RAID is a storage technology that combines multiple disk drive components into a logical unit...

 disks, outlines the different levels of RAID, and the benefits of each level. It is a good paper for discussing issues of reliability and fault tolerance of computer systems, and the cost of providing such fault-tolerance.

The case for a single-chip multiprocessor

  • Kunle Olukotun
    Kunle Olukotun
    Oyekunle Ayinde Olukotun is a pioneer of multi-core processors, a professor of electrical engineering and computer science at Stanford University and director of the Pervasive Parallelism Laboratory at Stanford....

    , Basem Nayfeh, Lance Hammond, Ken Wilson
    Ken Wilson
    Ken Wilson, Kenneth Wilson or Kenny Wilson may refer to:*Ken Wilson , American sports broadcaster*Ken Wilson , Canadian minor hockey league general manager and owner...

    , Kunyung Chang
  • In SIGOPS Oper. Syst. Rev. 30, pages 2–11, 1996.
  • Online version(PDF)


Description: This paper argues that the approach taken to improving the performance of processors by adding multiple instruction issue and out-of-order execution cannot continue to provide speedups indefinitely. It lays out the case for making single chip processors that contain multiple "cores". With the mainstream introduction of multicore processors by Intel in 2005, and their subsequent domination of the market, this paper was shown to be prescient.

The Rendering Equation

  • J. Kajiya
    Jim Kajiya
    Jim Kajiya is a pioneer in the field of computer graphics. He is perhaps best known for the development of the rendering equation.Kajiya received his PhD from the University of Utah in 1979, was a professor at Caltech from 1979 through 1994, and is currently a researcher at Microsoft Research.-...

  • SIGGRAPH: ACM Special Interest Group on Computer Graphics and Interactive Techniques pages 143—150 http://doi.acm.org/10.1145/15922.15902

Elastically deformable models

  • D. Terzopoulos, J. Platt, A. Barr, K. Fleischer
  • Computer Graphics, 21(4), 1987, 205–214, Proc. ACM SIGGRAPH
    SIGGRAPH
    SIGGRAPH is the name of the annual conference on computer graphics convened by the ACM SIGGRAPH organization. The first SIGGRAPH conference was in 1974. The conference is attended by tens of thousands of computer professionals...

    '87 Conference, Anaheim, CA, July, 1987.
  • Online version(PDF)


Description: The Academy of Motion Picture Arts and Sciences cited this paper as a "milestone in computer graphics".

The Phase Correlation
Phase correlation
In image processing, phase correlation is a method of image registration, and uses a fast frequency-domain approach to estimate the relative translative offset between two similar images.- Example :...

 Image Alignment Method

  • C.D. Kuglin and D.C. Hines
  • IEEE 1975 Conference on Cybernetics and Society, 1975, New York, pp. 163–165, September


Description: A correlation method based upon the inverse Fourier transform
Fourier transform
In mathematics, Fourier analysis is a subject area which grew from the study of Fourier series. The subject began with the study of the way general functions may be represented by sums of simpler trigonometric functions...


Determining Optical Flow
Optical flow
Optical flow or optic flow is the pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer and the scene. The concept of optical flow was first studied in the 1940s and ultimately published by American psychologist James J....

  • B.K.P. Horn and B.G. Schunck
  • Artificial Intelligence, Volume 17, 185–203, 1981


Description: A method for estimating the image motion of world points between 2 frames of a video sequence.

An Iterative Image Registration
Image registration
Image registration is the process of transforming different sets of data into one coordinate system. Data may be multiple photographs, data from different sensors, from different times, or from different viewpoints. It is used in computer vision, medical imaging, military automatic target...

 Technique with an Application to Stereo Vision
Binocular vision
Binocular vision is vision in which both eyes are used together. The word binocular comes from two Latin roots, bini for double, and oculus for eye. Having two eyes confers at least four advantages over having one. First, it gives a creature a spare eye in case one is damaged. Second, it gives a...

  • Lucas, B.D. and Kanade, T.
    Takeo Kanade
    is a Japanese computer scientist and one of the world's foremost researchers in computer vision. He is currently U.A. and Helen Whitaker Professor at Carnegie Mellon University...

  • Proceedings of the 7th International Joint Conference on Artificial Intelligence
    International Joint Conference on Artificial Intelligence
    The International Joint Conference on Artificial Intelligence is a meeting of researchers from the different areas of artificial intelligence . It is organized by the IJCAI, Inc., and has been held every other year since 1969. Originally, the other years saw the meetings of the ECAI, AAAI, and a...

    , 674–679,Vancouver, Canada,1981
  • Online version


Description: This paper provides efficient technique for image registration

The Laplacian Pyramid as a compact image code



Description: A technique for image encoding using local operators of many scales.

Snakes: Active contour models

  • Michael Kass, Andrew Witkin
    Andrew Witkin
    Andrew P. Witkin was an American computer scientist who made major contributions in computer vision and computer graphics.-Career:...

    , and Demetri Terzopoulos
  • International Journal of Computer Vision, 1(4):321–331, 1988. (Marr Prize Special Issue)
  • Online version


Description: An interactive variational technique for image segmentation and visual tracking.

Condensation
Condensation algorithm
The condensation algorithm is a computer vision algorithm. The principal application is to detect and track the contour of objects moving in a cluttered environment. Object tracking is one of the more basic and difficult aspects of computer vision and is generally a prerequisite to object...

 – conditional density propagation for visual tracking

  • M. Isard and A. Blake
  • International Journal of Computer Vision, 29(1):5–28, 1998.
  • Online version


Description: A technique for visual tracking

Object recognition from local scale-invariant features



Description: A technique (scale-invariant feature transform
Scale-invariant feature transform
Scale-invariant feature transform is an algorithm in computer vision to detect and describe local features in images. The algorithm was published by David Lowe in 1999....

) for robust feature description

Concurrent, parallel, and distributed computing

Topics covered: concurrent computing
Concurrent computing
Concurrent computing is a form of computing in which programs are designed as collections of interacting computational processes that may be executed in parallel...

, parallel computing
Parallel computing
Parallel computing is a form of computation in which many calculations are carried out simultaneously, operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently . There are several different forms of parallel computing: bit-level,...

, and distributed computing
Distributed computing
Distributed computing is a field of computer science that studies distributed systems. A distributed system consists of multiple autonomous computers that communicate through a computer network. The computers interact with each other in order to achieve a common goal...

.

A relational model for large shared data banks

  • E. F. Codd
    Edgar F. Codd
    Edgar Frank "Ted" Codd was an English computer scientist who, while working for IBM, invented the relational model for database management, the theoretical basis for relational databases...

  • Communications of the ACM
    Communications of the ACM
    Communications of the ACM is the flagship monthly journal of the Association for Computing Machinery . First published in 1957, CACM is sent to all ACM members, currently numbering about 80,000. The articles are intended for readers with backgrounds in all areas of computer science and information...

    , 13(6):377–387, June 1970


Description: This paper introduced the relational model for databases. This model became the number one model.

Binary B-Trees for Virtual Memory

  • Rudolf Bayer
    Rudolf Bayer
    Rudolf Bayer has been Professor of Informatics at the Technical University of Munich since 1972. He is noted for inventing three data sorting structures: the B-tree , the UB-tree and the red-black tree.Bayer is a recipient of 2001 ACM SIGMOD Edgar F. Codd Innovations Award.-External links:*...

  • ACM-SIGFIDET Workshop 1971, San Diego, California, Session 5B, p. 219–235.


Description: This paper introduced the B-Tree
B-tree
In computer science, a B-tree is a tree data structure that keeps data sorted and allows searches, sequential access, insertions, and deletions in logarithmic time. The B-tree is a generalization of a binary search tree in that a node can have more than two children...

s data structure
Data structure
In computer science, a data structure is a particular way of storing and organizing data in a computer so that it can be used efficiently.Different kinds of data structures are suited to different kinds of applications, and some are highly specialized to specific tasks...

. This model became the number one model.

Relational Completeness of Data Base Sublanguages

  • E. F. Codd
    Edgar F. Codd
    Edgar Frank "Ted" Codd was an English computer scientist who, while working for IBM, invented the relational model for database management, the theoretical basis for relational databases...

  • In: R. Rustin (ed.): Database Systems: 65-98, Prentice Hall and IBM Research Report RJ 987, San Jose, California : (1972)
  • Online version (PDF)


Description: Completeness of Data Base Sublanguages

The Entity Relationship Model – Towards a Unified View of Data

  • Peter Chen
    Peter Chen
    Dr. Peter Pin-Shan Chen is an American computer scientist and Professor of Computer Science at Louisiana State University, who is known for the development of Entity-Relationship Modeling in 1976.- Biography :...

  • ACM Transactions on Database Systems
    ACM Transactions on Database Systems
    The ACM Transactions on Database Systems is one of the journals produced by the Association for Computing Machinery. TODS publishes one volume yearly. Each volume has four issues, which appear in March, June, September and December....

    , Vol. 1, No. 1, March 1976, pp. 9–36


Description: This paper introduced the entity-relationship diagram(ERD) method of database design.

SEQUEL: A structured English query language

  • Donald D. Chamberlin
    Donald D. Chamberlin
    Donald D. Chamberlin is an American computer scientist who is best known as one of the principal designers of the original SQL language specification with Raymond Boyce. He also made significant contributions to the development of XQuery....

    , Raymond F. Boyce
    Raymond F. Boyce
    Raymond 'Ray' Boyce was an American computer scientist who was known for his research in relational databases.Boyce grew up in New York, and went to college in Providence, Rhode Island. He earned his PhD in computer science at Purdue in 1971 . After leaving Purdue he worked on database projects...

  • International Conference on Management of Data, Proceedings of the 1974 ACM SIGFIDET (now SIGMOD) workshop on Data description, access and control, Ann Arbor, Michigan, pp. 249–264


Description: This paper introduced the SQL
SQL
SQL is a programming language designed for managing data in relational database management systems ....

 language.

The notions of consistency and predicate locks in a database system

  • K.P. Eswaran, J. Gray, R.A. Lorie, I.L. Traiger
  • Communications of the ACM 19, 1976, 624—633


Description: This paper defined the concepts of transaction
Database transaction
A transaction comprises a unit of work performed within a database management system against a database, and treated in a coherent and reliable way independent of other transactions...

, consistency
Consistency
Consistency can refer to:* Consistency , the psychological need to be consistent with prior acts and statements* "Consistency", an 1887 speech by Mark Twain...

 and schedule. It also argued that a transaction needs to lock a logical rather than a physical subset of the database.

Mining association rules between sets of items in large databases

  • Rakesh Agrawal
    Rakesh Agrawal
    Dr. Rakesh Agrawal is the Winthrop E. Stone Distinguished Professor of Chemical Engineering at Purdue University, West Lafayette, Indiana. Previously he was employed for more than two decades with Air Products and Chemicals, Inc., where he was elected to the highest technical position in the...

    , Tomasz Imielinski
    Tomasz Imielinski
    Tomasz Imieliński is a professor of computer science at Rutgers University and Executive Vice President at Ask.com. He served as chairman of the computer science department at Rutgers from 1996 to 2003...

    , Arun Swami
  • Proc. of the ACM SIGMOD Conference on Management of Data, pages 207–216, Washington, D.C., May 1993
  • Online copy (HTML)


Description: Association rules
Association rule learning
In data mining, association rule learning is a popular andwell researched method for discovering interesting relations between variablesin large databases. Piatetsky-Shapirodescribes analyzing and presenting...

, a very common method for data mining.

The Computer from Pascal to von Neumann

Description: Perhaps the first book on the history of computation.

A History of Computing in the Twentieth Century

edited by:
  • Nicholas Metropolis
    Nicholas Metropolis
    Nicholas Constantine Metropolis was a Greek American physicist.-Work:Metropolis received his B.Sc. and Ph.D. degrees in physics at the University of Chicago...

  • J. Howlett
  • Gian-Carlo Rota
    Gian-Carlo Rota
    Gian-Carlo Rota was an Italian-born American mathematician and philosopher.-Life:Rota was born in Vigevano, Italy...

  • Academic Press
    Academic Press
    Academic Press is an academic book publisher. Originally independent, it was acquired by Harcourt, Brace & World in 1969. Reed Elsevier bought Harcourt in 2000, and Academic Press is now an imprint of Elsevier....

    , 1980, ISBN 0-12-491650-3


Description: Several chapters by pioneers of computing.

A Vector Space Model for Automatic Indexing

  • Gerard Salton, A. Wong, C. S. Yang
  • Commun. ACM 18(11): 613–620 (1975)


Description: Presented the vector space model
Vector space model
Vector space model is an algebraic model for representing text documents as vectors of identifiers, such as, for example, index terms. It is used in information filtering, information retrieval, indexing and relevancy rankings...

.

Extended Boolean Information Retrieval

  • Gerard Salton, Edward A. Fox, Harry Wu
  • Commun. ACM 26(11): 1022–1036 (1983)


Description: Presented the inverted index
Inverted index
In computer science, an inverted index is an index data structure storing a mapping from content, such as words or numbers, to its locations in a database file, or in a document or a set of documents...


Networks and security

Topics covered: cryptography
Cryptography
Cryptography is the practice and study of techniques for secure communication in the presence of third parties...

 and computer security
Computer security
Computer security is a branch of computer technology known as information security as applied to computers and networks. The objective of computer security includes protection of information and property from theft, corruption, or natural disaster, while allowing the information and property to...

, computer networks and the Internet
Internet
The Internet is a global system of interconnected computer networks that use the standard Internet protocol suite to serve billions of users worldwide...

.

An experimental timesharing system.

  • Fernando J. Corbató
    Fernando J. Corbató
    Fernando José "Corby" Corbató is a prominent American computer scientist, notable as a pioneer in the development of time-sharing operating systems....

    , M. Merwin-Daggett, and R.C. Daley
  • Proceedings of the AFIPS FJCC, pages 335–344, 1962.
  • Online copy (HTML)


Description: This paper discuss time-sharing
Time-sharing
Time-sharing is the sharing of a computing resource among many users by means of multiprogramming and multi-tasking. Its introduction in the 1960s, and emergence as the prominent model of computing in the 1970s, represents a major technological shift in the history of computing.By allowing a large...

 as a method of sharing computer resource. This idea changed the interaction with computer systems.

The Working Set Model for Program Behavior

  • Peter J. Denning
    Peter J. Denning
    Peter J. Denning is an American computer scientist, and prolific writer. He is best known for pioneering work in virtual memory, especially for inventing the working-set model for program behavior, which defeated thrashing in operating systems and became the reference standard for all memory...

  • Communications of the ACM, Vol. 11, No. 5, May 1968, pp 323–333
  • Online version(PDF)


Description: The beginning of cache
Cache
In computer engineering, a cache is a component that transparently stores data so that future requests for that data can be served faster. The data that is stored within a cache might be values that have been computed earlier or duplicates of original values that are stored elsewhere...

. For more information see SIGOPS Hall of Fame.

Virtual Memory
Virtual memory
In computing, virtual memory is a memory management technique developed for multitasking kernels. This technique virtualizes a computer architecture's various forms of computer data storage , allowing a program to be designed as though there is only one kind of memory, "virtual" memory, which...

, Processes, and Sharing in MULTICS
Multics
Multics was an influential early time-sharing operating system. The project was started in 1964 in Cambridge, Massachusetts...

  • Robert C. Daley, Jack B. Dennis
  • Communications of the ACM, Vol. 11, No. 5, May 1968, pp. 306–312.
  • Online version(PDF)


Description: The classic paper on Multics
Multics
Multics was an influential early time-sharing operating system. The project was started in 1964 in Cambridge, Massachusetts...

, the most ambitious operating system in the early history of computing. Difficult reading, but it describes the implications of trying to build a system that takes information sharing to its logical extreme. Most operating systems since Multics have incorporated a subset of its facilities.

A note on the confinement problem

  • Butler W. Lampson
  • Communications of the ACM, 16(10):613–615, October 1973.
  • Online version(PDF)


Description: This paper addresses issues in constraining the flow of information from untrusted programs. It discusses covert channels, but more importantly it addresses the difficulty in obtaining full confinement without making the program itself effectively unusable. The ideas are important when trying to understand containment of malicious code, as well as aspects of trusted computing.

The UNIX
Unix
Unix is a multitasking, multi-user computer operating system originally developed in 1969 by a group of AT&T employees at Bell Labs, including Ken Thompson, Dennis Ritchie, Brian Kernighan, Douglas McIlroy, and Joe Ossanna...

 Time-Sharing System

  • Dennis M. Ritchie and Ken Thompson
  • Communications of the ACM
    Communications of the ACM
    Communications of the ACM is the flagship monthly journal of the Association for Computing Machinery . First published in 1957, CACM is sent to all ACM members, currently numbering about 80,000. The articles are intended for readers with backgrounds in all areas of computer science and information...

     7, 7, July 1974.
  • Online copy (few formats)


Description: The Unix
Unix
Unix is a multitasking, multi-user computer operating system originally developed in 1969 by a group of AT&T employees at Bell Labs, including Ken Thompson, Dennis Ritchie, Brian Kernighan, Douglas McIlroy, and Joe Ossanna...

 operating system
Operating system
An operating system is a set of programs that manage computer hardware resources and provide common services for application software. The operating system is the most important type of system software in a computer system...

 and its principles were described in this paper. The main importance is not of the paper but of the operating system, which had tremendous effect on operating system and computer technology.

Weighted voting for replicated data

  • David K. Gifford
  • Proceedings of the 7th ACM Symposium on Operating Systems Principles, pages 150–159, December 1979. Pacific Grove, California
  • Online copy (few formats)


Description: This paper describes the consistency mechanism known as quorum consensus. It is a good example of algorithms that provide a continuous set of options between two alternatives (in this case, between the read-one write-all, and the write-one read-all consistency methods). There have been many variations and improvements by researchers in the years that followed, and it is one of the consistency algorithms that should be understood by all. The options available by choosing different size quorums provide a useful structure for discussing of the core requirements for consistency in distributed systems.

Experiences with Processes and Monitors in Mesa

  • Butler W. Lampson, David D. Redell
  • Communications of the ACM, Vol. 23, No. 2, February, 1980, pp. 105–117.
  • Online copy (PDF)


Description: This is the classic paper on synchronization techniques, including both alternate approaches and pitfalls.

Scheduling Techniques for Concurrent Systems

  • J. K. Ousterhout
    John Ousterhout
    John Kenneth Ousterhout is the chairman of Electric Cloud, Inc. and a professor of computer science at Stanford University. He founded Electric Cloud with John Graham-Cumming. Ousterhout previously was a professor of computer science at University of California, Berkeley where he created the Tcl...

  • Proceedings of Third International Conference on Distributed Computing Systems
    International Conference on Distributed Computing Systems
    The International Conference on Distributed Computing Systems is the oldest conference in the field of distributed computing systems in the world. It was launched by the IEEE Computer Society Technical Committee on Distributed Processing in October 1979, and is sponsored by such committee...

    , 1982, 22—30.


Description: Algorithms for coscheduling
Coscheduling
Coscheduling is a mechanism proposed for concurrent systems that schedules related processes to run on different processors at the same time. If an application consists of a collection of processes working closely together, and if some but not all of the processes are scheduled for execution, the...

 of related processes were given

A Fast File System for UNIX

  • Marshall Kirk Mckusick
    Marshall Kirk McKusick
    Marshall Kirk McKusick is a computer scientist, known for his extensive work on BSD, from the 1980s to FreeBSD in the present day. He was president of the USENIX Association from 1990 to 1992 and again from 2002 to 2004, and still serves on the board. He is also on the editorial board of...

    , William N. Joy, Samuel J. Leffler, Robert S. Fabry
  • IACM Transactions on Computer Systems, Vol. 2, No. 3, August 1984, pp. 181–197.
  • Online copy (PDF)


Description: The file system
File system
A file system is a means to organize data expected to be retained after a program terminates by providing procedures to store, retrieve and update data, as well as manage the available space on the device which contain it. A file system organizes data in an efficient manner and is tuned to the...

 of UNIX
Unix
Unix is a multitasking, multi-user computer operating system originally developed in 1969 by a group of AT&T employees at Bell Labs, including Ken Thompson, Dennis Ritchie, Brian Kernighan, Douglas McIlroy, and Joe Ossanna...

. One of the first papers discussing how to manage disk storage for high-performance file systems. Most file-system research since this paper has been influenced by it, and most high-performance file systems of the last 20 years incorporate techniques from this paper.

The Design and Implementation of a Log-Structured File System

  • Mendel Rosenblum
    Mendel Rosenblum
    Mendel Rosenblum is an associate professor of Computer Science at Stanford University, and one of the co-founders of VMware. Since 2008 he is a Fellow of the Association for Computing Machinery "for contributions to reinventing virtual machines", and had previously received the ACM SIGOPS Mark...

    , J. K. Ousterhout
    John Ousterhout
    John Kenneth Ousterhout is the chairman of Electric Cloud, Inc. and a professor of computer science at Stanford University. He founded Electric Cloud with John Graham-Cumming. Ousterhout previously was a professor of computer science at University of California, Berkeley where he created the Tcl...

  • ACM Transactions on Computer Systems, Vol. 10, No. 1 (February 1992), pp. 26–52.
  • Online version


Description: Log-structured file system
Log-structured file system
A log-structured filesystem is a file system design first proposed in 1988 by John K. Ousterhout and Fred Douglis. Designed for high write throughput, all updates to data and metadata are written sequentially to a continuous stream, called a log...

.

Microkernel
Microkernel
In computer science, a microkernel is the near-minimum amount of software that can provide the mechanisms needed to implement an operating system . These mechanisms include low-level address space management, thread management, and inter-process communication...

 operating system architecture and Mach
Mach
Mach may refer to:* Mach , a lunar crater* Mach disk, diamond pattern seen in rocket exhaust* Mach number, a measure of speed* Gillette Mach3, a manual razor with three blades* Mach bands, an optical illusion...

  • David L. Black, David B. Golub, Daniel P. Julin, Richard F. Rashid, Richard P. Draves, Randall W. Dean, Alessandro Forin, Joseph Barrera, Hideyuki Tokuda, Gerald Malan, David Bohman
  • Proceedings of the USENIX Workshop on Microkernels and Other Kernel Architectures, pages 11–30, April 1992.


Description: This is a good paper discussing one particular microkernel
Microkernel
In computer science, a microkernel is the near-minimum amount of software that can provide the mechanisms needed to implement an operating system . These mechanisms include low-level address space management, thread management, and inter-process communication...

 architecture and contrasting it with monolithic kernel design. Mach underlies Mac OS X
Mac OS X
Mac OS X is a series of Unix-based operating systems and graphical user interfaces developed, marketed, and sold by Apple Inc. Since 2002, has been included with all new Macintosh computer systems...

, and its layered architecture had a significant impact on the design of the Windows NT kernel and modern microkernels like L4
L4 microkernel family
L4 is a family of second-generation microkernels, generally used to implement Unix-like operating systems, but also used in a variety of other systems.L4 was a response to the poor performance of earlier microkernel-base operating systems...

. In addition, its memory-mapped files feature was added to many monolithic kernels.

An Implementation of a Log-Structured File System for UNIX

  • Margo Seltzer
    Margo Seltzer
    Margo Ilene Seltzer is a professor and researcher in computer systems. Currently she is the Herchel Smith Professor of Computer Science and a Harvard College Professor in the School of Engineering and Applied Sciences at Harvard University, where she is active in the Systems Research Group.Dr...

    , Keith Bostic
    Keith Bostic
    Keith Bostic is a computer programmer from the United States.In 1986, Bostic joined the Computer Systems Research Group at the University of California, Berkeley. He was one of the principal architects of the Berkeley 4.4BSD and 4.4BSD-Lite releases...

    , Marshall Kirk McKusick
    Marshall Kirk McKusick
    Marshall Kirk McKusick is a computer scientist, known for his extensive work on BSD, from the 1980s to FreeBSD in the present day. He was president of the USENIX Association from 1990 to 1992 and again from 2002 to 2004, and still serves on the board. He is also on the editorial board of...

    , Carl Staelin
  • Proceedings of the Winter 1993 USENIX Conference, San Diego, CA, January 1993, 307-326
  • Online version


Description: The paper was the first production-quality implementation of that idea which spawned much additional discussion of the viability and short-comings of log-structured filesystems. While "The Design and Implementation of a Log-Structured File System" was certainly the first, this one was important in bringing the research idea to a usable system.

Soft Updates: A Solution to the Metadata Update problem in File Systems

  • G. Ganger, M. McKusick
    Marshall Kirk McKusick
    Marshall Kirk McKusick is a computer scientist, known for his extensive work on BSD, from the 1980s to FreeBSD in the present day. He was president of the USENIX Association from 1990 to 1992 and again from 2002 to 2004, and still serves on the board. He is also on the editorial board of...

    , C. Soules, Y. Patt
    Yale Patt
    Yale Nance Patt is an American professor of electrical and computer engineering at The University of Texas at Austin. He holds the Ernest Cockrell, Jr. Centennial Chair in Engineering. In 1965, Patt introduced the WOS module, the first complex logic gate implemented on a single piece of silicon...

  • ACM Transactions on Computer Systems 18, 2. pp 127–153, May 2000
  • Online version


Description: A new way of maintaining filesystem consistency.

The FORTRAN Automatic Coding System

  • John Backus
    John Backus
    John Warner Backus was an American computer scientist. He directed the team that invented the first widely used high-level programming language and was the inventor of the Backus-Naur form , the almost universally used notation to define formal language syntax.He also did research in...

     et al.
  • Proceedings of the WJCC (Western Joint Computer Conference), Los Angeles, California, February, 1957.
  • Online version(PDF)


Description: This paper describes the design and implementation of the first FORTRAN
Fortran
Fortran is a general-purpose, procedural, imperative programming language that is especially suited to numeric computation and scientific computing...

 compiler by the IBM
IBM
International Business Machines Corporation or IBM is an American multinational technology and consulting corporation headquartered in Armonk, New York, United States. IBM manufactures and sells computer hardware and software, and it offers infrastructure, hosting and consulting services in areas...

 team. Fortran is a general-purpose
Domain-specific programming language
In software development and domain engineering, a domain-specific language is a programming language or specification language dedicated to a particular problem domain, a particular problem representation technique, and/or a particular solution technique...

, procedural
Procedural programming
Procedural programming can sometimes be used as a synonym for imperative programming , but can also refer to a programming paradigm, derived from structured programming, based upon the concept of the procedure call...

, imperative programming
Imperative programming
In computer science, imperative programming is a programming paradigm that describes computation in terms of statements that change a program state...

 language that is especially suited to numeric computation and scientific computing.

Recursive functions of symbolic expressions and their computation by machine, part I

  • John McCarthy
    John McCarthy (computer scientist)
    John McCarthy was an American computer scientist and cognitive scientist. He coined the term "artificial intelligence" , invented the Lisp programming language and was highly influential in the early development of AI.McCarthy also influenced other areas of computing such as time sharing systems...

    .
  • Communications of the ACM, 3(4):184–195, April 1960.
  • Several online versions


Description: This paper introduced LISP, the first functional programming language, which was used heavily in many areas of computer science, especially in AI
Ai
AI, A.I., Ai, or ai may refer to:- Computers :* Artificial intelligence, a branch of computer science* Ad impression, in online advertising* .ai, the ISO Internet 2-letter country code for Anguilla...

. LISP also has powerful features for manipulating LISP programs within the language.

ALGOL 60
ALGOL
ALGOL is a family of imperative computer programming languages originally developed in the mid 1950s which greatly influenced many other languages and became the de facto way algorithms were described in textbooks and academic works for almost the next 30 years...

  • Revised Report on the Algorithmic Language Algol 60 by Peter Naur, et al. – The very influential ALGOL definition; with the first formally defined syntax.
  • B. Randell
    Brian Randell
    Brian Randell is a British computer scientist, and Emeritus Professor at the School of Computing Science, Newcastle University, U.K. He specializes in research in software fault tolerance and dependability, and is a noted authority on the early prior to 1950 history of computers.- Biography...

     and L.J. Russell, ALGOL 60 Implementation: The Translation and Use of ALGOL 60 Programs on a Computer. Academic Press, 1964. The design of the Whetstone Compiler. One of the early published descriptions of implementing a compiler
    Compiler
    A compiler is a computer program that transforms source code written in a programming language into another computer language...

    . See the related papers: Whetstone Algol Revisited, and The Whetstone KDF9 Algol Translator by B. Randell
  • Edsger W. Dijkstra, Algol 60 translation: an Algol 60 translator for the x1 and making a translator for Algol 60, report MR 35/61. Mathematisch Centrum, Amsterdam, 1961. http://www.cs.utexas.edu/users/EWD/MCReps/MR35.PDF


Description: Algol 60 introduced block structure.

Pascal
Pascal (programming language)
Pascal is an influential imperative and procedural programming language, designed in 1968/9 and published in 1970 by Niklaus Wirth as a small and efficient language intended to encourage good programming practices using structured programming and data structuring.A derivative known as Object Pascal...

  • Niklaus Wirth
    Niklaus Wirth
    Niklaus Emil Wirth is a Swiss computer scientist, best known for designing several programming languages, including Pascal, and for pioneering several classic topics in software engineering. In 1984 he won the Turing Award for developing a sequence of innovative computer languages.-Biography:Wirth...

    : The Programming Language Pascal. 35–63, Acta Informatica, Volume 1, 1971.
  • Kathleen Jensen and Niklaus Wirth: PASCAL - User Manual and Report. Springer-Verlag, 1974, 1985, 1991, ISBN 0-387-97649-3 and ISBN 3-540-97649-3 http://www.cs.inf.ethz.ch/~wirth/books/Pascal/
  • Niklaus Wirth: Algorithms + Data Structures = Programs. Prentice–Hall, 1975, ISBN 0-13-022418-9 http://www.cs.inf.ethz.ch/~wirth/books/AlgorithmE0/


Description: Pascal introduced good programming practices using structured programming
Structured programming
Structured programming is a programming paradigm aimed on improving the clarity, quality, and development time of a computer program by making extensive use of subroutines, block structures and for and while loops - in contrast to using simple tests and jumps such as the goto statement which could...

 and data structuring
Data structure
In computer science, a data structure is a particular way of storing and organizing data in a computer so that it can be used efficiently.Different kinds of data structures are suited to different kinds of applications, and some are highly specialized to specific tasks...

.

The next 700 programming languages



Description: This seminal paper proposed an ideal language ISWIM
ISWIM
ISWIM is an abstract computer programming language devised by Peter J. Landin and first described in his article, The Next 700 Programming Languages, published in the Communications of the ACM in 1966...

, which without being ever implemented influenced the whole later development.

Lambda Papers

  • Gerald Jay Sussman
    Gerald Jay Sussman
    Gerald Jay Sussman is the Panasonic Professor of Electrical Engineering at the Massachusetts Institute of Technology . He received his S.B. and Ph.D. degrees in mathematics from MIT in 1968 and 1973 respectively. He has been involved in artificial intelligence research at MIT since 1964...

     and Guy L. Steele, Jr.
    Guy L. Steele, Jr.
    Guy Lewis Steele Jr. , also known as "The Great Quux", and GLS , is an American computer scientist who has played an important role in designing and documenting several computer programming languages.-Biography:...

  • AI Memo
    AI Memo
    The AI Memos are a series of influential memorandums and technical reports published by the MIT AI Lab, Massachusetts Institute of Technology, USA...

    s, 1975–1980


Description: This series of papers and reports first defined the influential Scheme programming language and questioned the prevailing practices in programming language design, employing lambda calculus
Lambda calculus
In mathematical logic and computer science, lambda calculus, also written as λ-calculus, is a formal system for function definition, function application and recursion. The portion of lambda calculus relevant to computation is now called the untyped lambda calculus...

 extensively to model programming language concepts and guide efficient implementation without sacrificing expressive power
Expressive power
In computer science, the expressive power of a language describes the ideas expressible in that language.For example, the Web Ontology Language expression language profile lacks ideas which can be expressed in OWL2 RL . OWL2 EL may therefore be said to have less expressive power than OWL2 RL...

.

Structure and Interpretation of Computer Programs
Structure and Interpretation of Computer Programs
Structure and Interpretation of Computer Programs is a textbook published in 1984 about general computer programming concepts from MIT Press written by Massachusetts Institute of Technology professors Harold Abelson and Gerald Jay Sussman, with Julie Sussman...

  • Harold Abelson and Gerald Jay Sussman
    Gerald Jay Sussman
    Gerald Jay Sussman is the Panasonic Professor of Electrical Engineering at the Massachusetts Institute of Technology . He received his S.B. and Ph.D. degrees in mathematics from MIT in 1968 and 1973 respectively. He has been involved in artificial intelligence research at MIT since 1964...

  • MIT Press
    MIT Press
    The MIT Press is a university press affiliated with the Massachusetts Institute of Technology in Cambridge, Massachusetts .-History:...

    , 1984, 1996


Description: This textbook explains core computer programming concepts, and is widely considered a classic text in computer science.

The C Programming Language

  • Brian Kernighan
    Brian Kernighan
    Brian Wilson Kernighan is a Canadian computer scientist who worked at Bell Labs alongside Unix creators Ken Thompson and Dennis Ritchie and contributed to the development of Unix. He is also coauthor of the AWK and AMPL programming languages. The 'K' of K&R C and the 'K' in AWK both stand for...

     and Dennis Ritchie
    Dennis Ritchie
    Dennis MacAlistair Ritchie , was an American computer scientist who "helped shape the digital era." He created the C programming language and, with long-time colleague Ken Thompson, the UNIX operating system...

  • Prentice Hall
    Prentice Hall
    Prentice Hall is a major educational publisher. It is an imprint of Pearson Education, Inc., based in Upper Saddle River, New Jersey, USA. Prentice Hall publishes print and digital content for the 6-12 and higher-education market. Prentice Hall distributes its technical titles through the Safari...

    , 1978, 1988


Description: Co-authored by the man who designed the C programming language
C (programming language)
C is a general-purpose computer programming language developed between 1969 and 1973 by Dennis Ritchie at the Bell Telephone Laboratories for use with the Unix operating system....

, the first edition of this book served for many years as the language's de facto standard. As such, the book is regarded by many to be the authoritative reference on C.

The C++ Programming Language
The C++ Programming Language
The C++ Programming Language was the first book to describe the C++ programming language, written by the language’s creator, Bjarne Stroustrup, and first published in October 1985...

  • Bjarne Stroustrup
    Bjarne Stroustrup
    Bjarne Stroustrup ; born December 30, 1950 in Århus, Denmark) is a Danish computer scientist, most notable for the creation and the development of the widely used C++ programming language...

  • Addison–Wesley, 1986, 1997, 2000


Description: Written by the man who designed the C++ programming language
C++
C++ is a statically typed, free-form, multi-paradigm, compiled, general-purpose programming language. It is regarded as an intermediate-level language, as it comprises a combination of both high-level and low-level language features. It was developed by Bjarne Stroustrup starting in 1979 at Bell...

, the first edition of this book served for many years as the language's de facto standard until the publication of the ISO/IEC 14882:1998: Programming Language C++ standard on 1 September 1998.

The Java Programming Language
Java (programming language)
Java is a programming language originally developed by James Gosling at Sun Microsystems and released in 1995 as a core component of Sun Microsystems' Java platform. The language derives much of its syntax from C and C++ but has a simpler object model and fewer low-level facilities...

  • Ken Arnold
    Ken Arnold
    Kenneth Cutts Richard Cabot Arnold is an American computer programmer well known as one of the developers of the 1980s dungeon-crawling computer game Rogue, for his contributions to the original Berkeley distribution of Unix, for his books and articles about C and C++ Kenneth Cutts Richard Cabot ...

    , James Gosling
    James Gosling
    James A. Gosling, OC is a computer scientist, best known as the father of the Java programming language.-Education and career:In 1977, Gosling received a B.Sc in Computer Science from the University of Calgary...

    , David Holmes, The Java Programming Language, Fourth Edition, Addison-Wesley Professional, 2005, ISBN 0-321-34980-6

Computational linguistics
Computational linguistics
Computational linguistics is an interdisciplinary field dealing with the statistical or rule-based modeling of natural language from a computational perspective....

Contains the first presentation of stochastic context-free grammar
Stochastic context-free grammar
A stochastic context-free grammar is a context-free grammar in which each production is augmented with a probability...

s.
The first published description of computational morphology
Morphology (linguistics)
In linguistics, morphology is the identification, analysis and description, in a language, of the structure of morphemes and other linguistic units, such as words, affixes, parts of speech, intonation/stress, or implied context...

 using finite state transducer
Finite state transducer
A finite state transducer is a finite state machine with two tapes: an input tape and an output tape. This contrasts with an ordinary finite state automaton , which has a single tape.-Overview:...

s. (Kaplan and Kay had previously done work in this field and presented this at a conference; the linguist Johnson had remarked the possibility in 1972, but not produced any implementation.)
An overview of hidden Markov model
Hidden Markov model
A hidden Markov model is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved states. An HMM can be considered as the simplest dynamic Bayesian network. The mathematics behind the HMM was developed by L. E...

s geared toward speech recognition
Speech recognition
Speech recognition converts spoken words to text. The term "voice recognition" is sometimes used to refer to recognition systems that must be trained to a particular speaker—as is the case for most desktop recognition software...

 and other NLP fields, describing the Viterbi
Viterbi algorithm
The Viterbi algorithm is a dynamic programming algorithm for finding the most likely sequence of hidden states – called the Viterbi path – that results in a sequence of observed events, especially in the context of Markov information sources, and more generally, hidden Markov models...

 and forward-backward algorithms.
Describes a now commonly-used POS tagger based on transformation-based learning.
Textbook on statistical and probabilistic methods in NLP.
This survey documents relatively less researched importance of lazy functional programming languages (i.e. Haskell
Haskell (programming language)
Haskell is a standardized, general-purpose purely functional programming language, with non-strict semantics and strong static typing. It is named after logician Haskell Curry. In Haskell, "a function is a first-class citizen" of the programming language. As a functional programming language, the...

) to construct Natural Language Processors and to accommodated many linguistic theories.

Software engineering
Software engineering
Software Engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software, and the study of these approaches; that is, the application of engineering to software...

: Report of a conference sponsored by the NATO Science Committee

  • Peter Naur
    Peter Naur
    Peter Naur is a Danish pioneer in computer science and Turing award winner. His last name is the N in the BNF notation , used in the description of the syntax for most programming languages...

    , Brian Randell
    Brian Randell
    Brian Randell is a British computer scientist, and Emeritus Professor at the School of Computing Science, Newcastle University, U.K. He specializes in research in software fault tolerance and dependability, and is a noted authority on the early prior to 1950 history of computers.- Biography...

     (eds.)
  • Garmisch, Germany, 7–11 October 1968, Brussels, Scientific Affairs Division, NATO (1969) 231pp.
  • Online copy (PDF)


Description: Conference of leading figures in software field circa 1968

The paper defined the field of Software engineering
Software engineering
Software Engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software, and the study of these approaches; that is, the application of engineering to software...


Go To
Goto
goto is a statement found in many computer programming languages. It is a combination of the English words go and to. It performs a one-way transfer of control to another line of code; in contrast a function call normally returns control...

 Statement Considered Harmful
Considered harmful
In computer science and related disciplines, considered harmful is a phrase popularly used in the titles of diatribes and other critical essays ....

  • Dijkstra, E. W.
    Edsger Dijkstra
    Edsger Wybe Dijkstra ; ) was a Dutch computer scientist. He received the 1972 Turing Award for fundamental contributions to developing programming languages, and was the Schlumberger Centennial Chair of Computer Sciences at The University of Texas at Austin from 1984 until 2000.Shortly before his...

  • Communications of the ACM
    Communications of the ACM
    Communications of the ACM is the flagship monthly journal of the Association for Computing Machinery . First published in 1957, CACM is sent to all ACM members, currently numbering about 80,000. The articles are intended for readers with backgrounds in all areas of computer science and information...

    , 11(3):147–148, March 1968
  • Online copy (PDF)


Description: Don't use goto – the beginning of structured programming
Structured programming
Structured programming is a programming paradigm aimed on improving the clarity, quality, and development time of a computer program by making extensive use of subroutines, block structures and for and while loops - in contrast to using simple tests and jumps such as the goto statement which could...

.

On the criteria to be used in decomposing systems into modules

  • David Parnas
    David Parnas
    David Lorge Parnas is a Canadian early pioneer of software engineering, who developed the concept of information hiding in modular programming, which is an important element of object-oriented programming today. He is also noted for his advocacy of precise documentation.- Biography :Parnas earned...

  • Communications of the ACM
    Communications of the ACM
    Communications of the ACM is the flagship monthly journal of the Association for Computing Machinery . First published in 1957, CACM is sent to all ACM members, currently numbering about 80,000. The articles are intended for readers with backgrounds in all areas of computer science and information...

    , Volume 15, Issue 12:1053–1058, December 1972.
  • Online copy (PDF)


Description: The importance of modularization and information hiding
Information hiding
In computer science, information hiding is the principle of segregation of the design decisions in a computer program that are most likely to change, thus protecting other parts of the program from extensive modification if the design decision is changed...

. Note that information hiding was first presented in a different paper of the same author – "Information Distributions Aspects of Design Methodology", Proceedings of IFIP Congress '71, 1971, Booklet TA-3, pp. 26–30

Hierarchical Program Structures

  • Ole-Johan Dahl
    Ole-Johan Dahl
    Ole-Johan Dahl was a Norwegian computer scientist and is considered to be one of the fathers of Simula and object-oriented programming along with Kristen Nygaard.- Career :...

    , C. A. R. Hoare
    C. A. R. Hoare
    Sir Charles Antony Richard Hoare , commonly known as Tony Hoare or C. A. R. Hoare, is a British computer scientist best known for the development of Quicksort, one of the world's most widely used sorting algorithms...

  • in Dahl, Dijkstra and Hoare, Structured Programming, Academic Press, London and New York, pp. 175–220, 1972.


Description: The beginning of Object-oriented programming
Object-oriented programming
Object-oriented programming is a programming paradigm using "objects" – data structures consisting of data fields and methods together with their interactions – to design applications and computer programs. Programming techniques may include features such as data abstraction,...

. This paper argued that programs should be decomposed to independent components with small and simple interfaces. They also argued that objects should have both data and related methods.

A technique for software module specification with examples

  • David Parnas
    David Parnas
    David Lorge Parnas is a Canadian early pioneer of software engineering, who developed the concept of information hiding in modular programming, which is an important element of object-oriented programming today. He is also noted for his advocacy of precise documentation.- Biography :Parnas earned...

  • Comm. ACM 15, 5 [(May, 1972), 330–336.
  • Online copy (PDF)


Description: software specification.

Structured Design

  • Wayne Stevens
    Wayne Stevens
    Wayne P. Stevens was an American software engineer, consultant, author, pioneer, and advocate of the practical application of software methods and tools.- Life & Work :...

    , Glenford Myers, and Larry Constantine
    Larry Constantine
    Larry LeRoy Constantine is an American software engineer and professor in the Mathematics and Engineering Department at the University of Madeira Portugal, who is considered one of the pioneers of computing...

  • IBM Systems Journal, 13 (2), 115–139, 1974.
  • On-line copy (PDF)


Description: Seminal paper on Structured Design
Structured Systems Analysis and Design Method
Structured systems analysis and design method is a systems approach to the analysis and design of information systems. SSADM was produced for the Central Computer and Telecommunications Agency , a UK government office concerned with the use of technology in government, from 1980 onwards.- Overview...

, data flow diagram
Data flow diagram
A data flow diagram is a graphical representation of the "flow" of data through an information system, modelling its process aspects. Often they are a preliminary step used to create an overview of the system which can later be elaborated...

, coupling
Coupling (computer science)
In computer science, coupling or dependency is the degree to which each program module relies on each one of the other modules.Coupling is usually contrasted with cohesion. Low coupling often correlates with high cohesion, and vice versa...

, and cohesion
Cohesion (computer science)
In computer programming, cohesion is a measure of how strongly-related each piece of functionality expressed by the source code of a software module is...

.

The Emperor's Old Clothes

  • C.A.R. Hoare
  • Communications of the ACM, Vol. 24, No. 2, February 1981, pp. 75–83.
  • Archived copy (PDF)


Description: A lovely story of how large software projects can go right, and then wrong, and then right again, told with humility and humor. Illustrates the "second-system effect
Second-system effect
The second-system effect refers to the tendency of small, elegant, and successful systems to have elephantine, feature-laden monstrosities as their successors. The term was first used by Fred Brooks in his classic The Mythical Man-Month...

" and the importance of simplicity.

The Mythical Man-Month
The Mythical Man-Month
The Mythical Man-Month: Essays on Software Engineering is a book on software engineering and project management by Fred Brooks, whose central theme is that "adding manpower to a late software project makes it later"...

: Essays on Software Engineering

  • Brooks, Jr., F. P.
    Fred Brooks
    Frederick Phillips Brooks, Jr. is a software engineer and computer scientist, best known for managing the development of IBM's System/360 family of computers and the OS/360 software support package, then later writing candidly about the process in his seminal book The Mythical Man-Month...

  • Addison Wesley Professional. 2nd edition, 1995.


Description: Throwing more people at the task will not speed its completion...

No Silver Bullet
No Silver Bullet
"No Silver Bullet — Essence and Accidents of Software Engineering" is a widely discussed paper on software engineering written by Fred Brooks in 1986...

: Essence and Accidents of Software Engineering

Description: We will keep having problems with software...

The Cathedral and the Bazaar
The Cathedral and the Bazaar
The Cathedral and the Bazaar is an essay by Eric S. Raymond on software engineering methods, based on his observations of the Linux kernel development process and his experiences managing an open source project, fetchmail. It examines the struggle between top-down and bottom-up design...

  • Raymond, E.S.
  • First Monday
    First Monday (journal)
    First Monday is an open-access electronic peer-reviewed scientific journal for articles about the Internet.-Publication:First Monday is sponsored and hosted by the University of Illinois at Chicago...

    , 3, 3 (March 1998)
  • Online copy (HTML)


Description: Open source
Open source
The term open source describes practices in production and development that promote access to the end product's source materials. Some consider open source a philosophy, others consider it a pragmatic methodology...

 methodology.

Design Patterns: Elements of Reusable Object Oriented Software

  • E. Gamma
    Erich Gamma
    Erich Gamma is Swiss computer scientist and co-author of the influential Software engineering textbook, Design Patterns: Elements of Reusable Object-Oriented Software. He co-wrote the JUnit software testing framework with Kent Beck and led the design of the Eclipse platform's Java Development Tools...

    , R. Helm, R. Johnson, J. Vlissides
    John Vlissides
    John Matthew Vlissides was a software scientist known mainly as one of the four authors of the book Design Patterns: Elements of Reusable Object-Oriented Software...

  • Addison–Wesley, Reading, Massachusetts, 1995.


Description: This book was the first to define and list design pattern
Design pattern (computer science)
In software engineering, a design pattern is a general reusable solution to a commonly occurring problem within a given context in software design. A design pattern is not a finished design that can be transformed directly into code. It is a description or template for how to solve a problem that...

s in computer science.

Statecharts: A Visual Formalism For Complex Systems

  • David Harel
    David Harel
    David Harel is a professor of computer science at the Weizmann Institute of Science in Israel. Born in London, England, he was Dean of the Faculty of Mathematics and Computer Science at the institute for seven years.-Biography:...

  • D. Harel. Statecharts: A visual formalism for complex systems. Science of Computer Programming, 8:231—274, 1987
  • Online version


Description: Statecharts are a visual modeling method. They are an extension of state machine that might be exponentially more efficient. Therefore, statcharts enable formal modeling of applications that were too complex before. Statecharts are part of the UML
Unified Modeling Language
Unified Modeling Language is a standardized general-purpose modeling language in the field of object-oriented software engineering. The standard is managed, and was created, by the Object Management Group...

 diagrams.

Theoretical computer science

Topics covered: theoretical computer science
Theoretical computer science
Theoretical computer science is a division or subset of general computer science and mathematics which focuses on more abstract or mathematical aspects of computing....

, including computability theory
Computability theory
Computability theory, also called recursion theory, is a branch of mathematical logic that originated in the 1930s with the study of computable functions and Turing degrees. The field has grown to include the study of generalized computability and definability...

, computational complexity theory
Computational complexity theory
Computational complexity theory is a branch of the theory of computation in theoretical computer science and mathematics that focuses on classifying computational problems according to their inherent difficulty, and relating those classes to each other...

, algorithm
Algorithm
In mathematics and computer science, an algorithm is an effective method expressed as a finite list of well-defined instructions for calculating a function. Algorithms are used for calculation, data processing, and automated reasoning...

s, algorithmic information theory
Algorithmic information theory
Algorithmic information theory is a subfield of information theory and computer science that concerns itself with the relationship between computation and information...

, information theory
Information theory
Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...

 and formal verification
Formal verification
In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics .- Usage :Formal verification can be...

.

See also

  • DBLP
    DBLP
    DBLP is a computer science bibliography website hosted at Universität Trier, in Germany. It was originally a database and logic programming bibliography site, and has existed at least since the 1980s. DBLP listed more than 1.3 million articles on computer science in January 2010...

     (Digital Bibliography & Library Project in computer science)
  • Lists of important publications in science
  • List of open problems in computer science
  • The Collection of Computer Science Bibliographies
    The Collection of Computer Science Bibliographies
    The Collection of Computer Science Bibliographies is one of the oldest bibliography collections freely accessible on the Internet. It is a collection of bibliographies of scientific literature in computer science and mathematics from various sources, covering most aspects of computer science...

  • Paris Kanellakis Award
    Paris Kanellakis Award
    The Paris Kanellakis Theory and Practice Award is granted yearly by the Association for Computing Machinery to honor specific theoretical accomplishments that have had a significant and demonstrable effect on the practice of computing...

    , a prize given to honor specific theoretical accomplishments that have had a significant and demonstrable effect on the practice of computing.

External links


Academic Search Engines

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK