Von Neumann architecture
Encyclopedia
The term Von Neumann architecture, aka the Von Neumann model, derives from a computer architecture
proposal by the mathematician
and early computer scientist
John von Neumann
and others, dated June 30, 1945, entitled First Draft of a Report on the EDVAC
. This describes a design architecture for an electronic digital computer with subdivisions of a processing unit consisting of an arithmetic logic unit
and processor register
s, a control unit
containing an instruction register
and program counter
, a memory
to store both data and instructions, external mass storage
, and input and output mechanisms. The meaning of the term has evolved to mean a stored-program computer
in which an instruction fetch and a data operation cannot occur at the same time because they share a common bus. This is referred to as the Von Neumann bottleneck and often limits the performance of the system.
The design of a Von Neumann architecture is simpler than the more modern Harvard architecture
which is also a stored-program system but has one dedicated address and data buses for memory, and another set of address and data buses for fetching instructions.
A stored-program digital computer is one that keeps its programmed
instructions, as well as its data, in read-write
, random-access memory
(RAM). Stored-program computers were an advancement over the program-controlled computers of the 1940s, such as the Colossus
and the ENIAC
, which were programmed by setting switches and inserting patch leads to route data and to control signals between various functional units. In the vast majority of modern computers, the same memory is used for both data and program instructions.
(in principle) is a fixed program computer. It can do basic mathematics
, but it cannot be used as a word processor
or a gaming console. Changing the program of a fixed-program machine requires re-wiring, re-structuring, or re-designing the machine. The earliest computers were not so much "programmed" as they were "designed". "Reprogramming", when it was possible at all, was a laborious process, starting with flowchart
s and paper notes, followed by detailed engineering designs, and then the often-arduous process of physically re-wiring and re-building the machine. It could take three weeks to set up a program on ENIAC
and get it working.
With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set
and can store in memory a set of instructions (a program
) that details the computation
.
A stored-program design also allows for self-modifying code
. One early motivation for such a facility was the need for a program to increment or otherwise modify the address portion of instructions, which had to be done manually in early designs. This became less important when index register
s and indirect address
ing became usual features of machine architecture. Self-modifying code has largely fallen out of favor, since it is usually hard to understand and debug
, as well as being inefficient under modern processor pipelining and caching schemes.
On a large scale, the ability to treat instructions as data is what makes assemblers, compiler
s and other automated programming tools possible. One can "write programs which write programs". On a smaller scale, I/O-intensive machine instructions such as the BITBLT
primitive used to modify images on a bitmap display, were once thought to be impossible to implement without custom hardware. It was shown later that these instructions could be implemented efficiently by "on the fly compilation" ("just-in-time compilation
") technology, e.g., code-generating programs—one form of self-modifying code that has remained popular.
There are drawbacks to the Von Neumann design. Aside from the Von Neumann bottleneck described below, program modifications can be quite harmful, either by accident or design. In some simple stored-program computer designs, a malfunctioning program can damage itself, other programs, or the operating system
, possibly leading to a computer crash
. Memory protection
and other forms of access control
can usually protect against both accidental and malicious program modification.
, who had been alerted to a problem of mathematical logic by the lectures of Max Newman
at the University of Cambridge
, wrote a paper in 1936 entitled On Computable Numbers, with an Application to the Entscheidungsproblem, which was published in the Proceedings of the London Mathematical Society. In it he described a hypothetical machine which he called a "universal computing machine", and which is now known as the "universal Turing machine
". The hypothetical machine had an infinite store (memory in today's terminology) that contained both instructions and data. The German engineer Konrad Zuse
independently wrote about this concept in 1936. John von Neumann
became acquainted with Turing when he was a visiting professor at Cambridge in 1935 and also during Turing's PhD year at the Institute for Advanced Study
, Princeton
in 1936-37. Whether he knew of Turing's 1936 paper at that time is not clear.
Independently, J. Presper Eckert
and John Mauchly
, who were developing the ENIAC
at the Moore School of Electrical Engineering
, at the University of Pennsylvania
, wrote about the stored-program concept in December 1943. In planning a new machine, EDVAC
, Eckert wrote in January 1944 that they would store data and programs in a new addressable memory device, a mercury metal delay line memory
. This was the first time the construction of a practical stored-program machine was proposed. At that time, he and Mauchly were not aware of Turing's work.
Von Neumann was involved in the Manhattan Project
at the Los Alamos National Laboratory
, which required huge amounts of calculation. This drew him to the ENIAC project, in the Summer of 1944. There he joined into the ongoing discussions on the design of this stored-program computer, the EDVAC. As part of that group, he volunteered to write up a description of it and produced the First Draft of a Report on the EDVAC which included ideas from Eckert and Mauchly. It was unfinished when his colleague Herman Goldstine circulated it with only von Neumann's name on it, to the consternation of Eckert and Mauchly. The paper was read by dozens of von Neumann's colleagues in America and Europe, and influenced the next round of computer designs.
Von Neumann was, then, not alone in putting forward the idea of the stored-program architecture, and Jack Copeland
considers that it is "historically inappropriate, to refer to electronic stored-program digital computers as 'von Neumann machines'". His Los Alamos colleague Stan Frankel
said of von Neumann's regard for Turing's ideas:
At the time that the First Draft report was circulated, Turing was producing a report entitled Proposed Electronic Calculator which described in engineering and programming detail, his idea of a machine that was called the Automatic Computing Engine (ACE). He presented this to the Executive Committee of the British National Physical Laboratory
on February 19, 1946. Although Turing knew from his wartime experience at Bletchley Park that what he proposed was feasible, the secrecy surrounding Colossus
, that was subsequently maintained for several decades, prevented him from saying so. Various successful implementations of the ACE design were produced.
Both von Neumann's and Turing's papers described stored-program computers, but von Neumann's earlier paper achieved greater circulation and the computer architecture it outlined became known as the "von Neumann architecture". In the 1953 publication Faster than Thought: A Symposium on Digital Computing Machines (edited by B.V. Bowden), a section in the chapter on Computers in America reads as follows:
In the same book, the first two paragraphs of a chapter on ACE read as follows:
allows input and output devices to be treated the same as memory. A single system bus
could be used to provide a modular system with lower cost. This is sometimes called a "streamlining" of the architecture.
In subsequent decades, simple microcontrollers would sometimes omit features of the model to lower cost and size.
Larger computers added features for higher performance.
(data transfer rate) between the CPU and memory compared to the amount of memory. Because program memory and data memory cannot be accessed at the same time, throughput is much smaller than the rate at which the CPU can work. This seriously limits the effective processing speed when the CPU is required to perform minimal processing on large amounts of data. The CPU is continuously forced to wait
for needed data to be transferred to or from memory. Since CPU speed and memory size have increased much faster than the throughput between them, the bottleneck has become more of a problem, a problem whose severity increases with every newer generation of CPU.
The term "von Neumann bottleneck" was coined by John Backus
in his 1977 ACM Turing Award
lecture. According to Backus:
The performance problem can be alleviated (to some extent) by several mechanisms. Providing a cache
between the CPU and the main memory, providing separate caches or separate access paths for data and instructions (the so-called Modified Harvard architecture
), using branch predictor
algorithms and logic, and providing a limited CPU stack to reduce memory access are four of the ways performance is increased. The problem can also be sidestepped somewhat by using parallel computing
, using for example the Non-Uniform Memory Access
(NUMA) architecture—this approach is commonly employed by supercomputers. It is less clear whether the intellectual bottleneck that Backus criticized has changed much since 1977. Backus's proposed solution has not had a major influence. Modern functional programming
and object-oriented programming
are much less geared towards "pushing vast numbers of words back and forth" than earlier languages like Fortran
were, but internally, that is still what computers spend much of their time doing, even highly parallel supercomputers.
COP8
was introduced in 1986; it has a modified Harvard architecture
.
Perhaps the most common kind of non-von Neumann structure used in modern computers is content-addressable memory
(CAM).
In some cases, emerging memristor
technology may be able to circumvent the von Neumann bottleneck.
Computer architecture
In computer science and engineering, computer architecture is the practical art of selecting and interconnecting hardware components to create computers that meet functional, performance and cost goals and the formal modelling of those systems....
proposal by the mathematician
Mathematician
A mathematician is a person whose primary area of study is the field of mathematics. Mathematicians are concerned with quantity, structure, space, and change....
and early computer scientist
Computer scientist
A computer scientist is a scientist who has acquired knowledge of computer science, the study of the theoretical foundations of information and computation and their application in computer systems....
John von Neumann
John von Neumann
John von Neumann was a Hungarian-American mathematician and polymath who made major contributions to a vast number of fields, including set theory, functional analysis, quantum mechanics, ergodic theory, geometry, fluid dynamics, economics and game theory, computer science, numerical analysis,...
and others, dated June 30, 1945, entitled First Draft of a Report on the EDVAC
First Draft of a Report on the EDVAC
The First Draft of a Report on the EDVAC was an incomplete 101-page document written by John von Neumann and distributed on June 30, 1945 by Herman Goldstine, security officer on the classified ENIAC project...
. This describes a design architecture for an electronic digital computer with subdivisions of a processing unit consisting of an arithmetic logic unit
Arithmetic logic unit
In computing, an arithmetic logic unit is a digital circuit that performs arithmetic and logical operations.The ALU is a fundamental building block of the central processing unit of a computer, and even the simplest microprocessors contain one for purposes such as maintaining timers...
and processor register
Processor register
In computer architecture, a processor register is a small amount of storage available as part of a CPU or other digital processor. Such registers are addressed by mechanisms other than main memory and can be accessed more quickly...
s, a control unit
Control unit
A control unit in general is a central part of the machinery that controls its operation, provided that a piece of machinery is complex and organized enough to contain any such unit. One domain in which the term is specifically used is the area of computer design...
containing an instruction register
Instruction register
In computing, an instruction register is the part of a CPU's control unit that stores the instruction currently being executed or decoded. In simple processors each instruction to be executed is loaded into the instruction register which holds it while it is decoded, prepared and ultimately...
and program counter
Program counter
The program counter , commonly called the instruction pointer in Intel x86 microprocessors, and sometimes called the instruction address register, or just part of the instruction sequencer in some computers, is a processor register that indicates where the computer is in its instruction sequence...
, a memory
Computer memory
In computing, memory refers to the physical devices used to store programs or data on a temporary or permanent basis for use in a computer or other digital electronic device. The term primary memory is used for the information in physical systems which are fast In computing, memory refers to the...
to store both data and instructions, external mass storage
Mass storage
In computing, mass storage refers to the storage of large amounts of data in a persisting and machine-readable fashion. Devices and/or systems that have been described as mass storage include tape libraries, RAID systems, hard disk drives, magnetic tape drives, optical disc drives, magneto-optical...
, and input and output mechanisms. The meaning of the term has evolved to mean a stored-program computer
Stored-program computer
A stored-program computer is one which stores program instructions in electronic memory. Often the definition is extended with the requirement that the treatment of programs and data in memory be interchangeable or uniform....
in which an instruction fetch and a data operation cannot occur at the same time because they share a common bus. This is referred to as the Von Neumann bottleneck and often limits the performance of the system.
The design of a Von Neumann architecture is simpler than the more modern Harvard architecture
Harvard architecture
The Harvard architecture is a computer architecture with physically separate storage and signal pathways for instructions and data. The term originated from the Harvard Mark I relay-based computer, which stored instructions on punched tape and data in electro-mechanical counters...
which is also a stored-program system but has one dedicated address and data buses for memory, and another set of address and data buses for fetching instructions.
A stored-program digital computer is one that keeps its programmed
Computer program
A computer program is a sequence of instructions written to perform a specified task with a computer. A computer requires programs to function, typically executing the program's instructions in a central processor. The program has an executable form that the computer can use directly to execute...
instructions, as well as its data, in read-write
Read-write memory
Read-write memory is a type of computer memory that may be relatively easily written to as well as read from . The term RAM is often used to describe writable memory. RAM actually referring to memory that can be accessed at any "location"....
, random-access memory
Random-access memory
Random access memory is a form of computer data storage. Today, it takes the form of integrated circuits that allow stored data to be accessed in any order with a worst case performance of constant time. Strictly speaking, modern types of DRAM are therefore not random access, as data is read in...
(RAM). Stored-program computers were an advancement over the program-controlled computers of the 1940s, such as the Colossus
Colossus computer
Not to be confused with the fictional computer of the same name in the movie Colossus: The Forbin Project.Colossus was the world's first electronic, digital, programmable computer. Colossus and its successors were used by British codebreakers to help read encrypted German messages during World War II...
and the ENIAC
ENIAC
ENIAC was the first general-purpose electronic computer. It was a Turing-complete digital computer capable of being reprogrammed to solve a full range of computing problems....
, which were programmed by setting switches and inserting patch leads to route data and to control signals between various functional units. In the vast majority of modern computers, the same memory is used for both data and program instructions.
History
The earliest computing machines had fixed programs. Some very simple computers still use this design, either for simplicity or training purposes. For example, a desk calculatorCalculator
An electronic calculator is a small, portable, usually inexpensive electronic device used to perform the basic operations of arithmetic. Modern calculators are more portable than most computers, though most PDAs are comparable in size to handheld calculators.The first solid-state electronic...
(in principle) is a fixed program computer. It can do basic mathematics
Mathematics
Mathematics is the study of quantity, space, structure, and change. Mathematicians seek out patterns and formulate new conjectures. Mathematicians resolve the truth or falsity of conjectures by mathematical proofs, which are arguments sufficient to convince other mathematicians of their validity...
, but it cannot be used as a word processor
Word processor
A word processor is a computer application used for the production of any sort of printable material....
or a gaming console. Changing the program of a fixed-program machine requires re-wiring, re-structuring, or re-designing the machine. The earliest computers were not so much "programmed" as they were "designed". "Reprogramming", when it was possible at all, was a laborious process, starting with flowchart
Flowchart
A flowchart is a type of diagram that represents an algorithm or process, showing the steps as boxes of various kinds, and their order by connecting these with arrows. This diagrammatic representation can give a step-by-step solution to a given problem. Process operations are represented in these...
s and paper notes, followed by detailed engineering designs, and then the often-arduous process of physically re-wiring and re-building the machine. It could take three weeks to set up a program on ENIAC
ENIAC
ENIAC was the first general-purpose electronic computer. It was a Turing-complete digital computer capable of being reprogrammed to solve a full range of computing problems....
and get it working.
With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set
Instruction set
An instruction set, or instruction set architecture , is the part of the computer architecture related to programming, including the native data types, instructions, registers, addressing modes, memory architecture, interrupt and exception handling, and external I/O...
and can store in memory a set of instructions (a program
Computer program
A computer program is a sequence of instructions written to perform a specified task with a computer. A computer requires programs to function, typically executing the program's instructions in a central processor. The program has an executable form that the computer can use directly to execute...
) that details the computation
Computation
Computation is defined as any type of calculation. Also defined as use of computer technology in Information processing.Computation is a process following a well-defined model understood and expressed in an algorithm, protocol, network topology, etc...
.
A stored-program design also allows for self-modifying code
Self-modifying code
In computer science, self-modifying code is code that alters its own instructions while it is executing - usually to reduce the instruction path length and improve performance or simply to reduce otherwise repetitively similar code, thus simplifying maintenance...
. One early motivation for such a facility was the need for a program to increment or otherwise modify the address portion of instructions, which had to be done manually in early designs. This became less important when index register
Index register
An index registerCommonly known as a B-line in early British computers. in a computer's CPU is a processor register used for modifying operand addresses during the run of a program, typically for doing vector/array operations...
s and indirect address
Addressing mode
Addressing modes are an aspect of the instruction set architecture in most central processing unit designs. The various addressing modes that are defined in a given instruction set architecture define how machine language instructions in that architecture identify the operand of each instruction...
ing became usual features of machine architecture. Self-modifying code has largely fallen out of favor, since it is usually hard to understand and debug
Debugging
Debugging is a methodical process of finding and reducing the number of bugs, or defects, in a computer program or a piece of electronic hardware, thus making it behave as expected. Debugging tends to be harder when various subsystems are tightly coupled, as changes in one may cause bugs to emerge...
, as well as being inefficient under modern processor pipelining and caching schemes.
On a large scale, the ability to treat instructions as data is what makes assemblers, compiler
Compiler
A compiler is a computer program that transforms source code written in a programming language into another computer language...
s and other automated programming tools possible. One can "write programs which write programs". On a smaller scale, I/O-intensive machine instructions such as the BITBLT
Bit blit
Bit BLIT is a computer graphics operation in which several bitmaps are combined into one using a raster operator....
primitive used to modify images on a bitmap display, were once thought to be impossible to implement without custom hardware. It was shown later that these instructions could be implemented efficiently by "on the fly compilation" ("just-in-time compilation
Just-in-time compilation
In computing, just-in-time compilation , also known as dynamic translation, is a method to improve the runtime performance of computer programs. Historically, computer programs had two modes of runtime operation, either interpreted or static compilation...
") technology, e.g., code-generating programs—one form of self-modifying code that has remained popular.
There are drawbacks to the Von Neumann design. Aside from the Von Neumann bottleneck described below, program modifications can be quite harmful, either by accident or design. In some simple stored-program computer designs, a malfunctioning program can damage itself, other programs, or the operating system
Operating system
An operating system is a set of programs that manage computer hardware resources and provide common services for application software. The operating system is the most important type of system software in a computer system...
, possibly leading to a computer crash
Crash (computing)
A crash in computing is a condition where a computer or a program, either an application or part of the operating system, ceases to function properly, often exiting after encountering errors. Often the offending program may appear to freeze or hang until a crash reporting service documents...
. Memory protection
Memory protection
Memory protection is a way to control memory access rights on a computer, and is a part of most modern operating systems. The main purpose of memory protection is to prevent a process from accessing memory that has not been allocated to it. This prevents a bug within a process from affecting...
and other forms of access control
Access control
Access control refers to exerting control over who can interact with a resource. Often but not always, this involves an authority, who does the controlling. The resource can be a given building, group of buildings, or computer-based information system...
can usually protect against both accidental and malicious program modification.
Development of the stored-program concept
The mathematician Alan TuringAlan Turing
Alan Mathison Turing, OBE, FRS , was an English mathematician, logician, cryptanalyst, and computer scientist. He was highly influential in the development of computer science, providing a formalisation of the concepts of "algorithm" and "computation" with the Turing machine, which played a...
, who had been alerted to a problem of mathematical logic by the lectures of Max Newman
Max Newman
Maxwell Herman Alexander "Max" Newman, FRS was a British mathematician and codebreaker.-Pre–World War II:Max Newman was born Maxwell Neumann in Chelsea, London, England, on 7 February 1897...
at the University of Cambridge
University of Cambridge
The University of Cambridge is a public research university located in Cambridge, United Kingdom. It is the second-oldest university in both the United Kingdom and the English-speaking world , and the seventh-oldest globally...
, wrote a paper in 1936 entitled On Computable Numbers, with an Application to the Entscheidungsproblem, which was published in the Proceedings of the London Mathematical Society. In it he described a hypothetical machine which he called a "universal computing machine", and which is now known as the "universal Turing machine
Universal Turing machine
In computer science, a universal Turing machine is a Turing machine that can simulate an arbitrary Turing machine on arbitrary input. The universal machine essentially achieves this by reading both the description of the machine to be simulated as well as the input thereof from its own tape. Alan...
". The hypothetical machine had an infinite store (memory in today's terminology) that contained both instructions and data. The German engineer Konrad Zuse
Konrad Zuse
Konrad Zuse was a German civil engineer and computer pioneer. His greatest achievement was the world's first functional program-controlled Turing-complete computer, the Z3, which became operational in May 1941....
independently wrote about this concept in 1936. John von Neumann
John von Neumann
John von Neumann was a Hungarian-American mathematician and polymath who made major contributions to a vast number of fields, including set theory, functional analysis, quantum mechanics, ergodic theory, geometry, fluid dynamics, economics and game theory, computer science, numerical analysis,...
became acquainted with Turing when he was a visiting professor at Cambridge in 1935 and also during Turing's PhD year at the Institute for Advanced Study
Institute for Advanced Study
The Institute for Advanced Study, located in Princeton, New Jersey, United States, is an independent postgraduate center for theoretical research and intellectual inquiry. It was founded in 1930 by Abraham Flexner...
, Princeton
Princeton, New Jersey
Princeton is a community located in Mercer County, New Jersey, United States. It is best known as the location of Princeton University, which has been sited in the community since 1756...
in 1936-37. Whether he knew of Turing's 1936 paper at that time is not clear.
Independently, J. Presper Eckert
J. Presper Eckert
John Adam Presper "Pres" Eckert Jr. was an American electrical engineer and computer pioneer. With John Mauchly he invented the first general-purpose electronic digital computer , presented the first course in computing topics , founded the first commercial computer company , and...
and John Mauchly
John Mauchly
John William Mauchly was an American physicist who, along with J. Presper Eckert, designed ENIAC, the first general purpose electronic digital computer, as well as EDVAC, BINAC and UNIVAC I, the first commercial computer made in the United States.Together they started the first computer company,...
, who were developing the ENIAC
ENIAC
ENIAC was the first general-purpose electronic computer. It was a Turing-complete digital computer capable of being reprogrammed to solve a full range of computing problems....
at the Moore School of Electrical Engineering
Moore School of Electrical Engineering
The Moore School of Electrical Engineering at the University of Pennsylvania came into existence as a result of an endowment from Alfred Fitler Moore on June 4, 1923. It was granted to Penn's School of Electrical Engineering, located in the Towne Building...
, at the University of Pennsylvania
University of Pennsylvania
The University of Pennsylvania is a private, Ivy League university located in Philadelphia, Pennsylvania, United States. Penn is the fourth-oldest institution of higher education in the United States,Penn is the fourth-oldest using the founding dates claimed by each institution...
, wrote about the stored-program concept in December 1943. In planning a new machine, EDVAC
EDVAC
EDVAC was one of the earliest electronic computers. Unlike its predecessor the ENIAC, it was binary rather than decimal, and was a stored program computer....
, Eckert wrote in January 1944 that they would store data and programs in a new addressable memory device, a mercury metal delay line memory
Delay line memory
Delay line memory was a form of computer memory used on some of the earliest digital computers. Like many modern forms of electronic computer memory, delay line memory was a refreshable memory, but as opposed to modern random-access memory, delay line memory was serial-access...
. This was the first time the construction of a practical stored-program machine was proposed. At that time, he and Mauchly were not aware of Turing's work.
Von Neumann was involved in the Manhattan Project
Manhattan Project
The Manhattan Project was a research and development program, led by the United States with participation from the United Kingdom and Canada, that produced the first atomic bomb during World War II. From 1942 to 1946, the project was under the direction of Major General Leslie Groves of the US Army...
at the Los Alamos National Laboratory
Los Alamos National Laboratory
Los Alamos National Laboratory is a United States Department of Energy national laboratory, managed and operated by Los Alamos National Security , located in Los Alamos, New Mexico...
, which required huge amounts of calculation. This drew him to the ENIAC project, in the Summer of 1944. There he joined into the ongoing discussions on the design of this stored-program computer, the EDVAC. As part of that group, he volunteered to write up a description of it and produced the First Draft of a Report on the EDVAC which included ideas from Eckert and Mauchly. It was unfinished when his colleague Herman Goldstine circulated it with only von Neumann's name on it, to the consternation of Eckert and Mauchly. The paper was read by dozens of von Neumann's colleagues in America and Europe, and influenced the next round of computer designs.
Von Neumann was, then, not alone in putting forward the idea of the stored-program architecture, and Jack Copeland
Jack Copeland
Brian Jack Copeland is Professor of Philosophy at the University of Canterbury, Christchurch, New Zealand.Copeland received a BPhil and DPhil from the University of Oxford in philosophy, where he undertook research on modal and non-classical logic.He is the Director of the Turing Archive for the...
considers that it is "historically inappropriate, to refer to electronic stored-program digital computers as 'von Neumann machines'". His Los Alamos colleague Stan Frankel
Stan Frankel
Stanley Phillips "Stan" Frankel was an American computer scientist. He was born in Los Angeles, attended graduate school at the University of Rochester, received his PhD in physics from the University of California, Berkeley, and began his career as a post-doc student under J. Robert Oppenheimer...
said of von Neumann's regard for Turing's ideas:
At the time that the First Draft report was circulated, Turing was producing a report entitled Proposed Electronic Calculator which described in engineering and programming detail, his idea of a machine that was called the Automatic Computing Engine (ACE). He presented this to the Executive Committee of the British National Physical Laboratory
National Physical Laboratory, UK
The National Physical Laboratory is the national measurement standards laboratory for the United Kingdom, based at Bushy Park in Teddington, London, England. It is the largest applied physics organisation in the UK.-Description:...
on February 19, 1946. Although Turing knew from his wartime experience at Bletchley Park that what he proposed was feasible, the secrecy surrounding Colossus
Colossus computer
Not to be confused with the fictional computer of the same name in the movie Colossus: The Forbin Project.Colossus was the world's first electronic, digital, programmable computer. Colossus and its successors were used by British codebreakers to help read encrypted German messages during World War II...
, that was subsequently maintained for several decades, prevented him from saying so. Various successful implementations of the ACE design were produced.
Both von Neumann's and Turing's papers described stored-program computers, but von Neumann's earlier paper achieved greater circulation and the computer architecture it outlined became known as the "von Neumann architecture". In the 1953 publication Faster than Thought: A Symposium on Digital Computing Machines (edited by B.V. Bowden), a section in the chapter on Computers in America reads as follows:
THE MACHINE OF THE INSTITUTE FOR ADVANCED STUDIES, PRINCETON
In 1945, Professor J. von Neumann, who was then working at the Moore School of Engineering in Philadelphia, where the E.N.I.A.C. had been built, issued on behalf of a group of his co-workers a report on the logical design of digital computers. The report contained a fairly detailed proposal for the design of the machine which has since become known as the E.D.V.A.C. (electronic discrete variable automatic computer). This machine has only recently been completed in America, but the von Neumann report inspired the construction of the E.D.S.A.C. (electronic delay-storage automatic calculator) in Cambridge (see page 130).
In 1947, Burks, Goldstine and von Neumann published another report which outlined the design of another type of machine (a parallel machine this time) which should be exceedingly fast, capable perhaps of 20,000 operations per second. They pointed out that the outstanding problem in constructing such a machine was in the development of a suitable memory, all the contents of which were instantaneously accessible, and at first they suggested the use of a special tube—called the Selectron, which had been invented by the Princeton Laboratories of the R.C.A. These tubes were expensive and difficult to make, so von Neumann subsequently decided to build a machine based on the Williams memory. This machine, which was completed in June, 1952 in Princeton has become popularly known as the Maniac. The design of this machine has inspired that of half a dozen or more machines which are now being built in America, all of which are known affectionately as "Johniacs."
In the same book, the first two paragraphs of a chapter on ACE read as follows:
AUTOMATIC COMPUTATION AT THE NATIONAL PHYSICAL LABORATORY'
One of the most modern digital computers which embodies developments and improvements in the technique of automatic electronic computing was recently demonstrated at the National Physical Laboratory, Teddington, where it has been designed and built by a small team of mathematicians and electronics research engineers on the staff of the Laboratory, assisted by a number of production engineers from the English Electric Company, Limited. The equipment so far erected at the Laboratory is only the pilot model of a much larger installation which will be known as the Automatic Computing Engine, but although comparatively small in bulk and containing only about 800 thermionic valves, as can be judged from Plates XII, XIII and XIV, it is an extremely rapid and versatile calculating machine.
The basic concepts and abstract principles of computation by a machine were formulated by Dr. A. M. Turing, F.R.S., in a paper1. read before the London Mathematical Society in 1936, but work on such machines in Britain was delayed by the war. In 1945, however, an examination of the problems was made at the National Physical Laboratory by Mr. J. R. Womersley, then superintendent of the Mathematics Division of the Laboratory. He was joined by Dr. Turing and a small staff of specialists, and, by 1947, the preliminary planning was sufficiently advanced to warrant the establishment of the special group already mentioned. In April, 1948, the latter became the Electronics Section of the Laboratory, under the charge of Mr. F. M. Colebrook.
Early von Neumann-architecture computers
The First Draft described a design that was used by many universities and corporations to construct their computers. Among these various computers, only ILLIAC and ORDVAC had compatible instruction sets.- Manchester Small-Scale Experimental Machine (SSEM), nicknamed "Baby" (University of Manchester, England) made its first successful run of a stored-program on June 21st 1948.
- EDSACEDSACElectronic Delay Storage Automatic Calculator was an early British computer. The machine, having been inspired by John von Neumann's seminal First Draft of a Report on the EDVAC, was constructed by Maurice Wilkes and his team at the University of Cambridge Mathematical Laboratory in England...
(University of Cambridge, England) was the first practical stored-program electronic computer (May 1949) - Manchester Mark 1Manchester Mark 1The Manchester Mark 1 was one of the earliest stored-program computers, developed at the Victoria University of Manchester from the Small-Scale Experimental Machine or "Baby" . It was also called the Manchester Automatic Digital Machine, or MADM...
(University of Manchester, England) Developed from the SSEM (June 1949) - CSIRACCSIRACCSIRAC , originally known as CSIR Mk 1, was Australia's first digital computer, and the fourth stored program computer in the world. It was first to play digital music and is one of only a few surviving first-generation computers .The CSIRAC was constructed by a team led by Trevor Pearcey and...
(Council for Scientific and Industrial ResearchCommonwealth Scientific and Industrial Research OrganisationThe Commonwealth Scientific and Industrial Research Organisation is the national government body for scientific research in Australia...
) Australia (November 1949) - ORDVACORDVACThe ORDVAC or Ordnance Discrete Variable Automatic Computer, an early computer built by the University of Illinois for the Ballistics Research Laboratory at Aberdeen Proving Ground, was based on the IAS architecture developed by John von Neumann, which came to be known as the von Neumann architecture...
(U-Illinois) at Aberdeen Proving Ground, Maryland (completed November 1951) - IAS machineIAS machineThe IAS machine was the first electronic computer built by the Institute for Advanced Study , in Princeton, New Jersey, USA. It is sometimes called the von Neuman machine, since the paper describing its design was edited by John von Neumann, a mathematics professor at both Princeton University...
at Princeton University (January 1952) - MANIAC IMANIAC IThe MANIAC was an early computer built under the direction of Nicholas Metropolis at the Los Alamos Scientific Laboratory...
at Los Alamos Scientific Laboratory (March 1952) - ILLIACILLIACILLIAC was a series of supercomputers built at a variety of locations, some at the University of Illinois at Urbana-Champaign. In all, five computers were built in this series between 1951 and 1974...
at the University of Illinois, (September 1952) - AVIDACAVIDACThe AVIDAC or Argonne Version of the Institute's Digital Automatic Computer, an early computer built by Argonne National Laboratory, was based on the IAS architecture developed by John von Neumann. It was built by the Laborarory's Physics Division for $250,000 and began operations on January 28, 1953...
at Argonne National Laboratory (1953) - ORACLEORACLE (computer)The ORACLE or Oak Ridge Automatic Computer and Logical Engine, an early computer built by Oak Ridge National Laboratory, was based on the IAS architecture developed by John von Neumann. As with all computers of its era, it was a one of a kind machine that could not exchange programs with other...
at Oak Ridge National Laboratory (June 1953) - JOHNNIACJOHNNIACThe JOHNNIAC was an early computer built by RAND that was based on the von Neumann architecture that had been pioneered on the IAS machine. It was named in honor of von Neumann, short for John v. Neumann Numerical Integrator and Automatic Computer...
at RAND Corporation (January 1954) - BESKBESKBESK was Sweden's first electronic computer, using vacuum tubes instead of relays. It was developed by Matematikmaskinnämnden and during a short time it was the fastest computer in the world. The computer was completed in 1953 and in use until 1966...
in Stockholm (1953) - BESM-1BESMBESM is the name of a series of Soviet mainframe computers built in 1950-1960s. The name is an acronym for "Bolshaya Elektronno-Schetnaya Mashina" , literally "Large Electronically Computing Machine". The series began as a successor to MESM...
in Moscow (1952) - DASKDASKThe DASK was the first computer in Denmark. It was commissioned in 1955, designed and constructed by Regnecentralen, and began operation in September 1957. DASK is an acronym for Dansk Algoritmisk Sekvens Kalkulator or Danish Algorithmic Sequence Calculator. Regnecentralen almost didn't allow the...
in Denmark (1955) - PERM in Munich (1956?)
- SILLIACSILLIACThe SILLIAC , an early computer built by the University of Sydney, Australia, was based on the ILLIAC and ORDVAC computers developed at the University of Illinois, which in turn were based on the IAS architecture developed by John von Neumann.SILLIAC had its genesis in...
in Sydney (1956) - WEIZACWEIZACThe WEIZAC was the first computer in Israel, and one of the first large-scale, stored-program, electronic computers in the world....
in Rehovoth (1955)
Early stored-program computers
The date information in the following chronology is difficult to put into proper order. Some dates are for first running a test program, some dates are the first time the computer was demonstrated or completed, and some dates are for the first delivery or installation.- The IBM SSEC had the ability to treat instructions as data, and was publicly demonstrated on January 27, 1948. This ability was claimed in a US patent. However it was partially electromechanical, not fully electronic. In practice, instructions were read from paper tape due to its limited memory.
- The ManchesterVictoria University of ManchesterThe Victoria University of Manchester was a university in Manchester, England. On 1 October 2004 it merged with the University of Manchester Institute of Science and Technology to form a new entity, "The University of Manchester".-1851 - 1951:The University was founded in 1851 as Owens College,...
SSEM (the Baby) was the first fully electronic computer to run a stored program. It ran a factoring program for 52 minutes on June 21, 1948, after running a simple division program and a program to show that two numbers were relatively primeCoprimeIn number theory, a branch of mathematics, two integers a and b are said to be coprime or relatively prime if the only positive integer that evenly divides both of them is 1. This is the same thing as their greatest common divisor being 1...
. - The ENIACENIACENIAC was the first general-purpose electronic computer. It was a Turing-complete digital computer capable of being reprogrammed to solve a full range of computing problems....
was modified to run as a primitive read-only stored-program computer (using the Function Tables for program ROMRead-only memoryRead-only memory is a class of storage medium used in computers and other electronic devices. Data stored in ROM cannot be modified, or can be modified only slowly or with difficulty, so it is mainly used to distribute firmware .In its strictest sense, ROM refers only...
) and was demonstrated as such on September 16, 1948, running a program by Adele GoldstineAdele GoldstineAdele Goldstine , born Adele Katz, wrote the complete technical description for the first electronic digital computer, ENIAC...
for von Neumann. - The BINACBINACBINAC, the Binary Automatic Computer, was an early electronic computer designed for Northrop Aircraft Company by the Eckert-Mauchly Computer Corporation in 1949. Eckert and Mauchly, though they had started the design of EDVAC at the University of Pennsylvania, chose to leave and start EMCC, the...
ran some test programs in February, March, and April 1949, although was not completed until September 1949. - The Manchester Mark 1Manchester Mark 1The Manchester Mark 1 was one of the earliest stored-program computers, developed at the Victoria University of Manchester from the Small-Scale Experimental Machine or "Baby" . It was also called the Manchester Automatic Digital Machine, or MADM...
developed from the SSEM project. An intermediate version of the Mark 1 was available to run programs in April 1949, but was not completed until October 1949. - The EDSAC ran its first program on May 6, 1949.
- The EDVACEDVACEDVAC was one of the earliest electronic computers. Unlike its predecessor the ENIAC, it was binary rather than decimal, and was a stored program computer....
was delivered in August 1949, but it had problems that kept it from being put into regular operation until 1951. - The CSIR Mk ICSIRACCSIRAC , originally known as CSIR Mk 1, was Australia's first digital computer, and the fourth stored program computer in the world. It was first to play digital music and is one of only a few surviving first-generation computers .The CSIRAC was constructed by a team led by Trevor Pearcey and...
ran its first program in November 1949. - The SEACSEAC (computer)SEAC was a first-generation electronic computer, built in 1950 by the U.S. National Bureau of Standards and was initially called the National Bureau of Standards Interim Computer, because it was a small-scale computer designed to be built quickly and put into operation while the NBS waited for...
was demonstrated in April 1950. - The Pilot ACEPilot ACEThe Pilot ACE was one of the first computers built in the United Kingdom, at the National Physical Laboratory in the early 1950s.It was a preliminary version of the full ACE, which had been designed by Alan Turing. After Turing left NPL , James H...
ran its first program on May 10, 1950 and was demonstrated in December 1950. - The SWACSWAC (computer)The SWAC was an early electronic digital computer built in 1950 by the U.S. National Bureau of Standards in Los Angeles, California. It was designed by Harry Huskey...
was completed in July 1950. - The WhirlwindWhirlwind (computer)The Whirlwind computer was developed at the Massachusetts Institute of Technology. It is the first computer that operated in real time, used video displays for output, and the first that was not simply an electronic replacement of older mechanical systems...
was completed in December 1950 and was in actual use in April 1951. - The first ERA AtlasUNIVAC 1101The UNIVAC 1101, or ERA 1101, was a computer system designed by Engineering Research Associates and built by the Remington Rand corporation in the 1950s. It was the first stored program computer in the U.S. that was moved from its site of manufacture and successfully installed at a distant site...
(later the commercial ERA 1101/UNIVAC 1101) was installed in December 1950.
Evolution
Through the decades of the 1960s and 1970s computers generally became both smaller and faster, which led to some evolutions in their architecture. For example, memory-mapped I/OMemory-mapped I/O
Memory-mapped I/O and port I/O are two complementary methods of performing input/output between the CPU and peripheral devices in a computer...
allows input and output devices to be treated the same as memory. A single system bus
System bus
A system bus is a single computer bus that connects the major components of a computer system. The technique was developed to reduce costs and improve modularity....
could be used to provide a modular system with lower cost. This is sometimes called a "streamlining" of the architecture.
In subsequent decades, simple microcontrollers would sometimes omit features of the model to lower cost and size.
Larger computers added features for higher performance.
Von Neumann bottleneck
The shared bus between the program memory and data memory leads to the Von Neumann bottleneck, the limited throughputThroughput
In communication networks, such as Ethernet or packet radio, throughput or network throughput is the average rate of successful message delivery over a communication channel. This data may be delivered over a physical or logical link, or pass through a certain network node...
(data transfer rate) between the CPU and memory compared to the amount of memory. Because program memory and data memory cannot be accessed at the same time, throughput is much smaller than the rate at which the CPU can work. This seriously limits the effective processing speed when the CPU is required to perform minimal processing on large amounts of data. The CPU is continuously forced to wait
Wait state
A wait state is a delay experienced by a computer processor when accessing external memory or another device that is slow to respond.As of late 2011, computer microprocessors run at very high speeds, while memory technology does not seem to be able to catch up: typical PC processors like the Intel...
for needed data to be transferred to or from memory. Since CPU speed and memory size have increased much faster than the throughput between them, the bottleneck has become more of a problem, a problem whose severity increases with every newer generation of CPU.
The term "von Neumann bottleneck" was coined by John Backus
John Backus
John Warner Backus was an American computer scientist. He directed the team that invented the first widely used high-level programming language and was the inventor of the Backus-Naur form , the almost universally used notation to define formal language syntax.He also did research in...
in his 1977 ACM Turing Award
Turing Award
The Turing Award, in full The ACM A.M. Turing Award, is an annual award given by the Association for Computing Machinery to "an individual selected for contributions of a technical nature made to the computing community. The contributions should be of lasting and major technical importance to the...
lecture. According to Backus:
Surely there must be a less primitive way of making big changes in the store than by pushing vast numbers of words back and forth through the von Neumann bottleneck. Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand. Thus programming is basically planning and detailing the enormous traffic of words through the von Neumann bottleneck, and much of that traffic concerns not significant data itself, but where to find it.
The performance problem can be alleviated (to some extent) by several mechanisms. Providing a cache
Cache
In computer engineering, a cache is a component that transparently stores data so that future requests for that data can be served faster. The data that is stored within a cache might be values that have been computed earlier or duplicates of original values that are stored elsewhere...
between the CPU and the main memory, providing separate caches or separate access paths for data and instructions (the so-called Modified Harvard architecture
Modified Harvard architecture
The Modified Harvard Architecture is a variation of the Harvard computer architecture that allows the contents of the instruction memory to be accessed as if it were data...
), using branch predictor
Branch predictor
In computer architecture, a branch predictor is a digital circuit that tries to guess which way a branch will go before this is known for sure. The purpose of the branch predictor is to improve the flow in the instruction pipeline...
algorithms and logic, and providing a limited CPU stack to reduce memory access are four of the ways performance is increased. The problem can also be sidestepped somewhat by using parallel computing
Parallel computing
Parallel computing is a form of computation in which many calculations are carried out simultaneously, operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently . There are several different forms of parallel computing: bit-level,...
, using for example the Non-Uniform Memory Access
Non-Uniform Memory Access
Non-Uniform Memory Access is a computer memory design used in Multiprocessing, where the memory access time depends on the memory location relative to a processor...
(NUMA) architecture—this approach is commonly employed by supercomputers. It is less clear whether the intellectual bottleneck that Backus criticized has changed much since 1977. Backus's proposed solution has not had a major influence. Modern functional programming
Functional programming
In computer science, functional programming is a programming paradigm that treats computation as the evaluation of mathematical functions and avoids state and mutable data. It emphasizes the application of functions, in contrast to the imperative programming style, which emphasizes changes in state...
and object-oriented programming
Object-oriented programming
Object-oriented programming is a programming paradigm using "objects" – data structures consisting of data fields and methods together with their interactions – to design applications and computer programs. Programming techniques may include features such as data abstraction,...
are much less geared towards "pushing vast numbers of words back and forth" than earlier languages like Fortran
Fortran
Fortran is a general-purpose, procedural, imperative programming language that is especially suited to numeric computation and scientific computing...
were, but internally, that is still what computers spend much of their time doing, even highly parallel supercomputers.
Non-von Neumann processors
The National SemiconductorNational Semiconductor
National Semiconductor was an American semiconductor manufacturer, that specialized in analog devices and subsystems,formerly headquartered in Santa Clara, California, USA. The products of National Semiconductor included power management circuits, display drivers, audio and operational amplifiers,...
COP8
COP8
The COP8 microcontroller from National Semiconductor is an 8 bit CISC core microcontroller, whose main features are:* Large amount of I/O pins.* Plenty of Flash memory/ROM for code and data .* Very low EMI. No known bugs....
was introduced in 1986; it has a modified Harvard architecture
Harvard architecture
The Harvard architecture is a computer architecture with physically separate storage and signal pathways for instructions and data. The term originated from the Harvard Mark I relay-based computer, which stored instructions on punched tape and data in electro-mechanical counters...
.
Perhaps the most common kind of non-von Neumann structure used in modern computers is content-addressable memory
Content-addressable memory
Content-addressable memory is a special type of computer memory used in certain very high speed searching applications. It is also known as associative memory, associative storage, or associative array, although the last term is more often used for a programming data structure...
(CAM).
In some cases, emerging memristor
Memristor
Memristor is a passive two-terminal electrical component envisioned by Leon Chua as a fundamental non-linear circuit element relating charge and magnetic flux linkage...
technology may be able to circumvent the von Neumann bottleneck.
See also
- CARDboard Illustrative Aid to ComputationCARDboard Illustrative Aid to ComputationCardiac was a learning aid developed by David Hagelbarger and Saul Fingerman for Bell Telephone Laboratories in 1968 to teach high school students how computers work. The kit consisted of an instruction manual and a die-cut cardboard "computer".The computer "operated" by means of pencil and...
- Harvard architectureHarvard architectureThe Harvard architecture is a computer architecture with physically separate storage and signal pathways for instructions and data. The term originated from the Harvard Mark I relay-based computer, which stored instructions on punched tape and data in electro-mechanical counters...
- Interconnect bottleneckInterconnect bottleneckThe interconnect bottleneck, the point at which integrated circuits reach their capacity, is expected sometime around 2010.Improved performance of computer systems has been achieved, in large part, by downscaling the IC minimum feature size. This allows the basic IC building block, the transistor,...
- Little man computerLittle man computerThe Little Man Computer is an instructional model of a computer, created by Dr. Stuart Madnick in 1965. The LMC is generally used to teach students, because it models a simple von Neumann architecture computer - which has all of the basic features of a modern computer...
- Modified Harvard architectureModified Harvard architectureThe Modified Harvard Architecture is a variation of the Harvard computer architecture that allows the contents of the instruction memory to be accessed as if it were data...
- Random access machineRandom access machineIn computer science, random access machine is an abstract machine in the general class of register machines. The RAM is very similar to the counter machine but with the added capability of 'indirect addressing' of its registers...
- Turing machineTuring machineA Turing machine is a theoretical device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside a...
General
- Can Programming be Liberated from the von Neumann Style?, John Backus, 1977 ACM Turing Award Lecture. Communications of the ACM, August 1978, Volume 21, Number 8 Online PDF
- C. Gordon Bell and Allen Newell (1971), Computer Structures: Readings and Examples, McGraw-Hill Book Company, New York. Massive (668 pages)