Computer simulation
Encyclopedia
A computer simulation, a computer model, or a computational model is a computer program
, or network of computers, that attempts to simulate
an abstract model of a particular system. Computer simulations have become a useful part of mathematical model
ing of many natural systems in physics
(computational physics
), astrophysics
, chemistry
and biology
, human systems in economics
, psychology
, social science, and engineering
. Simulations can be used to explore and gain new insights into new technology
, and to estimate the performance of systems too complex for analytical solutions.
Computer simulations vary from computer programs that run a few minutes, to network-based groups of computers running for hours, to ongoing simulations that run for days. The scale of events being simulated by computer simulations has far exceeded anything possible (or perhaps even imaginable) using traditional paper-and-pencil mathematical modeling. Over 10 years ago, a desert-battle simulation, of one force invading another, involved the modeling of 66,239 tanks, trucks and other vehicles on simulated terrain around Kuwait
, using multiple supercomputers in the DoD
High Performance Computer Modernization Program
Other examples include a 1-billion-atom model of material deformation (2002); a 2.64-million-atom model of the complex maker of protein in all organisms, a ribosome
, in 2005;
and the Blue Brain
project at EPFL (Switzerland), begun in May 2005, to create the first computer simulation of the entire human brain, right down to the molecular level.
, which attempts to find analytical solutions to problems and thereby enable the prediction of the behavior of the system from a set of parameters and initial conditions.
The term computer simulation is broader than computer modeling; the latter implies that all aspects are being modeled in the computer representation. However, computer simulation also includes generating inputs from simulated users in order to run actual computer software or equipment, with only part of the system being modeled. An example would be a flight simulator
that can run machines as well as actual flight software.
Computer simulations are used in many fields, including science
, technology
, entertainment
, health care, and business
planning and scheduling.
in World War II
to model the process of nuclear detonation
. It was a simulation of 12 hard spheres
using a Monte Carlo algorithm
. Computer simulation is often used as an adjunct to, or substitute for, modeling systems for which simple closed form analytic solutions are not possible. There are many types of computer simulations; the common feature they all share is the attempt to generate a sample of representative scenarios for a model in which a complete enumeration of all possible states of the model would be prohibitive or impossible.
Input sources also vary widely:
Lastly, the time at which data is available varies:
Because of this variety, and that many common elements exist between diverse simulation systems, there are a large number of specialized simulation languages. The best-known of these may be Simula
(sometimes Simula-67, after the year 1967 when it was proposed). There are now many others
.
Systems that accept data from external sources must be very careful in knowing what they are receiving. While it is easy for computers to read in values from text or binary files, what is much harder is knowing what the accuracy (compared to measurement resolution and precision
) of the values is. Often it is expressed as "error bars", a minimum and maximum deviation from the value seen within which the true value (is expected to) lie. Because digital computer mathematics is not perfect, rounding and truncation errors will multiply this error up, and it is therefore useful to perform an "error analysis" to check that values output by the simulation are still usefully accurate.
Even small errors in the original data can accumulate into substantial error later in the simulation. While all computer analysis is subject to the "GIGO" (garbage in, garbage out) restriction, this is especially true of digital simulation. Indeed, it was the observation of this inherent, cumulative error, for digital systems that is the origin of chaos theory
.
Another way of categorizing models is to look at the underlying data structures. For time-stepped simulations, there are two main classes:
Equations define the relationships between elements of the modeled system and attempt to find a state in which the system is in equilibrium. Such models are often used in simulating physical systems, as a simpler modeling case before dynamic simulation is attempted.
s; however, psychologists and others noted that humans could quickly perceive trends by looking at graphs or even moving-images or motion-pictures generated from the data, as displayed by computer-generated-imagery (CGI) animation. Although observers couldn't necessarily read out numbers, or spout math formulas, from observing a moving weather chart, they might be able to predict events (and "see that rain was headed their way"), much faster than scanning tables of rain-cloud coordinates. Such intense graphical displays, which transcended the World of numbers and formulae, sometimes also led to output that lacked a coordinate grid or omitted timestamps, as if straying too far from numeric data displays. Today, weather forecasting
models tend to balance the view of moving rain/snow clouds against a map that uses numeric coordinates and numeric timestamps of events.
Similarly, CGI computer simulations of CAT scans can simulate how a tumor might shrink or change, during an extended period of medical treatment, presenting the passage of time as a spinning view of the visible human head, as the tumor changes.
Other applications of CGI computer simulations are being developed to graphically display large amounts of data, in motion, as changes occur during a simulation run.
Specific examples of computer simulations follow:
Notable, and sometimes controversial, computer simulations used in science include: Donella Meadows'
World3
used in the Limits to Growth
, James Lovelock's
Daisyworld
and Thomas Ray's Tierra
.
was started to develop reusable libraries for simulations in Java
, together with Easy Java Simulations, a complete graphical environment that generates code based on these libraries.
The reliability and the trust people put in computer simulations depends on the validity
of the simulation model, therefore verification and validation
are of crucial importance in the development of computer simulations. Another important aspect of computer simulations is that of reproducibility of the results, meaning that a simulation model should not provide a different answer for each execution. Although this might seem obvious, this is a special point of attention in stochastic simulation
s, where random numbers should actually be semi-random numbers. An exception to reproducibility are human in the loop simulations such as flight simulations and computer games
. Here a human is part of the simulation and thus influences the outcome in a way that is hard, if not impossible, to reproduce exactly.
Vehicle
manufacturers make use of computer simulation to test safety features in new designs. By building a copy of the car in a physics simulation environment, they can save the hundreds of thousands of dollars that would otherwise be required to build a unique prototype and test it. Engineers can step through the simulation milliseconds at a time to determine the exact stresses being put upon each section of the prototype.
Computer graphics
can be used to display the results of a computer simulation. Animations can be used to experience a simulation in real-time e.g. in training simulations
. In some cases animations may also be useful in faster than real-time or even slower than real-time modes. For example, faster than real-time animations can be useful in visualizing the buildup of queues in the simulation of humans evacuating a building. Furthermore, simulation results are often aggregated into static images using various ways of scientific visualization
.
In debugging, simulating a program execution under test (rather than executing natively) can detect far more errors than the hardware itself can detect and, at the same time, log useful debugging information such as instruction trace, memory alterations and instruction counts. This technique can also detect buffer overflow
and similar "hard to detect" errors as well as produce performance information and tuning
data.
to ensure that the accuracy of the results are properly understood. For example, the probabilistic risk analysis of factors determining the success of an oilfield exploration program involves combining samples from a variety of statistical distributions using the Monte Carlo method
. If, for instance, one of the key parameters (e.g. the net ratio of oil-bearing strata) is known to only one significant figure, then the result of the simulation might not be more precise than one significant figure, although it might (misleadingly) be presented as having four significant figures.
Model Calibration Techniques
The following three steps should be used to produce accurate simulation models: calibration, verification, and validation. Computer simulations are good at portraying and comparing theoretical scenarios but in order to accurately model actual case studies, it has to match what is actually happening today. A base model should be created and calibrated so that it matches the area being studied. The calibrated model should then be verified to ensure that the model is operating as expected based on the inputs. Once the model has been verified, the final step is to validate the model by comparing the outputs to historical data from the study area. This can be done by using statistical techniques and ensuring an adequate R-squared value. Unless these techniques are employed, the simulation model created will produce inaccurate results and not be a useful prediction tool.
Model calibration is achieved by adjusting any available parameters in order to adjust how the model operates and simulates the process. For example in traffic simulation, typical parameters include look-ahead distance, car-following sensitivity, discharge headway, and start-up lost time. These parameters influence driver behaviors such as when and how long it takes a driver to change lanes, how much distance a driver leaves between itself and the car in front of it, and how quickly it starts to accelerate through an intersection. Adjusting these parameters has a direct effect on the amount of traffic volume that can traverse through the modeled roadway network by making the drivers more or less aggressive. These are examples of calibration parameters that can be fine-tuned to match up with characteristics observed in the field at the study location. Most traffic models will have typical default values but they may need to be adjusted to better match the driver behavior at the location being studied.
Model verification is achieved by obtaining output data from the model and comparing it to what is expected from the input data. For example in traffic simulation, traffic volume can be verified to ensure that actual volume throughput in the model is reasonably close to traffic volumes input into the model. Ten percent is a typical threshold used in traffic simulation to determine if output volumes are reasonably close to input volumes. Simulation models handle model inputs in different ways so traffic that enters the network, for example, may or may not reach its desired destination. Additionally, traffic that wants to enter the network may not be able to, if any congestion exists. This is why model verification is a very important part of the modeling process.
The final step is to validate the model by comparing the results with what’s expected based on historical data from the study area. Ideally, the model should produce similar results to what has happened historically. This is typically verified by nothing more than quoting the R2 statistic from the fit. This statistic measures the fraction of variability that is accounted for by the model. A high R2 value does not necessarily mean the model fits the data well. Another tool used to validate models is graphical residual analysis. If model output values are drastically different than historical values, it probably means there’s an error in the model. This is an important step to verify before using the model as a base to produce additional models for different scenarios to ensure each one is accurate. If the outputs do not reasonably match historic values during the validation process, the model should be reviewed and updated to produce results more in line with expectations. It is an iterative process that helps to produce more realistic models.
Validating traffic simulation models requires comparing traffic estimated by the model to observed traffic on the roadway and transit systems. Initial comparisons are for trip interchanges between quadrants, sectors, or other large areas of interest. The next step is to compare traffic estimated by the models to traffic counts, including transit ridership, crossing contrived barriers in the study area. These are typically called screenlines, cutlines, and cordon lines and may be imaginary or actual physical barriers. Cordon lines surround particular areas such as the central business district or other major activity centers. Transit ridership estimates are commonly validated by comparing them to actual patronage crossing cordon lines around the central business district.
Three sources of error can cause weak correlation during calibration: input error, model error, and parameter error. In general, input error and parameter error can be adjusted easily by the user. Model error however is caused by the methodology used in the model and may not be as easy to fix. Simulation models are typically built using several different modeling theories that can produce conflicting results. Some models are more generalized while others are more detailed. If model error occurs as a result of this, in may be necessary to adjust the model methodology to make results more consistent.
In order to produce good models that can be used to produce realistic results, these are the necessary steps that need to be taken in order to ensure that simulation models are functioning properly. Simulation models can be used as a tool to verify engineering theories but are only valid, if calibrated properly. Once satisfactory estimates of the parameters for all models have been obtained, the models must be checked to assure that they adequately perform the functions for which they are intended. The validation process establishes the credibility of the model by demonstrating its ability to replicate actual traffic patterns. The importance of model validation underscores the need for careful planning, thoroughness and accuracy of the input data collection program that has this purpose. Efforts should be made to ensure collected data is consistent with expected values. For example in traffic analysis, it is typically common for a traffic engineer to perform a site visit to verify traffic counts and become familiar with traffic patterns in the area. The resulting models and forecasts will be no better than the data used for model estimation and validation.
Computer program
A computer program is a sequence of instructions written to perform a specified task with a computer. A computer requires programs to function, typically executing the program's instructions in a central processor. The program has an executable form that the computer can use directly to execute...
, or network of computers, that attempts to simulate
Simulation
Simulation is the imitation of some real thing available, state of affairs, or process. The act of simulating something generally entails representing certain key characteristics or behaviours of a selected physical or abstract system....
an abstract model of a particular system. Computer simulations have become a useful part of mathematical model
Mathematical model
A mathematical model is a description of a system using mathematical concepts and language. The process of developing a mathematical model is termed mathematical modeling. Mathematical models are used not only in the natural sciences and engineering disciplines A mathematical model is a...
ing of many natural systems in physics
Physics
Physics is a natural science that involves the study of matter and its motion through spacetime, along with related concepts such as energy and force. More broadly, it is the general analysis of nature, conducted in order to understand how the universe behaves.Physics is one of the oldest academic...
(computational physics
Computational physics
Computational physics is the study and implementation of numerical algorithms to solve problems in physics for which a quantitative theory already exists...
), astrophysics
Astrophysics
Astrophysics is the branch of astronomy that deals with the physics of the universe, including the physical properties of celestial objects, as well as their interactions and behavior...
, chemistry
Chemistry
Chemistry is the science of matter, especially its chemical reactions, but also its composition, structure and properties. Chemistry is concerned with atoms and their interactions with other atoms, and particularly with the properties of chemical bonds....
and biology
Biology
Biology is a natural science concerned with the study of life and living organisms, including their structure, function, growth, origin, evolution, distribution, and taxonomy. Biology is a vast subject containing many subdivisions, topics, and disciplines...
, human systems in economics
Economics
Economics is the social science that analyzes the production, distribution, and consumption of goods and services. The term economics comes from the Ancient Greek from + , hence "rules of the house"...
, psychology
Psychology
Psychology is the study of the mind and behavior. Its immediate goal is to understand individuals and groups by both establishing general principles and researching specific cases. For many, the ultimate goal of psychology is to benefit society...
, social science, and engineering
Engineering
Engineering is the discipline, art, skill and profession of acquiring and applying scientific, mathematical, economic, social, and practical knowledge, in order to design and build structures, machines, devices, systems, materials and processes that safely realize improvements to the lives of...
. Simulations can be used to explore and gain new insights into new technology
Technology
Technology is the making, usage, and knowledge of tools, machines, techniques, crafts, systems or methods of organization in order to solve a problem or perform a specific function. It can also refer to the collection of such tools, machinery, and procedures. The word technology comes ;...
, and to estimate the performance of systems too complex for analytical solutions.
Computer simulations vary from computer programs that run a few minutes, to network-based groups of computers running for hours, to ongoing simulations that run for days. The scale of events being simulated by computer simulations has far exceeded anything possible (or perhaps even imaginable) using traditional paper-and-pencil mathematical modeling. Over 10 years ago, a desert-battle simulation, of one force invading another, involved the modeling of 66,239 tanks, trucks and other vehicles on simulated terrain around Kuwait
Kuwait
The State of Kuwait is a sovereign Arab state situated in the north-east of the Arabian Peninsula in Western Asia. It is bordered by Saudi Arabia to the south at Khafji, and Iraq to the north at Basra. It lies on the north-western shore of the Persian Gulf. The name Kuwait is derived from the...
, using multiple supercomputers in the DoD
United States Department of Defense
The United States Department of Defense is the U.S...
High Performance Computer Modernization Program
Other examples include a 1-billion-atom model of material deformation (2002); a 2.64-million-atom model of the complex maker of protein in all organisms, a ribosome
Ribosome
A ribosome is a component of cells that assembles the twenty specific amino acid molecules to form the particular protein molecule determined by the nucleotide sequence of an RNA molecule....
, in 2005;
and the Blue Brain
Blue Brain
The Blue Brain Project is an attempt to create a synthetic brain by reverse-engineering the mammalian brain down to the molecular level.The aim of the project, founded in May 2005 by the Brain and Mind Institute of the École Polytechnique Fédérale de Lausanne is to study the brain's architectural...
project at EPFL (Switzerland), begun in May 2005, to create the first computer simulation of the entire human brain, right down to the molecular level.
Simulation versus modeling
Traditionally, building large models of systems has been via a statistical modelStatistical model
A statistical model is a formalization of relationships between variables in the form of mathematical equations. A statistical model describes how one or more random variables are related to one or more random variables. The model is statistical as the variables are not deterministically but...
, which attempts to find analytical solutions to problems and thereby enable the prediction of the behavior of the system from a set of parameters and initial conditions.
The term computer simulation is broader than computer modeling; the latter implies that all aspects are being modeled in the computer representation. However, computer simulation also includes generating inputs from simulated users in order to run actual computer software or equipment, with only part of the system being modeled. An example would be a flight simulator
Flight simulator
A flight simulator is a device that artificially re-creates aircraft flight and various aspects of the flight environment. This includes the equations that govern how aircraft fly, how they react to applications of their controls and other aircraft systems, and how they react to the external...
that can run machines as well as actual flight software.
Computer simulations are used in many fields, including science
Science
Science is a systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions about the universe...
, technology
Technology
Technology is the making, usage, and knowledge of tools, machines, techniques, crafts, systems or methods of organization in order to solve a problem or perform a specific function. It can also refer to the collection of such tools, machinery, and procedures. The word technology comes ;...
, entertainment
Entertainment
Entertainment consists of any activity which provides a diversion or permits people to amuse themselves in their leisure time. Entertainment is generally passive, such as watching opera or a movie. Active forms of amusement, such as sports, are more often considered to be recreation...
, health care, and business
Business
A business is an organization engaged in the trade of goods, services, or both to consumers. Businesses are predominant in capitalist economies, where most of them are privately owned and administered to earn profit to increase the wealth of their owners. Businesses may also be not-for-profit...
planning and scheduling.
History
Computer simulation developed hand-in-hand with the rapid growth of the computer, following its first large-scale deployment during the Manhattan ProjectManhattan Project
The Manhattan Project was a research and development program, led by the United States with participation from the United Kingdom and Canada, that produced the first atomic bomb during World War II. From 1942 to 1946, the project was under the direction of Major General Leslie Groves of the US Army...
in World War II
World War II
World War II, or the Second World War , was a global conflict lasting from 1939 to 1945, involving most of the world's nations—including all of the great powers—eventually forming two opposing military alliances: the Allies and the Axis...
to model the process of nuclear detonation
Nuclear weapon
A nuclear weapon is an explosive device that derives its destructive force from nuclear reactions, either fission or a combination of fission and fusion. Both reactions release vast quantities of energy from relatively small amounts of matter. The first fission bomb test released the same amount...
. It was a simulation of 12 hard spheres
Hard spheres
Hard spheres are widely used as model particles in the statistical mechanical theory of fluids and solids. They are defined simply as impenetrable spheres that cannot overlap in space. They mimic the extremely strong repulsion that atoms and spherical molecules experience at very close distances...
using a Monte Carlo algorithm
Monte Carlo method
Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling to compute their results. Monte Carlo methods are often used in computer simulations of physical and mathematical systems...
. Computer simulation is often used as an adjunct to, or substitute for, modeling systems for which simple closed form analytic solutions are not possible. There are many types of computer simulations; the common feature they all share is the attempt to generate a sample of representative scenarios for a model in which a complete enumeration of all possible states of the model would be prohibitive or impossible.
Data preparation
The external data requirements of simulations and models vary widely. For some, the input might be just a few numbers (for example, simulation of a waveform of AC electricity on a wire), while others might require terabytes of information (such as weather and climate models).Input sources also vary widely:
- Sensors and other physical devices connected to the model;
- Control surfaces used to direct the progress of the simulation in some way;
- Current or Historical data entered by hand;
- Values extracted as by-product from other processes;
- Values output for the purpose by other simulations, models, or processes.
Lastly, the time at which data is available varies:
- "invariant" data is often built into the model code, either because the value is truly invariant (e.g. the value of π) or because the designers consider the value to be invariant for all cases of interest;
- data can entered into the simulation when it starts up, for example by reading one or more files, or by reading data from a preprocessorPreprocessor (CAE)In Computer aided engineering a preprocessor is a program which provides a Graphical user interface to define physical properties.This data is used by the subsequent computer simulation....
; - data can be provided during the simulation run, for example by a sensor network;
Because of this variety, and that many common elements exist between diverse simulation systems, there are a large number of specialized simulation languages. The best-known of these may be Simula
Simula
Simula is a name for two programming languages, Simula I and Simula 67, developed in the 1960s at the Norwegian Computing Center in Oslo, by Ole-Johan Dahl and Kristen Nygaard...
(sometimes Simula-67, after the year 1967 when it was proposed). There are now many others
Simulation language
A computer simulation language describes the operation of a simulation on a computer. There are two major types of simulation: continuous and discrete event though more modern languages can handle combinations. Most languages also have a graphical interface and at least simple statistical gathering...
.
Systems that accept data from external sources must be very careful in knowing what they are receiving. While it is easy for computers to read in values from text or binary files, what is much harder is knowing what the accuracy (compared to measurement resolution and precision
Accuracy and precision
In the fields of science, engineering, industry and statistics, the accuracy of a measurement system is the degree of closeness of measurements of a quantity to that quantity's actual value. The precision of a measurement system, also called reproducibility or repeatability, is the degree to which...
) of the values is. Often it is expressed as "error bars", a minimum and maximum deviation from the value seen within which the true value (is expected to) lie. Because digital computer mathematics is not perfect, rounding and truncation errors will multiply this error up, and it is therefore useful to perform an "error analysis" to check that values output by the simulation are still usefully accurate.
Even small errors in the original data can accumulate into substantial error later in the simulation. While all computer analysis is subject to the "GIGO" (garbage in, garbage out) restriction, this is especially true of digital simulation. Indeed, it was the observation of this inherent, cumulative error, for digital systems that is the origin of chaos theory
Chaos theory
Chaos theory is a field of study in mathematics, with applications in several disciplines including physics, economics, biology, and philosophy. Chaos theory studies the behavior of dynamical systems that are highly sensitive to initial conditions, an effect which is popularly referred to as the...
.
Types
Computer models can be classified according to several independent pairs of attributes, including:- StochasticStochastic processIn probability theory, a stochastic process , or sometimes random process, is the counterpart to a deterministic process...
or deterministicDeterministic algorithmIn computer science, a deterministic algorithm is an algorithm which, in informal terms, behaves predictably. Given a particular input, it will always produce the same output, and the underlying machine will always pass through the same sequence of states...
(and as a special case of deterministic, chaotic) - see External links below for examples of stochastic vs. deterministic simulations - Steady-state or dynamic
- ContinuousContinuous functionIn mathematics, a continuous function is a function for which, intuitively, "small" changes in the input result in "small" changes in the output. Otherwise, a function is said to be "discontinuous". A continuous function with a continuous inverse function is called "bicontinuous".Continuity of...
or discreteDiscrete mathematicsDiscrete mathematics is the study of mathematical structures that are fundamentally discrete rather than continuous. In contrast to real numbers that have the property of varying "smoothly", the objects studied in discrete mathematics – such as integers, graphs, and statements in logic – do not...
(and as an important special case of discrete, discrete eventDiscrete Event SimulationIn discrete-event simulation, the operation of a system is represented as a chronological sequence of events. Each event occurs at an instant in time and marks a change of state in the system...
or DE models) - Local or distributedDistributed computingDistributed computing is a field of computer science that studies distributed systems. A distributed system consists of multiple autonomous computers that communicate through a computer network. The computers interact with each other in order to achieve a common goal...
.
Another way of categorizing models is to look at the underlying data structures. For time-stepped simulations, there are two main classes:
- Simulations which store their data in regular grids and require only next-neighbor access are called stencil codesStencil codesStencil codes are a class of iterative kernelswhich update array elements according to some fixed pattern, called stencil.They are most commonly found in the codes of computer simulations, e.g...
. Many CFD applications belong to this category. - If the underlying graph is not a regular grid, the model may belong to the meshfree method class.
Equations define the relationships between elements of the modeled system and attempt to find a state in which the system is in equilibrium. Such models are often used in simulating physical systems, as a simpler modeling case before dynamic simulation is attempted.
- Dynamic simulations model changes in a system in response to (usually changing) input signals.
- StochasticStochastic processIn probability theory, a stochastic process , or sometimes random process, is the counterpart to a deterministic process...
models use random number generators to model chance or random events; - A discrete event simulationDiscrete Event SimulationIn discrete-event simulation, the operation of a system is represented as a chronological sequence of events. Each event occurs at an instant in time and marks a change of state in the system...
(DES) manages events in time. Most computer, logic-test and fault-tree simulations are of this type. In this type of simulation, the simulator maintains a queue of events sorted by the simulated time they should occur. The simulator reads the queue and triggers new events as each event is processed. It is not important to execute the simulation in real time. It's often more important to be able to access the data produced by the simulation, to discover logic defects in the design, or the sequence of events. - A continuous dynamic simulation performs numerical solution of differential-algebraic equations or differential equations (either partialPartial differential equationIn mathematics, partial differential equations are a type of differential equation, i.e., a relation involving an unknown function of several independent variables and their partial derivatives with respect to those variables...
or ordinaryOrdinary differential equationIn mathematics, an ordinary differential equation is a relation that contains functions of only one independent variable, and one or more of their derivatives with respect to that variable....
). Periodically, the simulation program solves all the equations, and uses the numbers to change the state and output of the simulation. Applications include flight simulators, construction and management simulation games, chemical process modelingChemical process modelingChemical process modeling is a computer modeling technique used in chemical engineering process design. It typically involves using purpose-built software to define a system of interconnected components, which are then solved so that the steady-state or dynamic behavior of the system can be...
, and simulations of electrical circuits. Originally, these kinds of simulations were actually implemented on analog computerAnalog computerAn analog computer is a form of computer that uses the continuously-changeable aspects of physical phenomena such as electrical, mechanical, or hydraulic quantities to model the problem being solved...
s, where the differential equations could be represented directly by various electrical components such as op-amps. By the late 1980s, however, most "analog" simulations were run on conventional digital computers that emulate the behavior of an analog computer. - A special type of discrete simulation that does not rely on a model with an underlying equation, but can nonetheless be represented formally, is agent-based simulation. In agent-based simulation, the individual entities (such as molecules, cells, trees or consumers) in the model are represented directly (rather than by their density or concentration) and possess an internal state and set of behaviors or rules that determine how the agent's state is updated from one time-step to the next.
- DistributedDistributed computingDistributed computing is a field of computer science that studies distributed systems. A distributed system consists of multiple autonomous computers that communicate through a computer network. The computers interact with each other in order to achieve a common goal...
models run on a network of interconnected computers, possibly through the InternetInternetThe Internet is a global system of interconnected computer networks that use the standard Internet protocol suite to serve billions of users worldwide...
. Simulations dispersed across multiple host computers like this are often referred to as "distributed simulations". There are several standards for distributed simulation, including Aggregate Level Simulation ProtocolAggregate Level Simulation ProtocolThe Aggregate Level Simulation Protocol is a protocol and supporting software that enables simulations to interoperate with one another. Replaced by the High Level Architecture , it was used by the US military to link analytic and training simulations.ALSP consists of:#ALSP Infrastructure...
(ALSP), Distributed Interactive SimulationDistributed Interactive SimulationDistributed Interactive Simulation is an IEEE standard for conducting real-time platform-level wargaming across multiple host computers and is used worldwide, especially by military organizations but also by other agencies such as those involved in space exploration and medicine.-History:The...
(DIS), the High Level Architecture (simulation) (HLA) and the Test and Training Enabling Architecture (TENA).
CGI computer simulation
Formerly, the output data from a computer simulation was sometimes presented in a table, or a matrix, showing how data was affected by numerous changes in the simulation parameters. The use of the matrix format was related to traditional use of the matrix concept in mathematical modelMathematical model
A mathematical model is a description of a system using mathematical concepts and language. The process of developing a mathematical model is termed mathematical modeling. Mathematical models are used not only in the natural sciences and engineering disciplines A mathematical model is a...
s; however, psychologists and others noted that humans could quickly perceive trends by looking at graphs or even moving-images or motion-pictures generated from the data, as displayed by computer-generated-imagery (CGI) animation. Although observers couldn't necessarily read out numbers, or spout math formulas, from observing a moving weather chart, they might be able to predict events (and "see that rain was headed their way"), much faster than scanning tables of rain-cloud coordinates. Such intense graphical displays, which transcended the World of numbers and formulae, sometimes also led to output that lacked a coordinate grid or omitted timestamps, as if straying too far from numeric data displays. Today, weather forecasting
Weather forecasting
Weather forecasting is the application of science and technology to predict the state of the atmosphere for a given location. Human beings have attempted to predict the weather informally for millennia, and formally since the nineteenth century...
models tend to balance the view of moving rain/snow clouds against a map that uses numeric coordinates and numeric timestamps of events.
Similarly, CGI computer simulations of CAT scans can simulate how a tumor might shrink or change, during an extended period of medical treatment, presenting the passage of time as a spinning view of the visible human head, as the tumor changes.
Other applications of CGI computer simulations are being developed to graphically display large amounts of data, in motion, as changes occur during a simulation run.
Computer simulation in science
Generic examples of types of computer simulations in science, which are derived from an underlying mathematical description:- a numerical simulation of differential equationDifferential equationA differential equation is a mathematical equation for an unknown function of one or several variables that relates the values of the function itself and its derivatives of various orders...
s that cannot be solved analytically, theories that involve continuous systems such as phenomena in physical cosmologyPhysical cosmologyPhysical cosmology, as a branch of astronomy, is the study of the largest-scale structures and dynamics of the universe and is concerned with fundamental questions about its formation and evolution. For most of human history, it was a branch of metaphysics and religion...
, fluid dynamicsFluid dynamicsIn physics, fluid dynamics is a sub-discipline of fluid mechanics that deals with fluid flow—the natural science of fluids in motion. It has several subdisciplines itself, including aerodynamics and hydrodynamics...
(e.g. climate modelClimate modelClimate models use quantitative methods to simulate the interactions of the atmosphere, oceans, land surface, and ice. They are used for a variety of purposes from study of the dynamics of the climate system to projections of future climate...
s, roadway noiseRoadway noiseRoadway noise is the collective sound energy emanating from motor vehicles. In the USA it contributes more to environmental noise exposure than any other noise source, and is constituted chiefly of engine, tire, aerodynamic and braking elements...
models, roadway air dispersion models), continuum mechanicsContinuum mechanicsContinuum mechanics is a branch of mechanics that deals with the analysis of the kinematics and the mechanical behavior of materials modelled as a continuous mass rather than as discrete particles...
and chemical kineticsChemical kineticsChemical kinetics, also known as reaction kinetics, is the study of rates of chemical processes. Chemical kinetics includes investigations of how different experimental conditions can influence the speed of a chemical reaction and yield information about the reaction's mechanism and transition...
fall into this category. - a stochasticStochasticStochastic refers to systems whose behaviour is intrinsically non-deterministic. A stochastic process is one whose behavior is non-deterministic, in that a system's subsequent state is determined both by the process's predictable actions and by a random element. However, according to M. Kac and E...
simulation, typically used for discrete systems where events occur probabilistically, and which cannot be described directly with differential equations (this is a discrete simulation in the above sense). Phenomena in this category include genetic driftGenetic driftGenetic drift or allelic drift is the change in the frequency of a gene variant in a population due to random sampling.The alleles in the offspring are a sample of those in the parents, and chance has a role in determining whether a given individual survives and reproduces...
, biochemicalBiochemistryBiochemistry, sometimes called biological chemistry, is the study of chemical processes in living organisms, including, but not limited to, living matter. Biochemistry governs all living organisms and living processes...
or gene regulatory networkGene regulatory networkA gene regulatory network or genetic regulatory network is a collection of DNA segments in a cell whichinteract with each other indirectly and with other substances in the cell, thereby governing the rates at which genes in the network are transcribed into mRNA.In general, each mRNA molecule goes...
s with small numbers of molecules. (see also: Monte Carlo methodMonte Carlo methodMonte Carlo methods are a class of computational algorithms that rely on repeated random sampling to compute their results. Monte Carlo methods are often used in computer simulations of physical and mathematical systems...
).
Specific examples of computer simulations follow:
- statistical simulations based upon an agglomeration of a large number of input profiles, such as the forecasting of equilibrium temperatureTemperatureTemperature is a physical property of matter that quantitatively expresses the common notions of hot and cold. Objects of low temperature are cold, while various degrees of higher temperatures are referred to as warm or hot...
of receiving waters, allowing the gamut of meteorological data to be input for a specific locale. This technique was developed for thermal pollutionThermal pollutionThermal pollution is the degradation of water quality by any process that changes ambient water temperature.A common cause of thermal pollution is the use of water as a coolant by power plants and industrial manufacturers...
forecasting . - agent based simulation has been used effectively in ecologyEcologyEcology is the scientific study of the relations that living organisms have with respect to each other and their natural environment. Variables of interest to ecologists include the composition, distribution, amount , number, and changing states of organisms within and among ecosystems...
, where it is often called individual based modeling and has been used in situations for which individual variability in the agents cannot be neglected, such as population dynamicsPopulation dynamicsPopulation dynamics is the branch of life sciences that studies short-term and long-term changes in the size and age composition of populations, and the biological and environmental processes influencing those changes...
of salmonSalmonSalmon is the common name for several species of fish in the family Salmonidae. Several other fish in the same family are called trout; the difference is often said to be that salmon migrate and trout are resident, but this distinction does not strictly hold true...
and troutTroutTrout is the name for a number of species of freshwater and saltwater fish belonging to the Salmoninae subfamily of the family Salmonidae. Salmon belong to the same family as trout. Most salmon species spend almost all their lives in salt water...
(most purely mathematical models assume all trout behave identically). - time stepped dynamic model. In hydrology there are several such hydrology transport models such as the SWMM and DSSAM ModelDSSAM ModelThe DSSAM Model is a computer simulation developed for the Truckee River to analyze water quality impacts from land use and wastewater management decisions in the Truckee River Basin. This area includes the cities of Reno and Sparks, Nevada as well as the Lake Tahoe Basin...
s developed by the U.S. Environmental Protection AgencyUnited States Environmental Protection AgencyThe U.S. Environmental Protection Agency is an agency of the federal government of the United States charged with protecting human health and the environment, by writing and enforcing regulations based on laws passed by Congress...
for river water quality forecasting. - computer simulations have also been used to formally model theories of human cognition and performance, e.g. ACT-RACT-RACT-R is a cognitive architecture mainly developed by John Robert Anderson at Carnegie Mellon University. Like any cognitive architecture, ACT-R aims to define the basic and irreducible cognitive and perceptual operations that enable the human mind....
- computer simulation using molecular modeling for drug discoveryDrug discoveryIn the fields of medicine, biotechnology and pharmacology, drug discovery is the process by which drugs are discovered or designed.In the past most drugs have been discovered either by identifying the active ingredient from traditional remedies or by serendipitous discovery...
- computer simulation for studying the selective sensitivity of bonds by mechanochemistry during grinding of organic molecules.
- Computational fluid dynamicsComputational fluid dynamicsComputational fluid dynamics, usually abbreviated as CFD, is a branch of fluid mechanics that uses numerical methods and algorithms to solve and analyze problems that involve fluid flows. Computers are used to perform the calculations required to simulate the interaction of liquids and gases with...
simulations are used to simulate the behaviour of flowing air, water and other fluids. There are one-, two- and three- dimensional models used. A one dimensional model might simulate the effects of water hammerWater hammerWater hammer is a pressure surge or wave resulting when a fluid in motion is forced to stop or change direction suddenly . Water hammer commonly occurs when a valve is closed suddenly at an end of a pipeline system, and a pressure wave propagates in the pipe...
in a pipe. A two-dimensional model might be used to simulate the drag forces on the cross-section of an aeroplane wing. A three-dimensional simulation might estimate the heating and cooling requirements of a large building. - An understanding of statistical thermodynamic molecular theory is fundamental to the appreciation of molecular solutions. Development of the Potential Distribution Theorem (PDT) allows one to simplify this complex subject to down-to-earth presentations of molecular theory.
Notable, and sometimes controversial, computer simulations used in science include: Donella Meadows'
Donella Meadows
Donella H. "Dana" Meadows was a pioneering American environmental scientist, teacher and writer. She is best known as lead author of the influential book The Limits to Growth, which made headlines around the world.- Life :Born in Elgin, Illinois, Meadows was educated in science, receiving a B.A...
World3
World3
The World3 model was a computer simulation of interactions between population, industrial growth, food production and limits in the ecosystems of the Earth. It was originally produced and used by a Club of Rome study that produced the model and the book The Limits to Growth...
used in the Limits to Growth
Limits to Growth
The Limits to Growth is a 1972 book modeling the consequences of a rapidly growing world population and finite resource supplies, commissioned by the Club of Rome. Its authors were Donella H. Meadows, Dennis L. Meadows, Jørgen Randers, and William W. Behrens III. The book used the World3 model to...
, James Lovelock's
James Lovelock
James Lovelock, CH, CBE, FRS is an independent scientist, environmentalist and futurologist who lives in Devon, England. He is best known for proposing the Gaia hypothesis, which postulates that the biosphere is a self-regulating entity with the capacity to keep our planet healthy by controlling...
Daisyworld
Daisyworld
Daisyworld, a computer simulation, is a hypothetical world orbiting a star whose radiant energy is slowly increasing. It is meant to mimic important elements of the Earth-Sun system, and was introduced by James Lovelock and Andrew Watson in a paper published in 1983 to illustrate the plausibility...
and Thomas Ray's Tierra
Tierra (computer simulation)
Tierra is a computer simulation developed by ecologist Thomas S. Ray in the early 1990s in which computer programs compete for central processing unit time and access to main memory...
.
Simulation environments for physics and engineering
Graphical environments to design simulations have been developed. Special care was taken to handle events (situations in which the simulation equations are not valid and have to be changed). The open project Open Source PhysicsOpen Source Physics
Open Source Physics, or OSP, is a project sponsored by the National Science Foundation and Davidson College, whose mission is to spread the use of open source code libraries, tools, and compiled simulations for physics and other numerical simulations. The OSP collection provides curriculum...
was started to develop reusable libraries for simulations in Java
Java (programming language)
Java is a programming language originally developed by James Gosling at Sun Microsystems and released in 1995 as a core component of Sun Microsystems' Java platform. The language derives much of its syntax from C and C++ but has a simpler object model and fewer low-level facilities...
, together with Easy Java Simulations, a complete graphical environment that generates code based on these libraries.
Computer simulation in practical contexts
Computer simulations are used in a wide variety of practical contexts, such as:- analysis of air pollutant dispersion using atmospheric dispersion modelingAtmospheric dispersion modelingAtmospheric dispersion modeling is the mathematical simulation of how air pollutants disperse in the ambient atmosphere. It is performed with computer programs that solve the mathematical equations and algorithms which simulate the pollutant dispersion...
- design of complex systems such as aircraftAircraftAn aircraft is a vehicle that is able to fly by gaining support from the air, or, in general, the atmosphere of a planet. An aircraft counters the force of gravity by using either static lift or by using the dynamic lift of an airfoil, or in a few cases the downward thrust from jet engines.Although...
and also logisticsLogisticsLogistics is the management of the flow of goods between the point of origin and the point of destination in order to meet the requirements of customers or corporations. Logistics involves the integration of information, transportation, inventory, warehousing, material handling, and packaging, and...
systems. - design of Noise barrierNoise barrierA noise barrier is an exterior structure designed to protect sensitive land uses from noise pollution...
s to effect roadway noise mitigationNoise mitigationNoise mitigation is a set of strategies to reduce noise pollution. The main areas of noise mitigation or abatement are: transportation noise control, architectural design, and occupational noise control... - flight simulatorFlight simulatorA flight simulator is a device that artificially re-creates aircraft flight and various aspects of the flight environment. This includes the equations that govern how aircraft fly, how they react to applications of their controls and other aircraft systems, and how they react to the external...
s to train pilots - weather forecastingAtmospheric modelAn atmospheric model is a mathematical model constructed around the full set of primitive dynamical equations which govern atmospheric motions. It can supplement these equations with parameterizations for turbulent diffusion, radiation, moist processes , heat exchange, soil, vegetation, surface...
- Simulation of other computers is emulationEmulatorIn computing, an emulator is hardware or software or both that duplicates the functions of a first computer system in a different second computer system, so that the behavior of the second system closely resembles the behavior of the first system...
. - forecasting of prices on financial markets (for example Adaptive ModelerAdaptive ModelerAltreva Adaptive Modeler is a software application for creating agent-based financial market simulation models for the purpose of forecasting prices of real world market traded stocks or other securities...
) - behavior of structures (such as buildings and industrial parts) under stress and other conditions
- design of industrial processes, such as chemical processing plants
- Strategic ManagementStrategic managementStrategic management is a field that deals with the major intended and emergent initiatives taken by general managers on behalf of owners, involving utilization of resources, to enhance the performance of firms in their external environments...
and Organizational StudiesOrganizational studiesOrganizational studies, sometimes known as organizational science, encompass the systematic study and careful application of knowledge about how people act within organizations... - Reservoir simulationReservoir simulationReservoir simulation is an area of reservoir engineering in which computer models are used to predict the flow of fluids through porous media.-Uses:...
for the petroleum engineering to model the subsurface reservoir - Process Engineering Simulation tools.
- Robot simulatorsRobotics suiteA robotics suite is a visual environment for robot control and simulation. They are typically an end-to-end platform for robotics development and include tools for visual programming and creating and debugging robot applications...
for the design of robots and robot control algorithms - Urban Simulation ModelsUrbanSimUrbanSim is an Open Source urban simulation system designed by Paul Waddell and developed with numerous collaborators to support metropolitan land use, transportation, and environmental planning. It has been distributed on the web since 1998, with regular revisions and updates, from...
that simulate dynamic patterns of urban development and responses to urban land use and transportation policies. See a more detailed article on Urban Environment Simulation. - Traffic engineeringTraffic engineering (transportation)For the engineering of communications and computer networks, see Teletraffic engineering.Traffic engineering is a branch of civil engineering that uses engineering techniques to achieve the safe and efficient movement of people and goods on roadways...
to plan or redesign parts of the street network from single junctions over cities to a national highway network, for transportation system planning, design and operations. See a more detailed article on Simulation in TransportationTraffic SimulationTraffic simulation or the simulation of transportation systems is the mathematical modeling of transportation systems through the application of computer software to better help plan, design and operate transportation systems...
. - modeling car crashes to test safety mechanisms in new vehicle models
The reliability and the trust people put in computer simulations depends on the validity
Validity
In logic, argument is valid if and only if its conclusion is entailed by its premises, a formula is valid if and only if it is true under every interpretation, and an argument form is valid if and only if every argument of that logical form is valid....
of the simulation model, therefore verification and validation
Verification and Validation
In software project management, software testing, and software engineering, verification and validation is the process of checking that a software system meets specifications and that it fulfills its intended purpose...
are of crucial importance in the development of computer simulations. Another important aspect of computer simulations is that of reproducibility of the results, meaning that a simulation model should not provide a different answer for each execution. Although this might seem obvious, this is a special point of attention in stochastic simulation
Stochastic simulation
Stochastic simulation algorithms and methods were initially developed to analyse chemical reactions involving large numbers of species with complex reaction kinetics. The first algorithm, the Gillespie algorithm was proposed by Dan Gillespie in 1977...
s, where random numbers should actually be semi-random numbers. An exception to reproducibility are human in the loop simulations such as flight simulations and computer games
Computer Games
"Computer Games" is a single by New Zealand group, Mi-Sex released in 1979 in Australia and New Zealand and in 1981 throughout Europe. It was the single that launched the band, and was hugely popular, particularly in Australia and New Zealand...
. Here a human is part of the simulation and thus influences the outcome in a way that is hard, if not impossible, to reproduce exactly.
Vehicle
Vehicle
A vehicle is a device that is designed or used to transport people or cargo. Most often vehicles are manufactured, such as bicycles, cars, motorcycles, trains, ships, boats, and aircraft....
manufacturers make use of computer simulation to test safety features in new designs. By building a copy of the car in a physics simulation environment, they can save the hundreds of thousands of dollars that would otherwise be required to build a unique prototype and test it. Engineers can step through the simulation milliseconds at a time to determine the exact stresses being put upon each section of the prototype.
Computer graphics
Computer graphics
Computer graphics are graphics created using computers and, more generally, the representation and manipulation of image data by a computer with help from specialized software and hardware....
can be used to display the results of a computer simulation. Animations can be used to experience a simulation in real-time e.g. in training simulations
Training simulation
A training simulation is a virtual medium through which various types of skills can be acquired. Training simulations can be used in a wide variety of genres; however they are most commonly used in corporate situations to improve business awareness and management skills...
. In some cases animations may also be useful in faster than real-time or even slower than real-time modes. For example, faster than real-time animations can be useful in visualizing the buildup of queues in the simulation of humans evacuating a building. Furthermore, simulation results are often aggregated into static images using various ways of scientific visualization
Scientific visualization
Scientific visualization is an interdisciplinary branch of science according to Friendly "primarily concerned with the visualization of three-dimensional phenomena , where the emphasis is on realistic renderings of volumes, surfaces, illumination sources, and so forth, perhaps...
.
In debugging, simulating a program execution under test (rather than executing natively) can detect far more errors than the hardware itself can detect and, at the same time, log useful debugging information such as instruction trace, memory alterations and instruction counts. This technique can also detect buffer overflow
Buffer overflow
In computer security and programming, a buffer overflow, or buffer overrun, is an anomaly where a program, while writing data to a buffer, overruns the buffer's boundary and overwrites adjacent memory. This is a special case of violation of memory safety....
and similar "hard to detect" errors as well as produce performance information and tuning
Performance tuning
Performance tuning is the improvement of system performance. This is typically a computer application, but the same methods can be applied to economic markets, bureaucracies or other complex systems. The motivation for such activity is called a performance problem, which can be real or anticipated....
data.
Pitfalls
Although sometimes ignored in computer simulations, it is very important to perform sensitivity analysisSensitivity analysis
Sensitivity analysis is the study of how the variation in the output of a statistical model can be attributed to different variations in the inputs of the model. Put another way, it is a technique for systematically changing variables in a model to determine the effects of such changes.In any...
to ensure that the accuracy of the results are properly understood. For example, the probabilistic risk analysis of factors determining the success of an oilfield exploration program involves combining samples from a variety of statistical distributions using the Monte Carlo method
Monte Carlo method
Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling to compute their results. Monte Carlo methods are often used in computer simulations of physical and mathematical systems...
. If, for instance, one of the key parameters (e.g. the net ratio of oil-bearing strata) is known to only one significant figure, then the result of the simulation might not be more precise than one significant figure, although it might (misleadingly) be presented as having four significant figures.
Model Calibration Techniques
The following three steps should be used to produce accurate simulation models: calibration, verification, and validation. Computer simulations are good at portraying and comparing theoretical scenarios but in order to accurately model actual case studies, it has to match what is actually happening today. A base model should be created and calibrated so that it matches the area being studied. The calibrated model should then be verified to ensure that the model is operating as expected based on the inputs. Once the model has been verified, the final step is to validate the model by comparing the outputs to historical data from the study area. This can be done by using statistical techniques and ensuring an adequate R-squared value. Unless these techniques are employed, the simulation model created will produce inaccurate results and not be a useful prediction tool.
Model calibration is achieved by adjusting any available parameters in order to adjust how the model operates and simulates the process. For example in traffic simulation, typical parameters include look-ahead distance, car-following sensitivity, discharge headway, and start-up lost time. These parameters influence driver behaviors such as when and how long it takes a driver to change lanes, how much distance a driver leaves between itself and the car in front of it, and how quickly it starts to accelerate through an intersection. Adjusting these parameters has a direct effect on the amount of traffic volume that can traverse through the modeled roadway network by making the drivers more or less aggressive. These are examples of calibration parameters that can be fine-tuned to match up with characteristics observed in the field at the study location. Most traffic models will have typical default values but they may need to be adjusted to better match the driver behavior at the location being studied.
Model verification is achieved by obtaining output data from the model and comparing it to what is expected from the input data. For example in traffic simulation, traffic volume can be verified to ensure that actual volume throughput in the model is reasonably close to traffic volumes input into the model. Ten percent is a typical threshold used in traffic simulation to determine if output volumes are reasonably close to input volumes. Simulation models handle model inputs in different ways so traffic that enters the network, for example, may or may not reach its desired destination. Additionally, traffic that wants to enter the network may not be able to, if any congestion exists. This is why model verification is a very important part of the modeling process.
The final step is to validate the model by comparing the results with what’s expected based on historical data from the study area. Ideally, the model should produce similar results to what has happened historically. This is typically verified by nothing more than quoting the R2 statistic from the fit. This statistic measures the fraction of variability that is accounted for by the model. A high R2 value does not necessarily mean the model fits the data well. Another tool used to validate models is graphical residual analysis. If model output values are drastically different than historical values, it probably means there’s an error in the model. This is an important step to verify before using the model as a base to produce additional models for different scenarios to ensure each one is accurate. If the outputs do not reasonably match historic values during the validation process, the model should be reviewed and updated to produce results more in line with expectations. It is an iterative process that helps to produce more realistic models.
Validating traffic simulation models requires comparing traffic estimated by the model to observed traffic on the roadway and transit systems. Initial comparisons are for trip interchanges between quadrants, sectors, or other large areas of interest. The next step is to compare traffic estimated by the models to traffic counts, including transit ridership, crossing contrived barriers in the study area. These are typically called screenlines, cutlines, and cordon lines and may be imaginary or actual physical barriers. Cordon lines surround particular areas such as the central business district or other major activity centers. Transit ridership estimates are commonly validated by comparing them to actual patronage crossing cordon lines around the central business district.
Three sources of error can cause weak correlation during calibration: input error, model error, and parameter error. In general, input error and parameter error can be adjusted easily by the user. Model error however is caused by the methodology used in the model and may not be as easy to fix. Simulation models are typically built using several different modeling theories that can produce conflicting results. Some models are more generalized while others are more detailed. If model error occurs as a result of this, in may be necessary to adjust the model methodology to make results more consistent.
In order to produce good models that can be used to produce realistic results, these are the necessary steps that need to be taken in order to ensure that simulation models are functioning properly. Simulation models can be used as a tool to verify engineering theories but are only valid, if calibrated properly. Once satisfactory estimates of the parameters for all models have been obtained, the models must be checked to assure that they adequately perform the functions for which they are intended. The validation process establishes the credibility of the model by demonstrating its ability to replicate actual traffic patterns. The importance of model validation underscores the need for careful planning, thoroughness and accuracy of the input data collection program that has this purpose. Efforts should be made to ensure collected data is consistent with expected values. For example in traffic analysis, it is typically common for a traffic engineer to perform a site visit to verify traffic counts and become familiar with traffic patterns in the area. The resulting models and forecasts will be no better than the data used for model estimation and validation.
See also
- Virtual prototypingVirtual prototypingVirtual prototyping is a technique in the process of product development. It involves using computer-aided design and computer-aided engineering software to validate a design before committing to making a physical prototype...
- Stencil codesStencil codesStencil codes are a class of iterative kernelswhich update array elements according to some fixed pattern, called stencil.They are most commonly found in the codes of computer simulations, e.g...
- Meshfree methodsMeshfree methodsMeshfree methods are a particular class of numerical simulation algorithms for the simulation of physical phenomena. Traditional simulation algorithms relied on a grid or a mesh, meshfree methods in contrast use the geometry of the simulated object directly for calculations. Meshfree methods exist...
- Web-based simulation
- EmulatorEmulatorIn computing, an emulator is hardware or software or both that duplicates the functions of a first computer system in a different second computer system, so that the behavior of the second system closely resembles the behavior of the first system...
- Procedural animationProcedural animationA procedural animation is a type of computer animation, used to automatically generate animation in real-time to allow for a more diverse series of actions than could otherwise be created using predefined animations....