Model of computation

From Wikipedia, the free encyclopedia

In computer science, and more specifically in computability theory and computational complexity theory, a model of computation is a model which describes how an output of a mathematical function is computed given an input. A model describes how units of computations, memories, and communications are organized.[1] The computational complexity of an algorithm can be measured given a model of computation. Using a model allows studying the performance of algorithms independently of the variations that are specific to particular implementations and specific technology.

Models[]

Models of computation can be classified into three categories: sequential models, functional models, and concurrent models.

Sequential models[]

Sequential models include:

Functional models[]

Functional models include:

Concurrent models[]

Concurrent models include:

Some of these models have both deterministic and nondeterministic variants. Nondeterministic models are not useful for practical computation;[citation needed] they are used in the study of computational complexity of algorithms.

Models differ in their expressive power; for example, each function that can be computed by a Finite state machine can also be computed by a Turing machine, but not vice versa.

Uses[]

In the field of runtime analysis of algorithms, it is common to specify a computational model in terms of primitive operations allowed which have unit cost, or simply unit-cost operations. A commonly used example is the random-access machine, which has unit cost for read and write access to all of its memory cells. In this respect, it differs from the above-mentioned Turing machine model.

See also[]

References[]

  1. ^ "Models of Computation" (PDF).

Further reading[]

  • Fernández, Maribel (2009). Models of Computation: An Introduction to Computability Theory. Undergraduate Topics in Computer Science. Springer. ISBN 978-1-84882-433-1.
  • Savage, John E. (1998). Models Of Computation: Exploring the Power of Computing. Addison-Wesley. ISBN 978-0201895391.
Retrieved from ""