Algorithms and Psuedocode

From

(Difference between revisions)
Jump to: navigation, search
(Why algorithms are necessary: an informal definition)
(Why algorithms are necessary: an informal definition)
Line 14: Line 14:
The words "enumerably infinite" mean "countable using integers perhaps extending to infinity." Thus Boolos and Jeffrey are saying that an algorithm ''implies'' instructions for a process that "creates" output integers from an ''arbitrary'' "input" integer or integers that, in theory, can be chosen from 0 to infinity. Thus we might expect an algorithm to be an algebraic equation such as '''y = m + n''' — two arbitrary "input variables" '''m''' and '''n''' that produce an output '''y'''. As we see in Algorithm characterizations — the word algorithm implies much more than this, something on the order of (for our addition example):
The words "enumerably infinite" mean "countable using integers perhaps extending to infinity." Thus Boolos and Jeffrey are saying that an algorithm ''implies'' instructions for a process that "creates" output integers from an ''arbitrary'' "input" integer or integers that, in theory, can be chosen from 0 to infinity. Thus we might expect an algorithm to be an algebraic equation such as '''y = m + n''' — two arbitrary "input variables" '''m''' and '''n''' that produce an output '''y'''. As we see in Algorithm characterizations — the word algorithm implies much more than this, something on the order of (for our addition example):
-
:Precise instructions (in language understood by "the computer") for a "fast, efficient, good" ''process'' that specifies the "moves" of "the computer" (machine or human, equipped with the necessary internally-contained information and capabilities) to find, decode, and then munch arbitrary input integers/symbols '''m''' and '''n''', symbols '''+''' and '''=''' ... and (reliably, correctly, "effectively") produce, in a "reasonable" [[time]], output-integer '''y''' at a specified place and in a specified format.
+
:Precise instructions (in language understood by "the computer") for a "fast, efficient, good" ''process'' that specifies the "moves" of "the computer" (machine or human, equipped with the necessary internally-contained information and capabilities) to find, decode, and then munch arbitrary input integers/symbols '''m''' and '''n''', symbols '''+''' and '''=''' ... and (reliably, correctly, "effectively") produce, in a "reasonable" time, output-integer '''y''' at a specified place and in a specified format.
The concept of ''algorithm'' is also used to define the notion of decidability. That notion is central for explaining how formal systems come into being starting from a small set of axioms and rules. In logic, the time that an algorithm requires to complete cannot be measured, as it is not apparently related with our customary physical dimension. From such uncertainties, that characterize ongoing work, stems the unavailability of a definition of ''algorithm'' that suits both concrete (in some sense) and abstract usage of the term.
The concept of ''algorithm'' is also used to define the notion of decidability. That notion is central for explaining how formal systems come into being starting from a small set of axioms and rules. In logic, the time that an algorithm requires to complete cannot be measured, as it is not apparently related with our customary physical dimension. From such uncertainties, that characterize ongoing work, stems the unavailability of a definition of ''algorithm'' that suits both concrete (in some sense) and abstract usage of the term.

Revision as of 16:54, 27 March 2009

To be written...

Algorithms

In mathematics, computing, linguistics and related subjects, an algorithm is a finite sequence of instructions, an explicit, step-by-step procedure for solving a problem, often used for calculation and data processing. It is formally a type of effective method in which a list of well-defined instructions for completing a task will, when given an initial state, proceed through a well-defined series of successive states, eventually terminating in an end-state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as probabilistic algorithms, incorporate randomness.

Why algorithms are necessary: an informal definition

While there is no generally accepted formal definition of "algorithm", an informal definition could be "an algorithm is a process that performs some sequence of operations." For some people, a program is only an algorithm if it stops eventually. For others, a program is only an algorithm if it stops before a given number of calculation steps.

A prototypical example of an "algorithm" is Euclid's algorithm to determine the maximum common divisor of two integers greater than one: "subtract the smaller number from the larger one; repeat until you get a zero or a one." This procedure is known to stop always and the number of subtractions needed is always smaller than the larger of the two numbers.

We can derive clues to the issues involved and an informal meaning of the word from the following quotation from Boolos & Jeffrey (1974, 1999) (boldface added):

No human being can write fast enough or long enough or small enough to list all members of an enumerably infinite set by writing out their names, one after another, in some notation. But humans can do something equally useful, in the case of certain enumerably infinite sets: They can give explicit instructions for determining the nth member of the set, for arbitrary finite n. Such instructions are to be given quite explicitly, in a form in which they could be followed by a computing machine, or by a human who is capable of carrying out only very elementary operations on symbols Boolos & Jeffrey (1974, 1999, p. 19)

The words "enumerably infinite" mean "countable using integers perhaps extending to infinity." Thus Boolos and Jeffrey are saying that an algorithm implies instructions for a process that "creates" output integers from an arbitrary "input" integer or integers that, in theory, can be chosen from 0 to infinity. Thus we might expect an algorithm to be an algebraic equation such as y = m + n — two arbitrary "input variables" m and n that produce an output y. As we see in Algorithm characterizations — the word algorithm implies much more than this, something on the order of (for our addition example):

Precise instructions (in language understood by "the computer") for a "fast, efficient, good" process that specifies the "moves" of "the computer" (machine or human, equipped with the necessary internally-contained information and capabilities) to find, decode, and then munch arbitrary input integers/symbols m and n, symbols + and = ... and (reliably, correctly, "effectively") produce, in a "reasonable" time, output-integer y at a specified place and in a specified format.

The concept of algorithm is also used to define the notion of decidability. That notion is central for explaining how formal systems come into being starting from a small set of axioms and rules. In logic, the time that an algorithm requires to complete cannot be measured, as it is not apparently related with our customary physical dimension. From such uncertainties, that characterize ongoing work, stems the unavailability of a definition of algorithm that suits both concrete (in some sense) and abstract usage of the term.

MediaWiki Appliance - Powered by TurnKey Linux