Introduction to Algorithms: The Computer Science of Human Decisions Front Cover

Introduction to Algorithms: The Computer Science of Human Decisions

  • Length: 157 pages
  • Edition: 1
  • Publication Date: 2021-01-25
  • ISBN-10: B08V4GPSMG
Description

What are algorithms and why should you care? We’ll start with an overview of algorithms and then discuss two games that you could use an algorithm to solve more efficiently – the number guessing game and a route-finding game.
In mathematics and computer science, an algorithm (/ˈælɡərɪðəm/ ( listen)) is a finite sequence of well-defined, computer-implementable instructions, typically to solve a class of problems or to perform a computation. Algorithms are always unambiguous and are used as specifications for performing calculations, data processing, automated reasoning, and other tasks.
As an effective method, an algorithm can be expressed within a finite amount of space and time, and in a well-defined formal language for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing “output” and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input.
The concept of algorithm has existed since antiquity. Arithmetic algorithms, such as a division algorithm, was used by ancient Babylonian mathematicians c. 2500 BC and Egyptian mathematicians c. 1550 BC. Greek mathematicians later used algorithms in 240 BC in the sieve of Eratosthenes for finding prime numbers, and the Euclidean algorithm for finding the greatest common divisor of two numbers. Arabic mathematicians such as al-Kindi in the 9th century used cryptographic algorithms for code-breaking, based on frequency analysis.
The word algorithm itself is derived from the name of the 9th-century mathematician Muḥammad ibn Mūsā al-Khwārizmī, whose nisba (identifying him as from Khwarazm) was Latinized as Algoritmi. A partial formalization of what would become the modern concept of algorithm began with attempts to solve the Entscheidungs problem (decision problem) posed by David Hilbert in 1928. Later formalizations were framed as attempts to define “effective calculability” or “effective method”. Those formalizations included the Gödel–Herbrand–Kleene recursive functions of 1930, 1934 and 1935, Alonzo Church’s lambda calculus of 1936, Emil Post’s Formulation 1 of 1936, and Alan Turing’s Turing machines of 1936–37 and 1939.

Probably the best way to understand an algorithm is to think of it as a recipe. There are many ways to bake cookies, but by following a recipe a baker knows to first preheat the oven, then measure out the flour, add butter, chocolate chips, etc. until the desired cookies are complete.
Using algorithms, a programmer or computer scientist can tell his machine to query database A for last month’s sales figures, compare them to the prior month and the same month last year, and then display it in a bar graph.
Mix multiple algorithms together and you have a working computer program.
As can be expected, there are numerous types of algorithms for virtually every kind of mathematical problem there is to solve. There are:
•Numerical algorithms.
•Algebraic algorithms.
•Geometric algorithms.
•Sequential algorithms.
•Operational algorithms.
•Theoretical algorithms.
There are also various algorithms named after the leading mathematicians who invented them:
•Shor’s algorithm.
•Girvan-Newman algorithm.
•Several Euclidian algorithms.
There are also those named after the specific problem they solve, such as:
•Bidirectional search algorithm.
•K-way merge algorithm.
In the computing field, most algorithms tend to solve data management and analysis problems.

To access the link, solve the captcha.