Algorithms

Index

  1. Algorithm
  2. Analyzing Algorithms
  3. Order Arithmetic
  4. Complexity of Algorithm
    • Time Complexity
    • Space Complexity
  5. Cost Complexity
    • Worst-case
    • Best-case
    • Average-case
  6. Orders of Growth
  7. Asymptotic Analysis
  8. Asymptotic Notation
  9. Polynomial Runing time
  10. Exponential Runing time
  11. Principles of Algorithm Design
  12. Algoritm Design and Analysis Process

Algorithm

  • Algorithm is defined as a step by step procedure to perform a specific task within finite number of steps
  • It can be defined as a sequence of definite and effective instructions, while terminates with the production of correct output from the given input.
  • An algorithm is a set of rules for carrying out calculation either by hand or on a machine.
  • An algorithm is a sequence of computational steps that transform the input into the output.
  • An algorithm is a sequence of operations performed on data that have to be organized in data structures.
  • A finite set of instruction that specify a sequence of operations to be carried out in order to solve a specific problem or class of problems is called an algorithm.

Analysis Algorithm

The analysis of an algorithm provides background information that gives us a general idea of how long an algorithm will take for a given problem set.

For each algorithm considered, we will come up with an estimate of how long it will take to solve a problem that has a set of N input values.

The purpose of analysis of algorithms is not to give a formula that will tell us exactly how many seconds or computer cycles a particular algorithm will take.

This is not useful information because we would then need to talk about the -

  • Type of computer
  • Whether it has one or many users at a time
  • What processor it has
  • How fast its clock is
  • Whether it has a complex or reduced instruction set processor chip
  • How well the compiler optimizes the executable code.
  • All of those will have an impact on how fast a program for on algorithm will run.

To talk about analysis in those terms would mean that by moving a program to a faster computer, the algorithm would become better because it how completes its job faster.

Order Arithmetic

In mathematics and computer programming, the order of operations/ operator precedence/ Order Arithmetic is a collection of rules that reflect conventions about which procedures to perform first in order to evaluate a given mathematical expression and Algorithm

How to Calculate Running Time of an Algorithm?

We can calculate the running time of an algorithm reliably by running the implementation of the algorithm on a computer.

Alternatively we can calculate the running time by using a technique called algorithm analysis.

We can estimate an algorithm's performance by counting the number of basic operations by the algorithm to process an input of a certain size.

  • Basic Operation: - the time to complete a basic operation does not depend on the particular values of its operands. So it takes a constant amount of time.
  • Input size: - it is the number of input processed by the algorithm. for example, sorting algorithm the input size is measured by the number of records to be sorted.
  • Growth Rate: - The growth rate of an algorithm is the rate at which the running time (cost) of the algorithm grows as the size of the input grows. The growth rate has a tremendous effect on the resources consumed by the algorithm.

Complexity of Algorithms

It is very convenient to classify algorithms based on the relative amount of time or relative amount of space they require and specify the growth of time/space requirements as a function of the input size.

  • Time Complexity: - Running time of the program as a function of the size of input.
  • Space Complexity: - Space complexity based on how much space an algorithm needs to complete its task.

Case Complexity

Worst-case Complexity

The worst case complexity of the algorithm is the function defined by the maximum number of steps taken on any instance of size n.

The behaviour of the algorithm with respect to the worst possible case of the input instance. The worst-case running time of an algorithm is an upper bound on the running time for any input. Knowing it gives us a guarantee that the algorithm will never take any longer time.

Best-case Complexity

The best-case Complexity of the algorithm is the function defined by the minimum number of steps taken on any instance of size n.

Average-case complexity

The average-case complexity of the algorithm is the function defined by the average number of steps taken on any instance of size n

The expected behaviour when the input is randomly drawn from a given distribution. The average-case running time of an algorithm is an estimate of the running time for an "average" input. Computation of a average-case running time entails knowing all possible input sequences, the probability distribution of occurrence of these sequences, and the running times for the individual sequences.

Growth Rate of Function/ Order of growth

Resources for an algorithm are usually expressed as a function of input. Often this function is messy and difficult to work. To study function growth easily, we reduce the function down to the important part.

Let:    f(n) = an2 + bn + c

In this function, the n2 term dominates the function, that is when n gets sufficiently large, the other terms bare factor into the result.

Dominate terms are what we are interested in to reduce a function in this we ignore all constants and coefficients and look at the highest order term in relation to n.

Asymptotic Analysis

Asymptotic means a line that tends to converge to a curve, which may or may not eventually touch the curve. It is a line that stays within bounds.

Asymptotic notation is a shorthand way to write down and talk about 'fastest possible' and 'slowest possible' running times for an algorithm, using hight and low bounds on speed. These are also referred to as 'best case' and 'worst case' scenarios respectively.

Why are Asymptotic Notation Important?

  • They give a simple characterization of an algorithm's efficiency.
  • They allow the comparison of the performances of various algorithms.

Asymptotic Notations

Big-oh Notations (O)

Big-oh is the formal method of expressing the upper bound of an algorithm's running time. It is the measure of the longest amount of time it could possible take for the algorithm to complete. More formally, for non-negative functions, f(n) and g(n) if there exists an integer n0 and a constant c > 0 such that all integers n > n0

f(n) ≤ cg(n)

Then f(n) is Big-O of g(n) This is denoted as:

f(n) ∊ O(g(n))

i.e., the set of function which, as n gets large, grow no faster than a constant times f(n)

Big-omega Notation (Ω)

For non-negative functions, f(n) and g(n) if there exists an integer n0 and a constant c > 0 such that all integers n > n0, f(n) ≥ cg(n) then f(n) is Big-omega of g(n). This is denoted as:

f(n) ∈ Ω(g(n)

This is almost the same definition of Big-oh, except that f(n) ≥ (g(n)), this makes g(n) a lower bound function instead of an upper bound function. It describes the best that can happen for a given data size.

Theta Notation (θ)

The lower and upper bound for the function 'f' is provided by the theta notation (θ). For non-negative function f(n) and g(n), if there exists an integer n0 and positive constants c1 and c2 i.e., c1 > 0 and c2 > 0 such that for all integers n > n0 c1g(n) ≤ f(n) ≤ c2g(n) then f(n) is theta of g(n). This is denoted as: f(n) ∈ θ(g) we mean f is order g

Polynomial running time

An algorithm is said to be solvable in polynomial time if the number of steps required to complete the algorithm for a given input is O(nk) for some non-negative intege k where n is the complexity of the input. Polynomial-time algorithms are said to be "fast." Most familiar mathematical operations such as addition, subtraction, multiplication, and division, as well as computing square roots, powers, and logarithms, can be performed in polynomial time.

if time = f(nc) where n = numbers of elements in the input i.e. size of the input and c is some constant integer , then we say the algorithm has Polynomial time complexity.

Exponential Running Time

The set of problems which can be solved by an exponential time algorithms, but for which no polynomial time algorithms is known. An algorithm is said to be exponential time, if T(n) is upper bounded by 2poly(n) where poly(n) is some polynomial in n More formally, an algorithm is exponential time if T(n) is bounded by O(2nk)for some constant k.

if time = f(cn) where n = numbers of elements in the input i.e. size of the input and c is some constant integer , then we say the algorithm has Exponential time complexity.

Principles of Algorithm Design/ Algorithm Properties

  • Input: - An algorithm must contain a zero or more well-defined input
  • Output: - whenever any input is given to an algorithm, then it should produce at least one output
  • Finiteness: - Algorithm must complete after a finite number of instructions have been executed.
  • Absence of ambiguity: - Each step must be clearly defined having only one interpretation.
  • Definition of sequence: - Each step must have a unique defined preceding and succeeding step. The first step (start step) and last step (halt step) must be clearly noted.
  • Feasibility: - It must be possible to perform each instruction.

Algoritm Design and Analysis Process

  • Understand the problem
  • Decide on: Computational means, Exact vs. Approximate solving, Algorithm design technique
  • Design an algorithm
  • Prove correctness
  • Analyze the algorithm
  • Code the algorithm
Top

Comments

Popular posts from this blog

Basic Algorithm Design Techniques

Backtracking