Open In App

Worst, Average and Best Case Analysis of Algorithms

Last Updated : 14 Mar, 2024
Improve
Improve
Like Article
Like
Save
Share
Report

In the previous post, we discussed how Asymptotic analysis overcomes the problems of the naive way of analyzing algorithms. But let’s take an overview of the asymptotic notation and learn about What is Worst, Average, and Best cases of an algorithm:

Popular Notations in Complexity Analysis of Algorithms

1. Big-O Notation

We define an algorithm’s worst-case time complexity by using the Big-O notation, which determines the set of functions grows slower than or at the same rate as the expression. Furthermore, it explains the maximum amount of time an algorithm requires to consider all input values.

2. Omega Notation

It defines the best case of an algorithm’s time complexity, the Omega notation defines whether the set of functions will grow faster or at the same rate as the expression. Furthermore, it explains the minimum amount of time an algorithm requires to consider all input values.

3. Theta Notation

It defines the average case of an algorithm’s time complexity, the Theta notation defines when the set of functions lies in both O(expression) and Omega(expression), then Theta notation is used. This is how we define a time complexity average case for an algorithm. 

Measurement of Complexity of an Algorithm

Based on the above three notations of Time Complexity there are three cases to analyze an algorithm:

1. Worst Case Analysis (Mostly used) 

In the worst-case analysis, we calculate the upper bound on the running time of an algorithm. We must know the case that causes a maximum number of operations to be executed. For Linear Search, the worst case happens when the element to be searched (x) is not present in the array. When x is not present, the search() function compares it with all the elements of arr[] one by one. Therefore, the worst-case time complexity of the linear search would be O(n).

2. Best Case Analysis (Very Rarely used) 

In the best-case analysis, we calculate the lower bound on the running time of an algorithm. We must know the case that causes a minimum number of operations to be executed. In the linear search problem, the best case occurs when x is present at the first location. The number of operations in the best case is constant (not dependent on n). So time complexity in the best case would be ?(1) 

3. Average Case Analysis (Rarely used) 

In average case analysis, we take all possible inputs and calculate the computing time for all of the inputs. Sum all the calculated values and divide the sum by the total number of inputs. We must know (or predict) the distribution of cases. For the linear search problem, let us assume that all cases are uniformly distributed (including the case of x not being present in the array). So we sum all the cases and divide the sum by (n+1). Following is the value of average-case time complexity. 
 

Average Case Time = \sum_{i=1}^{n}\frac{\theta (i)}{(n+1)} = \frac{\theta (\frac{(n+1)*(n+2)}{2})}{(n+1)} = \theta (n)

Which Complexity analysis is generally used?

Below is the ranked mention of complexity analysis notation based on popularity:

1. Worst Case Analysis: 

Most of the time, we do worst-case analyses to analyze algorithms. In the worst analysis, we guarantee an upper bound on the running time of an algorithm which is good information. 

2. Average Case Analysis 

The average case analysis is not easy to do in most practical cases and it is rarely done. In the average case analysis, we must know (or predict) the mathematical distribution of all possible inputs. 

3. Best Case Analysis 

The Best Case analysis is bogus. Guaranteeing a lower bound on an algorithm doesn’t provide any information as in the worst case, an algorithm may take years to run.

Interesting information about asymptotic notations:

A) For some algorithms, all the cases (worst, best, average) are asymptotically the same. i.e., there are no worst and best cases. 

  • Example:  Merge Sort does ?(n log(n)) operations in all cases.

B) Where as most of the other sorting algorithms have worst and best cases. 

  • Example 1: In the typical implementation of Quick Sort (where pivot is chosen as a corner element), the worst occurs when the input array is already sorted and the best occurs when the pivot elements always divide the array into two halves.
  • Example 2: For insertion sort, the worst case occurs when the array is reverse sorted and the best case occurs when the array is sorted in the same order as output.

Examples with their complexity analysis:

1. Linear search algorithm:

C++
// C++ implementation of the approach
#include <bits/stdc++.h>
using namespace std;

// Linearly search x in arr[].
// If x is present then return the index,
// otherwise return -1
int search(int arr[], int n, int x)
{
    int i;
    for (i = 0; i < n; i++) {
        if (arr[i] == x)
            return i;
    }
    return -1;
}

// Driver's Code
int main()
{
    int arr[] = { 1, 10, 30, 15 };
    int x = 30;
    int n = sizeof(arr) / sizeof(arr[0]);

    // Function call
    cout << x << " is present at index "
         << search(arr, n, x);

    return 0;
}
C
// C implementation of the approach
#include <stdio.h>

// Linearly search x in arr[].
// If x is present then return the index,
// otherwise return -1
int search(int arr[], int n, int x)
{
    int i;
    for (i = 0; i < n; i++) {
        if (arr[i] == x)
            return i;
    }
    return -1;
}

/* Driver's code*/
int main()
{
    int arr[] = { 1, 10, 30, 15 };
    int x = 30;
    int n = sizeof(arr) / sizeof(arr[0]);

    // Function call
    printf("%d is present at index %d", x,
           search(arr, n, x));

    getchar();
    return 0;
}
Java
// Java implementation of the approach

public class GFG {

    // Linearly search x in arr[].  If x is present then
    // return the index, otherwise return -1
    static int search(int arr[], int n, int x)
    {
        int i;
        for (i = 0; i < n; i++) {
            if (arr[i] == x) {
                return i;
            }
        }
        return -1;
    }

    /* Driver's code*/
    public static void main(String[] args)
    {
        int arr[] = { 1, 10, 30, 15 };
        int x = 30;
        int n = arr.length;

        // Function call
        System.out.printf("%d is present at index %d", x,
                          search(arr, n, x));
    }
}
C#
// C# implementation of the approach
using System;
public class GFG {

    // Linearly search x in arr[].  If x is present then
    // return the index, otherwise return -1
    static int search(int[] arr, int n, int x)
    {
        int i;
        for (i = 0; i < n; i++) {
            if (arr[i] == x) {
                return i;
            }
        }
        return -1;
    }

    /* Driver's code*/
    public static void Main()
    {
        int[] arr = { 1, 10, 30, 15 };
        int x = 30;
        int n = arr.Length;

        // Function call
        Console.WriteLine(x + " is present at index "
                          + search(arr, n, x));
    }
}
JavaScript
// javascript implementation of the approach

     // Linearly search x in arr. If x is present then
    // return the index, otherwise return -1
    function search(arr , n , x) {
        var i;
        for (i = 0; i < n; i++) {
            if (arr[i] == x) {
                return i;
            }
        }
        return -1;
    }

    /* Driver program to test above functions */
    
        var arr = [ 1, 10, 30, 15 ];
        var x = 30;
        var n = arr.length;
        document.write(x+" is present at index "+ search(arr, n, x));
PHP
<?php
// PHP implementation of the approach

// Linearly search x in arr[]. If x 
// is present then return the index,
// otherwise return -1
function search($arr, $n, $x)
{
    for ($i = 0; $i < $n; $i++)
    {
    if ($arr[$i] == $x)
        return $i;
    }
    return -1;
}

// Driver's Code
$arr = array(1, 10, 30, 15);
$x = 30;
$n = sizeof($arr);

// Function call
echo $x . " is present at index ". 
             search($arr, $n, $x);
Python3
# Python 3 implementation of the approach

# Linearly search x in arr[]. If x is present
# then return the index, otherwise return -1


def search(arr, x):
    for index, value in enumerate(arr):
        if value == x:
            return index
    return -1


# Driver's Code
if __name__ == '__main__':
    arr = [1, 10, 30, 15]
    x = 30

    # Function call
    print(x, "is present at index",
          search(arr, x))

Output
30 is present at index 2


Time Complexity Analysis: (In Big-O notation)

  • Best Case: O(1), This will take place if the element to be searched is on the first index of the given list. So, the number of comparisons, in this case, is 1.
  • Average Case: O(n), This will take place if the element to be searched is on the middle index of the given list.
  • Worst Case: O(n), This will take place if:
    • The element to be searched is on the last index
    • The element to be searched is not present on the list

2. In this example, we will take an array of length (n) and deals with the following cases :

  • If (n) is even then our output will be 0
  • If (n) is odd then our output will be the sum of the elements of the array.

Below is the implementation of the given problem:

C++
// C++ implementation of the approach
#include <bits/stdc++.h>
using namespace std;

int getSum(int arr[], int n)
{
    if (n % 2 == 0) // (n) is even
    {
        return 0;
    }
    int sum = 0;
    for (int i = 0; i < n; i++) {
        sum += arr[i];
    }
    return sum; //  (n) is odd
}

// Driver's Code
int main()
{
    // Declaring two array one of length odd and other of
    // length even;
    int arr[4] = { 1, 2, 3, 4 };
    int a[5] = { 1, 2, 3, 4, 5 };

    // Function call
    cout << getSum(arr, 4)
         << endl; // print 0 because (n) is even
    cout << getSum(a, 5)
         << endl; // print sum because (n) is odd
}
// This code is contributed by Suruchi Kumari
Java
// Java implementation of the approach

public class GFG {
    static int getsum(int arr[], int n)
    {
        if (n % 2 == 0) // if (n) is even
        {
            return 0;
        }
        int sum = 0;
        for (int i = 0; i < n; i++) {
            sum += arr[i];
        }
        return sum; // if (n) is odd
    }

    /* Driver's code*/
    public static void main(String[] args)
    {
        int arr1[]
            = { 1, 2, 3,
                4 }; // Declaring an array of even length
        int n1 = arr1.length;

        int arr2[]
            = { 1, 2, 3, 4,
                5 }; // Declaring an array of odd length
        int n2 = arr2.length;

        // Function call
        System.out.println(getsum(
            arr1, n1)); // print 0 because (n) is even
        System.out.println(getsum(
            arr2,
            n2)); // print sum of array because (n) is odd
    }
} // This code is contributed by Syed Maruf Ali (Sdmrf)
C#
using System;

public class Gfg
{
  static int getSum(int[] arr, int n)
  {
    if (n % 2 == 0) // (n) is even
    {
      return 0;
    }
    int sum = 0;
    for (int i = 0; i < n; i++) {
      sum += arr[i];
    }
    return sum; //  (n) is odd
  }

  // Driver's Code
  public static void Main(string[] args)
  {
    // Declaring two array one of length odd and other of
    // length even;
    int[] arr = { 1, 2, 3, 4 };
    int[] a = { 1, 2, 3, 4, 5 };

    // Function call
    Console.Write(getSum(arr, 4)+"\n"); // print 0 because (n) is even
    Console.Write(getSum(a, 5)); // print sum because (n) is odd
  }
}

// This code is contributed by poojaagarwal2.
JavaScript
// JavaScript implementation of the approach
      function getSum(arr, n) {
        if (n % 2 == 0) {
          // (n) is even
          return 0;
        }
        var sum = 0;
        for (var i = 0; i < n; i++) {
          sum += arr[i];
        }
        return sum; //  (n) is odd
      }

      // Driver's Code

      // Declaring two array one of length odd and other of
      // length even;
      var arr = [1, 2, 3, 4];
      var a = [1, 2, 3, 4, 5];

      // Function call
      console.log(getSum(arr, 4)); //print 0 because (n) is even
      console.log(getSum(a, 5)); // print sum because (n) is odd
      
      // This code is contributed by satwiksuman.
PHP
<?php
// PHP implementation of the approach

function getSum($arr, $n) {
    if ($n % 2 == 0) {
        return 0;
    }

    $sum = 0;
    for ($i = 0; $i < $n; $i++) {
        $sum += $arr[$i];
    }
    return $sum;
}

// Driver's Code
$arr1 = [1, 2, 3, 4];  // Declaring an array of even length
$n1 = count($arr1);
$arr2 = [1, 2, 3, 4, 5];  // Declaring an array of odd length
$n2 = count($arr2);

// Function calls
echo getSum($arr1, $n1) . "\n";  // print 0 because (n) is even
echo getSum($arr2, $n2) . "\n";  // print sum of array because (n) is odd
?>
Python3
# Python 3 implementation of the approach


def getsum(arr, n):
    if n % 2 == 0:  # if (n) is even
        return 0

    Sum = 0
    for i in range(n):
        Sum += arr[i]
    return Sum  # if (n) is odd


# Driver's Code
if __name__ == '__main__':
  arr1 = [1, 2, 3, 4]  # Declaring an array of even length
  n1 = len(arr1)
  arr2 = [1, 2, 3, 4, 5]  # Declaring an array of odd length
  n2 = len(arr2)

# Function call
print(getsum(arr1, n1))  # print 0 because (n) is even

print(getsum(arr2, n2))  # print sum of array because (n) is odd

# This code is contributed by Syed Maruf Ali

Output
0
15


Time Complexity Analysis:

  • Best Case: The order of growth will be constant because in the best case we are assuming that (n) is even.
  • Average Case: In this case, we will assume that even and odd are equally likely, therefore Order of growth will be linear
  • Worst Case: The order of growth will be linear because in this case, we are assuming that (n) is always odd.

For more details, please refer: Design and Analysis of Algorithms.

Worst, Average, and Best Case Analysis of Algorithms is a technique used to analyze the performance of algorithms under different conditions. Here are some advantages, disadvantages, important points, and reference books related to this analysis technique:

Advantages:

  1. This technique allows developers to understand the performance of algorithms under different scenarios, which can help in making informed decisions about which algorithm to use for a specific task.
  2. Worst case analysis provides a guarantee on the upper bound of the running time of an algorithm, which can help in designing reliable and efficient algorithms.
  3. Average case analysis provides a more realistic estimate of the running time of an algorithm, which can be useful in real-world scenarios.


Disadvantages:

  1. This technique can be time-consuming and requires a good understanding of the algorithm being analyzed.
  2. Worst case analysis does not provide any information about the typical running time of an algorithm, which can be a disadvantage in real-world scenarios.
  3. Average case analysis requires knowledge of the probability distribution of input data, which may not always be available.


Important points:

  1. The worst case analysis of an algorithm provides an upper bound on the running time of the algorithm for any input size.
  2. The average case analysis of an algorithm provides an estimate of the running time of the algorithm for a random input.
  3. The best case analysis of an algorithm provides a lower bound on the running time of the algorithm for any input size.
  4. The big O notation is commonly used to express the worst case running time of an algorithm.
  5. Different algorithms may have different best, average, and worst case running times.


Reference books:

  1. “Introduction to Algorithms” by Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein is a comprehensive guide to algorithm analysis, including worst, average, and best case analysis.
  2. “Algorithm Design” by Jon Kleinberg and Éva Tardos provides a modern approach to algorithm design, with a focus on worst case analysis.
  3. “The Art of Computer Programming” by Donald Knuth is a classic text on algorithms and programming, which includes a detailed discussion of worst case analysis.
  4. “Algorithms Unlocked” by Thomas H. Cormen provides a gentle introduction to algorithm analysis, including worst, average, and best case analysis.


Previous Article
Next Article

Similar Reads

When does the worst case of Quicksort occur?
The answer depends on the strategy for choosing pivot. In early versions of Quick Sort where the leftmost (or rightmost) element is chosen as a pivot, the worst occurs in the following cases. 1) Array is already sorted in the same order. 2) Array is already sorted in reverse order. 3) All elements are the same (a special case of cases 1 and 2) Sinc
4 min read
K’th Smallest/Largest Element in Unsorted Array | Worst case Linear Time
We recommend reading the following posts as a prerequisite for this post.K’th Smallest/Largest Element in Unsorted Array K’th Smallest/Largest Element in Unsorted Array | Expected Linear TimeGiven an array and a number k where k is smaller than the size of the array, we need to find the k’th smallest element in the given array. It is given that all
15+ min read
Can QuickSort be implemented in O(nLogn) worst case time complexity?
The worst-case time complexity of a typical implementation of QuickSort is O(n2). The worst case occurs when the picked pivot is always an extreme (smallest or largest) element. This happens when the input array is sorted or reverses sorted and either the first or last element is picked as a pivot. Although randomized QuickSort works well even when
15+ min read
QuickSort Tail Call Optimization (Reducing worst case space to Log n )
Prerequisite : Tail Call Elimination In QuickSort, partition function is in-place, but we need extra space for recursive function calls. A simple implementation of QuickSort makes two calls to itself and in worst case requires O(n) space on function call stack. The worst case happens when the selected pivot always divides the array such that one pa
12 min read
Cuckoo Hashing - Worst case O(1) Lookup!
Background : There are three basic operations that must be supported by a hash table (or a dictionary): Lookup(key): return true if key is there on the table, else falseInsert(key): add the item ‘key’ to the table if not already presentDelete(key): removes ‘key’ from the table Collisions are very likely even if we have a big table to store keys. Us
15+ min read
Find a permutation that causes worst case of Merge Sort
Given a set of elements, find which permutation of these elements would result in worst case of Merge Sort.Asymptotically, merge sort always takes O(n Log n) time, but the cases that require more comparisons generally take more time in practice. We basically need to find a permutation of input elements that would lead to maximum number of compariso
12 min read
Difference between average case and amortized analysis
Average case analysis is a type of algorithm analysis that takes into account the distribution of inputs that an algorithm is likely to encounter in practice. In contrast to worst-case analysis, which considers the performance of an algorithm on the input that causes it to take the longest time to execute, the average-case analysis assumes that the
3 min read
Asymptotic Notation and Analysis (Based on input size) in Complexity Analysis of Algorithms
Asymptotic Analysis is defined as the big idea that handles the above issues in analyzing algorithms. In Asymptotic Analysis, we evaluate the performance of an algorithm in terms of input size (we don't measure the actual running time). We calculate, how the time (or space) taken by an algorithm increases with the input size. Asymptotic notation is
8 min read
Lower case to upper case - An interesting fact
Problem: Given a string containing only lowercase letters, generate a string with the same letters, but in uppercase. Input : GeeksForGeeks Output : GEEKSFORGEEKS Recommended PracticeLower case to upper caseTry It! The first method that comes to our mind is C/C++ Code // C++ program to convert a string to uppercase #include &lt;iostream&gt; using n
7 min read
Program for Worst Fit algorithm in Memory Management
Prerequisite : Partition allocation methodsWorst Fit allocates a process to the partition which is largest sufficient among the freely available partitions available in the main memory. If a large process comes at a later stage, then memory will not have space to accommodate it. Example: Input : blockSize[] = {100, 500, 200, 300, 600}; processSize[
8 min read
three90RightbarBannerImg