Worst, Average and Best Case Analysis of Algorithms
Last Updated :
14 Mar, 2024
In the previous post, we discussed how Asymptotic analysis overcomes the problems of the naive way of analyzing algorithms. But let’s take an overview of the asymptotic notation and learn about What is Worst, Average, and Best cases of an algorithm:
Popular Notations in Complexity Analysis of Algorithms
1. Big-O Notation
We define an algorithm’s worst-case time complexity by using the Big-O notation, which determines the set of functions grows slower than or at the same rate as the expression. Furthermore, it explains the maximum amount of time an algorithm requires to consider all input values.
2. Omega Notation
It defines the best case of an algorithm’s time complexity, the Omega notation defines whether the set of functions will grow faster or at the same rate as the expression. Furthermore, it explains the minimum amount of time an algorithm requires to consider all input values.
3. Theta Notation
It defines the average case of an algorithm’s time complexity, the Theta notation defines when the set of functions lies in both O(expression) and Omega(expression), then Theta notation is used. This is how we define a time complexity average case for an algorithm.
Measurement of Complexity of an Algorithm
Based on the above three notations of Time Complexity there are three cases to analyze an algorithm:
1. Worst Case Analysis (Mostly used)
In the worst-case analysis, we calculate the upper bound on the running time of an algorithm. We must know the case that causes a maximum number of operations to be executed. For Linear Search, the worst case happens when the element to be searched (x) is not present in the array. When x is not present, the search() function compares it with all the elements of arr[] one by one. Therefore, the worst-case time complexity of the linear search would be O(n).
2. Best Case Analysis (Very Rarely used)
In the best-case analysis, we calculate the lower bound on the running time of an algorithm. We must know the case that causes a minimum number of operations to be executed. In the linear search problem, the best case occurs when x is present at the first location. The number of operations in the best case is constant (not dependent on n). So time complexity in the best case would be ?(1)
3. Average Case Analysis (Rarely used)
In average case analysis, we take all possible inputs and calculate the computing time for all of the inputs. Sum all the calculated values and divide the sum by the total number of inputs. We must know (or predict) the distribution of cases. For the linear search problem, let us assume that all cases are uniformly distributed (including the case of x not being present in the array). So we sum all the cases and divide the sum by (n+1). Following is the value of average-case time complexity.
Average Case Time = \sum_{i=1}^{n}\frac{\theta (i)}{(n+1)} = \frac{\theta (\frac{(n+1)*(n+2)}{2})}{(n+1)} = \theta (n)
Which Complexity analysis is generally used?
Below is the ranked mention of complexity analysis notation based on popularity:
1. Worst Case Analysis:
Most of the time, we do worst-case analyses to analyze algorithms. In the worst analysis, we guarantee an upper bound on the running time of an algorithm which is good information.
2. Average Case Analysis
The average case analysis is not easy to do in most practical cases and it is rarely done. In the average case analysis, we must know (or predict) the mathematical distribution of all possible inputs.
3. Best Case Analysis
The Best Case analysis is bogus. Guaranteeing a lower bound on an algorithm doesn’t provide any information as in the worst case, an algorithm may take years to run.
Interesting information about asymptotic notations:
A) For some algorithms, all the cases (worst, best, average) are asymptotically the same. i.e., there are no worst and best cases.
- Example: Merge Sort does ?(n log(n)) operations in all cases.
B) Where as most of the other sorting algorithms have worst and best cases.
- Example 1: In the typical implementation of Quick Sort (where pivot is chosen as a corner element), the worst occurs when the input array is already sorted and the best occurs when the pivot elements always divide the array into two halves.
- Example 2: For insertion sort, the worst case occurs when the array is reverse sorted and the best case occurs when the array is sorted in the same order as output.
Examples with their complexity analysis:
1. Linear search algorithm:
C++
// C++ implementation of the approach
#include <bits/stdc++.h>
using namespace std;
// Linearly search x in arr[].
// If x is present then return the index,
// otherwise return -1
int search(int arr[], int n, int x)
{
int i;
for (i = 0; i < n; i++) {
if (arr[i] == x)
return i;
}
return -1;
}
// Driver's Code
int main()
{
int arr[] = { 1, 10, 30, 15 };
int x = 30;
int n = sizeof(arr) / sizeof(arr[0]);
// Function call
cout << x << " is present at index "
<< search(arr, n, x);
return 0;
}
C
// C implementation of the approach
#include <stdio.h>
// Linearly search x in arr[].
// If x is present then return the index,
// otherwise return -1
int search(int arr[], int n, int x)
{
int i;
for (i = 0; i < n; i++) {
if (arr[i] == x)
return i;
}
return -1;
}
/* Driver's code*/
int main()
{
int arr[] = { 1, 10, 30, 15 };
int x = 30;
int n = sizeof(arr) / sizeof(arr[0]);
// Function call
printf("%d is present at index %d", x,
search(arr, n, x));
getchar();
return 0;
}
Java
// Java implementation of the approach
public class GFG {
// Linearly search x in arr[]. If x is present then
// return the index, otherwise return -1
static int search(int arr[], int n, int x)
{
int i;
for (i = 0; i < n; i++) {
if (arr[i] == x) {
return i;
}
}
return -1;
}
/* Driver's code*/
public static void main(String[] args)
{
int arr[] = { 1, 10, 30, 15 };
int x = 30;
int n = arr.length;
// Function call
System.out.printf("%d is present at index %d", x,
search(arr, n, x));
}
}
C#
// C# implementation of the approach
using System;
public class GFG {
// Linearly search x in arr[]. If x is present then
// return the index, otherwise return -1
static int search(int[] arr, int n, int x)
{
int i;
for (i = 0; i < n; i++) {
if (arr[i] == x) {
return i;
}
}
return -1;
}
/* Driver's code*/
public static void Main()
{
int[] arr = { 1, 10, 30, 15 };
int x = 30;
int n = arr.Length;
// Function call
Console.WriteLine(x + " is present at index "
+ search(arr, n, x));
}
}
JavaScript
// javascript implementation of the approach
// Linearly search x in arr. If x is present then
// return the index, otherwise return -1
function search(arr , n , x) {
var i;
for (i = 0; i < n; i++) {
if (arr[i] == x) {
return i;
}
}
return -1;
}
/* Driver program to test above functions */
var arr = [ 1, 10, 30, 15 ];
var x = 30;
var n = arr.length;
document.write(x+" is present at index "+ search(arr, n, x));
PHP
<?php
// PHP implementation of the approach
// Linearly search x in arr[]. If x
// is present then return the index,
// otherwise return -1
function search($arr, $n, $x)
{
for ($i = 0; $i < $n; $i++)
{
if ($arr[$i] == $x)
return $i;
}
return -1;
}
// Driver's Code
$arr = array(1, 10, 30, 15);
$x = 30;
$n = sizeof($arr);
// Function call
echo $x . " is present at index ".
search($arr, $n, $x);
Python3
# Python 3 implementation of the approach
# Linearly search x in arr[]. If x is present
# then return the index, otherwise return -1
def search(arr, x):
for index, value in enumerate(arr):
if value == x:
return index
return -1
# Driver's Code
if __name__ == '__main__':
arr = [1, 10, 30, 15]
x = 30
# Function call
print(x, "is present at index",
search(arr, x))
Output30 is present at index 2
Time Complexity Analysis: (In Big-O notation)
- Best Case: O(1), This will take place if the element to be searched is on the first index of the given list. So, the number of comparisons, in this case, is 1.
- Average Case: O(n), This will take place if the element to be searched is on the middle index of the given list.
- Worst Case: O(n), This will take place if:
- The element to be searched is on the last index
- The element to be searched is not present on the list
2. In this example, we will take an array of length (n) and deals with the following cases :
- If (n) is even then our output will be 0
- If (n) is odd then our output will be the sum of the elements of the array.
Below is the implementation of the given problem:
C++
// C++ implementation of the approach
#include <bits/stdc++.h>
using namespace std;
int getSum(int arr[], int n)
{
if (n % 2 == 0) // (n) is even
{
return 0;
}
int sum = 0;
for (int i = 0; i < n; i++) {
sum += arr[i];
}
return sum; // (n) is odd
}
// Driver's Code
int main()
{
// Declaring two array one of length odd and other of
// length even;
int arr[4] = { 1, 2, 3, 4 };
int a[5] = { 1, 2, 3, 4, 5 };
// Function call
cout << getSum(arr, 4)
<< endl; // print 0 because (n) is even
cout << getSum(a, 5)
<< endl; // print sum because (n) is odd
}
// This code is contributed by Suruchi Kumari
Java
// Java implementation of the approach
public class GFG {
static int getsum(int arr[], int n)
{
if (n % 2 == 0) // if (n) is even
{
return 0;
}
int sum = 0;
for (int i = 0; i < n; i++) {
sum += arr[i];
}
return sum; // if (n) is odd
}
/* Driver's code*/
public static void main(String[] args)
{
int arr1[]
= { 1, 2, 3,
4 }; // Declaring an array of even length
int n1 = arr1.length;
int arr2[]
= { 1, 2, 3, 4,
5 }; // Declaring an array of odd length
int n2 = arr2.length;
// Function call
System.out.println(getsum(
arr1, n1)); // print 0 because (n) is even
System.out.println(getsum(
arr2,
n2)); // print sum of array because (n) is odd
}
} // This code is contributed by Syed Maruf Ali (Sdmrf)
C#
using System;
public class Gfg
{
static int getSum(int[] arr, int n)
{
if (n % 2 == 0) // (n) is even
{
return 0;
}
int sum = 0;
for (int i = 0; i < n; i++) {
sum += arr[i];
}
return sum; // (n) is odd
}
// Driver's Code
public static void Main(string[] args)
{
// Declaring two array one of length odd and other of
// length even;
int[] arr = { 1, 2, 3, 4 };
int[] a = { 1, 2, 3, 4, 5 };
// Function call
Console.Write(getSum(arr, 4)+"\n"); // print 0 because (n) is even
Console.Write(getSum(a, 5)); // print sum because (n) is odd
}
}
// This code is contributed by poojaagarwal2.
JavaScript
// JavaScript implementation of the approach
function getSum(arr, n) {
if (n % 2 == 0) {
// (n) is even
return 0;
}
var sum = 0;
for (var i = 0; i < n; i++) {
sum += arr[i];
}
return sum; // (n) is odd
}
// Driver's Code
// Declaring two array one of length odd and other of
// length even;
var arr = [1, 2, 3, 4];
var a = [1, 2, 3, 4, 5];
// Function call
console.log(getSum(arr, 4)); //print 0 because (n) is even
console.log(getSum(a, 5)); // print sum because (n) is odd
// This code is contributed by satwiksuman.
PHP
<?php
// PHP implementation of the approach
function getSum($arr, $n) {
if ($n % 2 == 0) {
return 0;
}
$sum = 0;
for ($i = 0; $i < $n; $i++) {
$sum += $arr[$i];
}
return $sum;
}
// Driver's Code
$arr1 = [1, 2, 3, 4]; // Declaring an array of even length
$n1 = count($arr1);
$arr2 = [1, 2, 3, 4, 5]; // Declaring an array of odd length
$n2 = count($arr2);
// Function calls
echo getSum($arr1, $n1) . "\n"; // print 0 because (n) is even
echo getSum($arr2, $n2) . "\n"; // print sum of array because (n) is odd
?>
Python3
# Python 3 implementation of the approach
def getsum(arr, n):
if n % 2 == 0: # if (n) is even
return 0
Sum = 0
for i in range(n):
Sum += arr[i]
return Sum # if (n) is odd
# Driver's Code
if __name__ == '__main__':
arr1 = [1, 2, 3, 4] # Declaring an array of even length
n1 = len(arr1)
arr2 = [1, 2, 3, 4, 5] # Declaring an array of odd length
n2 = len(arr2)
# Function call
print(getsum(arr1, n1)) # print 0 because (n) is even
print(getsum(arr2, n2)) # print sum of array because (n) is odd
# This code is contributed by Syed Maruf Ali
Time Complexity Analysis:
- Best Case: The order of growth will be constant because in the best case we are assuming that (n) is even.
- Average Case: In this case, we will assume that even and odd are equally likely, therefore Order of growth will be linear
- Worst Case: The order of growth will be linear because in this case, we are assuming that (n) is always odd.
For more details, please refer: Design and Analysis of Algorithms.
Worst, Average, and Best Case Analysis of Algorithms is a technique used to analyze the performance of algorithms under different conditions. Here are some advantages, disadvantages, important points, and reference books related to this analysis technique:
Advantages:
- This technique allows developers to understand the performance of algorithms under different scenarios, which can help in making informed decisions about which algorithm to use for a specific task.
- Worst case analysis provides a guarantee on the upper bound of the running time of an algorithm, which can help in designing reliable and efficient algorithms.
- Average case analysis provides a more realistic estimate of the running time of an algorithm, which can be useful in real-world scenarios.
Disadvantages:
- This technique can be time-consuming and requires a good understanding of the algorithm being analyzed.
- Worst case analysis does not provide any information about the typical running time of an algorithm, which can be a disadvantage in real-world scenarios.
- Average case analysis requires knowledge of the probability distribution of input data, which may not always be available.
Important points:
- The worst case analysis of an algorithm provides an upper bound on the running time of the algorithm for any input size.
- The average case analysis of an algorithm provides an estimate of the running time of the algorithm for a random input.
- The best case analysis of an algorithm provides a lower bound on the running time of the algorithm for any input size.
- The big O notation is commonly used to express the worst case running time of an algorithm.
- Different algorithms may have different best, average, and worst case running times.
Reference books:
- “Introduction to Algorithms” by Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein is a comprehensive guide to algorithm analysis, including worst, average, and best case analysis.
- “Algorithm Design” by Jon Kleinberg and Éva Tardos provides a modern approach to algorithm design, with a focus on worst case analysis.
- “The Art of Computer Programming” by Donald Knuth is a classic text on algorithms and programming, which includes a detailed discussion of worst case analysis.
- “Algorithms Unlocked” by Thomas H. Cormen provides a gentle introduction to algorithm analysis, including worst, average, and best case analysis.
Please Login to comment...