Understanding time complexity using Big-O notation and examples

In this article we will discuss complexity and Big O notation.

Complexity

The idea is to describe how many operations a process takes depending on the size of the problem. Typically, we are given a collection of data (array, List, etc) consisting of n elements. We refer to n as the size of the problem. An algorithim will perform (at most, depending on the stopping criteria) a number of operations as a function of n. If we call this function f(n), then we say an algorthim is O(f(n)). This is useful when comparing the efficiency of different algorithms. We summarize the common big O notation in the following table.

Big-O Name Big-O Notation
Constant Time O(1)
Logarithmic Time O(log n)
Linear Time O(n)
Quasilinear Time O(n Log n)
Quadratic Time O(n²)

We describe each case below.

Constant Time

A constant time process is independent of the size of the problem. We call such an algorithm O(1).

Example

Accessing an element in an array only requires a single operation regardless of the size of the array.

Logarithmic Time

The number of required operations as a function of the size of the problem in a logarithmic time process is on the order of a logarithmic function.

Example

Binary search a sorted array:

int search(int[] array, int value) {

	// define the start index
	int start = 0;
	
	// define the end index
	int end = array.length - 1;
	
	// loop 
	while (start <= end) {

		// compute the midpoint index between the start and end indices
  		int mid = (start + end) / 2;
  		
  		// access the element at the midpoint index
		int test = array[mid];
		
		if (test == value) {
   	
   			// we found the element - 
   			// return the midpoint index
        	return mid;
        	
   		} else if (test < value){
   		
   			// the test element is smaller than the value - 
   			// loop through the elements starting at the index mid + 1
       		start = mid + 1;
       	
   		} else {
   		
   			// the test element is greater than the value - 
   			// loop through the elements ending at the index mid - 1
       		end = mid - 1;
   		}
   	}
   
   	// the value was not found -
   	// return -1
   	return -1;
}

Since the algorithm splits the problem into a new smaller problem with one half the size on each iteration, it is of order log(n). We call such an algorithm O(logn).

Linear Time

The number of required operations as a function of the size of the problem in a linear time process is on the order of a linear function. We call such an algorithm O(n).

Example

Summing the elements in an array:

int sum(int[] array) {
	int sum = 0;
	for (int i = 0; i < array.length; i++) {
		sum += array[i];
	}
	
	return sum;
}

Quasilinear Time

The number of required operations as a function of the size of the problem in a quasilinear time process is on the order of a linear function n times a logarithmic function log(n). We call such an algorithm O(nlogn). Any algorithm which iterates through the indices and, at each iteration utilizes an O(logn) algorithm is O(nlogn).

Example

Quicksort (Average case)

Quadratic Time

The number of required operations as a function of the size of the problem in a quadratic time process is on the order of a quadratic function. Any algorithim that involves a two deep nested iteration is O(n²).

Example

// loop through the indices
for (int i = 0; i < N; i++) {
	
	// do something
		
	// loop through the indices again
	for (int j = 0; j < N; j++) {
		
		// do something else
	}
}

Conclusion

In this article we learned about Big-O notation and complexity in programming with examples in Java

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.