# Quick Sort Algorithm

A comparison based, divide and conquer algorithm that sorts an input by sorting subsets around a pivot element.

Quick Sort is a divide-and-conquer, in-memory sorting algorithm that partitions an input around a pivot element and then sorts those subsets recursively.

The idea around the quick sort algorithm is to place all the smaller in-value elements to the left of the pivot element and all the larger elements to its right. Thus after every iteration, the pivot element will be at its correct position. The algorithm is then repeated for the two left and right subsets.

The most important aspect of the algorithm is the partitioning routine that divides the given input into two sets around the pivot element and returns the location of this partitioning index.

Assuming we choose the first element as the pivot element, the following steps provide an idea about the partitioning process:

1. Set $$start=left$$, $$end = right$$, $$pivot = input[0]$$
2. while $$start \leq end$$; do
1. Increment start until you find an element more than the pivot element: while input[start] <= pivot; start++
2. Decrement end until you find an element smaller than or equal to the pivot element: while input[end] > pivot; end--
3. if start < end swap input[start] and input[end]
3. if input[left] > input[end]; swap these two as end represents the index having all the elements to its left as smaller than the pivot, i.e., end represents the index meant for the pivot (or input[left).
4. return $$end$$ - the partitioning index.

Consider the following example: [3,5,1,9,7,8] with pivot element as the first element(3)

• Starting from 0, we increment start to point to index 1 (value 5), which is more than 3. Similarly, starting from index 5 (value 7), we decrement the end to point to index 2 (value 1). As start < end, we swap these two: [3,1,5,9,7,8]
• Now, after the swap, the parent condition $$start \leq end$$ still holds true, and another iteration begins. The start is incremented again and now points to index 2 (value 5). Similarly, the end is decremented and points to index 1 (value 1). As start < end does not hold true, there is no swap: [3,1,5,9,7,8].
• After the main while loop terminates ($$start \nleq end$$), the end index points to an element with all the values less than pivot to its left. As the pivot element (3) represented by input[left] is more than the element at the end index, we swap these two and return the end index 1: [1, 3, 5, 9, 7, 8]

As we can see, the element at index 1 is at its correct position, i.e., all the elements to its left are smaller, and all the elements to its right are larger than it. So now, the algorithm recursively performs the same operation on these two subsets.

Java implementation of the partitioning routine is as:

private int partition(int[] input, int start, int end){
int pivot = input[start];
// next element than the pivot
int i = start + 1;
int j = end;

while(i < j){
// move toward the right until you find an
// element more than the pivot.
// stop there, as that element needs to be moved
// to the right of the pivot
while(input[i] <= pivot && i < end){
i++;
}
// move toward the left until you find an
// element smaller than the pivot
// stop there, as that element needs to be
// moved to the left of the pivot
while(input[j] > pivot && j > start){
j--;
}
// if i < j; swap these
if(i < j){
int temp = input[i];
input[i] = input[j];
input[j] = temp;
}
}
if(pivot > input[j]){
// the j will be pointing to the location
// meant for the pivot as all the elements
// left to j are smaller than the pivot
// and all the elements to the right of j
// are more than pivot
int temp = input[j];
input[j] = input[start];
input[start] = temp;
}
return j;
}


And the main method to recursively perform the sorting on the left and right partitions:

public void quickSort(int[] input, int start, int end){
if(start < end){
// find partition location
int partitionIndex = partition(input, start, end);
// sort the first half
quickSort(input, start, partitionIndex -1);
// sort second half
quickSort(input, partitionIndex + 1, end);
}
}


The best case complexity of this algorithm is $$O(n * log_2n)$$ which comes out when the partition index divides the input into two equal-sized partitions, thus reducing the size by half in every recursive call. This will result in a tree of height $$log_2n$$ with n operations at each level. Unfortunately, this is only possible when the median of the input lies in its exact center, which is generally not the case.

The average/worst case complexity of quick sort is $$O(n^2)$$, which comes when we try to sort an already sorted input as in that case, the choice of the first element as a pivot will always lead to two partitions where one partition will have only one element, and other will have rest. It will lead to a tree of height $$\approx n$$ where for every level, we’ll have to perform n comparisons, thus resulting in $$O(n^2)$$ complexity.

There are a few variations available to improve the performance, like:

• Choosing the middle element as the pivot element (swap the middle element with the first element and use the above implementation).
• Choosing a random element as the pivot element in each iteration.
• Or shuffling the input before applying the algorithm to reduce the chances of working with an already sorted input.

Be notified of new posts. Subscribe to the RSS feed.