— -SORTING ALGOS — -

bhanu prakash
8 min readMar 22, 2021

….Sorting?

When we talk about sorting it is nothing but a technique to arrange certain group of objects in a particular order, like small to large (ascending order) or large to small (descending order) based on their values like sorting ranks and marks of students, sorting names in records in alphabetical order etc.

This sorting can be performed by using sorting algorithms. Most of these algorithms are based on comparison and some are not. These algorithms help us in day to day life by making the tasks easier by arranging them.

But there are some important aspects that we should keep in mind while using these sorting algorithms like time and space complexities. Each algorithm has different usage.

The major advantages of this sorting is it reduces the complexity of the problem and saves much time and searching also becomes easier.

Complexity:

These algorithms can be separated and used based on their complexities (time taken to execute the algorithm and space required)

These complexities can be divided into

o Best case.

o Average case.

o Worst cases.

We are interested in the algorithms which have low worst case complexity.

There are many sorting algorithms like

v Merge sort

v Insertion Sort

v Quick Sort

v Selection Sort

v Bubble Sort

v Shell Sort

v Counting Sort

v Heap Sort

v Radix Sort

Now I will discuss a few of them -> Bubble, Insertion, Selection, Quick, Merge sorts.

Selection Sort:

This algorithm is very simple. It just finds the minimum value in the entire arr and swaps it with the fist element. This is an in-place sorting algorithm, it does not need extra space. It divides the array into two parts the unsorted and sorted, it finds smallest element in unsorted side (initially the complete given arr) and places it in sorted side (initially empty). It is an inefficient algorithm as it has large complexity O­­(n­­­­­­­2) in all cases

Algo:

o Select the first element of the array.

o Compare the selected element with all the other elements in the arr.

o In each comparison, if any element is found smaller than the selected element, both are swapped.

o Repeat the same procedure with element in the next position in the array till the entire array is sorted.

selection sort:

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

arr: array of items

n: size of the arr

for i = 1 to n — 1

// set current element as minimum

min = i

//check the element to be minimum

for j = i+1 to n

if arr[j] < arr[min] then

min = j;

end if

end for

//swap the minimum element with the current element

if indexMin != i then

swap arr[min] and arr[i]

end if

end for

end procedure

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Best case: Ω(n2)

Average case: θ (n2)

Worst case: O(n2)

Advantage:

Does not depend on the initial arrangement of the data.

Simple implementation.

Disadvantage:

Many searches are required.

Bubble sort:

It is the simplest sorting algorithm. It is based on comparisons between the elements (adjacent elements). If the adjacent elements are not in order they get swapped. This is an inefficient algorithm as it has large average and worst case complexity as O(n2) .

Algo:

o Start with the first two elements and sort them in ascending order.

o Compare the second and third elements to check which one is greater, and swap if necessary.

o Compare the third and fourth elements to check which one is greater, and sort them in ascending order.

o Compare the fourth and fifth element to check which one is greater, and sort them in ascending order.

o Repeat steps 1–5 until no more swaps are required.

Bubble sort:

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

begin BubbleSort(arr)

for all elements of arr

if arr[i] > arr[i+1]

//compare and swap

swap(arr[i], arr[i+1])

end if

end for

return arr

end BubbleSort

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Best case: Ω(n)

Average case: θ(n²)

Worst case: O(n²)

Insertion sort:

It is also a simple algorithm. To understand it in simple words we can say that it is similar to the way we sort playing cards. It is efficient algorithm when it comes to small and almost sorted arrays. Here the array is divided in to parts (sorted and unsorted) like in selection sort. The list will be searched sequentially and the elements will be moved and inserted into sorted side just like we do in playing cards.

Algo:

o Let the first element in the list is in sorted portion and all the remaining elements are in unsorted portion.

o Select the first element from the unsorted portion and insert that element into the sorted portion in the order specified.

o Repeat the above process until all the elements from the unsorted portion are moved into the sorted portion.

Insertion sort:

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Insertion-sort(A)

for i = 1 to n

key ← A [i]

j ← i — 1

while j > = 0 and A[j] > key

A[j+1] ← A[j]

j ← j — 1

end while

A[j+1] ← key

end for

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Best case: Ω(n)

Average case: θ(n2)

Worst case: O(n2)

Advantage

Efficient for almost sorted array/elements.

Simple implementation.

Disadvantage

It is less efficient for an array/list containing more number of elements.

When the number of elements increases, the performance is slow.

Quick sort:

It is one the mostly used algorithm in real world. It is based on divide and conquer method. It’s key technique relies on partitioning of the array and that element is called pivot element, all the elements smaller than this pivot element will be moved low of it and grater will be on high side of it and this happens recursively in each low and high sub array. It is an in place algo and do not need extra space. It is an efficient algorithm as it has it’s best and average case as Ω(n log(n)) complexity.

Algo:

o Take last index as pivot element (not compulsory, we can choose any)

o Create 2 variables high and low to point the locations.

o Lower index is pointed by low.

o Higher index is pointed by high.

o Till value of the low is smaller than the pivot element move forward (move high).

o Till value of the high is greater than pivot element move backword (move low).

o If the above 2 steps i.e. 5,6 doesn’t match then interchange low and high.

o If high is less than or equals to low the point they met is new pivot element.

Quick sort:

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

quickSort(array, low, high)

if (low < high)

pivot <- partition(array,low, high)

quickSort(array, low, pivot)

quickSort(array, pivot + 1, high)

//Partiton function

partition(array, low, high)

set high as pivot

store <- low — 1

for i <- low + 1 to high

if element[i] < pivotElement

swap element[i] and element[store]

store++

swap pivotElement and element[store+1]

return store + 1

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Best case: Ω(n log(n))

Average case: θ(n log(n))

Worst case: O(n²)

Merge Sort:

It is also one of the most popular algorithm. It is also based on divide and conquer technique, it is a stable algorithm (preserves the original order) and efficient algorithm as it has O(n log(n)) complexity in all the cases but it need an extra auxiliary space O(n) (not an in place algorithm). The basic idea behind this algorithm is to divide the array into sub array continuously till it consists a single element and merge back while sorting the elements at the same time.

Algo:

o If there is a single element the given array then it is already sorted, just simply return.

o Otherwise call the sub arrays recursively (calling merge function) till it contains a single element.

o Here the merging function performs the sorting and merges back to original array.

o Remember while merging the merge function uses an extra array to store the values and copies the sorted values to original array.

Merge sort:

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

MergeSort(arr, left, right):

if left > right

return

mid = (left+right)/2

mergeSort(arr, left, mid)

mergeSort(arr, mid+1, right)

merge(arr, left, mid, right)

end

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Best case: Ω(n log(n))

Average case: θ(n log(n))

Worst case: O(n log(n))

Advantage

It completes any sorting in (n log(n)) time complexity (best time complexity)

Disadvantage

But this process needs extra space O(n) to perform merge procedure and this causes problems when it comes to large size arrays.

--

--