order. So far, I've learned selection sort, insertion sort, quick sort, merge sort, counting sort, Timsort, bubble sort and radix sort from lecture and reading materials.
The efficiency of a sorting algorithm is related to the number of input datas being processed. In pratice, the bubble sort is rarely used to sort large and unsorted datas due to its inefficiency with complexity O(n2). When the number of input datas is small, we usally use selection sort or insertion sort, and will prefer insertion sort on almost sorted datas as both algorithms are O(n2) and only insertion sort has best case O(n).
The practical algorithms are quick sort and merge sort with average complexity
O(n log n). The idea of quick sort is to choose a pivot and move everything less than the pivot to a list called Left and everything greater than the pivot to the list Right, then quicksoft Left and Right recursively, finally return Left + pivot + Right. For this algorithm, the crucial part is to choose a good pivot, the median would be the best, a poor pivot can result to be a worse case with complexity O(n2). The idea of merge sort is to divide the list of datas in half(Left and Right) and merge soft two halves recursively, finally use the merge function to merge the lists Left and Right.
The timsort was derived from insertion sort and merge sort, and it's the python's built-in sort algorithm. The idea of timsort is to split the input array into sub-arrays(orderred runs),
then use insertion sort on each run, finally use merge sort on sorted runs and return the expected array. The best case of timsort is O(n) and worst case is O(n log n).
Sorting is an important area of study in computer science, for different sorting methods, we need to compare them with respect to their running time to get the most efficient algorithm for the particular list of datas.