While sorting algorithms are very important from a programming point of view, they also frequently occur in our daily practical lives. For example, each week, several students in our lab distribute notes that they will present at the weekly seminar. The students distribute the notes independently, they invariably get out of order. How do you put them back in order? It's a sorting problem.
One way would be to put the notes in a new pile, represented by a linked list (a simple array is also possible). You'd insert the notes into the pile one by one, making sure each note goes in the appropriate place and does not upset the pre-existing order. What's the overhead for doing things this way? Well, you have to search the pile and insert the new note, which takes $O(n)$ if we don't do anything clever
. Since you'll be doing that for all n items, the total cost of sorting the notes is $O(n^2)$, which isn't great. In the word of programming, this is known as an insertion sort
. You can improve each insertion time by using a binary search tree
to represent the pile. This is known as a tree sort, and decreases the search and total sort time to $O(\log(n))$ and $O(n \log(n))$, respectively. An auto-balancing BST works best, since in that case even the worst sort time is $O(n \log(n))$.
If you want $O(n \log(n))$ sort performance, but don't feel like mentally implementing an auto-balancing BST to sort a couple of notes, there are some alternatives. My personal favorite is merge sort, which, incidentally, is this week's topic. Merge sort is a divide-and-conquer algorithm that is best defined recursively: divide the items into two equally-sized piles, recursively sort each pile, then merge the sorted piles back together. The recursion terminates when a pile cannot be divided any further (i.e. contains less than two elements). What's the computational complexity of merge sort? Since you halve the input array at each recursion depth, the total depth of recursion is $\lceil \log_2(n) \rceil$. At each recursion depth, you also have to merge two sorted piles -- this can be done in linear time. The overall sort thus takes $O(n \log(n))$ (since we don't care about the base of the logarithm).
This week's problem was to implement the merge step of the merge sort. To make things more interesting, we're merging an arbitrary number of piles of arbitrary size, instead of two piles of roughly the same size. Here it is, in Python (for a change):
Since I just couldn't help myself, I went ahead and implemented the entire merge sort in Python. It doesn't share any code with the merge step mentioned above, but the principles are the same.
I'm using a temporary list to perform the merging. It's possible to perform an in-place merge sort without requiring such temporary storage, but I haven't gotten around to it yet.