## 1. Selection Sort

### Why do we learn about sorting

#### What are insertion and selection sort?

We learn many algorithms without even realizing they are algorithms. For example, "long multiplication" is often taught in early grade school as an algorithm for multiplying large numbers. Typically, you memorize a small multiplication table for single digits numbers, and then apply the long multiplication algorithm to multiply large numbers. Also, anyone who has sorted a stack of playing cards has probably used a sorting algorithm. Most humans use what is called **insertion sort**. They maintain a stack of sorted cards, initially just one card. Then, they take the next (2nd) unsorted card, and puts it in front of or behind the 1st card, depending on order. Then, they take the 3rd unsorted card, and puts it into the proper position among the first 2. They continue this until all 52 cards are sorted.

Some people might have a different way of sorting cards called a **selection sort**. First, we give a number to each card. The Ace of clubs will be 1. The 2 of clubs will be 2. The King of clubs will be 13. The Ace of Diamonds 14. The King of Diamonds 26. The Ace of Hearts 27. The King of Diamonds 39. The Ace of spades 40, and finally the King of Spades 52. Another way to describe this numbering/ordering is "sorted first alphabetically by suit: clubs, diamonds, hearts, spades, and finally numerically with Ace low and King high". Once we mentally know the order in which we want these cards sorted, we keep a "sorted pile". We go through our unsorted pile to find the King of Spades, and once found we move the King to the sorted Pile. Then, we find the Queen of spades. We continue this until all cards are found.

#### Why is sorting important?

Sorting is an extremely common task in computing. For example it is often a pre-requisite step to efficient searching. Think of a library of books or search engine. First, an index of the titles or descriptions of books / web pages are sorted. Prior to computers, this was sometimes accomplished through a library card catalogue where thousands of index cards represented the books. Then, if a user or visitor searches for the book "Flatland", they need only scan to the index cards representing F or Fla to find the card describing where to find the book. On the internet, large databases index websites by keywords and more complex search queries. That allows a search engine to instantly find a set of pages that matches a query without going through millions or billions of pages per search.

Because the need to sort is so useful and so prevalent, dozens of different sorting algorithms have been created over the past century. Some are faster, some are slower, and many are tailored to specific needs. For example, a person sorting a deck of cards may be able to lay out all their cards on a table, taking up a lot of space. This may speed up his sorting, but may not work if he or she is in a tight place without a table or floor space. Instead, a different algorithm might be necessary if a person must hold all 52 cards in your hand while sorting. Similarly, some algorithms might be very fast for sorting ten or twenty cards, such as our previously mentioned insertion and selection sorts. However, they may be very poor if you need to sort thousands or millions of cards. Special algorithms must be needed to handle those cases. Later on, you will find that the "speed" of an algorithm relative to the size of the input (e.g. number of cards) is called an algorithms computational complexity, and the amount of memory or space required to perform the algorithm is often called the space complexity. These topics will come up as you study **Big-O notation** and **Complexity Theory**.

### Selection Sort

#### Min + Swap

First Step: **Selection sort** finds the smallest number in an array, and swaps it into the first position. Thus, it is a minimization followed by a swap:

public class MyProgram { public static void main(String[] args) { int[] x = {42,1303,1323,1171,2,1031,1131,2430,683}; //Find the smallest: int minIndex = 0; //This is an index here for(int i = 1; i < x.length ; i++) { if(x[i] < x[minIndex]) { minIndex = i; } } //swap x[0] with x[minIndex]: int tmp = x[0]; x[0] = x[minIndex]; x[minIndex] = tmp; //Now the array has 2 in the first position: for(int i = 0; i < x.length ; i++) { System.out.println(x[i]); } } }

#### Completing the sort

The overall algorithm works like this:

//Swap the smallest element from index 0 to length-1 into position 0 //Swap the 2nd smallest element from index 1 to length-1 into position 1 //continue to until the array is sorted

Thus, we need to perform the previous algorithm within a loop:

public class MyProgram { public static void main(String[] args) { int[] x = {42,1303,1323,1171,2,1031,1131,2430,683}; for(int j = 0; j < x.length; j++) { //Find the jth smallest: int minIndex = j; for(int i = j + 1; i < x.length ; i++) { if(x[i] < x[minIndex]) { minIndex = i; } } //swap x[j] with x[minIndex]: int tmp = x[j]; x[j] = x[minIndex]; x[minIndex] = tmp; } //Now the array is completely sorted: for(int i = 0; i < x.length ; i++) { System.out.println(x[i]); } } }

### Analyzing the complexity

We can see that if a the array has 10 elements, then our first loop will perform 9 comparisons, followed by 8, then 7, then 6, the 5, etc. This adds up to 45 comparisons if n the array is of length 10. It turns out for n elements, this is n*(n-1)/2 comparisons (proof left to the reader). In computer science, we call this kind of algorithm an "O of n squared" algorithm, which can be written as O(n^2).