**The Coder's Handbook*** ** *

**The Coder's Handbook**

**Sorts and Searches**

**BIG O**

Algorithms

An **algorithm **is an an unambiguous, executable, and terminating specification of a way to solve a problem. Every algorithm you write has a run-time: they can be fast, slow, or somewhere in between. What programmers really care about is how * well your program scales* as it deals with larger amounts of data.

Big O Notation

Let's imagine you are writing a program that needs to traverse a list of numbers. We'll use the letter n to designate how many numbers are in the list.

How does your program's runtime relate to n? To describe this behavior, we designate it as having a **Big O** value, which represents its worst-case run-time.

Let's consider some examples:

**O(n)**- As n gets bigger, your program's runtime increases linearly.**O(n²)**- As n gets bigger, your program's runtime increases exponentially! Your code probably has a nested loop.**O(log n)**- As n gets bigger, your program only takes a little bit longer. You're doing something very efficient that likely cuts the problem in half each time.*Note:**In computer science, we always are talking about log base 2.*

As The Limit Approaches Infinity...

Think of this as describing the behavior of a mathematical operation as the limit approaches infinity.

For example, we could say that the following statement has an O(n²):

3n² + 78n + 1542

This is because as n gets bigger and bigger, the other numbers become less significant.

3n² + 78n + 1542

≅ 3n² + 78n

≅ 3n²

≅ n²

Why use Big O?

Big O is a useful shorthand for approximately how good or bad an algorithm is at something. Think of it as a **shorthand **to describe what sort of "tier" of efficiency an algorithm is on.

Keep in mind that you might have a few different algorithms with the same Big O that have different performance on average within this category.

Ex: a Bubble Sort and an Insertion Sort are both O(n²) but the latter is usually much faster.

On the flip side, it's possible for an algorithm with an inefficient Big O to beat a more efficient algorithm in situations where the data is arranged just right, or with a small sample set.

Ex: A bubble sort is very good when all the elements are just very slightly out of order.

Ex: A merge sort O(n²) is often slower than an insertion sort O(n²) when n is very small, because the algorithm has some overhead only pays off with large values.

**SEARCHING**

Linear Search - O(n)

Go through the elements in order until you find the item. Also known as Sequential Search.

Binary Search - O(log n)

Requires a sorted list. Start in the middle and keep guessing based on being too high or too low.

**SORTING**

Bubble Sort - O(n²)

Keep making passes through the data swapping adjacent elements until the list is sorted.

Selection Sort - O(n²)

Look for lowest element and put it in the first spot, swapping with what is there. Keep going.

Insertion Sort - O(n²)

Look for elements to move forward to the correct spot and bump back the other data.

Merge Sort - O(n log n)

Divide the data in half repeatedly then put it back together.

**RESOURCES**

Tom Scott explains Big O, Bubble Sort, Insertion Sort, Quick Sort, Bogo Sort, and the real lesson of his internship.

Obama's take on sorting algorithms