Algorithms
Reference
Sort
Easy ones
Insertion sort
Bubble sort
Selection sort
Merge sort
- Divide an array of elements into sub-arrays of elements [Divide]
- Sort each of the sub-arrays with Merge Sort [Solve]
- Merge the sorted sub-arrays [Conquer]
Efficient merge Given sub-arrays of and elements, it merges them in : Compare a pair of elements from the beginning of sub-arrays, put down the small one, and move to the next. This is a correct way to merge only because these sub-arrays are already sorted.
Time Complexity Dividing an array of elements takes steps to reach the leaf. Merge takes time, thus time complexity is
Partition (prep for Quick sort)
partition(, , ) divides array into two sub-arrays and , and return . All the elements in the first subarray is smaller than , and those in the second subarray is larger than .
![partition](/media/posts/algorithms/partition.jpg =400x)
At every step, always increments by one, and decides which group should go into. :::message How do we move to the correct group?
- If , just increment
- If , increment , and then exchange and . :::
Time complexity is , as goes from to .
Quick sort
- Divide an array into two sub-arrays with Partition algorithm [Divide]
- Sort each of the sub-arrays with Quick Sort [Solve]
The average time complexity of Quick sort is , and it's the fastest sorting algorithm in most cases. However, if you pick the value (in the figure above) in the same way all the time, it becomes in the worst case.
Tree
Tree walk
- Preorder Tree walk: [root] -> [left subtree] -> [right subtree]
- Inorder Tree walk: [left subtree] -> [root] -> [right subtree]
- Postorder Tree walk: [left subtree] -> [right subtree] -> [root]
def preorder(node):
if node is None:
return
print(node)
preorder(T[node].left)
preorder(T[node].right)
def inorder(node):
if node is None:
return
inorder(T[node].left)
print(node)
inorder(T[node].right)
def postorder(node):
if node is None:
return
postorder(T[node].left)
postorder(T[node].right)
print(node)
Time complexity is as it visits each node exactly once.
Binary Heap
Binary heap is a complete binary tree that meets:
- max-heap (min-heap): the value of a node is smaller (larger) or equal to its parent
There are no constraints between sibling node values. A binary heap is represented as an array with 1-origin.
It has a very nice property that, given a node index ,
- parent:
- left child:
- right child: ![partition](/media/posts/algorithms/binary_heap.jpg =150x)
:::message Complete binary tree: All the leaves on the left side is filled, and the maximum difference of leaves' depths is one or zero. :::
Dynamic Programming
Longest Common Subsequence (LCS)
Given a pair of sequences and , LCS asks to return a longest common subsequence of them. NOTE: is a subsequence of (no need to be consecutive in the original sequence!!)
Let's define , and think about LCS of and . If ,
If ,
Time complexity is .
Keywords
Cracking the coding interviews
Trees (+ Graphs)
-
Binary Tree vs Binary Search tree
- In a Binary search tree, every node (and its descendants) satisifies a specific ordering
-
Balanced vs Unbalanced
- Complete Binary Trees: every level of the tree is fully filled, except for perhaps the last level (all nodes are as far left as possible)
- Full Binary Trees: every node has either zero or two children
- Perfect Binary Trees: both full and complete
-
Binary Tree Traversal
- In-order traversal: left -> root -> right
- Pre-order traversal: root -> left -> right
- Post-order traversal: left -> right -> root
-
Binary Heaps (Min-Heaps and Max-Heaps)
- is a complete binary tree
- Min-Heap: every node is smaller than its children (thus, root is the minimum)
- Two key operations: insert and extract min element
- Insert: start at the bottom level, and then swap with its parent until it's in the correct order
- "Bubble up" the minimum element
- Extract min element: Finding the min element is easy (always at the root)
-
- Replace the minimum element (i.e., root) with the last element (bottommost, rightmost)
-
- "Bubble down" the new root by swapping iteratively
-
-
Depth-First Search (DFS) and Breadth-First Search (BFS)
- Implementation:
- DFS can be simply implemented with recursion
- BFS a bit tricky: Use a queue
- Pros and Cons:
- DFS: simpler to implement, uses less memory, but might not find the shortest path (in an unweighted graph (why?))
- BFS: preferred for finding a (shortest) path, but uses more memory (queue)
- To find the shortest path from
s
tot
(assume running DFS / BFS froms
), it's a good idea to stay close tos
(BFS) rather than going far away (DFS)
- To find the shortest path from
- Implementation:
-
Bidirectional Search
- To find the shortest path between
s
andt
- Much more efficient than BFS for this purpose
- To find the shortest path between