For functions f(n) and g(n), we say that "f(n) is Big-O of g(n)" if
There exists a constant c > 0 there exists a constant n0 such that
for all n ≥ n0, 0 ≤ f(n) ≤ c * g(n)
We write f(n) = O(g(n)), but the "=" is not the usual meaning.
The intention is to allow us to say
Technically, 101,000,000*n is O(n), but programs with that
running time is still very slow.
Asymptotic running times for n items:
|Sorted Arrays||O(n)||O(log n)||O(n)|
|Sorted Linked Lists||O(n)||O(n)||O(n)/O(1)|
*Running time for deletion from a linked list depends on whether you
already have a pointer for the node to be deleted.
Suppose we have n items inserted and do n searches:
If we have all n items before hand, we can sort them all at the same time in O(n log n), plus n * O(log n) for n searches for O(n log n) time total.
If not, we need binary search trees.
For each node in a binary search tree, its left child (if any) holds
a smaller number and its right child (if any) holds a larger number.
There are several schemes for maintaining a O(log n) height for a binary search tree n nodes: AVL trees, Red-Black trees, B-trees, 2-3 trees, ...
For such trees:
Using a "good" hash function:
In the worst case, all items have the same hash index and a closed hash table degenerates into an unsorted array.
There are provably good hash functions.