Big O Notation (Algorithm)

From University
Revision as of 19:14, 1 January 2021 by WikiSysop (talk | contribs) (WikiSysop moved page Big O Notation to Big O Notation (Algorithm) without leaving a redirect)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Big O Notation[1]

This monograph focuses on its use in describing algorithms used in computer science.


Big O Notation is a mathematical notation that describes the limiting behavior for a function when the argument tends toward a particular value or infinity. This belongs to the Bachmann-Landau Notation[2] or asymptotic notations family.


The letter O is used because the growth rate of a function is also referred to as the order of the function.

Bachmann–Landau Notation Family

Notation Name Description Formal Definition Limit Definition
\(f(n) = o(g(n))\) Small O; Small Oh \(f\) is dominated by \(g\) asymptotically \( \forall k>0 \; \exists n_0 \; \forall n>n_0 \; |f(n)| < k \cdot g(n) \) \(\lim_{n \to \infty} \frac{f(n)}{g(n)} = 0\)
\(f(n) = O(g(n))\) Big O; Big Oh; Big Omicron \(|f|\) is bounded above by \(g\) (up to constant factor) asymptotically \(\exists k>0 \; \exists n_0 \; \forall n>n_0 \; |f(n)| \leq k\cdot g(n)\) \(\limsup_{n \to \infty} \frac{\left|f(n)\right|}{g(n)} < \infty\)
\(f(n) = \Theta(g(n))\) Big Theta \(f\) is bounded both above and below by \(g\) asymptotically \(\exists k_1>0 \; \exists k_2>0 \; \exists n_0 \; \forall n>n_0\)\(k_1\cdot g(n) \leq f(n) \leq k_2\cdot g(n)\) \(f(n) = O(g(n))\) and \(f(n) = \Omega(g(n))\) (Knuth version)
\(f(n)\sim g(n)\) On the order of \(f\) is equal to \(g\) asymptotically $$\forall \varepsilon>0\;\exists n_0\;\forall n>n_0\;\left|{f(n) \over g(n)}-1\right|<\varepsilon$$ \(\lim_{n \to \infty} {f(n) \over g(n)} = 1\)
\(f(n) = \Omega(g(n))\) Big Omega in number theory (Hardy–Littlewood) \(|f|\) is not dominated by \(g\) asymptotically \(\exists k>0 \; \forall n_0 \; \exists n>n_0 \; |f(n)| \geq k\cdot g(n)\) $$\limsup_{n \to \infty} \left|\frac{f(n)}{g(n)}\right| > 0 $$
\(f(n) = \Omega(g(n))\) Big Omega in complexity theory (Knuth) \(f\) is bounded below by \(g\) asymptotically \(\exists k>0 \; \exists n_0 \; \forall n>n_0 \; f(n) \geq k\cdot g(n)\) \(\liminf_{n \to \infty} \frac{f(n)}{g(n)} > 0 \)
\(f(n) = \omega(g(n))\) Small Omega \(f\) dominates \(g\) asymptotically \(\forall k>0 \; \exists n_0 \; \forall n>n_0 \ |f(n)| > k\cdot |g(n)|\) \(\lim_{n \to \infty} \left|\frac{f(n)}{g(n)}\right| = \infty\)

The limit definitions assume \(g(n) \neq 0\) for sufficiently large \(n\). The table is (partly) sorted from smallest to largest, in the sense that o, O, Θ, ∼, (Knuth's version of) \(\Omega\), ω on functions correspond to <, ≤, ≈, =, ≥, > on the real line. Computer science uses the big O, Big Theta Θ, little o, little omega ω and Knuth's big Omega \(\Omega\) notations.


Figure 1.
Big O Notation Poster

Internal Links

Parent Article: Computer Science Notes