Asymptotic Analysis

    We use asymptotic notation to express the rate of growth of an algorithm’s running time for the input size n.

    For sake of simplicity, we drop the insignificant terms and the constant coefficients.

    • eg. $$n^2$$ instead of $$n^2 + n + 100$$
    • Drop factors too eg. $$n_2$$ instead of $$6n^2$$

    Several forms of it:

    • notation,
    • notation,
    • and notation.

    Θ(n^0) == Θ(1) which means that the algorithm will take the same amount of time, regardless the input size. For example, to find the smallest element in a sorted array (ASC). It would take constant time, since the smallest element must be at index 0.

    • Θ(1)
    • Θ(lg n)
    • Θ(n·lg n)
    • Θ($$n^2$$)
    • Θ($$n^2$$lg n)
    • Θ($$n^3$$)
    • Θ($$2^n$$)

    Big-Theta expresses function growth by a simplified version of f(n) and it describes upper and lower bounds.

    In contrast, Big-O considers upper bound (worst case scenario). Might be imprecise but it is useful for large values of n.

    OK, so which is the difference among these notations?