# Big-Ω (Big-Omega) notation

Sometimes, we want to say that an algorithm takes

*at least*a certain amount of time, without providing an upper bound. We use big-Ω notation; that's the Greek letter "omega."If a running time is $\Omega(f(n))$, then for large enough , the running time is at least for some constant . Here's how to think of a running time that is $\Omega(f(n))$:

We say that the running time is "big-Ω of ." We use big-Ω notation for

**asymptotic lower bounds**, since it bounds the growth of the running time from below for large enough input sizes.Just as $\Theta(f(n))$ automatically implies , it also automatically implies $\Omega(f(n))$. So we can say that the worst-case running time of binary search is $\Omega(\lg n)$. We can also make correct, but imprecise, statements using big-Ω notation. For example, just as if you really do have a million dollars in your pocket, you can truthfully say "I have an amount of money in my pocket, and it's at least 10 dollars," you can also say that the worst-case running time of binary search is $\Omega(1)$, because it takes at least constant time.

This content is a collaboration of Dartmouth Computer Science professors Thomas Cormen and Devin Balkcom, plus the Khan Academy computing curriculum team. The content is licensed CC-BY-NC-SA.