Sometimes, we want to say that an algorithm takes at least a certain amount of time, without providing an upper bound. We use big-Ω notation; that's the Greek letter "omega."
If a running time is Ω(f(n)), then for large enough nn, the running time is at least k⋅f(n)k, dot, f, left parenthesis, n, right parenthesis for some constant kk. Here's how to think of a running time that is Ω(f(n)):
6n^2 vs 100n+300
We say that the running time is "big-Ω of f(n)f, left parenthesis, n, right parenthesis." We use big-Ω notation for asymptotic lower bounds, since it bounds the growth of the running time from below for large enough input sizes.
Just as Θ(f(n)) automatically implies O(f(n))O, left parenthesis, f, left parenthesis, n, right parenthesis, right parenthesis, it also automatically implies Ω(f(n)). So we can say that the worst-case running time of binary search is Ω(lgn). We can also make correct, but imprecise, statements using big-Ω notation. For example, just as if you really do have a million dollars in your pocket, you can truthfully say "I have an amount of money in my pocket, and it's at least 10 dollars," you can also say that the worst-case running time of binary search is Ω(1), because it takes at least constant time.