Yes, $\log(n)$ is between $1$ and $n$, but it is closer to $1$ than $n$. What is $\log(n)$? The log function is the inverse function of exponentation. Let me start with exponentation and you should get a better idea of what logarithm is.

Consider two numbers, $100$ and $2^{100}$. $2^{100}$ is $2$ multiplied with itself $100$ times. You can with some effort count $100$ numbers, but can you count $2^{100}$? I bet you can't. Why? $2^{100}$ is such a big number that it is greater than the number of all atoms in the universe. Reflect on that for a moment. It is such a huge number, that it allows you to give each atom a name (number). And the number of atoms in your finger nail is probably in the order of billions. $2^{100}$ ought to be enough for anyone (pun intended :)).

Now, between the two numbers, $100$ and $2^{100}$, $100$ is the logarithm of $2^{100}$ (in base $2$). $100$ is comparatively such a small number than $2^{100}$. Anybody ought to have $100$ different items in their home. But, $2^{100}$ is good enough for the universe. Think home vs universe when thinking of $\log(n)$ and $n$.

Where do exponentation and logarithms come from? Why are they of so much interest in computer science? You may not notice, but exponentation is everywhere. Did you pay interest on credit card? You just paid a universe for your home (Not so bad, but the curve fits). I like to think that exponentation comes from product rule, but others are welcome to give more examples. What's product rule, you may ask; And I shall answer.

Say you have two cities $A$ and $B$, and there are two ways to go between them. What is the number of paths between them? Two. That is trivial. Now say, there is another city $C$, and you can go from $B$ to $C$ in three ways. How many paths are there between $A$ and $C$ now? Six, right? How did you get that? Did you count them? Or did you multiply them? Either way, it is easy to see that both ways give a similar result.
Now if you add a city $D$ which can be reached from $C$ in four ways, how many ways are there between $A$ and $D$? Count if you don't trust me, but it is equal to $2\cdot 3\cdot 4$ which is $24$. Now, if there are ten cities and there are two paths from one city to the next, and they are arranged like they are on a straight line. How many paths are there from start to end? Multiply them if you don't trust me, but I will tell you there are $2^{10}$, which is $1024$. See that $2^{10}$ is exponential result of $10$, and $10$ is the logarithm of $2^{10}$. $10$ is a small number compared to $1024$.

The logarithm function $\log_2(n)$ is to $n$ what $n$ is to $2^n$ (note that $2$ is the logarithm's base). If you multipy $\log_b(n)$ with itself $b$ times (note that $b$ is the logarithm's base) you get $n$. $\log(n)$ is so tiny, so small compared with $n$, that it is size of your home where $n$ is size of the universe.

On a practical note, $\log(n)$ functions perform very similar to constant functions. They do grow with $n$, but they grow very slowly. If you optimized a program to run in logarithmic time which was taking a day before, you will probably run it in the order of minutes. Check for yourself with problems on Project Euler.

8log n is for "searching": think binary search – Suresh – 2012-03-21T06:07:50.300

2

Using $O$ to ask this question is incorrect, as it only denotes an upper bound. For instance constant time is $\mathcal{O}(\log n)$. $\theta$ would be more appropriate. See meta question: http://meta.cs.stackexchange.com/questions/182/editing-a-question-and-almost-all-the-answers

– Aryabhata – 2012-03-22T21:47:33.6631

More information on SO: what does $O(\log n)$ mean exactly?.

– Ran G. – 2012-04-02T04:07:08.973A little note: In the classical Turing Machine settings, all algorithms are $\Omega (n)$, since they need to read each symbol of the input at least once. Binary search can be done in $O(\log n)$ because we have the

promisethat the list is sorted, for example. – chazisop – 2012-06-19T13:38:30.7031A late contribution: by definition, the base $b$ logarithm of a number $n$ is just the number of times you multiply $b$ by itself to get $n$. $b^l = n \iff l = log_b(n)$. For example, $2^3 = 8 \iff log_2(8) = 3$. So if you have a number $n$ and you want to find out what $log_b(n) = ?$ just keep dividing it by $b$ until you reach a $1$ (assuming $n$ is a power of $b$ for simplicity). The number of divisions is equal to $log_b(n)$. Algorithms that exhibit this division behavior have running times in $O(log(n))$. – saadtaame – 2012-08-20T03:13:44.657