There are many interesting calculations involving logarithms that you can do in your head if you already know a few logarithms.
For instance, suppose you are reading a book by randomly flipping to one page each day and reading it. What is the expected number of days it will take for you to read every page in the book? (This is a variant of the well-known Coupon Collector's Problem).
Because you haven't seen any page before, any page will do, so you always see the first of the hundred pages on the first day. Hence, the expected number of days to see the first page is 1.
For the second page, you have a 99/100 probability of seeing any page except the one you already saw on day 1. To convert this probability into an expected number of days, we can use the fact that if you flip a biased coin that comes up heads with probability h, the expected number of flips until you see heads (including the final heads) is 1/h. Therefore, it will take 1/(99/100) = 100/99 days on average until you see the second page.
Continuing in this fashion, the expected number of days to see every page in the book is:
where H_100 is the 100th harmonic number. The nth harmonic number, written H_n, is the "sum of the reciprocals of the first n natural numbers" (Wikipedia). By expanding the first two terms of the Taylor expansion for H_n, we see that that H_n is approximately equal to ln n + y + 1/(2n), where y is Euler's constant, which is approximately 0.5772. So we have:
I've spent a lot of time over the years using bit shifting to multiply/divide by powers of 2 in low-level code. At some point I grokked the connection with (base 2) logarithms and found myself using them as a mental estimation tool with surprising effectiveness.
Many of us programmers have powers of 2 memorized through at least 2^20 (and 2^10 ~= 1000 is close enough for decomposing and converting larger values), which makes rough conversion to/from the log domain trivial. The trick in estimation is to round to the nearest whole number when converting to the log domain, and try to alternate rounding direction for inputs that are not close to the upper/lower extrema. Keep a running total of the log-domain value as a whole value, and take the antilog as the final step. Given these simple rules, most estimates come within a factor of 2 of the intended value, and with minimal cognitive load. All you need to remember is the current sum and which direction to round the next input.
You can get away with remembering less than the article suggests.
Logs of 0 and 10 are easy. Then, if you have the logs of 2, 3 and 7, you can easily calculate the logs of 4, 5, 8 and 9. From there you can go a long way by factorization and approximation.
- log 2 ~ 3/10 (which should be familiar to this audience)
- log 3 ~ (19/12) log 2 ~ 19/40 (in musical terms this is the fact that seven semitones make a perfect fifth) - although arguably you don't even need this, since you can derive the same approximation by noting that log 8 ~ 9/10 and interpolating between log 8 and log 10.
Electrical/optical engineers (should) know logs, or more precisely dBs (10log10), i.e. a logarithmic measure of ratio. That is because many powers, e.g. electrical and optical power as well as loss is often expressed in dB(m) (power in relation to 1 mW) or dB/m (as loss/gain).
However it is not necessary to memorize so many. It's easier to explain in dB. 0dB = 1, (that's really definition), 3 dB =2 and 5 dB =3 (that's a bit less precise), maybe we can add 7 dB = 5.
From those ones we can deduce everything else: 4=22=3 dB+3dB, 10=3 dB + 7 dB= 10 dB, similar negative dBs i.e. 1/3 = 0 - 3 dB...
The "incorrect by x%" metrics are misleading since they are that percentage incorrect in log space but significantly more incorrect when taking the antilog.
A trick I learned -- I think from Colin Wright -- involves powers of 2.
* Write the powers of 2 up to 2^10 in lex order and pop a decimal point after the first digit of each:
* 1.28
* 1.6
* 2
* 2.56
* 3.2
* 4
* 5.12
* 6.4
* 8
The logs of these are approximately 0.1, 0.2, 0.3, etc.
This is to do with 2^10 being about 10^3; you can do something very similar with powers of 5, but I find those harder to remember.
Kind of weird to pick sqrt(512) as an example to show the practical use. I mean you'd be better of with a base 2 logarithm as you're basically calculating 2^4.5 = sqrt(2)*2^4 = 1.4... * 16 = 22.4(ish).
I'm very fond of looking at problems in a different light to come up with solutions or approximations that can be computed with ease mentally. Do any folks here have pointers to related techniques or resources?
Check out "Street-fighting mathematics" and excellent (free) resource. It's so good I paid for a copy (and his other one) after reading the free version online.
For instance, suppose you are reading a book by randomly flipping to one page each day and reading it. What is the expected number of days it will take for you to read every page in the book? (This is a variant of the well-known Coupon Collector's Problem).
Because you haven't seen any page before, any page will do, so you always see the first of the hundred pages on the first day. Hence, the expected number of days to see the first page is 1.
For the second page, you have a 99/100 probability of seeing any page except the one you already saw on day 1. To convert this probability into an expected number of days, we can use the fact that if you flip a biased coin that comes up heads with probability h, the expected number of flips until you see heads (including the final heads) is 1/h. Therefore, it will take 1/(99/100) = 100/99 days on average until you see the second page.
Continuing in this fashion, the expected number of days to see every page in the book is:
where H_100 is the 100th harmonic number. The nth harmonic number, written H_n, is the "sum of the reciprocals of the first n natural numbers" (Wikipedia). By expanding the first two terms of the Taylor expansion for H_n, we see that that H_n is approximately equal to ln n + y + 1/(2n), where y is Euler's constant, which is approximately 0.5772. So we have: Therefore, the expected number of days to see every page in the book is approximately 100 * H_100 = 100 * 5.2 = 520. You can check this result via simulation at https://demonstrations.wolfram.com/CouponCollectorProblem/ .