Hi Bella,
This is an interesting question, since logarithms have been around for much longer than calculators!
Before calculators were around, people used tables to calculate anything that went beyond the basic arithmetic operations of addition, subtraction, multiplication, and division. Square roots and cube roots were known since ancient times. There were tables for square roots, cube roots, fourth roots etc for sure, but they allowed for only a limited number of values to be computed.
Then, in the 16th century, an English mathematician called John Napier discovered logarithms. As you may know, logarithms have one amazing property: they turn products into sums, as in log(a*b)=log(a)+log(b). That means exponents become factors, as in log(a2)=2 log(a), and log(√a)=1/2 log(a) (remember that taking a square root is the same as raising that number to the power 1/2). Napier tabulated thousands of logarithms and developed a simple device, called a slide rule, that lets you compute any logarithm with 3-digit or higher accuracy. Many mathematicians after Napier refined his tables to so many values that they fill entire books. So if you wanted to compute, say, √2 you would use the fact that log(√2)= (1/2) log(2). You'd look up the value of log(2) in the table, get 0.3010 (usually they tabulated the decadic, or base-10, logarithms), divide it by 2, get 0.1505, then look up which log would give you 0.1505, and it would be 1.414, which is √2 rounded to 4 significant digits. It's really neat!
Andre W.
tutor
Thanks, Edward! My classroom actually still had a huge wooden model of a slide rule hanging on the wall -- I found that fascinating!
Report
11/27/13
Edward B.
11/27/13