Asked • 06/02/19

Name this paradox about most common first digits in numbers?

I remember hearing about a paradox (not a real paradox, more of a surprising oddity) about frequency of the first digit in a random number being most likely 1, second most likely 2, etc. This was for measurements of seemingly random things, and it didn't work for uniformly generated pseudorandom numbers. I also seem to recall there was a case in history of some sort of banking fraud being detected because the data, which was fudged, was not adhering to this law. It was also generalisable so that it didn't matter what base number system you used, the measurements would be distributed in an analagous way. I've googled for various things trying to identify it but I just can't find it because I don't know the right name to search for. I would like to read some more about this subject, so if anyone can tell me the magic terms to search for I'd be grateful, thanks!

Al P.

I also googled and found this about Benford's Law - it says that many real world numbers tend to start with small digits: https://en.wikipedia.org/wiki/Benford%27s_law
Report

06/02/19

1 Expert Answer

By:

Still looking for help? Get the right answer, fast.

Ask a question for free

Get a free answer to a quick problem.
Most questions answered within 4 hours.

OR

Find an Online Tutor Now

Choose an expert and meet online. No packages or subscriptions, pay only for the time you need.