Asked • 03/19/19

Average number of guesses to guess number between 1 and 1000?

There is a game where you get to pick a number between 1 and 1000 and then the system will tell if you guessed to high or too low. Using a binary search you can find the answer in a maximum of ten guesses (2^10 = 1024) , but how would I go about calculating the average number of guesses required to guess the given method, assuming the user uses this binary chop method?

Damian B.

I would like to have a stats person check out the logic to this answer. Here is my reasoning. We know it takes a maximum of log base 2 of N guesses to determine any number given a binary search algorithm. The average number has to take into account that you can get lucky. If the algorithm divides and conquers, then one time in N you will guess on the first try. Two times in N on the second, and so forth. By the ninth attempt guessing a number from 1 to 1000, you would have uniquely identified 511 numbers (1 + 2 + 4 + 8+ ... + 256) or 2^9 - 1. That leaves 489 numbers for the final guess. You'd have to multiply the chances of each of these coming up to find the average. That is, the chance of guessing on the first guess times the frequency that it happens, or 1/N. The second guess is right 2 times in N. If you add the products of the number of guesses times the chances of that happening, You will get for N=1000 8.987 guesses on average to guess the number. So, it's slightly less than the maximum, but basically by a single guess.
Report

01/30/22

1 Expert Answer

By:

Patrick B. answered • 03/23/19

Tutor
4.7 (31)

Math and computer tutor/teacher

Still looking for help? Get the right answer, fast.

Ask a question for free

Get a free answer to a quick problem.
Most questions answered within 4 hours.

OR

Find an Online Tutor Now

Choose an expert and meet online. No packages or subscriptions, pay only for the time you need.