Asked • 06/15/19

How to quantify the differencen between 2/4 and 20/40?

Assume I have two methods to do prediction. The first method makes 4 predictions and 2 out of 4 are correct. The second method makes 40 predictions and 20 out of 40 are correct. The prediction precisions of both methods are the same, which is 2/4=20/40=0.5. But I think the second method is better than the first one, because it makes more correct predictions. Is there a measure to quantify this? Any suggestion may help:) Thanks in advance.

1 Expert Answer

By:

Raymond B. answered • 06/15/19

Tutor
5 (2)

Math, microeconomics or criminal justice

Still looking for help? Get the right answer, fast.

Ask a question for free

Get a free answer to a quick problem.
Most questions answered within 4 hours.

OR

Find an Online Tutor Now

Choose an expert and meet online. No packages or subscriptions, pay only for the time you need.