Hello!
This is actually an issue I've been concerned about, so I'm glad that I have a chance to provide an answer!
It will be helpful to imagine, for a moment, that when you request a random number, you get back a value between 0 and 7. (I'm making it so small so as to make it easier to demonstrate the issue; it's still applicable to common random number functions.) Each possible value (0, 1, 2, ... 7) is equally likely to be returned by the random number generator, as expected.
Now, If you want a number between 0 and something *smaller* than (in this case, 7), you modulo (get the remainder from a division) the returned value by one more than the maximum value you want.
Let's say that you want a value between 0 and 1; you modulo the returned value by 2. the following shows what you'll get:
Everything looks fine. Each possible result (0, 1, 2, 3) is returned with equal probability.
However, what if the number we're dividing (say, 3) by isn't evenly divisible into the returned random number (maximum plus 1, or 8, here)? Things aren't quite right:
As you can see, we'll get 0 3/8ths of the time, and 1 3/8ths of the time, but 2 only 2/8ths of the time. This is like playing with a loaded die, where one of the possibilities (and, because of modulo, the highest value possibility) will not happen with the desired probability. This is the 'modulo bias'. Preventing this bias is straight forward, if wasteful: module by 2 more than the maximum value (i.e. choose module 4, instead of modulo 3), and *throw* *away* any 3's.
The reason this is a problem with libc's rand() implementation, in particular, is because the maximum value returned by rand() has been (and might still be, on some systems) as low as 32767; allowing modulo bias to show up more often, and with more effect.