Recall that, in general,
mean(a*X+b) = a*mean(X)+ b
std.deviation(a*x+b) = a*std.deviation(X)
It should also be intuitive, in the sense that multiplying a random variable by a stretches it (if |a|>1) or
shrinks it (if |a|<1) so that the standard deviation is greater or smaller, respectively. Adding a constant b only moves the possible values for the random variable X forward or backward, but does not change their position with respect to the mean and therefore the constant "b" factor is irrelevant when computing variance or standard deviation.