Er, not to sound simple or dumb, but you could a random number between one and one million, take it's square root, and subtract 1000 by it, then add or subtract the total value.
Or in mathematical/programming form:
1000-(sqrt(rand()*1000000))
That'd certainly make it be in favor of smaller values. In fact, the chances of it changing by a mere one is 1/500, while as increasing/decreasing by 1000 is a full 1/1000000.
(And of course there'd be a safety system to make sure it doesn't get higher then 1000.) (And also, of course, it'd be rounded too.)