It is just hard for me to write tests using code. Usually my brain is the test. Otherwise it seems like a waste of time for me, example:
a + b = c
Do I really have to test?
If you were writing the add function for a processor in machine microcode you'd absolutely want to test this. Does you add function work on positive and negative numbers? Does it handle 0 correctly. Does it handle 1 correctly. If you're using floating point numbers, does it handle infinities and NaNs correctly. If you're just literally typing a + b you probably don't need to test that. But then it's probably part of some larger calculation you should be testing.
Basically, if you're writing a closed algorithm, something that takes data in, does stuff, then spits an output, you should be unit testing it 100% of the time. If you're writing UI, networking, graphics, etc. stuff it's a bit more of a gray area, because it's difficult to isolate your code from random hardware issues and constantly iterating designs. For graphics, for instance, do you do a per-pixel check? What if one of the pixels is a slightly different shade of blue? Is that a failure? But you should definitely be unit testing the smaller pieces that make up the larger whole even then.
...
For the example you have, you'd probably want to break the random away from the function and have the function take as input 1 or 2 random values. Then you can test to see what happens when you pass in negative values. Basically non deterministic factors are the enemy of unit testing. You usually want to isolate them from as much of the code as possible to allow for deterministic testing. Unit testing isn't just a matter of writing the tests, it often requires rethinking how to approach a problem in a way that's amenable to testing. That's usually a good thing, as it forces you to write code in a more modular way and really think through dependencies.
It does take more upfront effort and thought, though.