The most effective method is to introduce very subtle costs once a replicator gets going. Your initial population will either evolve to survive it or die off (you'd need to start from an earily save and try again). Then you keep jacking the costs up, being careful not to drive the bots to extinction (dynamic costs are good for this, but you run the risk of the bots learning to game the system).
Now, you can't really control how the bots are gaining energy. The idea is just to encourage them to find some way of surviving by killing the ones that don't. Surprisingly (or not, depending on your background) bots can learn to do some rather neat, if still primitive, things. Such as turning, moving, and firing at the same time. I don't think anyone's managed to develop a population that does more than that yet, though. No real intelligent behavior yet (like hunting down something it sees).
Interestingly, it's more common to see viruses develop and act as the primary vector for evolution. The simplest viruses methodically twiddle different memory locations, causing all sorts of random behavior such as reproduction, creating and firing viruses, shots, etc. I think you're going to find this to be the primary route your simulation will take to adapt to your changes, because it allows for rapid adaptation.