So let me push back a bit. I really do think it should be "range where the program does not modify CostX". The reason is that as sims run, bots will evolve, become more effecient, etc. A CostX of 1 may keep the population in the range at the beginning of time, but hopefully, millions of cycles later, the CostX will have had to creep up over time to maintain the population in the range. This would be evidence of evolved fitness. Additionally, my original vision here was that when setting up a sim, one does not have to get the absolute cost values right, just the relative values. You choose what you want to be more expensive and less expensive and the cost values have meaning as relative numbers, but the absolute values arn't as important I.e. getting the cost/energy/population balance exactly right up front isn't something you have to worry about becuase the program will adjust CostX for you. If you guess low, CostX will be > 1. If high, costX < 1. In any given sim, the population is unlikely to stabalize in the range at a CostX of 1, unless you have a wide range or happened to guess very well.
That said, I agree that we sill have a serious CostX adjustment overshoot/undershoot problem, especially when the population falls far enough to hit the zero cost threshold.
How's this. I'll add a "cost re-instatement threshold" (Y) designed to work with the 0 cost threshold (X), the idea being that if the population falls below X, CostX goes to 0 as it does today but when the population comes back above Y, CostX gets set back to what it was before when it crossed X. In this way, X and Y can be used with or independently of the auto-cost feature. When used together, if Y is below the lower limit on the auto-cost adjustment range, then when CostX is re-instated, it will still be adjusted downward over subsequent cycles while the population remains below the range. If in or above the range, it can still serve to accelerate the re-instatement of a high CostX and prevent population explosions.
Comments?