I think I have found the right place for my suggestions, at least I hope so, please bear with me if the idea isn't very clear, I'll be happy to try and clarify any points.
As I understand things the envgrid breaks the area down into lots of smaller areas so that enviromental conditions can vary across the region. While trying to model chemicals and enzymes to use these chemicals is most accurate it is also by its nature most complex, I suggest just three conditions to be defined in the Envgrid. Firstly the ammount of energy available to a browsing bot (or veggie) secondly Enviromental damage this could represent many things (simplest implementation would be a per cycle reduction in energy for bots in the affected area), and thirdly temperature, which I envision as affecting the energy cost of actions, my limited understanding suggests that in a cold environment things slow down, this could be represented by increasing energy costs for actions, and conversly decrease energy costs as temperature increases above the norm.
High temperature as I understand things also causes damage to cell structures, which is one area where the environmental damage could be used to help find balance.
A fourth variable I'm not so sure on the practicality but I consider worth a mention is enviromental mutagenic factors, that either increase or decrease the chance of mutation when a cell splits in the affected area.
As a side note this simple modeling of environment would I think allow an attempt at modeling the black smokers... but I don't know enough on the topic to do more than guess that it would be possible.
The other consideration of great import in this envgrid is the size of the squares (I presume squares, though any other tessalating shape could work in theory, but squares are simple). If the squares as much smaller than the bots then each bot is geing to be on multiple squares and I think for simplisty the affects of each should accumilate, this would have to be taken into account when setting the values of the affect that environment has on on the bot, also this would make shared resources rarer. However I think more interesting things will happen if the squares are closer to the size of the bots (as long as the sides of the square are less than the diameter of a bot the least number of squares a bot can be touching is four and I do not recommend having squares with sides larger than the diameter of bots), larger squares allow for multiple bots to be affected by the same part of the environment and for an energy source to have to be split between browsers, in this model I would suggest for greater variability the average of environmental affects be used, yet with some added formuli to show the shared resources. For example two browsers next to each other each occupying a total of six envigrid squares (simple 3*2 grid), each bot touching four of the squares, the middle two being occupied by both bots.
For temperature and environmental damage take the average of the value for each square occupyed, for browsing (energy gain from environment) the middle two squares being both occupied have their values halved. So if the energy available in each square is E (assuming equality for simplisty in this first case) each browser can gain (E + E + 0.5E + 0.5E)/4 or 3/4E Where as if they seperate they can both gain E.
If the distribution of energy is uneven then things become more complex, and if browsers have the ability to test possible positions arround them to look for richer feeding grounds interesting patterns of flocking to rich sources and solitary browsing could emerge.
That the damage caused by the environment is not dived is intentional in my thinking, I think that if an area is dangerously acidic/hot/whatever this is not reduced because it is effecting multiple bots.
I think that the ability to detect the energy availably from the environment and the ability to feed on it should be seperate functions, but that the ability to browse should have some inherent disadvantage, I suggest an increased cost for non browsing actions, this would allow preditors to monitor the available energy and wait for a browser to get close enough, effectivly ambusing it. If the ammount of increased cost is a user set variable the user can define the likelyhood of developing omniverous bots by evolution, seems to me to be an interesting thought, yet for leagues it can be set so high as to make omnivours impractical, which is I believe one of the arguments against removing the distinction between bots and veggies.
I hope that's comprehensable to everyone, I know some of the parts I mention are more relevent to other threads, but as I wrote I kinda got into a flow of thought and hacking this appart to post it multiple places seems inappropriate as I think it is all linked.
Thanks for reading, I hope you found it worth the time.
Math