This is an interesting project I ran upon a while ago: SlimGen, its made by the same people who made SlimDX (C# DirectX wrapper). What it lets you do is use assembly math operations inside of a .net CLR project.
Yes, I recently found that as well (was talking with Washu on gamedev and found his blog talking about SlimGen). Not something I'll play with any time soon, but I'll probably dig in to something along those lines to take advantage of vectorization (SSE, SIMD, etc.) in the future.
Worst case the memory would grow with the square of the number of bots, so I don't think it would be an issue until you start getting multiple thousands of bots. And my ultimate goal is to have performance and memory be linearly related to the number of bots, so then that might become moot.
How does this relate to DBII where I have pop explosions easily pushing numbers up to 5000?
DBII should have a memory footprint that is linear against the number of bots. But the performance generally degrades with (worst case) the square of the number of bots. Not to mention what happens if it ends up eating all your memory and goes in to virtual memory.
How can we get it to run on a cluster? Also, can this work on early versions of Windows? I'd expect significant performance increases if it runs on Win2k. I want to set up a dedicated computer, so having the latest OS isn't important.
I'm not sure how far back .NET is supported on windows. But with Mono I'd imagine you could run in any 32/64 bit OS. Probably even a really barren linux environment. I've done some mono compatibility testing recently and no problems were reported. But I haven't actually tested with mono. And I don't know how good the mono JIT is, so I don't know if there would be any performance penalty or anything like that.
As far as clusters... it's entirely self contained .NET code. So any platform which can run .NET could run Darwinbots. I haven't looked, but I imagine there either are solutions to run .NET in distributed environments or there will be soon. Then the hard part is threading the code enough to properly handle potentially hundreds of different computers. For a proper game, that wouldn't really work (real time requirements would squish it), but Darwinbots doesn't have to run in real time so there's absolutely no reason why you couldn't send, say, the DNA to another thread/computer to execute and get the output back 100ms later or whatever. With the way the physics will work it's not even entirely true that you'd have to completely finish one frame before moving on to the next.
I think the most important performance optimization right now is getting something running. Others should probably wait for that.
First version will be reaaaaaaallly slow for that very reason.