Darwinbots Forum
Code center => Suggestions => Topic started by: happyhamsterchan on May 04, 2010, 06:53:24 PM
-
OK, So I was thinking...
A good sim is a big sim. A big sim is a sim with a lot of RAM. However, x86 CPUs only go up to 4 GB. Should we upgrade DB to 64bit, perhaps so we can use a bajillion bytes of ram? that way we can make a cool sim. so ya.
I was thinking that, since most computers already support x64, and since RAM gets cheaper every year, perhaps this might be important enough for us to do to DB2?
-
Sims aren't pushing the 4 GB limit at present, so probably not necessary. I think you'll blow your CPU budget long before you run out of RAM.
Logistically, Visual Basic 6 is not 64 bit-able. .NET is, sort of, but there are some 32 bit assumptions that need to be made for XNA, etc. that mean that it's not strictly easy to go 64 bit.
-
Sims aren't pushing the 4 GB limit at present, so probably not necessary. I think you'll blow your CPU budget long before you run out of RAM.
In this case store more, calculate less. Collision detection, for instance can be speeded up by octree searches
-
Do we already do that nums?
-
That's called a broadphase in physics, for the record. Yes, I think Eric implemented a simple grid for a broadphase.
-
so have you tried coding up memory-wasting code, instead of cpu-intensive stuff?
-
Yeah. Again, there's only so far you can take it from a Comp Sci standpoint. Plus, because of CPU caches, there is definitely a point of diminishing returns (cache trashing).
-
awh... oh well, worth a shot.
-
Instead of large computer one could design a grid that expands over multiple computers.
Such designs scale well and are even used for distributed virtual chemistry.. (which requires a lot of CPU power)
Simply extend the screen borders to others, if you like to create something like that, and in DB2 currently there is a telleporter in internet mode, does a bit the same..
If you think about local speed, you should maybe think about CUDA (let the GPU do some of the processing).
but it wil not work for all graphic processors (so then you need alternative routines too).
-
We need to run Darwinbots on a super computer someday. I can't wait to see a zerobot after 10 million generations. Is there any way we can do this, ever? I suppose you could sign up with one of those shady web services that promise you billions of cpu cycles from an unamed source.
-
We need to run Darwinbots on a super computer someday. I can't wait to see a zerobot after 10 million generations. Is there any way we can do this, ever? I suppose you could sign up with one of those shady web services that promise you billions of cpu cycles from an unamed source.
Supercomputers work by having hundreds or processors, and DB can't even run dual core! I think that should be the next step, by the way. It would mean that, on dual core machines, it would run twice as fast.
-
well, you can run db to take advantage of two/more cores, but afaik it's somewhat inefficient. You just have to run a sim on each processor, and network them all together.
BTW a PC in two years will be a supercomputer anyway, we can just wait for octocore i9s to come out.
-
We need to run Darwinbots on a super computer someday. I can't wait to see a zerobot after 10 million generations. Is there any way we can do this, ever?
Been there done that (once)
-
We need to run Darwinbots on a super computer someday. I can't wait to see a zerobot after 10 million generations. Is there any way we can do this, ever?
Been there done that (once)
What happened They made a good point about the lack of parallel processing which is ironic since it seems like alot of DB could be parralized even if it was a bit inefficient.
-
Let me see if I can dig up some old save. I still have the intention to continue. Basically I upped the costx by hand (works faster). This made it labour intensive.