Code center > Darwinbots3
Vision
Numsgil:
--- Quote from: Arzgarb ---"Infinitely far away" sounds quite scary when the performance goal is 10 cycles/sec with 1000 bots, since in the current version vision calculation takes the most processing power in a sim. Especially with 4 eyes and much better resolution. But if you can figure out a clever algorithm for the job, this would be really great.
--- End quote ---
I have a few ideas for clever algorithms. I can explain what I'm thinking if anyone is interested, but it would probably be pretty technical. And failing that, infinite distance is the goal, but I'll settle for less as necessary. Plus, the problem bears a lot of similarity to ray casting, which has some clever algorithms figured out.
--- Quote ---Also, the normally random signals that the bot has to process itself sound nice, but it'll need to be robust. Like a system codule that takes input (some rod/cone values in the int stack) and produces output (last value of boolean stack?) that is then used to decide whether or not the signal will then be processed by the actual eye codules. But this needs thinking, because a bot can't know how many rods will be activated, and so how many input values it needs to process...
--- End quote ---
I have a book on signal processing I'll look through. I'll have a better idea then of whether the processing should actually be done by the bot or if it's just something we simulate behind the scenes. My guess is that in the end there's an "optimal" algorithm for it that would be too complex for bots (and run time), so we'll just simulate it using a few sysvars to tweak parameters.
--- Quote from: Testlund ---I think this is moving too much away from the unicellular concept. I would prefer touch senses instead and eyes that can only sense darkness and light and maybe just recognize movement at close range.
--- End quote ---
Believe it or not I'm keeping that argument in mind, since I know it's one you have I hope to expand out other senses, like touch, hearing (basically detecting vibrartions), and smell, so you can run blind sims if you want.
Also, since veggies will need the sim to have light levels (sort of like pond mode), we could tie vision into that system. So at very low light levels bots have to rely on other senses. That way it's possible to create an environment where blind bots and vision bots might coexist with different niches.
--- Quote from: Prsn828 ---I think Testlund is right.
Here comes a new idea:
Keep the concept of rods and cylinders. Scrap the idea of multiple eyes.
I say each bot should get only one eye; after all, in real life an organism doesn't even get that, so if DB3 decides to evolve usable vision, it will have to evolve eye cells.
Even tough is a sense reserved for multicellular organisms, but due to computing limits, I think it is permissible in DB.
If we do decide on bots with individual eyes, we will need to be able to garuntee that ties will work precisely as they are supposed to, and can be controlled very delicately.
Finally, if we need to, I think each bot could specify a unique eye codule in its DNA that would have input values given by the eye, and would process those values.
These input values would be like sysvars, but they would be accessible only to the eye codule, and perhaps only one store value could be written to by the eye codule to prevent overuse of that codule.
--- End quote ---
In real life, individual cells just have eye spots. They let the cell detect brightness and maybe color. Building a system of vision from this for Darwinbots would be possible, but it would really be an amazing feat of bot engineering. Even a primitive eye involves hundreds of cells, and some structural engineering (envaginating the schmear of vision cells so the organism can sense direction). That doesn't include the engineering (and behind the scenes physics) necessary to create lenses.
On top of that, doing detailed vision vs. doing simple eyespot vision are roughly computationally equivalent, since the hard part is doing the "what can see what" part.
So basically there's a whole spectrum of complexity available. At one end is something more realistic to cells, and at the other is something more like octopus eyes. While eye spots are more realistic, I think octopus eyes creates a more interesting and identifiable sim for the average user. The end result is probably something like that first stage in Spore, or fl0w.
Navigation
[0] Message Index
[*] Previous page
Go to full version