Code center > Suggestions

Eyes

(1/7) > >>

Jez:
Do you think that bots would benefit having their eyes changed a bit? The following quotes come from a previous discussion here:

--- Quote ---What I can't do is mimic behaviour of simple things like fish because they have two eyes and bots don't. I can't mimic shoal behaviour for instance only mass following behaviour.
--- End quote ---

--- Quote ---How about giving bots greater control over eyes, so that, for example, each refvar gets split into 1-9
So *.refeye9 would read back the eye reference of a bot in your eye9
--- End quote ---

--- Quote ---What I was thinking when I wrote that was more sort of two eyes and being able to choose the location that they are at. The difference between having eyes at side of head or on front of head style.
--- End quote ---

--- Quote ---Maybe something similar to what I'm doing with ties. Have a command (switcheye or something) that changes which eye has the "focus", and as such changes the info read from the refvars during DNA execution. You could change your "focus" from eye5 to eye3 and back again in a single cycle.
The only possible issue is while this increases potential complexity (which is good) it decreases reaction time that bots have. Already bots can accomplish alot in a single cycle. The few cycles it takes to turn and "lock" onto an opponent are very important to the fitness landscape, and we should be careful before we change it.
--- End quote ---

--- Quote ---Where the eye's overlap the animal has depth perception, where they can see with only one eye they have no depth perception but can ID objects.
That's what the bots are doing wrong isn't it! They have depth perception through the side eyes but can't ID.
--- End quote ---

--- Quote ---If we remove the distance part of eye5 and require bots to triangulate the distance to other bots, these are the consequences as I see them.
1. We would need to add some trig functions. Perhaps map the usual 0-360 degrees and -1 to 1 of regular sin and cosine to 0-1256 and -1000 to 1000.
2. Things get more complex for the bots. Specifically finding the distance to the target becomes rather difficult. Finding the velocity even more so.
3. It may not be possible to tell the difference between a really big bot and a very close bot unless you form a multibot and use steroscopic vision.
4. Alot of old bots simply wouldn't work.
I would be willing to make this fundamental change as part of the feature set for 3.0, since upgrading the numbers gives us a legitimate reason not to support older bots  But it would be a huge change.
--- End quote ---

Numsgil:
If we do this we need to do it very carefully.  I've already inputed alot into this topic, so I think others should start speaking up as well.

Exactly what would you change about the way that eyes work that would make them more realistic or whatever your measure of "better" is?

Ramiro:
I think that eyes all around of the bot could be interesting. Sometimes could be usefull and sometimes don“t but it would be quite challenging to make a bot take the right course of action when it sees many enemies around.

Bonicular perception could be used to asses distance and eliminate the variables that return the EXACT distance, something that hardly occurs in nature.

EricL:
I have been doing some thinking in this area as I am working on making shapes visiable to bots.  Today they are not visiable.  They block what is behind them, but instead of seeing a shape and being able to know it's distance, that is a shape, etc it appears to the bot that there is simply nothing in that direction even if there is a bot on the other side of the shape that would be within viewing range were the shape not there.  So, today, bots can hide behind shapes, but they have no way of knowing that they are doing so or of evolving behaviour to use or interact with shapes.

Today, the only things bots can see are other bots.  They can't see shots, field borders, ties, teleporters or shapes.  They can assume that if they see something, anything, its a bot.   This will and should change.  My vision (pun intended) is that the world gets more complex over time and the variety of objects in a sim and thus the variety of things bots can see and hence interact with increases.  I want to put aside for a moment questions about how many eyes there should be, whether refvars should work for eyes other than eye5 and so on and focus (pun intended) on the question of how bots should differeniate between the different kinds of objects that come into view because this is likely to be the next area of vision I work on.

In DB, for reasons of practicality, we take short cuts over the route nature took.  We could implement photons for example and make vision based upon reflected light and require that bots evolve recognition logic to sense and distinguish the different photon reflection patterns that consitute a bot vs. shape (not to mention a moving bot v. a fixed bot vs. a far away bot etc.).  Needless to say, we would be at it a very long time.

So instead, we use eye distance numbers and refvars to boot strap much of the underlying machinery nature had to evolve so that our bots can focus (hee hee) on evolving behaviour that utilizes these built in mechanisms rather than on evolving the mechanisms themsevles.

So, I intend to simply add on to the existing refvar paradym for object type recognition by adding a .reftype sysvar.  A reftype of say 0 would indicate that the closest thing visiable in eye5 is a bot, any type bot.  Or, we could go finer grain than that and reserve differnt type values for different types of bots e.g. autotroph, nonautotroph, corpse.  But my main point is that there would be other values that indicated that what the bot was looking at was something other than a bot.  The first one I would implement would be for looking at a shape.  We could extend this to include a type for the field border if we wanted bots to see the edges of the world as well as any new kinds of objects we add in the future.

It does mean that existing bots will need to add logic to avoid trying to chase/attack/tie to/run from/swarm with shapes but I don't see a way around that without inventing a new kind of sense.  For example, we could add brand new sysvars for echolocation or infrared vision or something as a new sense with it's own set of refvar like mechanisms and not enhance eyes, but that has many new issues not the least of which is that it is a lot of work and the sysvar space is limited.  Besides, going the refvar route will only confuse old bots in sims where there are shapes or other, future, not bot things that can be seen, which I think is acceptable.

Comments?

Numsgil:
Why don't we delve a bit into bot psychology for this.  Right now, for a bot, everything it sees is a bot.  If we add things that the bot can see, to the bot these would be new kinds of bots.

So instead of telling bots that this is a wall, this is a teleporter, etc. we should tell bots that this has so much energy, this has so much body, etc.

Motion is important for real animals when they decide if something is alive (another animal) or not.  There are two types of motion: translational and deformational.  A car moving down a highway has very little deformational motion but alot of translational motion.  It seems to just glide.

Something like a man on a treadmill has alot of deformational motion but little translational.  The man wiggles and squirms but doesn't go anywhere.

So I propose giving bots two sorts of senses.  One detects translational motion (the refvels) the other detects "spinning" motion (a bot spinning looking for food, or an outbound teleportation vortex, etc.).

Using the bots senses, smart bots can distinguish between all the sorts of "bots" in its environment.

Other living bots - They move (either translational if hunting or spinning if idle), have > 0 nrg.

Dead bots - They don't spin and have body > 0 and nrg = 0

Algae - Default algae has refeye = 0, dunno about a good idea for any old plant.

Walls - No motion whatsoever, body = 0, nrg = 0.

Teleporters - body = 0, nrg = 0, lots of spining motion

This way to the bots everything in the world is a bot, some just have different properties.

Navigation

[0] Message Index

[#] Next page

Go to full version