Author Topic: vision - distance discrimination  (Read 6374 times)

Offline EricL

  • Administrator
  • Bot God
  • *****
  • Posts: 2266
    • View Profile
vision - distance discrimination
« on: December 19, 2007, 06:11:50 PM »
I've got the rest of shape visibility implemented now for 2.43w.  There were two cases missing in pre 2.43w drops:

1) The case where a bot is off the corner of a shape and the eye spans the shape corner.  In this case, the bot eye value registered the shorter of the distances of the interestion points of the two eye edges with the shape sides, not the distance to the corner as it should.  In cases of wide eyes where the eye edges don't even intersect the shape, the eye didn't see it the shape at all!   Now the eye value correctly registers the corner as the closest part of the shape in view and it works for any and all eye widths.

2) The bot is off the side of the shape and the eye spans the closest point of the shape (the point of interestection of the shape edge and line drawn perpendicular to the shape edge through the bot center).  Simialr to above, in cases of wide eyes, the bot may not even see the shape in current drops.  Now the closest point is registerred and everything works for all eye widths.

I have several issues for comment.  Let me try to be breif and precise.

Q1) Are bot eyes located at the bot center or at the edge of the bot (I.e. at it's radius)?  Particularly for bots of large size, this has bearing on absolute eye values, distance discrimination and whether bots can and should see things inside themselves (such as when a vastly smaller bot penetrates it).  Currently, the eye value equation factors in the viewing bot's radius, but in sort or a strange way.   Imagine 2 bots, A and B, each exactly the same distance away from a third bot C.   If A is considerably larger than B, it's eye value when looking at C will be somewhat higher than when B looks at C, soley because it is larger.  The purpose of this (I assume) is to sort of locate eyes on the bot's edge.  But A can still see things within itself and the ultimate highest possible value of one of A's eyes (occurs when it is centerred right on the thing it is viewing) will be larger than B's ultimate highest eye value.   Bascially, being larger conveys slighty better distance discrimination capabilites.  

Q2) Should we increase the range of eye values for better eyesight distance discrimination?  This subject has been discussed before.  Currently we are using only a fraction of the range of values for eyesight, which seems to me like a waste.  If we want to allow for pricision burrowing in shapes or detection of shape imperfections due to previous burrowing or the identification of a small hidden bot moored against the side of a shape using distance alone or precision ambush tactics such as hiding around shape corners or precise dodging of visible shots, we are going to need better eyesight resolution.  I can still make it non-linear, as it is today, but doing so over a larger range of values would open the door I think to more precisie behaviour and interaction with the environment.  I would implement this via a toggle of some sort for backward compatability, either a per-bot settable sysvar or a DNA file flag.   Bots could choose to use the higher range or not.
Many beers....

Offline Numsgil

  • Administrator
  • Bot God
  • *****
  • Posts: 7742
    • View Profile
vision - distance discrimination
« Reply #1 on: December 19, 2007, 08:46:24 PM »
Q1: This is an artifact from way before my time.  Originally bot sizes were fixed, and the eye values were presumably decided arbitrarily by Carlo.  When varying bot radii were introduced (this existed before my time too, but I probably exasperated the results when I added realistic volume calculations) the eye values were changed to make bots backwards compatible.

When I first started 2.4, I started messing with eye values to try and make them more self consistent by placing the eye on the edge of the bot.  But it broke most existing bots and was a real sore spot so I changed it back.

Another peculiarity of eyes and large bots: a bot's vision radius doesn't change as it gets larger.  Large bots can't see any further than small bots, but there's less room between them and the edge of their vision radius.

Q2: Don't mess with existing eye return values.  It's just a huge headache for everyone involved, at least in my experience.  An alternative might be to introduce another set of eyes that are more self consistent.

Offline shvarz

  • Bot God
  • *****
  • Posts: 1341
    • View Profile
vision - distance discrimination
« Reply #2 on: December 19, 2007, 11:59:39 PM »
I say please mess with whatever you want, as long as things improve

In my opinion:
- eyes should be located on the periphery of the bot
- eyes should not be able to see inside the bot (back towards center)
- eyes can return either value to center or to periphery, it's not important because bots can estimate their own radius and adjust accordingly. I think it makes more sense to measure the distance from the eye, not from the center.

Also, if you want to give bots control over how far they could see, then I'd suggest balancing it out by introducing a preset "depth of field". Say DOF=50%, then if a bot sets its eye power to 100, then it gets accurate numbers only within 50-150 interval. Anything beyond 150 - it does not see it, anything closer than 50  - it can't tell how close it is, just get info that something is blocking the eye.
"Never underestimate the power of stupid things in big numbers" - Serious Sam

Offline EricL

  • Administrator
  • Bot God
  • *****
  • Posts: 2266
    • View Profile
vision - distance discrimination
« Reply #3 on: December 21, 2007, 12:44:02 PM »
I'm noddling the comments and will comment on them soon.

On a related issue, 2.43w makes a slight change in .refxpos and .refypos.  These now return the position of the nearest point on the viewed object that is in view of the focus eye.  For a viewed bot, this isn't a huge change, especially for small radii bots.  The refvar position will now be the cloest edge of the viewed bot that is in view asopposed to the bot's center.   I don;t think this will make much difference to omnieye bots or other revpos uses.

For shapes, obviosuly the change is much larger.  It actaully makes these refvars worth something for shapes and allows a bot for example, to chart and remember the dimensions or a specific location of a shape or portion of shape, circle it at a fixed distance, return to or defend a specific den between two shapes, etc.
Many beers....

Offline Numsgil

  • Administrator
  • Bot God
  • *****
  • Posts: 7742
    • View Profile
vision - distance discrimination
« Reply #4 on: December 21, 2007, 12:58:43 PM »
Quote from: EricL
On a related issue, 2.43w makes a slight change in .refxpos and .refypos.  These now return the position of the nearest point on the viewed object that is in view of the focus eye.  For a viewed bot, this isn't a huge change, especially for small radii bots.  The refvar position will now be the cloest edge of the viewed bot that is in view asopposed to the bot's center.   I don;t think this will make much difference to omnieye bots or other revpos uses.

This might cause some issues.  Have you played around with any bots that use refpos for navigation to see how they behave?  I can maybe see some problems with bots orbiting each other.  Of course, as long as the bots are round the closest point will lie on the segment connecting the bot's center, eye, and target bot's center, so it might just change eye values a little.

Offline EricL

  • Administrator
  • Bot God
  • *****
  • Posts: 2266
    • View Profile
vision - distance discrimination
« Reply #5 on: December 21, 2007, 01:00:31 PM »
To be clear, the refxpos/refypos change does not change eye values.
Many beers....

Offline EricL

  • Administrator
  • Bot God
  • *****
  • Posts: 2266
    • View Profile
vision - distance discrimination
« Reply #6 on: December 21, 2007, 03:01:28 PM »
Quote from: Numsgil
Another peculiarity of eyes and large bots: a bot's vision radius doesn't change as it gets larger.  Large bots can't see any further than small bots, but there's less room between them and the edge of their vision radius.
Actually, this is incorrect.  Even in current versions, larger bots can see farther (from their center).  A bot's radius is added to the standard sight range to determine it's actual sight distance.  So all bots can see the same distance beyond their radius.
Many beers....

Offline Numsgil

  • Administrator
  • Bot God
  • *****
  • Posts: 7742
    • View Profile
vision - distance discrimination
« Reply #7 on: December 21, 2007, 07:30:52 PM »
Quote from: EricL
Quote from: Numsgil
Another peculiarity of eyes and large bots: a bot's vision radius doesn't change as it gets larger.  Large bots can't see any further than small bots, but there's less room between them and the edge of their vision radius.
Actually, this is incorrect.  Even in current versions, larger bots can see farther (from their center).  A bot's radius is added to the standard sight range to determine it's actual sight distance.  So all bots can see the same distance beyond their radius.

Is that new?  Back in my day the radius was locked at 1440

Offline EricL

  • Administrator
  • Bot God
  • *****
  • Posts: 2266
    • View Profile
vision - distance discrimination
« Reply #8 on: December 21, 2007, 07:46:37 PM »
Quote from: Numsgil
Is that new?  Back in my day the radius was locked at 1440
Actually, my mistake.  I added it for viewing shapes but not for other bots.  So, larger bots can see shapes further away than smaller ones, but it has no bearing on viewing other bots.
« Last Edit: December 21, 2007, 08:03:47 PM by EricL »
Many beers....

Offline EricL

  • Administrator
  • Bot God
  • *****
  • Posts: 2266
    • View Profile
vision - distance discrimination
« Reply #9 on: December 28, 2007, 09:27:22 PM »
Folks, I'm sorry but I just had to do something about the eye value formula for 2.43y.  The current forumla dates back to before bots had variable size and the recent change to make bot sizes vary over an even larger range was showing up some problems.  In particular, eye values can currently go negative when a very large bot looks at something close up.

I think what I've done it is fairly backward compatable in that bots that trigger at eye values > 0 or >20 or >50 should still pretty much behave exactly the same as those points pretty much still mean about the same thing as far as actual distance.  But one thing this change does do is add tremendous resolution when viewing things close up.  It also increases the effective distance at whcih larger bots become visible.

Okay, first let's talk about how far bots can see.   Assume two bots A and B.  A has radius 'a' and B as radius 'b'  A is looking at B.  B will show up in A's eye (A's eye value will be some positive value) if the distance between their centers is less than a+b+1440.  It used to be just a+1440.  What this means now is that instead of "seeing" a bot's center, bots effectivly now "see" the other bot's edge.  This means that bots with large radii will be visible when their centers are farther away than will bots with small radii.  It's the closeness of the bot's edge that counts.   FYI, the largest possible bot with body of 32000 with today's code can have a radius somewhere around 450 I think, so this change can be significant when it comes to seeing very fat bots far off.

So, bots are visible to each other if the distance between their edges is less than 1440.  How do eye values vary with the distance D between edges?  Basically, its an inverse square relationship.  The value V of a bot's eye is bascially given by:

V = 1/(D/1440)^2

That is, a bot's eye is the inverse square of how close the bot is as a percentage of the maximum sight distance (currently fixed at 1440 for all bots and all eye widths).  This is pretty close to the previous formula, but cleaner and without some junk that refers to the old fixed radius size.  In practice, it means eye values are pretty much the same as they've always been unless the thing your looking at is very close in which case, eye values can be significantly higher.  At close range, bot's eyes become very sensitive to distance changes and can have values up to the maximum value for V of 32000.

Well, okay, not quite.  The real formula is:

V = 1/((D+10)/1440)^2 for D>=0

What is that D+10 all about? Well, if you do the math, the first formula gives eyevalues of over 32000 when bots are still separated by a little space (distances of about 8 or less).  I wanted to do two things.  First, I wanted eyevalues to change up close, for bots to be able to tell the difference between being distance 2 away and distance 6 away for example.  And I wanted bots to be able to tell the difference between being really really close to something and touching, intersecting or being inside it.  The D+10 does this

In cases where bots are touching, intersecting, when the viewing bot is inside another bot (or inside a shape) or when the bot being viewed is inside it, V=32000.  The above formula will not give values above 20736 for bots that don't touch each other.    This means bots do not get positional information for things inside themselves or when they in turn are inside something else - their eye values get  "maxed out".   They will get directional information about things inside them I.e. the eyes that see something will be at 32000, those that don't will be 0.  I'd be happy to entertain the suggestion that bots shoudl not even get directional information about things inside them - that eyes should be 0 in such cases - but, well, the above is how it is for 2.43y.

I should point out that I don't currently take the curve of the viewed bot's edge into account.  That is, imagine a large bot being viewed close up such that multiple eyes can see it.  Today, all eyes will register the same value, as if the bot were presenting a straight edge instead of a curved one.  This is a temporary short coming that I plan to address soon.   I have already done so for shapes.  When a bot's eye sees a shape, the eye value of each eye is a function of the closet part of the shape in that eye, typically one side of the eye or the other unless viewing a corner.  This lets bots make out things such as shape corners in great detail, particularly when they are close up.

I should point out that it also has the nice side effect of serving to make small bots huddled against shape walls pretty hard to see.  Imagine a bot's eye hitting a small bot hiding against a shape wall.  Unless the eye is straight on to the wall, it is likely one side of the eye or the other hits the shape and that that distance is less than that to the viewed bot's edge.  The eye will see the shape, not the bot.  Cool, huh?

And while we are on the subject, diving into a shape is a good way to hide.  Shapes can be penetrated to some degree, particularly if the bot continually thursts into them.  Doing this is a great way to evade a pursurer (or jump out and surprise prey).

Anyway, before you condem this change, let's try it out.  If I hadn't said anything, you probably wouldn't have noticed anyway!   Sp, give 2.43y a whirl and lets see what there is to see (pun intended).  Should be out later this weekend.
« Last Edit: December 28, 2007, 09:28:10 PM by EricL »
Many beers....

Offline Jez

  • Bot Overlord
  • ****
  • Posts: 788
    • View Profile
vision - distance discrimination
« Reply #10 on: December 30, 2007, 08:21:22 PM »
Just my thoughts; a bots eye's and therefore the vision distance should be measured from the outside of the bot, the precision of a bots eyesight should be related to the distance betweeen the eyes, as in trig or to analogise 'Leela' from a favoured cartoon of mine! This might screw 'single eye' bots but I guess that is fair, everything with an upside should have a downside...
If you try and take a cat apart to see how it works, the first thing you have in your hands is a non-working cat.
Douglas Adams

Offline Jez

  • Bot Overlord
  • ****
  • Posts: 788
    • View Profile
vision - distance discrimination
« Reply #11 on: December 30, 2007, 08:34:06 PM »
In fact, to add to that; I dont' see why a wide vision eye should be able to see as far a as narrow vision eye. wide (vision) looks short long (vison) looks long is how I always imagined it, without getting into the whole trig argument.

Even, considering some of the games I play; eyes shoul cost for both introduction and use, 9? eyes is a bit of a eutopia...

Someone said "there is no such thing as a free lunch" and I think that epitaph should be applied more vigorously WRT (with respect to) eyes.
If you try and take a cat apart to see how it works, the first thing you have in your hands is a non-working cat.
Douglas Adams

Offline Numsgil

  • Administrator
  • Bot God
  • *****
  • Posts: 7742
    • View Profile
vision - distance discrimination
« Reply #12 on: December 30, 2007, 11:16:45 PM »
To make it fair, you could balance it so that the area of viewable distance for bots is held constant.  Something like pi * 1440^2 / 4.  As you increase the view angle, it changes the shape of that pie slice, but the amount that you're actually looking at is balanced and constant.

Offline EricL

  • Administrator
  • Bot God
  • *****
  • Posts: 2266
    • View Profile
vision - distance discrimination
« Reply #13 on: December 31, 2007, 12:12:09 PM »
On the subject of varying eye sight distance as a function of width, please see this topic.
Many beers....

Offline MacadamiaNuts

  • Bot Destroyer
  • ***
  • Posts: 273
    • View Profile
vision - distance discrimination
« Reply #14 on: December 31, 2007, 01:36:54 PM »
PS. The radii adjustment to the distance was very much needed. It was annoying when bots had to bump the fat food to trigger a > 70 gene while they kept spraying and praying at tiny bots.
Sometimes you win, and sometimes you lose...