Code center > Suggestions

Implementing Sound

<< < (3/5) > >>

Numsgil:
Sounds are vibrations.  Just at different Hertz.  And certainly unicellular critters have other means of gaining information about the ones around them.  They're not some sort of stupid animal mindlessly bumbling through the water.

Testlund:
I meant the different kind of heartz that creatures can feel, not the kind you need specialised senses to pick up. As I understand it single cellular organisms can react to things they touches or smells, but they can't hear or see, right? ..other than light or darkness in some species. Otherwise they appear pretty mindless to me when I've looked into a microscope or in TV-programs about such organisms.

Numsgil:
I haven't been able to find much discussion about what single celled organisms can and can't sense, but when biology is concerned I tend to err on the side of giving them credit for abilities.  I am constantly astounded by the sophistication of even simple organisms.

Testlund:
I agree there is lot to get surprised about in biology, and as a matter of fact I decided to do a search on Google to see if there was anything written about this. I found this about bacteria and sound:

http://www.jstage.jst.go.jp/article/jgam/44/1/44_49/_article

...and this article with a link to a pdf document with remarkable information about complex behavior in bacteria.

http://mnemosynosis.livejournal.com/10810.html

So if you want to implement the ability for bots to percieve and transmit sound I guess it might not be too crazy an idea after all.  

EricL:

--- Quote from: rsucoop ---I assume this is impossible Eric?
--- End quote ---
Nothing is impossible, but my time to work on DB is finite and the list of things to impliment is long.  Realistically, I would not get to this in the next 6 months even if there were no open questons.   People are always welcome to create their own forks of the code if they don't want to wait for me - others have done this in the past and I would be happy to assist such efforts, as well as integrate working code back into the main line if and when appropriate.

I'm all for adding some functionality in this general area, but there are a bunch of implimentation issues with what you propose.   I'll just point out one and that is that the computational hit of anything that has an N^2 relationship between bots such as vision or hearing is proportional to the distance over which it acts.  The farther sounds propagates, the more bots might be emitting a sound a given bot can hear, the more sound interactions that have to be integrated for each bot and so on.   As such, I'm pretty opposed to implimenting anything (with the possible exception of the suggestion below) that operates over a range longer than the bucket dimension (4000) which is just a tiny bit farther than how far a bot with the largest possible radius can see with the narrowest possible eye.

Another pet peeve of mine, which is not specific to this suggestion, is that I hold the opinion that not only is it not necessary to impliment real world physics in all cases for our digital organisms, but that to do so can be counter productive to evolving complexity.   I see no reasons other than appealing to human intuition and creating evironmental richness that the DB world must emulate the real world and environmental richness I would argue can often be achievied more effeciently through the implimentation of more "native" digital means for bots to interact with each other and the environment and NOT by emulating real world physics.  Others disagree with me on this and I won't belabor it here, but it is another reason why I am not keen to spend a bunch of time myself writing code whose main purpose is to simulate a real world phenonomem.   As I say above, I would much prefer to focus on what interactions we are trying to acheive - what bot-to-bot or bot-to-environment scenerios we are trying to allow - and let the features and physics follow from that.

Do we want a "hey you, I'm over here!" mechanism for one bot to shout at another when one or both are not looking in the right direction or for a bot to "hear" the bot nearest to them or are we after a more general purpose means for bots to gather more data about their surroundings than vision alone currently allows?  If the former, I might offer up a simple suggestion (this is off the top) such as populating a bot's refvars and/or in/out pairs with data from the nearest bot (no matter what the range) when there is nothing in the focus eye, essentially allow him to "hear" the nearest bot and gather data about it without seeing it.  He could turn that direction, move towards it, run away from it, communicate with it, etc.   We might choose to disallow writing to a bot's memory until he can see it to prevent long range memory attacks.....

If the later - providing bots better means to gather data about their surrorundings - than my inclination is to build upon vision since it seems to me a natural place to do that.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version