Bots and Simulations > DNA - General
Neural nets and mutations
Moonfisher:
Yeah I kinda figured it was something like that...
I could just use more controll though. So I could completely lock the mutations of storeComands, or atleast set them to have far lower odds than mutations of values...
It doesn't realy seem like the 2 kinds of mutations are that related to me, they have very different impacts on the bot they act on. Changing a value should often change the range for some condition og the strength of shots, or repro size and stuff like that, where changing StoreComands will usualy have a drastic and often very negative effect, like suddenly reproducing with 1% of your nrg evey cycle, og eliminating the condition, or breaking all the conditions in a gene or disabling shot boosts....
It just doesn't seem well balanced, it seems like mutation of StoreComands would be something that took a lot of mutations in nature. I don't know how this works in nature, but nature isn't code, so I don't think it would break down so easy, and I think single point mutations wouldn't have such destructive effects...
The way I imagine it a mutation of an operator would only move it slightly in one direction, getting close to another operation but still remaining the same...
So an add would have to mutate like 20 times in the same dirrection to become a sub, and 200 times in a certain direction to become a mult....
Basicaly operators would have a whole range of possible values and you would place the ones related near each other, and have wider spaces for fragile operators like store, stop, start and cond....
That or just the ability to turn off operator mutations would be very usefull for my purposes So if EricL is messing with mutations anyway, a checkbox to disable mutations of operators would be a sweet adition.
EricL:
I beleive it does indeed work this way in nature at one level I.e. small changes in gene coding can have large impacts on functionality and expression. A single base pair change can create a stop codon in the middle of a gene for example, which is a fine way to disable the gene. I'm no genetisist, but I do not think the adjacency space for coding DNA maps smoothly to the fucntional protein space. Said another way, I do not think one can make stong predictions along the lines that small changes in coding DNA result in small changes in protein structure and that large changes result in large changes. Sometimes this is case I imagine but I think mostly only in cases where there is correction logic at the transcription level that ignores long run repeats and such. If a mutation really changes a codon so that it codes differently, I think the resulting change in the protein can be large or small.
Biological organisms appear more resilient to single nuclitide polymorphisms becuase they have evolved higher level redundency above the coding level. Often multiple genes or even whole gene families work together and overlap in what the do. The loss of one gene may not be serious. Additionally, coding DNA requires context. A gene that suddenly codes for a new protein may not work well with the promoters and other triggers for it's production and thus it's expression may be naturally supressed.
One can argue I suppose that the DB adjacency space is smooth relative to the resulting functionality in cases where the base pair in question represents the value being stored to a memory location I.e. that a small change in a value of a sysvar results in a small change in functionality for most memory lcoaitons. This is not the case however, for many other base pairs. An obvious example is a change to the location to which the store is being performed. So, simply restricting or greatly reducing the probability of mutating base pair types may not achive what you are after. Small changes in base pair value can still result in huge changes in the resulting functionality.
I understand what you are trying to do with the neural net, but I'm not really inclined to add a lot of special case functionality in the simulator for things that are kind of out of the black w.r.t. the main line focus of DB, particuarlily when I think they should be done differently. If you're trying to build plasticity into a single cell organism, then I don't think you need a net for that. I think the right way to do it is with inherited values that determine high level functionality in a non mutating environment. LionFish for example has like 20 different private values which govern it's high level behaviour. It mutates them itself from generation to generation and lets selection work at that level. It evolves and adapts to changing environments and competitors within it's design limits by changing these values generation to generation without mutations being enabaled. It's behaviour is plastic.
If you really want to build a net, then I think the nodes in that net should be whole bots. I'd be very intereted in adding capabilities to improve bot-bot communciaton...
Bottom line, I don't think a system that uses the simulator mutation code to change edge weights will work even if we made the changes and restricted the mutation space as you suggest. I think you should mangage your own network weighting in your DNA with mutations turned off completely, either within a single bot ala Lionfish or with mutliple cooperating bots acting as nodes in a much larger net.
That said, if you still really want the mutation space restriciton options, I will put them on the list if for no other reason then it is a requested feature that would be simple to impliment.
Moonfisher:
You're probably right, a neural net in nature consist of many cells working together and it's not how single cells work.
But a multicellular network would not work in DB, since it's probably something that evolved from the sensory network in larger organisms, from having simple reactions to pain receptors to forming more complex reactions from more inputs and finaly to actual thoughts. (I know a brain is more than just a neural network, but octopuses have no brain and still seem capable of more "complex" thoughts.)
I can also understand that a high amount of redundant code and the cooperation of several genes will limit the damage caused by mutations.
I did try to split up the network into a lot of genes, but that just destroyed the network more often.
The problem is that for every weight I have and input an add and a mult and an output... and only value mutations have a slight chance of doing something usefull with the network, where the operator mutations have a high chance of breaking the structure.
So it only has a 1 in 5 chance of having a very small chance of getting a usefull mutation, and it would take several usefull mutations to actualy get anywhere.
Locking mutations would limit the network to just 20 weights... this means I would face the choice of severely limiting the size of the network, or eliminating certain connections that seem to have no effect, or have a trained network and pick certain weights with a larger impact and change only the key weights.
I could also set a lower cap for the weights allowing me to use the binary opperators to store 2 different weights in one location. (But with a faily low cap on weights)
Or I could keep a small offset along with a weight modifying value in one location, making it possible to change any weight in the network, but still limited to changing only 20 different weights (Only advantage would be the network could prioritize and sacrifice certain weight changes for others).
Last idea is probably the one I like best, but still not realy as flexible as I had hoped.
The best I can think of is building the weight from many genes, 20 genes adding 5 and 20 genes withdrawing 5 for instance.
This way the odds of changing the weight would be a lot better.
The downside is that I have a LOT of weights, so multiplying that amount by 40 would make the bot very long... very very long...
And the point was to get faster evo results, so I'd rather not slow down the sim too much...
And you're also right that locking operator mutations wouldn't prevent the network from breaking down, however the mutations caused to the structure even when harmfull would still have a small chance of doing something usefull and not completely destroy the structure.
Also the absence of inc and dec would reduce the odds of shell starting to form from a mutated store rather than something in the network, creating the posibility for the shell amount to be regulated later on.
But breaking down weights into several genes would probably be more efficient than locking operator mutations, although in the long run it might get harder for the weights to adjust as the genes have been broken down...
I think the best setup for my purpose (And I realize it's doesn't quite fit cingle cell evolution) would be to do both, break down each weight into 4 or more values to increase the odds of weights changing vs the chance of the mem locations changing, and... locking mutations of operators
Even broken nodes can still be usefull since they are still modified by some weights before reaching an ouput.
I know it's not the right wayto do it, it's kind of cheating at evolution... but then Darwin will just have to sue me, I want the feature !
(If it's not too much trouble)
I'm also going to try to tweek league bots with it but it's hard to improve league bots this way since you need a steady flow of enemies to join the sim.
Best way I've found so far is adding another league bot as an alge and have a very high veggie cap and no energy gain, and give alga minimalis some more starting energy. The problem is that this will only allow the bot to evolve against weaker bots... but it might still be enough to tweek certain values.
Also the whole redundant code thing you described isn't far from what a NN is doing, random values and inputs getting scrabled together to create some sort of output... I'm just trying to nurture the structure
Maybe I'm overprotective, but I'm very impatient and I want good evo results NOW !
Numsgil:
Set up a birthing process, where the mother bot "teaches" the baby bot using the in/out of several cycles. It wouldn't be foolproof or automatic, but it would let you use more weights.
Moonfisher:
Hmm, I think that would be enough, it takes 15 cycles to transfer epigenetic memmory anyway, could transfer atleast 150 extra weights.
Wouldn't need an offset and cap for the weights, could just send them in some fixed order and have an end value to indicate when we're done, or just break the tie...
I think birthties break automaticaly after 15 cycles, even if you try to stiffen them. But 170 weights is a lot more to play around with... gonna have to figure out how many inputs and outputs and neurons that would allow, once I'm fully awake...
It would definately be a good tool for a plastic network, one using less raw inputs and outputs, transforming everything into a certain range of values... so the structure could be more complex.
The flexibility and possibilities would still be predictable and fairly limited though... since inputs and outputs can't change...
But it's definately a good point, didn't even occur to me to use tout1-10 during birth. Worst that can happen is that the tie breaks and a baby is let go prematurely...
What bugs me though is that cani's often eat their young, and the young always spawn at the front, like a snack table... so newborns need to get clear of their parrents fast, but the birthtie would make that hard... I'm hoping it would still be possible, but even with bots that break birthties right away you'll often see them eat 2-3 of their own young before successfully reproducing.
I guess the base network will need to have a conspec, but if the network is buildt this way I should probably transform inputs and output, and have a lot of more binary outputs, so the whole thing would be less "natural" anyways...
Still not a huge amount of weights though... it wouldn't be enough for 10 inputs 10 hiden neurons and 10 outputs... and the networks I'm fiddeling with now are already bigger (And not big enough).
But it's definately worth thinking about some more, if the network transforms values it can have a lower weight range and allow 2 weights per mem loc, bringing it up to 340 weights.
Although capping weights would also create more limitations, if it's even possible, seen weights go way off the scale in a network...
I'll think about this when I have some time work calls.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version