Code center > Suggestions
gimmic for zerobot evolution
The_Duck:
--- Quote from: EricL ---I suspect that someday, an analysis of evolved sequences will show things such as a much higher probabiltiy that a point mutation will result in something functional for example, than does a similar hand authored sequence that acheives the same functionality.
--- End quote ---
I recall reading an article in Scientific American a few years ago that was about something to this effect in real DNA; here is an abstract that talks about it: http://www.ncbi.nlm.nih.gov/pubmed/14604186?dopt=Abstract
Moonfisher:
I definately agree that surviving is the best possible fitness function you can hope for, everything else is just assumed to be qualified guesses...
You never know what could happen, even if you observe a species or certain mutations or behaviors you never realy know if they where bad or good since it all depended on a large amount of factors at that time. So any assumtion you try to make will likely be wrong, you can hope to kill off more bad than good bots, but you also risk inhibiting evolution from going in certain directions and generaly kill off some of the potentialy best bots in your sim.
Also if you realy want to find the best bot in your sim, you need to disable mutations and wait for one species to remain.
I can't think of any other way, if you mess with costs or the environment then you won't get the bot that was most fit for the original setup.
Numsgil:
As a follow up to my previous post. Real organisms tend to have mutation rates on the order of 1 mutation per million to billion base pairs. Which in DB terms would equate to 1 mutation per 1000 or 1000000 generations. I think DB is a little more forgiving than real life in this regard, since a complex DB DNA might be a couple thousand bps, while the human genome has several billion bps, which bumps us back to something between 1 mutation per 1 to 1000 generations.
I would say a "default" mutation rate to aim for might be 1 mutation per 7 or 8 generations.
EricL:
--- Quote from: The_Duck ---I recall reading an article in Scientific American a few years ago that was about something to this effect in real DNA; here is an abstract that talks about it: http://www.ncbi.nlm.nih.gov/pubmed/14604186?dopt=Abstract
--- End quote ---
I read this article also. One thing I will call out that I think they missed is that genones aren't after a 0% error rate when it comes to copying fideility. A genome that doesn't mutate and therefor cannot adapt to a changing environment is as bad as one that mutates too much due to too high a copy error rate. Therefor, I think it is simplisitc to state that reducing copy errors has been the only selection driver for how DNA came to be coded the way it is.
A theory I happen to subscribe to is that codon synonyms, codon bias and much of the DNA copying and replication machinery itself is largly the result of selection for a coding language and machinery that allows a genome to code for very different copy fidelity rates in differnent places and to have this mutation probability coded in an out-of-band way that does not directly impact the resultant proteins. Said another way, I might claim that evolution could have produced an information coding method and copy machinery whose error rate was zero for all intents and purposes but that this has been selected against since the genome that always makes perfect copies of itself cannot adapt. It's not at all hard to imagine that given two genomes which code for the exactly the same, identical proteins, selection would favor the one which happened to use codon synonyms which coded for higher mutation rates where higher mutation rates were advantageous (e.g. specific posion production when co-evolving with preditors) and lower mutation rates where lower mutations rates were advantegous.
shvarz:
--- Quote ---As a follow up to my previous post. Real organisms tend to have mutation rates on the order of 1 mutation per million to billion base pairs. Which in DB terms would equate to 1 mutation per 1000 or 1000000 generations. I think DB is a little more forgiving than real life in this regard, since a complex DB DNA might be a couple thousand bps, while the human genome has several billion bps, which bumps us back to something between 1 mutation per 1 to 1000 generations.
I would say a "default" mutation rate to aim for might be 1 mutation per 7 or 8 generations.
--- End quote ---
Just to resolve some confusion of terms:
Mutation rates can be counted as "per replication per nucleotide", I.e. the frequency of error during each individual step of DNA replication. Those vary widely from 10^-9 to 10^-4. These numbers reflect the accuracy of the copying machinery of an organism.
But you can also count mutation rate "per genome per generation", I.e. what proportion of genomes have at least one mutation. Those can be obtained (in a rough but fairly accurate estimation) by multiplying the mutation rate by the length of the genome. These vary a lot less. For humans it is 175 mutations in each replication cycle, but most of those happen in all the junk DNA (or are silent mutations) and thus they have no effect on phenotype.
Another important measure of mutation rates is how many mutations per generation do have an effect on phenotype. It is very difficult to estimate that for humans. But RNA viruses give us a clue as to the maximum possible. We know it's maximum because when we try to increase that mutation rate, viruses go extinct - they enter error catastrophe. For some viruses every third off-spring carries a mutation. We can estimate that about half of these mutations are silent. Thus, a single mutation in ~6 offspring is about as much as an organism can handle.
These rates are difficult to translate to DB, because they are properties of genome complexity and genome organization.
Navigation
[0] Message Index
[*] Previous page
Go to full version