General > Bot Challenges
Challenge #3: Neural Network
Moonfisher:
I'm trying to resist, but the dark side is too tempting
Realy afraid to get too caugh up in this... it's ok to fidle with it a litle in my spare time... but currently it's closer to all my spare time, and I just can't find a way to make myself dedicate more of it to the thesis...
Also just noticed I wrote the value for 2^36 for some reason... was ofcourse suposed to be 2^32...
Anyway going to make a short test bot to check if I can use large values in the stack, and if that works I'll make a micro NN to check if I can actualy make all of this work, and if it does I'll probably try to run a sim with it to see how well it evolves... although having doubts about how well a small hand made network would evolve... might try and make it completely binary and use random values for the weights (Provided I can set up the mutations to work properly with that method, haven't explored that propperly yet.).
I'll post results in here if I get anywhere with it...
I realy hope this can work, I keep thinking of more ideas for it, like training 100 networks from the same bot that do exactly the same but started with different random weights from different seeds before getting trained, creating different evolutionary potential...
Moonfisher:
Ok, the stack allows the use of values above 32000, which is going to make things a lot easyer
I tryed making a few hand tailored networks :
They're just for testing and don't represent exactly how this will work, it was mostly to see if the scaling of weights worked... Inputs for these are binary... made it easyer to set the weights manualy...
I'm thinking of keeping as much as possible in the stack for as long as possible in between genes.
They run propperly with thin liquids and no wrap... lots of veggies...
Also the outputs aren't transformed propperly, and generaly it's just a quick test to se of the general concept would work...
This one uses one hidden neuron.
[div class=\'codetop\']CODE[div class=\'codemain\' style=\'height:200px;white-space:pre;overflow:auto\']'i1 : *.eye5
'i2 : *.refshoot *.myshoot !=
'o1 : .shoot
'o2 : .up
'Normaly the inputs would have no definitions (For several reasons), this is just for testing
def i1 50
def i2 51
def h1 52
cond
*.robage 0 =
start
0 .shoot store
stop
cond
start
*.eye5 sgn .i1 store
stop
cond
*.refshoot *.myshoot !=
start
1 .i2 store
stop
cond
start
*.i1 100 mult -50 mult 100 div .h1 store
stop
cond
start
*.i2 100 mult -50 mult 100 div *.h1 add .h1 store
stop
cond
start
*.h1 100 mult 100 div 100 div .shoot store
stop
cond
start
*.h1 -300 mult 100 div 100 div .up store
stop
cond
start
0 .i2 store
stop
end
And this one uses 2... second one does virtualy nothing though, but the idea with this one is to set some random weights and see if something usefull can evolve. It doesn't reproduce, so I'm just throwing in lots of bots with frequent point mutation to see what happens...
If you use the current values it's more likely to devolve, since the best mutations are just some increase in speed once it gets big and slow...
[div class=\'codetop\']CODE[div class=\'codemain\' style=\'height:200px;white-space:pre;overflow:auto\']'i1 : *.eye5 sgn
'i2 : *.refshoot *.myshoot !=
'o1 : .shoot
'o2 : .up
'Normaly the inputs would have no definitions (For several reasons), this is just for testing
def i1 50
def i2 51
def h1 52
def h2 53
cond
*.robage 0 =
start
0 .shoot store
stop
cond
start
'*.eye5 100 ceil 100 mult 100 div .i1 store
*.eye5 sgn .i1 store
stop
cond
*.refshoot *.myshoot !=
start
1 .i2 store
stop
'Inputs
'h1
cond
start
*.i1 100 mult -50 mult 100 div .h1 store
stop
cond
start
*.i2 100 mult -50 mult 100 div *.h1 add .h1 store
stop
'h2
cond
start
*.i1 100 mult 50 mult 100 div .h2 store
stop
cond
start
*.i2 100 mult 50 mult 100 div *.h2 add .h2 store
stop
'Outputs
'o1
cond
start
*.h1 100 mult *.h2 0 mult add 100 div 100 div .shoot store
stop
'o2
cond
start
*.h1 -300 mult *.h2 100 mult add 100 div 100 div .up store
stop
cond
start
0 .i2 store
stop
end
So far I've only run a small quick test, funny enough 2 of the bots broke the conspec, but one had broken the gene generating the input, and the other one had broken the input gene for h1 making us see an enemy all the time... as far as weight tweeking goes nothing very interesting had time to happen, and I noticed the actual weights where rarely the ones to get mutated. This is also why I'll try to keep a lot in the stack, and have as litle code as possible that isn't related to the network and it's weights... also split everything up into more genes, both for sexrepro and to increase the odds that a broken gene will just act as a dead dendrit or neuron...
If I manage to evolve something interesting from the one with random values I'll post it... but the odds are low since it doesn't reproduce and I'm generaly impatient and don't have much faith in the potential of a bot that size... it's just too smal and fragile, it needs a big network so it can realy mess it up, and generaly so mos of the mutations would affect the actual network.
I'm thinking it might be a good ide to make the program able to clean up an evolved bot, restoring broken genes outside the network, and replacing broken dendrit genes with a fresh one with 0 as a weight. But I'm also thinking that may not be possible at all, since the mutations breaking certain genes can have too many different effects, I may have to fix it by hand, if that's even possible...
My concern is mostly that the changes should realy be focused around the weights, and mutations to other genes and killing dendrits would usualy have more radical effects and therfor become rather frequent... and dendrits and broken genes won't naturaly form (Or atleast it's very unlikely)... so it may end up locking the bot in a certain direction... I think if something evolves to be worth while I could try to fix it up a litle by hand... but best try not to have too much code ouside the network...
Also I ran one test on the random bot, one bot was alive when I got back, it had broken te output gene for shoot, just replaced store with dec... was shooting all the time and moved towards enemies... was lucky to have had a few alge in sight and survived a litle longer... but not quite what I was hoping for
Not sur why it was moving forward, I'm guessing the values in the stack from the first broken output gene affected the second output gene...
This does support the theory that keeping values in the stack and splitting everything into many genes may be benefitial... still trying to figure out if theres a realy clever way to do it... possibly one allowing for new dendrits to form or merge (Or atleast increase the odds).
I can see I definately need to test more ideas before I start on the C code... also I think using C may be a stupid way of doing it... but I don't know python and pearl and generaly never realy used those syntax based string manipulating fuction things.... but maybe learning it will be easyer than trying to build it in C
I think I'll just try to find a C library with those functions... if it looks like they would be very usefull...
Numsgil:
Instead of mutating the whole bot, I would do a bit of random oscillation on the weights every time a bot is born. Just check if it's robage is 0 and if so, add or subtract a little bit from weights at random. That way you can preserve the structure (which is extremely complex) and just mutate the weights.
abyaly:
What is your thesis on?
Moonfisher:
Yeah handeling mutations manualy would be more stable, but it would also set a strickt limit to the size of the network (Since I can only inherit 15 weights).
So right now I'm toying with the idea of keeping as much as possible in the stack at all times, like 15 hidden nodes at the time, then empty them into vars to make room for new values...
I'm hoping this would cause broken dentrits to often affect new neurons... but not sure if hat would be better or worse
Anyway if the structure completely falls appart I think it should be ok, if I want to restore some potential in an evolved network I'll just have to add a second network to handle whatever has gotten lost.
Anyway it's unnatural for a neural network to have dendrits going from evey input to every neuron asf... so if something breaks og merges it would only be natural...
Anyway made a version using some tranformation for the input and output values, but I think I scaled everything wrong, lost track along the way
I also think maybe splitting it up may have a harmmfull effect, gotta find a balance between a good setup for sexrepro and least harmfull mutations that disable genes.
And as far as I can see a large part of the network will always be the part transforming input and output values, so those would often get mutated... but then again I think ti would correspond to just another broken or merged dendrir... I'm mostly concerned that in average for one mutation of a weight 1 or 2 dendrits would get broken, so a bot needs to be rather lucky to achieve some weight tweeking without also having lost some dendrits. I'm thinking of maybe using the epigenetic locations to save some weights that have a more dramatic effect, to cause some more frequent tweeking of those values...
[div class=\'codetop\']CODE[div class=\'codemain\' style=\'height:200px;white-space:pre;overflow:auto\']
def h1 52
def h2 53
cond
*.robage 0 =
start
0 .shoot store
stop
'Inputs
'h1
start
*.eye5
stop
start
40 mult
stop
start
.h1 store
stop
cond
start
100
stop
cond
*.refshoot *.myshoot =
start
0 mult
stop
start
40 mult
stop
start
*.h1 add
stop
start
.h1 store
stop
'h2
start
*.eye5
stop
start
-10 mult
stop
start
.h2 store
stop
cond
start
100
stop
cond
*.refshoot *.myshoot =
start
0 mult
stop
start
100 mult
stop
start
*.h2 add
stop
start
.h2 store
stop
'------------------------ Outputs
'--- o1
start
*.h1 100 mult
stop
start
*.h2 0 mult
stop
start
add
stop
start
100 div
stop
start
100 div
stop
start
sgn -
stop
start
.shoot store
stop
'--- o2
start
*.h1 1 mult
stop
start
*.h2 1 mult
stop
start
add
stop
start
100 div
stop
start
*.maxvel mult
stop
start
100 div
stop
start
.up store
stop
end
Also about the thesis, it was initialy suposed to be an experiment to use neural networks in a game AI, but realized I didn't have the inputs I needed, so now it's a system for mining and baking data for the AI Will have to work on the AI afterwards
(Also I wasn't sure if it was realy worth putting that much effort into it, if you spend that much time on a feature you want to be sure it's something people will notice and eppretiate... Damn customers... the data mining feature can also be used for a lot of other stuff)
And while I'm typing anyway, I think I hit a strange bug with this thing, I was running veggies that gained like 40 nrg per kilo, so they got realy big and gained a lot of energy all the time. It then seemed like once killed they didn't always disapear (Like they never got removed from the bucket), and since the bot only moves forward it pretty much got stuck everytime it was looking at a non existing veggy. I'll make a propper bug report later today if I have the time.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version