Darwinbots Forum
General => Off Topic => Topic started by: Houshalter on May 05, 2010, 06:56:42 AM
-
I'm not particularly fond of neural networks since their not universal turing, but in the real world they do have their uses. I was wondering if it would be possible to make a neural network out of paper. You might laugh, but I've seen some crazy things built out of paper inculding clocks and automata, plus I've seen old mechanical adding machines before. My problem is trying to find a simple way to do addition and multiplication for the weights and thresholds. Any ideas? Am I crazy?
-
I thought ANNs WERE turing complete... Where'd you find that out?
-
Well what if you have 0 as an input. No matter how many times you multiply it by weights, you will never get one as an output.. You could make the thresholds dynamic, but that makes it even more complicated and I didn't think most ANNs did that anyways. They also can't do XOR supposedly, but I haven't tried that.
-
maybhe you could have like a vertical mechanical NN that's like an abaucus so you drop a bead down this tube and it does like see-saws and that kind of thing and can give outputs that way like binary.
-
Clicky (http://www.cs.unm.edu/~saia/computability.html). It says there are proofs that NN are turing complete. It doesn't elaborate, though. I would expect neural networks to be turing complete. Just not terribly efficient at it.
-
maybhe you could have like a vertical mechanical NN that's like an abaucus so you drop a bead down this tube and it does like see-saws and that kind of thing and can give outputs that way like binary.
Maybe, but how do you do addition or multiplication on seesaws? Also, I wanted it to be more analog and avoid binary/digital.
Clicky (http://www.cs.unm.edu/~saia/computability.html). It says there are proofs that NN are turing complete. It doesn't elaborate, though. I would expect neural networks to be turing complete. Just not terribly efficient at it.
Ya, I think some are but those are the complicated models. I skimmed the article you brought up and it goes into the halting problem for some reason.
-
Clicky (http://www.cs.unm.edu/~saia/computability.html). It says there are proofs that NN are turing complete. It doesn't elaborate, though. I would expect neural networks to be turing complete. Just not terribly efficient at it.
Ya, I think some are but those are the complicated models. I skimmed the article you brought up and it goes into the halting problem for some reason.
Yeah, I'm not sure what the point of the article is. It's just the first good reference I could find on Turing complete and NNs.
-
Technically, can't anything be turing-complete so long as it supports the NAND operation? Also, are you sure ANNs can't do xor, or just can't do them effficiently? I'd be surprised if something that's turing-complete can't do the xor operation.
-
I'm not good with boolean logic so don't ask me. I once did a bunch of research on turing machines and although their kind of complicated, their easy to make. You have to have a tape, a state, and a program. The program is written like this: If state is this and the symbol on the tape is this then move and change state to this.
-
Well, basically, any digital logic circuit can be created using NAND gates... so using NAND gates is an alternative way to make turing-complete devices, just like ANNs and tinkertoys and plain ol' tape-reading midget monkeys in a box with sharpies.
-
I hope I can help here. The most common type of neural network is feedforward, which means that input goes through the network and always produces an output. There is no looping and no halting problem; it is not turing-complete, but it is a universal approximator in the sense that it can make any mapping between a finite number of inputs and outputs. 0 can become 1 because there is always a dummy node in the network that outputs 1, and it can do XOR easily. Recurrent neural networks feed their output back into their input, but I think they're still not turing-complete unless their nodes' outputs are analog, as opposed to the more common binary threshold outputs, which would make them finite state machines without an infinite number of nodes.
A finite series of NAND gates is also a universal approximator.