I assume you mean real arbitrary neural nets, and not just "neural nets" (3-layered backpropagated toys). Would you even include backpropagation? I wouldn't think it would be necessary for version 1.
Homunq 17:05, 29 March 2008 (EDT)
Absolutely! Although there are endless applications for 'hidden layer + backpropogation' type neural networks, I think it would be more helpful to teach children about the basics of logic and how many simple units working together can generate complex patterns. I agree that backprop is not necessary for the first version. At some point, adding some different learning rules would be helpful to make some of the advanced labs include networks that adapt to changing conditions or something along the lines.
It's good to know there is some interest!
Braingram 16:11, 30 March 2008 (EDT)