Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Seconded. I was recently trying to find a good resource to simply and concisely explain (or at least give a 'feel' for) artificial neural networks to someone and came up blank. Everything I found was either assumed too much domain-specific knowledge or was too long.


Is there a "feel" to neural networks? You define a bunch of hidden/latent variables connected in an array (or multiple arrays), then perform a heuristic search of the weight state space to minimize some error/energy function. Sometimes it works, often not.


I love how you're getting downvoted when the truth is NNs work well _only_ when you have a team of specialists to select their hyperparameters. There's a rich vein of research for AutoML to automatically learn the hyps.


I don't know if this is the type of thing you're looking for, BUT... I made this little web app for visualizing the prediction gradient for a binary classification problem with 2 dimensional inputs [0]. You basically add members of each class by clicking in the plane. It has some bells and whistles for viewing the decision gradient (the probability of any given point in space being of one class or the other) and how this changes after each training. You can can also do things like change the learning rate, edit weight values, and add noise to weights.

In this toy problem, you can get more of a "feel" for the complexity capability of a network (by trying different clusterings of classes, etc...). Unfortunately, I hard coded it to have 2 hidden units. In retrospect, it would have been better to make the number of hidden units tunable as well so that one could visualize how a network with more non-linearities can draw increasingly complex decision boundaries.

[0] http://www.math.fsu.edu/~mhancock/#!/software/web-apps/neura...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: