Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
avereveard
on Sept 20, 2015
|
parent
|
context
|
favorite
| on:
Why Are Eight Bits Enough for Deep Neural Networks...
Don't know about him but I was working with -8 8 for input and -4 4 for weights, using atan function for transfer maps quite well and there is no need to oversaturate the next layer.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: