Posts tagged as neural_networks
So you want to Sheep
I’ve been doing this Sheep Learning 1 for a while and there often I have to prescribe some references. I’ll write this in the hope that it may be useful to others. Preferably the topics should be followed in order, but one can choose to explore as they wish. A lot of people tend to jump into the whole area without minimal mathematical prerequisites. However, primarily these are statistical models with a strong focus on Linear Algebra. For undergraduate students I recommend (Hogg et al. 2019) for the bare minimum of that area....Posted on: 2021-07-28, in Category: research
Dropout and Regularization
First mention of dropout is found in (Hinton et al. 2012). That paper talks about preventing feature correlation in neural networks. Dropout was applied successfully in (Krizhevsky et al. 2012) after which it gained widespread popularity. It was shown to be effective in Recurrent Neural Networks for the first time in (Zaremba et al. 2014). Historically, neural network pruning was an effective way to prevent overfitting of neural networks (Hassibi and Stork 1993; LeCun et al. 1990). These methods used ideas from perturbation theory to minimize the change in second order gradients (hessian)....Posted on: 2018-07-04, in Category: research