Alessandro Betti (DIISM, Universities of Florence and Siena)
Oct 11, 2018 – 9:30 AM
DIISM, Artificial Intelligence laboratory (room 201), Siena SI
By and large, Backpropagation (BP) is regarded as one of the most important neural computation algorithms at the basis of the impressive progress in machine learning, including the recent advances in deep learning. Interestingly, its computational structure has been the source of many debates on its arguable biological plausibility. While BP is an optimal algorithm for gradient computation nothing can be found that is asymptotically more efficient for this task it does not involve related architectural issues on the actual building of the neural network and, consequently, on its generalization capabilities. Based on early remarks on the derivation of BP, we propose a new algorithm, referred to as Perfect Building (PB), whose purpose is that of gradually building the neurons by a gradual learning process.