For more information about this meeting, contact Jason Morton.
|Title:||Approximation Errors of Deep Belief Networks|
|Seminar:||Applied Algebra and Network Theory Seminar|
|Speaker:||Guido Montufar, Penn State|
|The Deep Belief Network (DBN) is a kind of artificial learning system with an efficient learning algorithm introduced by Hinton in 2006 and which has revolutionized the machine learning research field Deep Learning.
The DBN has a graphical representation including several layers of hidden binary variables with directed pairwise connections between subsequent layers and undirected pairwise connections between the last two hidden layers. The representational power of these models is far from being completely understood.
In particular, the smallest number of hidden variables for which the DBN model is able to represent any probability distribution as its marginal visible distribution is still unknown.
In this talk, I discuss submodels of DBNs and use them to bound, for the first time, the maximal Kullback-Leibler approximation errors of the DBNs depending on the number of hidden layers. These results yield, in particular, bounds for the minimal number of hidden layers of a DBN universal approximator.|
Room Reservation Information
|Date:||02 / 08 / 2012|
|Time:||02:30pm - 03:20pm|