Home
Digital Library
Site Map
Store
Contact Us
Press Room
Shopping Cart
Help
Login
Simple Search
Advanced Search
Author Search
Proceedings Search
CS Store Search
Google Search
Search Help
Search For:
Displaying 1-7 out of 7 total
Guest Editors' Introduction: Special Section on Learning Deep Architectures
Found in:
IEEE Transactions on Pattern Analysis and Machine Intelligence
By Samy Bengio,Li Deng,Hugo Larochelle,Honglak Lee,Ruslan Salakhutdinov
Issue Date:August 2013
pp. 1795-1797
There has been a resurgence of research in the design of deep architecture models and learning algorithms, i.e., methods that rely on the extraction of a multilayer representation of the data. Often referred to as deep learning, this topic of research has ...
Semi-Supervised Mixture-of-Experts Classification
Found in:
Data Mining, IEEE International Conference on
By Grigoris Karakoulas, Ruslan Salakhutdinov
Issue Date:November 2004
pp. 138-145
We introduce a mixture-of-experts technique that is a generalization of mixture modeling techniques previously suggested for semi-supervised learning. We apply the bias-variance decomposition to semi-supervised classification and use the decomposition to s...
Workshop summary: Workshop on learning feature hierarchies
Found in: Proceedings of the 26th Annual International Conference on Machine Learning (ICML '09)
By Geoff Hinton, Kay Yu, Ruslan Salakhutdinov, Yann LeCun, Yoshua Bengio
Issue Date:June 2009
pp. 1-1
Previous studies of Non-Parametric Kernel (NPK) learning usually reduce to solving some Semi-Definite Programming (SDP) problem by a standard SDP solver. However, time complexity of standard interior-point SDP solvers could be as high as O(n6.5). Such inte...
Evaluation methods for topic models
Found in: Proceedings of the 26th Annual International Conference on Machine Learning (ICML '09)
By David Mimno, Hanna M. Wallach, Iain Murray, Ruslan Salakhutdinov
Issue Date:June 2009
pp. 1-8
A natural evaluation metric for statistical topic models is the probability of held-out documents given a trained model. While exact computation of this probability is intractable, several estimators for this probability have been used in the topic modelin...
Learning nonlinear dynamic models
Found in: Proceedings of the 26th Annual International Conference on Machine Learning (ICML '09)
By John Langford, Ruslan Salakhutdinov, Tong Zhang
Issue Date:June 2009
pp. 1-8
We present a novel approach for learning nonlinear dynamic models, which leads to a new set of tools capable of solving problems that are otherwise difficult. We provide theory showing this new approach is consistent for models with long range structure, a...
Bayesian probabilistic matrix factorization using Markov chain Monte Carlo
Found in: Proceedings of the 25th international conference on Machine learning (ICML '08)
By Andriy Mnih, Ruslan Salakhutdinov
Issue Date:July 2008
pp. 880-887
Low-rank matrix approximation methods provide one of the simplest and most effective approaches to collaborative filtering. Such models are usually fitted to data by finding a MAP estimate of the model parameters, a procedure that can be performed efficien...
On the quantitative analysis of deep belief networks
Found in: Proceedings of the 25th international conference on Machine learning (ICML '08)
By Iain Murray, Ruslan Salakhutdinov
Issue Date:July 2008
pp. 872-879
Deep Belief Networks (DBN's) are generative models that contain many layers of hidden variables. Efficient greedy algorithms for learning and approximate inference have allowed these models to be applied successfully in many application domains. The main b...
1
Original Search Engine
Need a Web Account?
For Members
For Non-members
Become a Member