Learning Sparsely Whole Network Structure using Bilateral Filtering – We propose a deep neural network framework for multivariate graph inference, by using both multivariate and graph regularity networks. The main objective is to learn a structure of the graph with a large number of components. Such a structure is learned using a matrix factorization framework, which we call matrix factorization. The matrix factorization is then used to automatically estimate the weights of the graph from their derivatives, i.e., the probability of some node to be selected. The graph structure learning algorithm is evaluated to determine the optimal structure. We demonstrate how to use matrix factorization to learn the graphs of different graphs. We also show theoretical evidence why the weights of the graphs (i.e., the sum of the derivatives) can be used to optimize the graph structure learning algorithm.

In this paper, we propose a new dynamic constraint solver for the purpose of parameter estimation, based on a learning method. Our approach is based on constraint optimisation using an ensemble of stochastic approximating algorithms, e.g., the Monte-Carlo algorithm and the maximum likelihood algorithm, the two recent successful search algorithms that are widely used in parameter estimation. The proposed algorithm is flexible enough to handle complex optimization problems in any order, and is applicable as a parameter estimation solver. Experimental evaluation shows that the proposed algorithm achieves state-of-the-art performance on MNIST, CIFAR-10 and COCO datasets.

Affective surveillance systems: An affective feature approach

Probabilistic Latent Variable Models

# Learning Sparsely Whole Network Structure using Bilateral Filtering

Dependent Component Analysis: Estimating the sum of its components

Linear Tabu Search For Efficient Policy Gradient EstimationIn this paper, we propose a new dynamic constraint solver for the purpose of parameter estimation, based on a learning method. Our approach is based on constraint optimisation using an ensemble of stochastic approximating algorithms, e.g., the Monte-Carlo algorithm and the maximum likelihood algorithm, the two recent successful search algorithms that are widely used in parameter estimation. The proposed algorithm is flexible enough to handle complex optimization problems in any order, and is applicable as a parameter estimation solver. Experimental evaluation shows that the proposed algorithm achieves state-of-the-art performance on MNIST, CIFAR-10 and COCO datasets.