A Neural Network-based Approach to Key Fob selection – In this paper, we develop a recurrent non-volatile memory encoding (R-RAM) architecture of a hierarchical neural network (HNN) to encode information. This architecture is based on an unsupervised memory encoding scheme that employs a recurrent non-volatile memory encoding, where the recurrent memory is a memory that decodes the contents of the model. The architecture is tested on a dataset of 40 people, and in three cases has been used to encode real time data, the state of which is represented by a neural network, and to encode the final output. We show that the architecture can encode a lot of different aspects of key Fob-like sequences. Besides the real time data, the architecture also incorporates natural language processing as a possible future capability in terms of its retrieval abilities. The architecture achieves significant improvement over state-of-the-art recurrent memory encoding (RI) architectures, and with a relatively reduced computational cost.
We propose a new framework that can learn sparse linear nonnegative matrix factorizations (LNB) from nonGaussian random matrix data. Our framework does not depend on sparse LNB, which is still a very common problem due to the large number of variables in the data. We then propose a nonlinear nonparametric framework to extract the nonGaussian information from random matrices for LNB, thus making the framework a powerful one for learning linear nonnegative matrices. The LNB is computed by taking the covariance matrix and a sparse matrix as inputs in a nonlinear manner, thereby allowing for a fast learning algorithm for this nonGaussian matrix model. With this proposed method, it is more convenient to compare the observed matrix to the target matrix for a training dataset and is much faster to analyze and estimate. The proposed framework is evaluated on various real-world datasets and is shown to yield competitive results even on a small number of datasets.
Feature Extraction in the Presence of Error Models (Extended Version)
A Neural Network-based Approach to Key Fob selection
Feature-Augmented Visuomotor Learning for Accurate Identification of Manipulating Objects
Stochastic Nonparametric Learning via Sparse CodingWe propose a new framework that can learn sparse linear nonnegative matrix factorizations (LNB) from nonGaussian random matrix data. Our framework does not depend on sparse LNB, which is still a very common problem due to the large number of variables in the data. We then propose a nonlinear nonparametric framework to extract the nonGaussian information from random matrices for LNB, thus making the framework a powerful one for learning linear nonnegative matrices. The LNB is computed by taking the covariance matrix and a sparse matrix as inputs in a nonlinear manner, thereby allowing for a fast learning algorithm for this nonGaussian matrix model. With this proposed method, it is more convenient to compare the observed matrix to the target matrix for a training dataset and is much faster to analyze and estimate. The proposed framework is evaluated on various real-world datasets and is shown to yield competitive results even on a small number of datasets.