Latent Feature Lasso.
Clicks: 147
ID: 32630
2017
Article Quality & Performance Metrics
Overall Quality
Improving Quality
0.0
/100
Combines engagement data with AI-assessed academic quality
Reader Engagement
Steady Performance
70.7
/100
145 views
119 readers
Trending
AI Quality Assessment
Not analyzed
Abstract
The latent feature model (LFM), proposed in (Griffiths & Ghahramani, 2005), but possibly with earlier origins, is a generalization of a mixture model, where each instance is generated not from a single latent class but from a combination of . Thus, each instance has an associated latent binary feature incidence vector indicating the presence or absence of a feature. Due to its combinatorial nature, inference of LFMs is considerably intractable, and accordingly, most of the attention has focused on nonparametric LFMs, with priors such as the Indian Buffet Process (IBP) on infinite binary matrices. Recent efforts to tackle this complexity either still have computational complexity that is exponential, or sample complexity that is high-order polynomial w.r.t. the number of latent features. In this paper, we address this outstanding problem of tractable estimation of LFMs via a novel atomic-norm regularization, which gives an algorithm with polynomial run-time and sample complexity without impractical assumptions on the data distribution.
| Reference Key |
yen2017latentproceedings
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
|---|---|
| Authors | Yen, Ian E H;Lee, Wei-Cheng;Chang, Sung-En;Suggala, Arun S;Lin, Shou-De;Ravikumar, Pradeep; |
| Journal | proceedings of machine learning research |
| Year | 2017 |
| DOI |
DOI not found
|
| URL | URL not found |
| Keywords | Keywords not found |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.