robust structure and motion factorization of nonrigid objects
Clicks: 80
ID: 244138
2015
Article Quality & Performance Metrics
Overall Quality
Improving Quality
0.0
/100
Combines engagement data with AI-assessed academic quality
Reader Engagement
Emerging Content
0.3
/100
1 views
1 readers
Trending
AI Quality Assessment
Not analyzed
Abstract
Structure from motion is an import theme in computer vision. Although great progress has been made both in theory and applications, most of the algorithms only work for static scenes and rigid objects. In recent years, structure and motion recovery of nonrigid objects and dynamic scenes has received a lot of attention. In this paper, the state-of-the-art techniques for structure and motion factorization of nonrigid objects are reviewed and discussed. First, an introduction of the structure from motion problem is presented, followed by a general formulation of nonrigid structure from motion. Then, an augmented affined factorization framework, by using homogeneous representation, is presented to solve the registration issue in the presence of outlying and missing data. Third, based on the observation that the reprojection residuals of outliers are significantly larger than those of inliers, a robust factorization strategy with outlier rejection is proposed by means of the reprojection residuals, followed by some comparative experimental evaluations. Finally, some future research topics in nonrigid structure from motion are discussed.
Abstract Quality Issue:
This abstract appears to be incomplete or contains metadata (167 words).
Try re-searching for a better abstract.
| Reference Key |
ewang2015frontiersrobust
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
|---|---|
| Authors | ;Guanghui eWang |
| Journal | canadian journal of philosophy |
| Year | 2015 |
| DOI |
10.3389/frobt.2015.00030
|
| URL | |
| Keywords |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.