distributed compressed hyperspectral sensing imaging based on spectral unmixing
Clicks: 163
ID: 167986
2020
Article Quality & Performance Metrics
Overall Quality
Improving Quality
0.0
/100
Combines engagement data with AI-assessed academic quality
Reader Engagement
Emerging Content
0.3
/100
1 views
1 readers
Trending
AI Quality Assessment
Not analyzed
Abstract
The huge volume of hyperspectral imagery demands enormous computational resources, storage memory, and bandwidth between the sensor and the ground stations. Compressed sensing theory has great potential to reduce the enormous cost of hyperspectral imagery by only collecting a few compressed measurements on the onboard imaging system. Inspired by distributed source coding, in this paper, a distributed compressed sensing framework of hyperspectral imagery is proposed. Similar to distributed compressed video sensing, spatial-spectral hyperspectral imagery is separated into key-band and compressed-sensing-band with different sampling rates during collecting data of proposed framework. However, unlike distributed compressed video sensing using side information for reconstruction, the widely used spectral unmixing method is employed for the recovery of hyperspectral imagery. First, endmembers are extracted from the compressed-sensing-band. Then, the endmembers of the key-band are predicted by interpolation method and abundance estimation is achieved by exploiting sparse penalty. Finally, the original hyperspectral imagery is recovered by linear mixing model. Extensive experimental results on multiple real hyperspectral datasets demonstrate that the proposed method can effectively recover the original data. The reconstruction peak signal-to-noise ratio of the proposed framework surpasses other state-of-the-art methods.Reference Key |
wang2020sensorsdistributed
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
---|---|
Authors | ;Zhongliang Wang;Hua Xiao |
Journal | ekonomiczne problemy usług |
Year | 2020 |
DOI | 10.3390/s20082305 |
URL | |
Keywords |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.