CropDeep: The Crop Vision Dataset for Deep-Learning-Based Classification and Detection in Precision Agriculture
Clicks: 166
ID: 266151
2019
Article Quality & Performance Metrics
Overall Quality
Improving Quality
0.0
/100
Combines engagement data with AI-assessed academic quality
Reader Engagement
Emerging Content
30.0
/100
165 views
17 readers
Trending
AI Quality Assessment
Not analyzed
Abstract
Intelligence has been considered as the major challenge in promoting economic potential and production efficiency of precision agriculture. In order to apply advanced deep-learning technology to complete various agricultural tasks in online and offline ways, a large number of crop vision datasets with domain-specific annotation are urgently needed. To encourage further progress in challenging realistic agricultural conditions, we present the CropDeep species classification and detection dataset, consisting of 31,147 images with over 49,000 annotated instances from 31 different classes. In contrast to existing vision datasets, images were collected with different cameras and equipment in greenhouses, captured in a wide variety of situations. It features visually similar species and periodic changes with more representative annotations, which have supported a stronger benchmark for deep-learning-based classification and detection. To further verify the application prospect, we provide extensive baseline experiments using state-of-the-art deep-learning classification and detection models. Results show that current deep-learning-based methods achieve well performance in classification accuracy over 99%. While current deep-learning methods achieve only 92% detection accuracy, illustrating the difficulty of the dataset and improvement room of state-of-the-art deep-learning models when applied to crops production and management. Specifically, we suggest that the YOLOv3 network has good potential application in agricultural detection tasks.
| Reference Key |
zheng2019sensorscropdeep:
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
|---|---|
| Authors | Yang-Yang Zheng;Jian-Lei Kong;Xue-Bo Jin;Xiao-Yi Wang;Ting-Li Su;Min Zuo;Zheng, Yang-Yang;Kong, Jian-Lei;Jin, Xue-Bo;Wang, Xiao-Yi;Su, Ting-Li;Zuo, Min; |
| Journal | sensors |
| Year | 2019 |
| DOI |
10.3390/s19051058
|
| URL | |
| Keywords |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.