DeepFly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult .
Clicks: 239
ID: 100053
2019
Article Quality & Performance Metrics
Overall Quality
Improving Quality
0.0
/100
Combines engagement data with AI-assessed academic quality
Reader Engagement
Steady Performance
30.0
/100
238 views
36 readers
Trending
AI Quality Assessment
Not analyzed
Abstract
Studying how neural circuits orchestrate limbed behaviors requires the precise measurement of the positions of each appendage in three-dimensional (3D) space. Deep neural networks can estimate two-dimensional (2D) pose in freely behaving and tethered animals. However, the unique challenges associated with transforming these 2D measurements into reliable and precise 3D poses have not been addressed for small animals including the fly, . Here, we present DeepFly3D, a software that infers the 3D pose of tethered, adult using multiple camera images. DeepFly3D does not require manual calibration, uses pictorial structures to automatically detect and correct pose estimation errors, and uses active learning to iteratively improve performance. We demonstrate more accurate unsupervised behavioral embedding using 3D joint angles rather than commonly used 2D pose data. Thus, DeepFly3D enables the automated acquisition of behavioral measurements at an unprecedented level of detail for a variety of biological applications.
| Reference Key |
gunel2019deepfly3delife
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
|---|---|
| Authors | Günel, Semih;Rhodin, Helge;Morales, Daniel;Campagnolo, João;Ramdya, Pavan;Fua, Pascal; |
| Journal | eLife |
| Year | 2019 |
| DOI |
10.7554/eLife.48571
|
| URL | |
| Keywords |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.