relative vessel motion tracking using sensor fusion, aruco markers, and mru sensors
Clicks: 153
ID: 163928
2017
Article Quality & Performance Metrics
Overall Quality
Improving Quality
0.0
/100
Combines engagement data with AI-assessed academic quality
Reader Engagement
Emerging Content
0.3
/100
1 views
1 readers
Trending
AI Quality Assessment
Not analyzed
Abstract
This paper presents a novel approach for estimating the relative motion between two moving offshore vessels. The method is based on a sensor fusion algorithm including a vision system and two motion reference units (MRUs). The vision system makes use of the open-source computer vision library OpenCV and a cube with Aruco markers placed onto each of the cube sides. The Extended Quaternion Kalman Filter (EQKF) is used for bad pose rejection for the vision system. The presented sensor fusion algorithm is based on the Indirect Feedforward Kalman Filter for error estimation. The system is self-calibrating in the sense that the Aruco cube can be placed in an arbitrary location on the secondary vessel. Experimental 6-DOF results demonstrate the accuracy and efficiency of the proposed sensor fusion method compared with the internal joint sensors of two Stewart platforms and the industrial robot. The standard deviation error was found to be 31mm or better when the Arcuo cube was placed at three different locations.Reference Key |
tordal2017modeling,relative
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
---|---|
Authors | ;Sondre Sanden Tordal;Geir Hovland |
Journal | Antioxidants (Basel, Switzerland) |
Year | 2017 |
DOI | 10.4173/mic.2017.2.3 |
URL | |
Keywords |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.