Hurst entropy: A method to determine predictability in a binary series based on a fractal-related process.
Clicks: 256
ID: 28499
2019
Article Quality & Performance Metrics
Overall Quality
Improving Quality
0.0
/100
Combines engagement data with AI-assessed academic quality
Reader Engagement
Steady Performance
63.5
/100
256 views
205 readers
Trending
AI Quality Assessment
Not analyzed
Abstract
Shannon's concept of information is related to predictability. In a binary series, the value of information relies on the frequency of 0's and 1's, or how it is expected to occur. However, information entropy does not consider the bias in randomness related to autocorrelation. In fact, it is possible for a binary temporal series to carry both short- and long-term memories related to the sequential distribution of 0's and 1's. Although the Hurst exponent measures the range of autocorrelation, there is a lack of mathematical connection between information entropy and autocorrelation present in the series. To fill this important gap, we combined numerical simulations and an analytical approach to determine how information entropy changes according to the frequency of 0's and 1's and the Hurst exponent. Indeed, we were able to determine how predictability depends on both parameters. Our findings are certainly useful to several fields when binary times series are applied, such as neuroscience to econophysics.Reference Key |
ferraz2019hurstphysical
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
---|---|
Authors | Ferraz, Mariana Sacrini Ayres;Kihara, Alexandre Hiroaki; |
Journal | physical review e |
Year | 2019 |
DOI | 10.1103/PhysRevE.99.062115 |
URL | |
Keywords | Keywords not found |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.