< Terug naar vorige pagina

Publicatie

Hierarchical temporal memory and recurrent neural networks for time series prediction

Tijdschriftbijdrage - Tijdschriftartikel

Ondertitel:an empirical validation and reduction to multilayer perceptions
Recurrent Neural Networks such as Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRUs) are often deployed as neural network-based predictors for time series data. Recently, Hierarchical Temporal Memory (HTM), a machine learning technology attempting to simulate the human brain's neocortex, has been proposed as another approach to time series data prediction. While HTM has gained a lot of attention, little is known about the actual performance compared to the more common RNNs. The only performance comparison between the two, performed at the company behind HTM, shows they perform similarly. In this article, we present a more in-depth performance comparison, involving more extensive hyperparameter tuning and evaluation on more scenarios. Surprisingly, our results show that both LSTM and GRUs can outperform HTM by over 30% at lower runtime. Furthermore, we show that HTM requires explicitly timestamped data to recognize daily and weekly patterns, while LSTM only needs the raw sequential data to predict such time series accurately. Finally, our experiments indicate that the temporally aware components of all considered predictors contribute nothing to the prediction accuracy. We further strengthen this claim by presenting equally or better performing Multilayer Perceptrons conceptually similar to the HTM and LSTM, disregarding their temporal aspects. (C) 2019 Elsevier B.V. All rights reserved.
Tijdschrift: Neurocomputing: an international journal
ISSN: 0925-2312
Volume: 396
Pagina's: 291 - 301
Jaar van publicatie:2020
Trefwoorden:A1 Journal article
BOF-keylabel:ja
BOF-publication weight:2
CSS-citation score:2
Authors from:Higher Education
Toegankelijkheid:Open