Resumen
Nowadays huge volumes of data are produced in the form of fast streams, which are further affected by non-stationary phenomena. The resulting lack of stationarity in the distribution of the produced data calls for efficient and scalable algorithms for online analysis capable of adapting to such changes (concept drift). The online learning field has lately turned its focus on this challenging scenario, by designing incremental learning algorithms that avoid becoming obsolete after a concept drift occurs. Despite the noted activity in the literature, a need for new efficient and scalable algorithms that adapt to the drift still prevails as a research topic deserving further effort. Surprisingly, Spiking Neural Networks, one of the major exponents of the third generation of artificial neural networks, have not been thoroughly studied as an online learning approach, even though they are naturally suited to easily and quickly adapting to changing environments. This work covers this research gap by adapting Spiking Neural Networks to meet the processing requirements that online learning scenarios impose. In particular the work focuses on limiting the size of the neuron repository and making the most of this limited size by resorting to data reduction techniques. Experiments with synthetic and real data sets are discussed, leading to the empirically validated assertion that, by virtue of a tailored exploitation of the neuron repository, Spiking Neural Networks adapt better to drifts, obtaining higher accuracy scores than naive versions of Spiking Neural Networks for online learning environments.
Idioma original | Inglés |
---|---|
Páginas (desde-hasta) | 1-19 |
Número de páginas | 19 |
Publicación | Neural Networks |
Volumen | 108 |
DOI | |
Estado | Publicada - dic 2018 |
Palabras clave
- Spiking Neural Networks
- Data reduction
- Online learning
- Concept drift
Project and Funding Information
- Funding Info
- This work was supported by the EU project Pacific AtlanticNetwork for Technical Higher Education and Research—PANTHER(grant number 2013-5659/004-001 EMA2).