Lightweight Alternatives for Hyper-parameter Tuning in Drifting Data Streams

Producción científica: Capítulo del libro/informe/acta de congresoContribución a la conferenciarevisión exhaustiva

3 Citas (Scopus)

Resumen

Scenarios dealing with data streams often undergo changes in data distribution, which ultimately lead to a performance degradation of algorithms learning from such data flows (concept drift). This phenomenon calls for the adoption of adaptive learning strategies for algorithms to perform resiliently after a change occurs. A multiplicity of approaches have so far addressed this issue by assorted means, e.g. instances weighting, ensembling, instance selection, or parameter tuning, among others. This latter strategy is often neglected as it requires a hyper-parameter tuning process that stream learning scenarios cannot computationally afford in most practical settings. Processing times and memory space are usually severely constrained, thus making the tuning phase unfeasible. Consequently, the research community has largely opted for other adaptive strategies with lower computational demands. This work outlines a new perspective to alleviate the hyper-parameter tuning process in the context of concept drift adaptation. We propose two simple and lightweight mechanisms capable of discovering competitive configurations of learning algorithms used for data stream classification. We compare its performance to that of a modern hyper-parametric search method (Successive Halving) over extensive experiments with synthetic and real datasets. We conclude that our proposed methods perform competitively, while consuming less processing time and memory.

Idioma originalInglés
Título de la publicación alojadaProceedings - 21st IEEE International Conference on Data Mining Workshops, ICDMW 2021
EditoresBing Xue, Mykola Pechenizkiy, Yun Sing Koh
EditorialIEEE Computer Society
Páginas304-311
Número de páginas8
ISBN (versión digital)9781665424271
DOI
EstadoPublicada - 2021
Evento21st IEEE International Conference on Data Mining Workshops, ICDMW 2021 - Virtual, Online, Nueva Zelanda
Duración: 7 dic 202110 dic 2021

Serie de la publicación

NombreIEEE International Conference on Data Mining Workshops, ICDMW
Volumen2021-December
ISSN (versión impresa)2375-9232
ISSN (versión digital)2375-9259

Conferencia

Conferencia21st IEEE International Conference on Data Mining Workshops, ICDMW 2021
País/TerritorioNueva Zelanda
CiudadVirtual, Online
Período7/12/2110/12/21

Financiación

FinanciadoresNúmero del financiador
Horizon 2020 Framework Programme
Horizon 2020101000162
Electronic Components and Systems for European Leadership783163

    Huella

    Profundice en los temas de investigación de 'Lightweight Alternatives for Hyper-parameter Tuning in Drifting Data Streams'. En conjunto forman una huella única.

    Citar esto