Learning Dynamic Processes with Reservoir Computing

Time
Tuesday, 4. June 2024

Location
G 309

Organizer

Speaker:
Lyudmila Grigoryeva

Many dynamic problems in engineering, control theory, signal processing, time series analysis,
and forecasting can be described using input/output (IO) systems. Whenever a true functional IO relation
cannot be derived from first principles, parsimonious and computationally efficient state-space systems can
be used as universal approximants. We have shown that Reservoir Computing (RC) state-space systems
with simple and easy-to-implement architectures enjoy universal approximation properties proved in different
setups. The defining feature of RC systems is that some components (usually the state map) are randomly
generated, and the observation equation is of a tractable form. From the machine learning perspective, RC
systems can be seen as recurrent neural networks with random weights and a simple-to-train readout layer
(often a linear map). RC systems serve as efficient, randomized, online computational tools for learning
dynamic processes and enjoy generalization properties that can be explicitly derived. We will make a general
introduction to up-to-date theoretical developments, discuss connections with research contributions in
other fields, and address details of RC systems’ applications.