Skip to Main content Skip to Navigation
Conference papers

Layered Memory Automata: Recognizers for Quasi-Regular Languages with Unbounded Memory

Abstract : This paper presents the model of Layered Memory Automata (LaMA) to deal with languages involving infinite alphabets, with practical applications in the analysis of datastreams, or modeling complex resource usages in concurrent systems. The LaMA can be seen as an extension of the Finite Memory Automata (FMA) with memory layers and the capacity of dealing with an unbounded amount of memory. Despite the increased expressiveness, the LaMA preserve most of the “good” properties of the FMA, in particular the closure properties for the so-called quasi-regular constructions. Moreover, the layering of the memory enables particularly economical constructions, which is an important focus of our study. The capacity of dealing with an unbounded amount of memory brings the LaMA closer to more powerful automata models such as the history register automata (HRA), thus occupying an interesting position at the crossroad between the operational and the more abstract points of view over data-languages.
Complete list of metadata

https://hal-univ-evry.archives-ouvertes.fr/hal-03713609
Contributor : Frédéric Davesne Connect in order to contact the contributor
Submitted on : Monday, July 4, 2022 - 9:46:55 PM
Last modification on : Wednesday, August 3, 2022 - 3:58:02 AM

Identifiers

Citation

Clément Bertrand, Hanna Klaudel, Frédéric Peschanski. Layered Memory Automata: Recognizers for Quasi-Regular Languages with Unbounded Memory. 43rd International Conference on Application and Theory of Petri Nets and Concurrency (PETRI NETS 2022), Jun 2022, Bergen, Norway. pp.43--63, ⟨10.1007/978-3-031-06653-5_3⟩. ⟨hal-03713609⟩

Share

Metrics

Record views

10