Layered Memory Automata: Recognizers for Quasi-Regular Languages with Unbounded Memory - Archive ouverte HAL Access content directly
Conference Papers Year : 2022

Layered Memory Automata: Recognizers for Quasi-Regular Languages with Unbounded Memory

(1) , (2) , (3)
1
2
3

Abstract

This paper presents the model of Layered Memory Automata (LaMA) to deal with languages involving infinite alphabets, with practical applications in the analysis of datastreams, or modeling complex resource usages in concurrent systems. The LaMA can be seen as an extension of the Finite Memory Automata (FMA) with memory layers and the capacity of dealing with an unbounded amount of memory. Despite the increased expressiveness, the LaMA preserve most of the “good” properties of the FMA, in particular the closure properties for the so-called quasi-regular constructions. Moreover, the layering of the memory enables particularly economical constructions, which is an important focus of our study. The capacity of dealing with an unbounded amount of memory brings the LaMA closer to more powerful automata models such as the history register automata (HRA), thus occupying an interesting position at the crossroad between the operational and the more abstract points of view over data-languages.
Not file

Dates and versions

hal-03713609 , version 1 (04-07-2022)

Identifiers

Cite

Clément Bertrand, Hanna Klaudel, Frédéric Peschanski. Layered Memory Automata: Recognizers for Quasi-Regular Languages with Unbounded Memory. 43rd International Conference on Application and Theory of Petri Nets and Concurrency (PETRI NETS 2022), Jun 2022, Bergen, Norway. pp.43--63, ⟨10.1007/978-3-031-06653-5_3⟩. ⟨hal-03713609⟩
20 View
0 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More