Cyclostationary Random Number Sequences for the Tsetlin Machine
Peer reviewed, Journal article
Accepted version
Permanent lenke
https://hdl.handle.net/11250/3067645Utgivelsesdato
2022Metadata
Vis full innførselSamlinger
Originalversjon
Tunheim, Svein Anders Yadav, Rohan Kumar Lei, Jiao Shafik, Rishad Granmo, Ole-Christoffer (2022). Cyclostationary Random Number Sequences for the Tsetlin Machine. Lecture notes in Computer Science, 13343, 844-856. https://doi.org/10.1007/978-3-031-08530-7_71Sammendrag
The Tsetlin Machine (TM) constitutes an emerging machine learning algorithm that has shown competitive performance on several benchmarks. The underlying concept of the TM is propositional logic determined by a group of finite state machines that learns patterns. Thus, TM-based systems naturally lend themselves to low-power operation when implemented in hardware for micro-edge Internet-of-Things applications. An important aspect of the learning phase of TMs is stochasticity. For low-power integrated circuit implementations the random number generation must be carried out efficiently. In this paper, we explore the application of pre-generated cyclostationary random number sequences for TMs. Through experiments on two machine learning problems, i.e., Binary Iris and Noisy XOR, we demonstrate that the accuracy is on par with standard TM. We show that through exploratory simulations the required length of the sequences that meets the conflicting tradeoffs can be suitably identified. Furthermore, the TMs achieve robust performance against reduced resolution of the random numbers. Finally, we show that maximum-length sequences implemented by linear feedback shift registers are suitable for generating the required random numbers.
Beskrivelse
Author's accepted manuscript