The Recurrent Temporal Restricted Boltzmann Machine is a promising probabilistic model for processing temporal data. It has been shown to learn physical dynamics from videos (e.g. bouncing balls), but its ability to process sequential data has not been tested on symbolic tasks. Here we assess its capabilities on learning sequences of letters corresponding to English words. It emerged that the model is able to extract local transition rules between items of a sequence (i.e. English graphotactic rules), but it does not seem to be suited to encode a whole word.
Assessment of sequential Boltmann machines on a lexical processing task
TESTOLIN, ALBERTO;SPERDUTI, ALESSANDRO;STOIANOV, IVILIN PEEV;ZORZI, MARCO
2012
Abstract
The Recurrent Temporal Restricted Boltzmann Machine is a promising probabilistic model for processing temporal data. It has been shown to learn physical dynamics from videos (e.g. bouncing balls), but its ability to process sequential data has not been tested on symbolic tasks. Here we assess its capabilities on learning sequences of letters corresponding to English words. It emerged that the model is able to extract local transition rules between items of a sequence (i.e. English graphotactic rules), but it does not seem to be suited to encode a whole word.File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.