Sequence learning from scratch

I would like to apply a language model approach to sequences that are not words, like in NLP. It is actions taken users visiting a website. Say you click, search, browse, search, click, leave. That is one sequence, which is similar to a document, say wikipedia entry, in the NLP world. The hypothesis is such sequences of actions may stem from a data generating process similar to the NLP, where we call it a language model.

There is no transfer learning that I am aware of in this case, since it is a new application. Has anyone ever hacked the fastai text library to fit a more general case of a sequence of symbols? I’d envisage it’d be done by training a text model from scratch, skipping a bunch of steps (tokenization, for example, would not apply, since it a NLP-specific concept).