Deep learning as applied to a stream of event data to make predictions

Hi, I’m trying to wrap my head around using the fastai library to make predictions from event data.

Basically, I want to ask “For a given user and their past N actions, what is the probability they are likely to make a purchase today?” where purchase is a specific action they can take.

I’ve thought about a couple approaches but I’m not sure if I’m tackling this the right way.

First I thought about taking the event data for each user and splitting it up by orders and collapsing that down to a series of images. By this I mean all the events between orders (exclusive of the starting order and inclusive of the next order, if it exists) will each be a set, or batch, that I would translate to an image. Then my plan is to try and train a resnet model to predict if the image contains an order.

Another idea is to take the event data split by orders, and somehow collapse it down to a single row where the dependent variable is whether or not the row resulted in an order. This would then be used to train a tabular model to predict the likelihood of an order being placed for a given row.

Apart from these ideas, are they other ways of slicing these event data to make purchase predictions?

For context, I come from a software engineering background and am relatively new to the world of data science/statistics/machine learning.