Been surfing through this file: https://github.com/fastai/fastai/blob/master/courses/dl1/lesson4-imdb.ipynb
and found a couple of statements encountering the usage of the “partial” python API. They are as follows:
learner.reg_fn = partial(seq2seq_reg, alpha=2, beta=1)
Now, as if the above wasn’t confounding enough, there’s a partial used for a class as well !
opt_fn = partial(optim.Adam, betas=(0.7, 0.99))
Just wondering why it is that the “partial” API was explicitly used. I used the plain old method definition, i.e.
learner.reg_fn = seq2seq_req()
and get the code to compile and train just fine, albeit the fact that the alpha and beta values are zeros. But setting the desired alpha and beta values could surely be just a little bit of code modification.
Anyways, the crux of the question: other than provide a syntactic sugar, is “partial” critical in any other ways? The python api for “partial” relates the usage to functions, so the usage of partial against the “optim.Adam” class is even more beguiling to me. Wonder how that works.
Appreciate any feedback… I’m just a plain old Java guy who’s having a heck-of-a-butt-slapping in the pythonic way of doing things