Do I sense a competition with Jeremy again?
How does an Autoencoder differ from a Variational-Autoencoder?
Would you have any advise for modifying the library on the fly ?
was reading this earlier today - gives a quick summary:
When are we expecting part 2?
I would copy what you want to modify to a Jupyter notebook and you can modify it there.
There is no date yet but it should be around march 2018.
Unfortunately, I had to deep dive in multiple layers of the library and it quickly became a real rabbit hole
Good. That gives buffer time for ML course.
What are you trying to do?
I’ve seen TensorFlow using Xavier initialization. How’s it different from PyTorch’s/fastai’s He’s initialization?
What does kaiming_normal function do?
Initializing the weights.
One thing that would make easier to read the notebooks is to use names for the parameters that you pass to the class and/or function.
Now it is a bit confusing what the values you pass stands for…
I tried to debug why my GPU is starving and to implement another loss function. To do so, I had to modify StructuredLearner to take my new loss and starting to look what was under.
I did not manage to make it work for now
Basically, VAEs try to recreate something ‘similar’ to itself as opposed to the ‘exact thing as the input’
Ask in the forums. To change you loss functions.
Instead of this
class StructuredLearner(Learner): def __init__(self, data, models, **kwargs): super().__init__(data, models, **kwargs) self.crit = F.mse_loss
Use something like this
class StructuredLearner2(Learner): def __init__(self, data, models, **kwargs): super().__init__(data, models, **kwargs) self.crit = # write your looss here
That’s what I did. Because to use StructuredLearner2 , I also needed to modify ColumnarData. It started to look messy in my notebook.
I’m looking for a more efficient way to do it, some kind of best practice approach to those modification.