I’m not on twitter. Thanks
How are transforms applied lazily? If I have a bunch of functions (similar to transforms) and I want to run all of them on say a Python list, how do I code them so that they are applied lazily?
You can use functions directly as transforms, and use a TfmdList
.
http://dev.fast.ai/data.core#show_at
Otherwise look at the source to TfmdList.
How can I init a dataset from the siamese example? What am I getting wrong here?
tfms = [[sp, OpenAndResize], [labeller, Categorize]]
dsets = Datasets(items, tfms, verbose=True)
t = dsets[0]
print(type(t[0]),type(t[1]))
x,y = dsets.decode(t)
print(x.shape,y)
dsets.show(t);
@jeremy Sorry if found unnecessary tagging but I’m really stuck with handling siamese dataset initialization. I’ve been stuck on this for quite some time and since it is an example presented in the notebooks thought it should be straightforward.
But yet you still did it…
I would appreciate your response not looking to bother you. It seems like basic functionality the library should support since this is an example out of your notebooks. It’s very easy to claim fastai2 supports siamese flow with half baked examples.
At the time it was, there has been updates to the library since then. Jeremy and Sylvain are working tirelessly on their new library, and they’re doing their best to update the documentation and examples when they can. If you can wait a week or two I will show an example in my study group @zlapp
That would be great. Thanks @muellerzr
Appreciate all the work being done in the library and happy to help solve this issue with a PR to fastai2 @jeremy @muellerzr. The reason why I’m venturing into the Siamese flow is to begin developing Contrastive Self Supervised Learning.
I actually didnot fully understand your question. What do you mean by initialization. Have a look on the output of your items. If you are in windows have a look on Regexlabeller source code (you need to change a little bit). Have a try. If it doesnot work, then we can see from the output of your path. Before start working try a git pull, so that you know you are in updated version. I have worked on it last weekend and everything was ok in the notebook. therefore, I hope the problem was somehow locally not on fastai
This is totally unacceptable. Please stay off these forums until you’re ready to contribute more appropriately.
Can you explain what’s happening here? I don’t quite get it.
@dipam7 (I’ll attempt to answer) so we have two functions, mul and add. Both of which accept specific types to do their functions (A and B). We can use type dispatching to which it will look at the input, compare it to the list of functions we passed in, and it will see that depending on which class we used which function will be used. Does this help?
So we use it as a function that behaves differently based on the data type passed to it. What’s the practical use case for it? Suppose we have a function with a lot of is_instance
to begin with. Will it serve as a replacement for that?
Best example is on transforms. Look at flip_lr here:
We want different augmentations based on what type it is, so we can define what we expect x to be (our input) (also an example of it being applied)
You could use isinstance but as you can see by the above that would get really messy really quickly
Thanks, that’s really helpful. Can you also tell me how do I determine the data type of the parameter. I was trying it with int
, float
, and str
but it gives me some error.
Code:
@patch
def resize(a:Int, size: int): return [a] * size
@patch
def resize(a:String, size: int): return a * size
@patch
def resize(a:float, size: int): return round(a, size)
And the error message
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-10-036c812eca6a> in <module>()
1 @patch
----> 2 def resize(a:int, size: int): return [a] * size
3
4 @patch
5 def resize(a:str, size: int): return a * size
1 frames
/usr/local/lib/python3.6/dist-packages/fastcore/foundation.py in _inner(f)
71 for o in functools.WRAPPER_ASSIGNMENTS: setattr(nf, o, getattr(f,o))
72 nf.__qualname__ = f"{c_.__name__}.{f.__name__}"
---> 73 setattr(c_, f.__name__, property(nf) if as_prop else nf)
74 return f
75 return _inner
TypeError: can't set attributes of built-in/extension type 'int'
If you notice, each of those are higher level classes, not python data types (that’s just my observation)
Thanks. I just figured out you can use Python data types as well with an argument of bases=t
in TypeDispatch.