Hi all. I’m opening this topic to discuss meanshift, state of the art, reproducibility and possible improvements.
I did a quick search online and found some new interesting approaches:
I personally find meanshift++ a very interesting direction since the runtime grows linearly with dataset size!
November 2, 2022, 12:15am
I recommend having a go at implementing some ideas
before reading these papers BTW.
January 7, 2023, 12:41am
I implemented the miniai
meanshift using jax (just to learn), but it is worse than what we had:
PyTorch: 3.93 ms ± 185 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)
JAX: 62.3 ms ± 1.74 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)
Matrix multiplication is 44 times faster in
PyTorch than in
PyTorch was 1.6 times faster than JAX for matrix multiplication.
JAX meanshift is 2.24 times faster than the batched version of PyTorch meanshift:
PyTorch: 3.8 ms ± 133 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
JAX: 1.69 ms ± 49.1 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
January 9, 2023, 6:51pm
GPU. I will share the notebook but today was a crazy day where I live (Brasilia, Brazil’s capital).
January 9, 2023, 9:27pm
Oh… interesting times for you I expect
January 10, 2023, 10:11pm
There was an error in the implementation. Here are the notebooks of the results above:
January 10, 2023, 10:20pm
A day of infamy, I would say. It is so said to see what these 21 century fascists believe and all the damage they can do to our society(social network AI algorithms are much to blame, in my opinion. But not the only culprits, of course).
The positive side of this story is that they did not accomplish what they wanted and despite the damage in buildings and art pieces, nobody was killed.