Jeremy's Ted talk: GUI based trainer?

@jeremy, was the demo shown on the talk had zero code and sort of GUI based trainer?

The exact context is example of cars at: https://youtu.be/t4kyRyKyOpo?t=789

Hi, @vikbehal, I don’t think so, this iterative process that Jeremy shows in his video, is post-training, and what he seeks to understand is why the classification errors, and obviously improving it, but the basis is a preliminary classification process. greetings.

I’m not sure if I heard it correctly in video but Jeremy says ‘without dl’?

Hi @vikbehal What happens is that this example explains a concept of teamwork between a human and the machine to improve algorithm performance. At this stage the human corrects certain errors in the decisions that the algorithm made, then begins another training process that corrects the error explained, this considerably improves the performance of the algorithm, and I see it perfectly plausible in the medical area in the future, greetings.

Correct. So everytime we provide feedback the algorithm learns.

I recently started Pytorch and it backpropogates based on input. Sort of dynamic neural networks? Another question I’ve is how to design this sort of GUI? Do we’ve APIs in Python and R?

The best answer is: Mr. @jeremy, How did you do that?, thank you.

It’s not published work as yet. But we’re commercializing it at http://platform.ai

3 Likes

Wow. Any preview available from students of fast.ai? Platform.ai is private as of now!

Uploading…