About the Part 2 (2019) category

well, any function need (), and i like to omit it in the example i showed. it looks too ugly for what i want to express. but i agree with u unicode may hinder some folks. but i think adjusting to functional programming may be a bigger hurdle also.

i think if you like math as it is written in papers and textbook, you will likely love that representation in code as well. this sort of sugar syntax is best added on top of full set API, so you can always fall back on english verbosity if desired. I think S4TF is leaving this all out knowing they are highly customizable and can be easily done in “user space”.

In Julia entering math symbols can be done by typing the LaTeX name for the symbol. I like that because it feels very natural.

Just wanted to confirm that from tomorrow the class will start?

6:30-9 p.m. pacific time

Check the link for a Merged Calendar Invite

Guys, I am not able to join via the invite. The .ics file won’t even upload to my calendar

1 Like

It’s just a placeholder for the time/date. Same as what Jeremy listed at the top.

Hello Everyone,
Nice to be in the new course, great material, Thanks Jeremy for the all the nice work.
I wish Jeremy would add a lesson about mixed input dataset.
I mean how to join data from text and images and may be tabular data to make prediction.


1 Like

Once the inputs are translated to tensors, you should be able to concat them and propagate through the subsequent layers, right?
Are there further complexities to be accounted for?

Not really
I have around 12 images of different context, so I can have each context in a imagelist and then join the output, that is fine.
How to use something like resenet as a model for each group of layers?

You can write a custom module which passes on the image input to the resnet. Instead of directly connecting Convolutional features to the fully connected layers (Dense) you could concat them with the structured data.

Hi PoonamV, could you point me to the new videos (if indeed they are uploaded)? Thanks!

EDIT: never mind I found the thread: https://forums.fast.ai/t/official-updates-thread/41429

You can also find them in the thread titled
Lesson # Discussion & Wiki (2019)

Not sure if there is a separate thread for this, but a topic I like to see is seq2seq networks (all variations) in more depth.


please look up my post on the chit chat thread.

Hi all, nice to see this course still keeping it real with v3! Just got all my Win 10 installs up to date (not only fastai but also keras-tensorflow and keras-rl gym for reinforcement learning… how smooth these days compared to just two years ago ehehe), so I am ready to go again. Deep into materials informatics now, however still curious enough and happy to learn the 2019 state of the art. Thanks again, Jeremy!

Do you know of any good resources for materials informatics datasets?

It really depends on the class of materials and the industry you are interested in (pharma, health, energy, engineering), but you can start with recent literature from https://www.nature.com/npjcompumats/articles and https://arxiv.org/list/cond-mat.mtrl-sci/recent for the state of the art in any applied field.

1 Like

Wanted to say thank you for an amazing course! I really enjoyed it and can’t wait to see the next on in June!

Is it published when these videos are open for public? I remember that Jeremy said on June so I assume it is happening soon.