Lesson 13 wiki

(Rachel Thomas) #1

Lesson resources

Links from class


Auto-generating a .py when saving a .ipynb

This can make version control easier.

To do this, append the following script to your ~/.jupyter/jupyter_notebook_config.py

import io
import os
from notebook.utils import to_api_path

_script_exporter = None

def script_post_save(model, os_path, contents_manager, **kwargs):
    """convert notebooks to Python script after save with nbconvert
    replaces `ipython notebook --script`
    from nbconvert.exporters.script import ScriptExporter

    if model['type'] != 'notebook':

    global _script_exporter
    if _script_exporter is None:
        _script_exporter = ScriptExporter(parent=contents_manager)
    log = contents_manager.log

    # save .py file
    base, ext = os.path.splitext(os_path)
    script, resources = _script_exporter.from_filename(os_path)
    script_fname = base + resources.get('output_extension', '.txt')
    log.info("Saving script /%s", to_api_path(script_fname, contents_manager.root_dir))
    with io.open(script_fname, 'w', encoding='utf-8') as f:

c.FileContentsManager.post_save_hook = script_post_save

Append this script instead if you also want an HTML file generated.


Part 2 early release videos now available!
Pre-release part 2 videos
(bckenstler) #2

Cyclical Learning rate callback for Keras:

(Jeremy Howard) #3

Video / notebooks / ppt now posted on wiki, FYI.

(layla.tadjpour) #4

@jeremy, @rachel can you please post a link to the english french parallel corpus ? Thanks.

(Jeremy Howard) #5

Thanks for the reminder - done now.

(lin.crampton) #6

transcript for this video is in:

let me know if there are changes/corrections to be made.

(Simon Espigolé) #7

Hi everybody,

Unfortunately, I can’t find the file “attention_wrapper.py” from lesson 13 at http://files.fast.ai/part2/lesson13/
Can someone send me a link to this file ?

Thank you very much for this brilliant course!
Have a nice day!



what would it actually take to create a dictionary for the full fr <-> en dictionary? Could one just batch the model generation or is there something more sophisticated necessary? Thanks you!

(Eric Perbos-Brinck) #9

Note: the complete collection of Part 2 video timelines is available in a single thread for keyword search.
Part 2: complete collection of video timelines

Lesson 13 video timeline:

00:00:10 Fast.ai student accepted into Google Brain Residency program

00:06:30 Cyclical Learning Rates for Training Neural Networks (another student’s paper)
& updates on Style Transfer, GAN, and Mean Shift Clustering research papers

00:13:45 Tiramisu: combining Mean Shitft Clustering and Approximate Nearest Neighbors

00:22:15 Facebook AI Similarity Search (FAISS)

00:28:15 The BiLSTM Hegemony

00:35:00 Implementing the BiLSTM, and Grammar as a Foreign Language (research)

00:45:30 Reminder on how RNN’s work from Lesson #5 (Part 1)

00:47:20 Why Attentional Models use “such” a simple architecture
& “Tacotron: a Fully End-To-End Text-To-Speech Synthesis Model” (research)

00:50:15 Continuing on Spelling_bee_RNN notebook (Attention Model), from Lesson 12

00:58:40 Building the Attention Layer and the ‘attention_wrapper.py’ walk-through

01:15:40 Impressive student’s experiment with different mathematical technique on Style Transfer

01:18:00 Translate English into French, with Pytorch

01:31:20 Translate English into French: using Keras to prepare the data
Note: Pytorch latest version now supports Broadcasting

01:38:50 Writing and running the ‘Train & Test’ code with Pytorch

01:44:00 NLP Programming Tutorial, by Graham Neubig (NAIST)

01:48:25 Question: “Could we translate Chinese to English with that technique ?”
& new technique: Neural Machine Translation of Rare Words with Subword Units (Research)

01:54:45 Leaving Translation aside and moving to Image Segmentation,
with the “The 100 layers Tiramisu: Fully Convolutional DenseNets” (research)
and “Densely Connected Convolutional Networks” (research)

(Kevin Dewalt) #10