Import * why?

I have finished a couple of chapters of the book and I am looking forward to finishing it. The one thing that I am struggling with is the import * style of coding throughout this book/course. It literally hides the structural details of the library which could be a barrier of entry for beginners. Since the library also implements classes with the same name as that of in other python packages it becomes very confusing. For example, the library has an Image class in vision. core. image package but since import * is being used it’s very difficult to figure out if it’s PIL. Image or library Image. I understand that the import * will not impact the performance because the library has been designed to handle it cleverly, but the question remains what advantages we get by doing import * instead of explicit import (python way), which makes the learning curve a bit steeper. It definitely not helping, maybe it’s just me but something I wanted to point out. Let me know what others think.

@sssingh the library has a TON of dependencies. Rather than having to go up and import each module/package one by one as you need them - you are able to access all of the dependencies in one go. This is beyond ideal especially when playing around with data/models and prototyping.

There is nothing stopping you from importing each module one by one if you would prefer to take that approach :slight_smile:


I think we should remember this is a course on Deep Learning not Software Engineering. Deep Learning is useful in many areas and we should encourage people who are trying to write practical code without the need for good Software Engineering considerations. If you are writing a major web application then Software Engineering is important but for a throw away model I think we can allow the odd bad practice.
Interestingly Jeremy has spoke on SE and DL in

It’s not necessarily “bad practice” if it’s used appropriately. The v1 docs are very clear as to why this is a thing:


fastai is designed to support both interactive computing as well as traditional software development. For interactive computing, where convenience and speed of experimentation is a priority, data scientists often prefer to grab all the symbols they need, with import *. Therefore, fastai is designed to support this approach, without compromising on maintainability and understanding.

In order to do so, the module dependencies are carefully managed (see next section), with each exporting a carefully chosen set of symbols when using import * . In general, for interactive computing, to just play around the core modules and the training loop you can do

from fastai.basics import *

If you want experiment with one of the applications such as vision, then you can do

from import *

That will give you all the standard external modules you’ll need, in their customary namespaces (e.g. pandas as pd, numpy as np, matplotlib.pyplot as plt), plus the core fastai libraries. In addition, the main classes and functions for your application (, in this case), e.g. creating a DataBunch from an image folder and training a convolutional neural network (with cnn_learner), are also imported. If you don’t wish to import any application, but want all the main functionality from fastai, use from fastai.basics import *. Of course, you can also just import the specific symbols that you require, without using import *.

If you wish to see where a symbol is imported from, either just type the symbol name (in a REPL such as Jupyter Notebook or IPython), or (in most editors) wave your mouse over the symbol to see the definition. For instance: