For those who run their own AI box, or want to

No checkboxes when I install like… wsl --install -d Ubuntu

No, this was through the Microsoft Store install process I recall. If I can get on a clean box I’ll try and find it again. I just added 20.04 to my existing 22.04 and didn’t see the option.

When first starting my local Jupyter install, it was disconcerting that all extensions were disabled OOTB due to compatability.

I see in Lesson 2 video [8:55] Jeremy has cleared that flag, so I presume its generally safe to do so…

But I’m curious about a bit more context on the state of play of nbextensions stability.
A bit of hunting did turn up a comment (2021-09-01) that…

until now nbextensions compatibility test is only for Jupyter Notebook 4.x and 5.x, about 6.x Compatibility testing has not been implemented yet. If this option is selected, under the version 6.x , all functions will be disabled.

And the issue seems to be outstanding for 2.5 years without much stress on this thread…

So what are your experiences?

Cheers, ben

p.s. here are my Jupyter versions. Would I be correct to say I’m on Jupyter 7.3.1?

$ jupyter --version
IPython          : 8.3.0
ipykernel        : 6.13.0
ipywidgets       : not installed
jupyter_client   : 7.3.1
jupyter_core     : 4.9.2
jupyter_server   : not installed
jupyterlab       : not installed
nbclient         : 0.6.2
nbconvert        : 6.5.0
nbformat         : 5.4.0
notebook         : 6.4.11
qtconsole        : not installed
traitlets        : 5.1.1

p.s.2. Which python should I use for new notebooks?..
image

I wanted to test the doc() function in my new install.
Having done $ mamba install -c fastai nbdev
I thought I would only neeed to do…

I have reviewed nbdev tutorial | nbdev, but nothing stood out.

I can do doc(function_name) and get results without having to install / import nbdev (maybe in my case it is come through fastbook/fastai imports? not sure. It works OOTB for me (fastai 2.6.3)

That’s true - it does work without nbdev. It just doesn’t look quite as cool :slight_smile:

1 Like

You need to import it. from fastai.vision.all import * for instance would grab it.

That might make a good FAQ. From the videos the strongest association I got for doc() was with nbdev. So now on Colab I got doc() working for the first time, with:

#once !pip install -U fastai nbdev
import fastai, nbdev
from fastai.vision.all import *
doc(print)
fastai.__version__ , nbdev.__version__ 
____________________
print[source]
print()
print(value, ..., sep=' ', end='\n', file=sys.stdout, flush=False)
Prints the values to a stream, or to sys.stdout by default. Optional keyword arguments: file: a file-like object (stream); defaults to the current sys.stdout. sep: string inserted between values, default a space. end: string appended after the last value, default a newline. flush: whether to forcibly flush the stream.
('2.6.3', '1.2.8')

But that same code doesn’t work on my local install. Below I’ve included my environment check and startup procedure.

$ mamba install -c fastai nbdev
Looking for: ['nbdev']
conda-forge/linux-64                                        Using cache
conda-forge/noarch                                          Using cache
fastai/noarch                                                 No change
fastai/linux-64                                               No change
Pinned packages:
  - python 3.9.*
Transaction
  Prefix: /home/ben/mambaforge
  All requested packages already installed

$ jupyter notebook
[I 08:01:53.138 NotebookApp] [nb_conda_kernels] enabled, 1 kernels found
[I 08:01:53.286 NotebookApp] [jupyter_nbextensions_configurator] enabled 0.4.1
[I 08:01:53.287 NotebookApp] Serving notebooks from local directory: /home/ben/jupyter
[I 08:01:53.288 NotebookApp] Jupyter Notebook 6.4.11 is running at:
[I 08:01:53.288 NotebookApp] http://localhost:8888/?token=924b4706d9f1d5689b053235903c80f1dc47833e55da1a9e
[I 08:01:53.288 NotebookApp]  or http://127.0.0.1:8888/?token=924b4706d9f1d5689b053235903c80f1dc47833e55da1a9e
[I 08:01:53.288 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).

AFAIK, the conda install reports that fastai is installed correct.
[Edit:] Whoops, thats wrong! I had misunderstood the “-c” flag, which actually
means “install ‘nbdev’ from the ‘fastiai’ channel” and not “install ‘fastai’ itself”.

But in the notebook it complains… No module named 'fastai’

[Edit:] Fixed by doing…

$ mamba install -c fastchan fastai

where fastchan is described here.

Thanks @balnazzar and @Interogativ. I tried your ideas, but they did not solve the multiple problems. After a lot of frustration, I ended up starting fresh - completely uninstalling/reinstalling anaconda. Then installing fastai and nbdev into the base environment, following Jeremy’s suggestion. It all seems to be working now, for the new course at least, and a good house cleaning.

Whew!

I am guessing that conda was somehow corrupted, but who can tell at this point. I may need to reinstall a few more packages for existing side projects, and those ought to be obvious. :woozy_face:

2 Likes

After learning a bit about conda channels, what is the preference between fastai and fastchan channels ? Comparing the two I see some overlap, some differences…

$ conda search --override-channels -c fastai \
      | cut -d\  -f1 | sort -u > /tmp/fastai
$ conda search --override-channels -c fastchan \
      | cut -d\  -f1 | sort -u > /tmp/fastchan
$ diff -y /tmp/fastchan /tmp/fastai
#                          #
Loading                    Loading
abseil-cpp               <
accelerate                 accelerate
albumentations             albumentations
arrow-cpp                <
aws-c-cal                <
aws-c-event-stream       <
aws-c-io                 <
aws-checksums            <
aws-sdk-cpp              <
boa                      <
catalogue                <
cudatoolkit              <
cython-blis              <
datasets                 <
fastai                     fastai
                         > fastai2
                         > fastbook
fastcgi                    fastcgi
fastcore                   fastcore
                         > fastdoc
                         > fastdot
fastdownload               fastdownload
                         > fastgpu
fastprogress               fastprogress
fastrelease                fastrelease
ffmpeg                   | fastscript
                         > gh
ghapi                      ghapi
glog                     | imgaug
grpc-cpp                 | jlang
huggingface_hub          <
inotify_simple           <
keyutils                 <
krb5                     <
langcodes                <
lerc                     <
libarchive               <
libblas                  <
libbrotlicommon          <
libbrotlidec             <
libbrotlienc             <
libcblas                 <
libclang                 <
libcrc32c                <
libcurl                  <
libcxx                   <
libdeflate               <
libevent                 <
libgcc-ng                <
libgfortran-ng           <
libgfortran5             <
libgomp                  <
libgoogle-cloud          <
libiconv                 <
liblapack                <
libmamba                 <
libmambapy               <
libnghttp2               <
libpq                    <
libprotobuf              <
libsolv                  <
libssh2                  <
libstdcxx-ng             <
libthrift                <
libtiff                  <
libutf8proc              <
libxml2                  <
libzlib                  <
llvm-openmp              <
mamba                    <
mysql-common             <
mysql-libs               <
nbdev                      nbdev
numexpr                  | nbdev-django
numpy                    | nbdev-numpy
                         > nbdev-pandas
                         > nbdev-pytorch
                         > nbdev-scipy
                         > nbdev-sphinx
                         > nbdev-stdlib
                         > nvidia-ml-py3
opencv-python-headless     opencv-python-headless
orc                      <
pillow                   <
protobuf                 <
pyarrow                  <
pybind11-abi             <
pynvml                     pynvml
pyqt                     <
pyqt-impl                <
pyqt5-sip                <
pyqtchart                <
pyqtwebengine            <
python_abi               <
pytorch                  <
pytorch-cpu              <
qt                       <
re2                      <
reproc                   <
reproc-cpp               <
rich                     <
s2n                      <
sacremoses               <
scipy                    <
sentencepiece              sentencepiece
spacy                    | setuptools-conda
spacy-legacy             | showdoc
spacy-loggers            <
sqlite                   <
thinc                    <
timm                       timm
tokenizers               | tinykernel
torchaudio               <
torchvision              <
transformers             <
typer                    <
watchgod                 <
yaml-cpp                 <
zlib                     <
zstd                     <

fastai is where I publish fast.ai-developed libs. fastchan contains those libs, plus lots of libs useful to the fast.ai community that other folks have written, but aren’t available in the default anaconda channels.

2 Likes

This is the repo responsible for creating the fastchan channel FYI:

3 Likes

Well !! I was minding my own business when I bumped into the scent of another lead, so fell down this rabbit hole again to follow it. Other commitments mean I’ve run out of time this session, so summarizing interesting bits in case it helps others…

The behaviour I observed of /etc/resolv.conf being overwritten assumes a DNS Resolver will be listening on Localhost Port 53. That would forward requests to real DNS server at the ISP. A reasonable candidate to fulfil this Internet Connection Services configured by Host Networking Services, from what I see in my firewall rules…

Background:

I finished the post about the Apptainer:
Here is the Medium link: Gentle introduction to Apptainer
And a copy of the same post on Github: GitHub - tensoralex/apptainer-fastai-container: A short tutorial on building an Apptainer image for fast.ai course/book

I created the definition file and script to build a ready-to-use container image for the course, which is explained in the post, and the scripts are in the Github repo.

Tested at least on Ubuntu 20.04 and 22.04.
Hope those will be helpful to someone :slight_smile:

3 Likes

Thanks for sharing! If you’ve got a tweet about this, please share so I can retweet.

1 Like

Thank you, @jeremy ! Here is the tweet: https://twitter.com/tensoralex/status/1536508768070025218

2 Likes

This is amazing! I’m going to try this and build a version of the paperspace container I use locally. Thanks for sharing!

1 Like

I’m thinking about one of the 3090 Alienwares with liquid cooling. In terms of price/performance, looks promising. I guess 24 GB GPU with fp16 enabled should be a feasible option for training larger models. Didn’t find any better out-of-the-box single-GPU option on the market at the time of writing this.

PyTorch v 1.12 has just been released!

Why it is important? The MPS device is now available, although as a prototype still (link), which means native pytorch on Mac M1 using torch.device('mps') instead of torch.device('cuda').

As of two weeks ago (when I tried last tried it in the nightly version), there were some fundamental pieces missing so, for example, I could not train an RNN on it. But for normal matrix multiplications (for example), it is between 2 and 1000 times faster than using the CPU (I think that the wide spread depends on the particular architecture of the M1.

Sadly fastai is not compatible (yet) with pytorch 1.12, as they changed several internals that are relied upon (I got all sort of errors when I tried creating a dataloader for example), but when it is, I look forward to try and deep learn on my work laptop :slight_smile:

PS: The M1 CPU is amazingly fast as a cpu, by the way, it is significantly faster than the mid-high range AMD Ryzer I have in my AI box

2 Likes

Thanks for sharing! My Studio M1 Ultra wasn’t specifically bought for AI stuff - but keen to see how this comes along :wink:

1 Like