Geospatial Deep Learning resources & study group

Hi Janne, thanks for great paper and congrats!

  • possibly you could try using high resolution imagery in Google Earth, there are some research papers using it so it should be possible to get license I guess
  • FYI there is also Treetect project, where they maintain tree database and maintain their health status in urban areas https://www.boston.gov/departments/new-urban-mechanics/treetect

Tom

Hi Tom, thanks for the suggestions, especially treetect. I’ll have to check what they did and see if we can utilize their work also in non-urban areas. For high-resolution (~5cm) UAV-imagery I think that will work just fine.

1 Like

Nice use of FastAI for deforestation classification

https://towardsdatascience.com/detecting-deforestation-from-satellite-images-7aa6dfbd9f61

1 Like

Does anybody have a good recommendation for getting satellite images for commercial use? I feel like every time I’ve seen satellite images it’s already part of a dataset, but I have a use-case where I would like to specify locations that I want satellite images from and then add processing on my side? I am kind of picturing it like I pass a lat/long and size of area I want images from and I get the most up-to-date satellite image returned to me for that location.

The more I’ve dug into the popular companies that do this, the more I’m thinking the industry might just not quite be at that level if ease yet. All of the ones I looked at (Maxar, nearmap, and vexceldata) seem pretty tough to actually get data from

I believe for commercial use you would end up paying somewhere - one avenue is earth engine https://towardsdatascience.com/how-to-download-high-resolution-satellite-data-for-anywhere-on-earth-5e6dddee2803

depending on your use case, you can also see if the data http://registry.mlhub.earth/ (and the related ML library from them) works for you (I think it’s not free after a certain amount though)

If you want VHR images you will end up paying for sure.
But if you want to work with lower resolution images you can definitely use ESA products with Sentinel images https://scihub.copernicus.eu/dhus/#/home and to create an automatic download of Sentinel images you can check at Python API — Sentinelsat 0.14 documentation . For me it was working quite well.
You can also check on https://earthexplorer.usgs.gov/ to get images at specific places, and there is probably a python version for automatic download but I actually never tried.

To be honest ESA products are quite complete for many purposes, but of course if you would like to make a model for automatic detection of cars you will have to pay for VHR images at the end.

Depending on the spatial resolution you want, you can get moderate-high resolution from the ESA Sentinel-1 (SAR, ~10m pixel-size) and Sentinel-2 (Multispectral, ~20m pixel size, 10 for RGB), which are free and publicly accessible. You can also check out the Landsat missions, which also provide multispectral data at ~30m.

I know sentinel 2 is available as cloud optimized geotiff, so depending for the use case you may even avoid downloading big chunks of data.

Then there’s some other interesting free satellite data, such as the GEDI mission, which is lidar, great for measuring height.

Higher resolution is a bit difficult to get for free, I know planet released some mosaics at 3 or 5m, funded by the Norwegian govt.

S2 and S1:

https://scihub.copernicus.eu/
Sentinel-2 Cloud-Optimized GeoTIFFs - Registry of Open Data on AWS
Sentinelsat — Sentinelsat 1.2.1 documentation

Landsat-8:

https://www.usgs.gov/core-science-systems/nli/landsat/landsat-data-access?qt-science_support_page_related_con=0#qt-science_support_page_related_con

Planet open data:

NICFI Program - Satellite Imagery and Monitoring | Planet

I usually like to go with S1-S2, but depending for the use case it may be too coarse.

Cheers,
K.

1 Like

Spotted fast.ai in use in this paper: Super-Resolution of Sentinel-2 Images Using Convolutional Neural Networks and Real Ground Truth Data. No published code unfortunately

2 Likes

Anyone up for giving reproducing the paper a go?

We are a small startup working in Climate Risk assessment, looking for someone with good GIS analytics skills (ideally with ML/DL skills on top) to work on a small contract project (two months, plus project extensions, and possibly leading to PT/FT).

The main task involves handling remote sensing data (MODIS etc) to produce a risk estimate of flooding and drought over large areas.

Don’t hesitate to reach out through DM, if you want to hear more details, or might be interested!

2 Likes

@daveluo thanks a ton for your work !!
I was trying to run your Colab notebook and facing the following errors:

(I have tried 4 different approaches and their outputs are also mentioned)

Approach1: Use the existing Solaris 0.4.0 Installation mentioned in Notebook

!add-apt-repository ppa:ubuntugis/ubuntugis-unstable -y
!apt-get update
!apt-get install python-numpy gdal-bin libgdal-dev python3-rtree

!pip install rasterio
!pip install geopandas
!pip install descartes
!pip install solaris
!pip install rio-tiler

some of the Errors in Output: 
ERROR: Failed building wheel for gdal
ERROR: Command errored out with exit status 1: /usr/bin/python3 -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-_bi6kor0/gdal_5793e836967c45a29ee13774fa79d314/setup.py'"'"'; __file__='"'"'/tmp/pip-install-_bi6kor0/gdal_5793e836967c45a29ee13774fa79d314/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-hgs6fynk/install-record.txt --single-version-externally-managed --compile --install-headers /usr/local/include/python3.7/gdal Check the logs for full command output.

Please see this file link for full output which shows the exact trail of the above errors

Approach2: Trying Solaris 0.2.0

Following your suggestion in the drivenData forum, I tried to install the older solaris==0.2.0 and with adding the relavant ‘ppa:ubuntugis/ppa -y’

But this approach also fails to install solaris and shows similar errors

Approach3: Install using Solaris-Github

Tried to install using: !pip install git+https://github.com/CosmiQ/solaris/@dev
which throws up an error:


ERROR: Could not install packages due to an OSError: [Errno 2] No such file or directory: '/usr/local/lib/python3.7/dist-packages/Shapely-1.7.1.dist-info/METADATA'

To which, I tried to install ‘pip install Shapely==1.7.1’, expecting the necessary folders (Shapely-1.7.1.dist-info/METADATA) will be found, but unfortunately the Colab files dont update with the fresh Shapely-1.7.1 installation made and it still throws up the Error:

OSError: [Errno 2] No such file or directory


Approach4: (Non Colab Solaris Setup)

Tried to install conda environment (conda env create -n solaris -f environment-gpu.yml) as suggested by the Github page of Solaris, but ultimately there are Rasterio and Riotiler version compatibility issues which come up and they throw up more errors.

I request for your help, in resolving this solaris installation errors. I feel your better insights will help to resolve this much easily.

Thanks in advance
Tilak

4 Likes

Fast.ai used in this entry to a recent AWS hackathon, using unet for building segmentation, then applying a resnet for damage classification

Lots of great details including customizing fastai to handle geotiffs

1 Like

Hello @daveluo
Many thanks for this great notebook! I really enjoyed going through! Unfortunately it seems that there is another issue with importing solaris. Since i have a project in this field (detecting human made terraichanges from statellite imagery) i would also like to make use of solaris. But the last days i did not figure how to make it work. Do you have any clue on how to install solaris nowadays?
Many thanks for any help!
Best
Michael

Hello @MichaelScofield , were you able to install solaris on colab, I did exactly what @daveluo advised to do and I followed every discussion on google about this problem but I still cannot get solaris installed on google colab. Please if you have any solution to this problem, let me know, thanks a lot.

Hello @tilak , were you able to install solaris on colab, I did exactly what @daveluo advised to do and I followed every discussion on google about this problem but I still cannot get solaris installed on google colab. Please if you have any solution to this problem, let me know, thanks a lot.

Hello @MiLap, were you able to install solaris on colab, I did exactly what @daveluo advised to do and I followed every discussion on google about this problem but I still cannot get solaris installed on google colab. Please if you have any solution to this problem, let me know, thanks a lot.

Hi all,

I’m working on a library ‘fastgs’ to add multi-spectral data visualization to the fastai pipeline.

My approach is to display multiple “RGB” or “monochrome” images for each multispectral image. These are currently laid out in a row (although I’m also experimenting with animating them), and “show_batch”, “show_results”, etc have been specialized to deal with them.

I’ve also integrated augmentations via albumentations, and some patches to deal with pretrained weights.

Big shoutout to @cordmaur and @Nickelberry for their excellent public notebooks on the topic. I learnt a lot from those.

There’s a sample notebook on a public landsat dataset at Cloud95 - fastai with fastgs multispectral support | Kaggle. If there are other public datasets that people are interested in, I could probably provide sample notebooks for those as well.

I’m relatively new to python so I may not be doing things totally idiomatically. Would be happy to receive feedback on functionality, code and docs. Also happy to receive suggestions / PRs for missing/additional features and functionality.

The repo is GitHub - restlessronin/fastgs: Geospatial (Sentinel2 Multi-Spectral) support for fastai, docs fastgs - Welcome to fastgs, and thanks to the power of nbdev, it’s available on both pypi and anaconda

Thanks

2 Likes

Hi @restlessronin, great initiative! I am looking forward to take a look and contribute to it! Thanks for sharing.