I think they added TPU support from v1.3 onwards
Hello! Iâm following the instructions in the ppt document from Saturday but I get the following import error when I try to open the notebooks. Do you know why?
(fastai) C:\Users\Silvia\fastai\course-nlp>jupyter notebook
Traceback (most recent call last):
File "c:\users\silvia\appdata\local\programs\python\python36\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "c:\users\silvia\appdata\local\programs\python\python36\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\Users\Silvia\AppData\Local\Programs\Python\Python36\Scripts\jupyter-notebook.EXE\__main__.py", line 4, in <module>
File "c:\users\silvia\appdata\local\programs\python\python36\lib\site-packages\notebook\notebookapp.py", line 81, in <module>
from .services.kernels.kernelmanager import MappingKernelManager
File "c:\users\silvia\appdata\local\programs\python\python36\lib\site-packages\notebook\services\kernels\kernelmanager.py", line 19, in <module>
from jupyter_client.session import Session
File "c:\users\silvia\appdata\local\programs\python\python36\lib\site-packages\jupyter_client\__init__.py", line 4, in <module>
from .connect import *
File "c:\users\silvia\appdata\local\programs\python\python36\lib\site-packages\jupyter_client\connect.py", line 35, in <module>
from jupyter_core.paths import jupyter_data_dir, jupyter_runtime_dir, secure_write
ImportError: cannot import name 'secure_write'
you will save yourself some environmental work if you give google colab a shot. (I switch back and forth all the time) for that error, something to do with your pip or conda env. you might want to review the env setup instructions.
I found this:
The recommended fix is:
pip install --upgrade jupyter_client
Try this and see if it works?
I tried it but the problem persists⊠:S I tried some other options but anything seems to work, so Iâm launching the notebooks from Anaconda Navigator.
Great! If you can run the notebooks, youâre all set.
Thanks for this.
The Norvig-Chomsky discussion referred to in the first notebook parallels Richard Feynmanâs much earlier comments on Greek vs. Babylonian Mathematics (in this 10 minute clip from one of his Messenger Lectures, given at Cornell University in 1964) Here, Greek mathematics represents the axiomatic approach where all knowledge must be systematically derived and proven from a set of postulates. Babylonian mathematics represents the approach of learning about nature by looking for patterns, not necessarily understanding why the world is that way, but being able to use that knowledge to predict real things about nature (such as lunar eclipses!) Feynman concluded that Physics needs both approaches.
This weekâs assignment (complete before Saturday 12/21/2019 class)
- Watch Video 2 Topic Modeling with SVD and NMF
- Work through notebook 2-svd-nmf-topic-modeling_jcat.ipynb
- Work through the Excel spreadsheet britlit.xlsx discussed in the video.
- If you donât have Excel, get LibreOffice
- Post questions here if you run into problems!
BTW the top post is a wiki post, so feel free to put the most up to date info directly in there, if itâs helpful to people. (I normally both post new info in a reply, and also update the top post, FYI.)
Great! I had not noticed that. Good to know!
San Jose. (Moved from Santa Monica years ago)
Ciudad de MĂ©xico
The Fastai NLP Study Group
will meet Saturday Dec 21st, at 8 AM PST, 11 AM EST, 5 PM CET, 9:30 PM IST
Topic: Topic Modeling with Non-negative Matrix Factorization (NMF) and Singular Value Decomposition (SVD)
Suggested homework / preparation:
- Watch NLP video #2
Video playlist is here - Read and work through the notebook Topic Modeling with NMF and SVD
Course notebooks are in github repo
Join the Zoom Meeting when itâs time!
To join via phone
Dial US: +1 669 900 6833
or +1 646 876 9923
Meeting ID: 832 034 584
Sign up here to receive meetup announcements via email.
I was looking for a little more context on SVD. Came across this https://www.youtube.com/watch?v=P5mlg91as1c which I found useful.
Great meeting today! Thanks to all who contributed questions and comments. Here is the chat text.
08:00:43 From kodzaks : Yes, thank you very much for doing it!
08:01:16 From Mutum : Agreed,Yes much appreciated
08:14:04 From VĂctor Peinado : Let me share this website with awesome animations of matrix multiplications: http://matrixmultiplication.xyz/
08:14:47 From csaroff : Thanks!
08:15:03 From Hans Paul P : nice
08:35:00 From Surya : For R dig into tm related libraries ,also word2vec
08:36:27 From Chris Palmer : These are just historical - for instance removing stop words and lemmatization is still needed for topic modelling - no matter what technique you use
08:40:07 From Chris Palmer : Should have types NOT just historicalâŠ
08:41:35 From Mash Zahid : Thanks, Surya.
08:42:30 From VĂctor Peinado : Regarding stemming/lemmatization, in general terms, traditional ML algorithms work better when you reduce the vocabulary by stemming or lemmatizing, while modern DL approaches are powerful enough to leverage these variations
08:43:22 From VĂctor Peinado : Also, lemmatizing is more convenient with languages with rich morphology (eg. roman languages) while stemming is ok for language as English
08:44:43 From mattc : That multiplication R X Q is not a cross product but normal matrix multiplication right?
08:44:51 From Abhinav Verma : yes
08:45:15 From mattc : cool. I always get that confused.
08:45:31 From Abhinav Verma : hehe, no worries
08:47:40 From csaroff : Isnât the cross product the same as ânormal matrix multiplicationâ?
08:48:19 From Abhinav Verma : yes , in a way. you can say that
08:48:52 From Abhinav Verma : I would have to look up the exact difference between the two though
08:49:04 From Mash Zahid : This may help the intuition between factorization and multiplication: http://nicolas-hug.com/blog/matrix_facto_2
08:49:45 From mattc : I bookmarked that Mash, thanks.
08:49:55 From Mash Zahid : But linear algebra is a major undergrad course, not easy to distill into just a few slides as we do here.
08:54:27 From jekaterina : Another great linear algebra course is here: http://immersivemath.com/ila/learnmore.html
08:54:54 From jekaterina : it has all the interactive graphs and diagrams that help visualize main concepts
08:56:23 From VĂctor Peinado : Another resource: Essence of Linear Algenbra, by 3Blue1Brown https://www.youtube.com/watch?v=fNk_zzaMoSs&list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab
08:56:29 From Mash Zahid : I like âCoding the Matrixâ by Philip Klein of Brown Univ, delivered via Coursera. Itâs a deep content intro to linear algebra (and more), with a focus on applications in computer science. The course is accompanied by a textbook written by Klein, which makes the course material better organized and more in-depth than slides and videos alone would allow.
http://codingthematrix.com/
08:58:48 From Mehulâs iPhone : thanks everyone for the links has anyone done the undergraduate linear algebra course at MIT
08:58:50 From Mehulâs iPhone : https://ocw.mit.edu/courses/mathematics/18-06sc-linear-algebra-fall-2011/
08:59:15 From Mehulâs iPhone : any thoughts about this one ? it is by Gilbert Strang
09:00:17 From Mash Zahid : Gilbert Strangâs linear algebra course at MIT is legendary!
09:00:32 From Srinivas : The gilbert strang course is the very best. Other that is good although at a more basic level than Strangs is LAFF from UT Austin at edx https://www.edx.org/course/linear-algebra-foundations-to-frontiers
09:01:13 From Mash Zahid : Mastering his course will give you a special superpower
09:02:24 From Mash Zahid : In my not-so-humble opinion, #1 Strang at MIT, #2 Klein at Brown because his teaching style is a bit more basic/simpler.
09:02:46 From mattc : I want superpowers. Is it the edx class youâre referring to?
09:02:48 From Mash Zahid : (I donât know the UT Austin prof.)
09:04:02 From Mash Zahid : MIT Strang course is link above from MIT OCW. Kleinâs materials are on website & YouTube. All free, so everyone can have superpowers if they work for it!!!
09:05:28 From mattc : great. Maybe Iâll ruin my holiday break and work through some of that.
09:08:13 From Chris Palmer : Looks like Coding the Matrix on Coursera has been deprecated. I found a link to lectures and class content here: https://www.reddit.com/r/Python/comments/5j042r/coding_the_matrix_this_course_for_anyone_need/
09:15:30 From kodzaks : GitHub for that courser course https://github.com/pauldevos/Coding-the-Matrix
09:16:23 From Hans Paul P : can you please jot down the mathematical understanding of it pls?
09:16:56 From Hans Paul P : the more groups the more we understand?
09:17:09 From Mash Zahid : Yes
09:17:19 From Mash Zahid : Or âtopics"
09:17:37 From Mash Zahid : Yes, clusters
09:17:54 From Hans Paul P : ok thanks
09:18:32 From Surya : Yes but the trade off is how minimal you keep your clusters and yet retain most of the information âŠ
09:19:51 From csaroff : Is this the same type of problem that variational auto encoders aim to solve?
09:20:56 From Mutum : so just like giving k values in K-means
09:27:25 From Alvaro G : Mashad, can you please post the link to such documentary?
09:27:48 From VĂctor Peinado : is this the documentary? https://www.youtube.com/watch?v=5dZ_lvDgevk
09:28:55 From Chris Palmer : That youtube link is not available to me here in AustraliaâŠ
09:30:21 From csaroff : Vpn?
09:31:53 From Yuri S : Want to clarify that Google is not working on identifying Uighers in China.
09:32:08 From Chris Palmer : I guess so - keep meaning to get around to using a VPNâŠ
09:36:29 From Pazhanivel : is it not that both autoencoders and PCA/SVD are used to reduce the number of features?
09:38:05 From Mash Zahid : PBS Frontline âIn the Age of AIâ https://www.pbs.org/wgbh/frontline/film/in-the-age-of-ai/
09:50:51 From mattc : Thank you, Mash, that makes sense
09:51:27 From Rey Farhan : Does anyone have a link to a good intro to autoencoders for this kind of task?
09:51:31 From Mash Zahid : You are most welcome and kind. (My family always remind me that I know nothing! )
09:52:43 From Mash Zahid : Autoencoders, refer to Aurelion Geronâs book and code https://github.com/ageron/handson-ml2
09:52:51 From Rey Farhan : thanks
09:53:06 From Pazhanivel : Thank you
09:53:37 From Team TWiML : Thanks everyone!
Recent Forbes article a propos of our brief class discussion on social responsibility: âTech firms in the United States are lending expertise, reputational credence, and even technology to Chinese surveillance companies, wittingly or otherwise.â
The Fastai NLP Study Group will meet
Saturday January 04, at 8 AM PST, 11 AM EST, 5 PM CET, 9:30 PM IST
Join the Zoom Meeting when itâs time!
Topic Modeling with Non-negative Matrix Factorization (NMF) and Singular Value Decomposition (SVD), revisited
Suggested homework / preparation:
- Watch NLP video #3
Video playlist is here
-
Read and work through the notebook 2b-odds-and-ends
-
Review the notebook we covered last meetup Topic Modeling with NMF and SVD
Course notebooks are available on github
To join via Zoom phone
Dial US: +1 669 900 6833
or +1 646 876 9923
Meeting ID: 832 034 584
The current meetup schedule is here
Sign up here to receive meetup announcements via email.