Chit Chat Thread

Yeah, I’ve remembered my university’s tensor computations course first time for the many years :smile:

Good to see fellow former Physics folks :blush:

1 Like

I’m @WNixalo

Is this to be the same thread for all the future classes?

1 Like

Thanks for reading the series and for the shoutout @neuradai :slight_smile:

Context on what is hackernoon

For more context to others: I write on a publication that is a part of medium: hackernoon. I’ve been asked if I get paid to write there or promote them, no I just started blogging there and they were kind enough to accept my first few blog posts which were really bad. Not sure If I’ve improved but I decided to stick to Hackernoon since they were so nice to me :slight_smile: Now due to some fight b/w medium and HN, they are moving off Medium.

I think I will keep writing at HackerNoon but since I have an okay audience at Medium, I will re-share on Medium with link to the orignal posts. :slight_smile:

2 Likes

Damn am I the only one without Twitter?

2 Likes

Not quite.

1 Like

My twitter handle is @PierreOuannes :slight_smile:

I’m @Data_sigh . Following today’s discussions from an airport hotel after a missed flight; in timezone limbo.

No, you’re not alone. I am without any “conventional” social media accounts at all, except narrow-focused and “professional” ones :smile:

You can think of Twitter as ‘narrow’ or ‘professional’ if you follow right people. There is a huge AI/ML/DL community twitting interesting stuff. Good starting point is to follow Jeremy them follow some of DL researchers Jeremy follows :slight_smile:
DL is the reason I am on Twitter.

2 Likes

What are nonconventional social media you are using ? I wish all DL researchers moved to Mastodon!

I’m @shar1pius on twitter.

Hi :wave:t3:,

Mine is @gavinrbauer, I’ve started Deep Learning with Part-1 v3 and am excited to follow Part-2 with you guys :smile:

Hi there! Mine is @kk1694.

I’m staying in London; if anyone is interested in meeting up (maybe organize a study/reading group), shoot me a message.

For any lazy person on here, I added all of our peers along with a few other practitioners/researchers/kagglers that I follow to a Deep Learning Twitter List

10 Likes

I’m @apolmig on twitter :slight_smile:

Hello Everyone, I’m Karthik(twitter “@karthikshyam”), I work in south bay and live in fremont. Are there any meetups or study groups in south bay/fremont ? I’m also dabbling with kaggle and would love to team up to try out new competitions. I used kaggle for going through part 1. Thanks to folks who put it up there.

I’m also interested in NLP. I’m also interested in transfer learning - as an example in the medical field we often only have access to a small set of clinical notes - which makes it hard to get a good model from scratch. Other areas of interest are sentiment classification (not just positive vs negative). Language models and the ability to generate text is also of interest.

1 Like

Best practices / opinion question:

Sometimes I find that I want to add a new command-line parameter to my code, that would only affect some subroutine deep inside the code. This means I have to then go through every routine that gets called in sequence and pass it along as a new parameter.

Is there a better way to do this? I can think of two options:

  1. Add a “**kwargs” parameter to all my routines, and then just add what I need to at the beginning and “extract” it where I need it. I like this option. (This is perhaps precisely what **kwargs was designed for but I rarely use it.)
  2. Make some sort of “config.py” that gets included by all the files, and contains what are essentially gobal variables which can be set wherever. This seems sloppy.

What do you think of these, or are there others?
(Not asking this on StackExchange b/c it would get modded down as “asking for an opinion”)