Lesson 8 in-class


(Sourav Dey) #141

One more thing to add some color to this random vs actual initialization issue. The actual MSE Loss is NOT a good proxy for how “good it looks to the human eye”. Here’s an interesting test. I tried to do Dr. Seuss style painting on my dog image.

The two experiments:

  1. Run for 30 iterations using random start.
  2. Run for 30 iterations using actual dog image as start.

Here’s what I see as loss evolution for random start:

Current loss value: 128737.335938
Current loss value: 20609.1621094
Current loss value: 10165.6757812
Current loss value: 6672.15917969
Current loss value: 5155.08496094
Current loss value: 4537.89404297
Current loss value: 4114.32568359
Current loss value: 3829.49755859
Current loss value: 3597.79443359
Current loss value: 3406.74853516
Current loss value: 3252.09716797
Current loss value: 3121.66894531
Current loss value: 2983.66870117
Current loss value: 2860.04321289
Current loss value: 2761.69287109
Current loss value: 2658.37353516
Current loss value: 2563.57617188
Current loss value: 2472.91357422
Current loss value: 2382.16723633
Current loss value: 2294.84765625
Current loss value: 2235.91455078
Current loss value: 2187.07885742
Current loss value: 2137.97900391
Current loss value: 2105.34887695
Current loss value: 2058.42382812
Current loss value: 2028.6328125
Current loss value: 1992.83178711
Current loss value: 1961.20556641
Current loss value: 1934.49316406
Current loss value: 1904.23754883

Here’s what I see as loss evolution for actual (content) image start:

Current loss value: 24636.9492188
Current loss value: 10866.2763672
Current loss value: 6373.98486328
Current loss value: 4799.92773438
Current loss value: 3871.01269531
Current loss value: 3332.984375
Current loss value: 3063.80957031
Current loss value: 2794.05078125
Current loss value: 2597.07958984
Current loss value: 2432.38134766
Current loss value: 2287.67407227
Current loss value: 2176.72216797
Current loss value: 2069.74511719
Current loss value: 1987.13232422
Current loss value: 1916.72436523
Current loss value: 1857.44934082
Current loss value: 1796.76477051
Current loss value: 1758.79333496
Current loss value: 1719.85595703
Current loss value: 1686.12841797
Current loss value: 1657.99719238
Current loss value: 1630.62524414
Current loss value: 1607.50720215
Current loss value: 1584.3046875
Current loss value: 1564.27990723
Current loss value: 1545.76745605
Current loss value: 1531.41711426
Current loss value: 1506.92480469
Current loss value: 1494.37683105
Current loss value: 1475.97167969

So the actual image gets to a better place. But now, lets look at about the same MSE place in both. Here’s what the painted image starting from random conditions looks like after 30 iterations, loss = 1904.

Not great.

Here’s what the actual initialization looks like after 13 iterations. Note that the loss here is 2069… technically worse than the random one after 30 iterations.

I think this is a better painted image… Seems like the loss is not capturing what I deem to be a good painting.Pretty cool.


(anamariapopescug) #142

Thx for the pointer! (cc @sravya8 ). There’s also this paper which the original authors reference, which has more details : https://arxiv.org/abs/1505.07376


(Constantin) #143

@rachel, great! Please do. It has been a game changer for me. Perhaps, if it fits in somewhere it would be interesting to also cover ctables. So far, though, I used carrays.
For the time being the most useful (IMHO) link to bcolz documentation is this tutorial


(David Woo) #144

if anyone had issues installing xgboost.

this link was helpful:


(Rachel Thomas) #145

@iNLyze We haven’t found ctables useful for deep learning so far.


(Samuel Ekpe) #146

use wget -r -nH -nd -np -R index.html* http://www.platform.ai/part2/lesson1/


#147

Building a new box. Getting xgboost error. Using Ubuntu 16.10 (with anaconda3/python3.6). Any suggestions?

import xgboost
/home/gaurav/anaconda3/lib/python3.6/site-packages/sklearn/cross_validation.py:44: DeprecationWarning: This module was deprecated in version 0.18 in favor of the model_selection module into which all the refactored classes and functions are moved. Also note that the interface of the new CV iterators are different from that of this module. This module will be removed in 0.20.
“This module will be removed in 0.20.”, DeprecationWarning)
Traceback (most recent call last):
File “”, line 1, in
File “/home/gaurav/anaconda3/lib/python3.6/site-packages/xgboost/init.py”, line 11, in
from .core import DMatrix, Booster
File “/home/gaurav/anaconda3/lib/python3.6/site-packages/xgboost/core.py”, line 112, in
_LIB = _load_lib()
File “/home/gaurav/anaconda3/lib/python3.6/site-packages/xgboost/core.py”, line 106, in _load_lib
lib = ctypes.cdll.LoadLibrary(lib_path[0])
File “/home/gaurav/anaconda3/lib/python3.6/ctypes/init.py”, line 422, in LoadLibrary
return self._dlltype(name)
File “/home/gaurav/anaconda3/lib/python3.6/ctypes/init.py”, line 344, in init
self._handle = _dlopen(self._name, mode)
OSError: /home/gaurav/anaconda3/lib/python3.6/site-packages/scipy/sparse/…/…/…/…/libstdc++.so.6: version `GLIBCXX_3.4.22’ not found (required by /home/gaurav/anaconda3/lib/python3.6/site-packages/xgboost/./lib/libxgboost.so)


(Igor Barinov) #148

looks like a lib is missing?


#149

Thanks @ibarinov


(Apoorv Jagtap) #151

In addition to the above mentioned ones, I had to install:

pip install image (to resolve no PIL module found error)

This will install django, olefile, pillow, image