Lesson 8 - solve_image(evaluator, iterations, x) loss is not changing

Running x = solve_image(evaluator, iterations, x) doesn’t change the loss.

###Function:

def solve_image(eval_obj, niter, x):
for i in range(niter):
x, min_val, info = fmin_l_bfgs_b(eval_obj.loss, x.flatten(),
fprime=eval_obj.grads, maxfun=20)
x = np.clip(x, -127,127)
print(‘Current loss value:’, min_val)
imsave(f’{curr}/{path}/results/res_at_iteration_{i}.png’, deproc(x.copy(), shp)[0])
return x

###Output produced:-

Current loss value: [[[ 33.2060318 20.08106232 16.17399216 12.16727257 8.56483841
7.62770414 7.33554125 7.80692196 9.49361134 7.62756014
6.00673389 6.87932873 12.56398296 11.63674164 18.35542297
19.38352966 24.62762833 24.0914917 20.39878273 19.10217094
13.6978569 14.9651947 15.17493153 10.7866888 9.50629044
8.4344492 7.72730112 8.06190777 11.87230492 12.63180542
13.64288807]
[ 57.88011932 26.76544189 13.48631096 11.79345131 10.49677277
7.5825119 6.35761404 6.3245697 9.35704231 8.73126411
8.19586754 11.97411442 23.20396423 24.7162323 40.58876038
44.0321579 34.85731888 46.57571793 34.31829071 21.19487
15.7381115 34.08716965 40.27103424 12.53403854 9.15103436
9.53599548 10.36951828 15.59152603 24.3707695 19.79315567
14.73207855]
[ 34.31941605 24.39427185 16.57691765 16.11696625 10.78215122
12.80167961 21.04280281 8.66085815 9.47975636 12.67052364
12.39707279 19.39152336 35.84634018 38.26467896 59.55347443
47.7763176 63.61056519 70.65930176 40.69279861 19.14442825
15.67675686 26.13866234 31.4870491 9.11361122 6.64157152
5.41341305 5.32737064 9.25443935 16.38127518 21.25893021
12.54995918]
[ 18.30953407 19.72229576 40.33584595 36.30170059 12.29924011
13.88677025 25.95745468 8.5910387 13.39797211 26.95464706
13.57233524 27.06708336 78.57077789 50.2036705 61.51771164
51.9119072 162.04238892 70.01686096 45.35917282 21.60429001
16.85955811 12.29426575 12.39064121 7.67420769 6.0675025
4.89125443 4.42332029 7.15128422 8.12601757 14.78685474
10.38321781]
[ 6.20771885 12.24368668 20.99679756 14.48396873 8.06450272
6.27996778 6.7813201 6.82815313 9.47907639 16.99809456
12.58028316 37.31471252 97.17681885 43.42054749 41.06358719
35.14484024 74.96340942 42.34041214 27.00871849 20.9910183
7.77365732 3.54495764 2.44171596 2.51937962 2.90996909
2.81223202 2.96856236 4.34698391 6.03745794 15.10670853
11.55791664]
[ 2.85382271 4.79457331 4.95561838 3.40083456 3.05351257
2.92715788 3.15619493 3.14316559 3.36652708 5.80950356
13.04117393 25.89600945 41.4420166 33.31500626 32.55680466
37.79725266 32.06484222 23.35697937 12.7887392 6.9765811
3.65574765 2.89452219 2.55472779 2.59539795 2.85685492
3.02479076 3.06512928 5.57230091 8.8180275 13.65346813
11.34440041]
[ 1.82977831 2.45436001 2.08012867 1.63265765 1.76400149
2.53763461 2.82586098 2.24992037 1.80811894 2.64655375
6.81728268 14.59576797 18.84130096 30.71979523 27.0160675
41.664711 40.56147003 27.13577271 10.52915192 4.54463768
1.92098296 1.92940354 1.91668832 1.98826528 2.58369207
4.64933777 7.11566019 19.05528259 18.42868614 12.35623455
10.7384243 ]
[ 1.41542506 1.18879294 1.15167201 0.87862027 0.63715315
0.81689113 0.97119749 0.99493468 1.09416664 1.53082657
2.90366006 7.4862566 13.92551136 27.90722275 19.50338554
21.25905991 22.32217598 23.01438904 8.73469734 3.66514421
1.8987211 2.38975787 3.02345848 2.44611502 2.33888936
5.05501223 11.90886211 26.33424187 15.61091709 11.32019329
13.96270657]
[ 1.77556205 1.16858411 0.91145396 0.68540859 0.80652422
1.00474107 0.99451089 1.05444384 1.21099329 1.66733789
2.70410252 6.13238621 13.76416588 32.75653458 27.2126503
27.19318771 22.50907516 16.81619453 9.34770966 4.76797009
4.78952742 8.748312 11.68467522 8.76805305 5.08468246
5.27363777 7.75885677 12.63898659 11.91668415 18.6069622
22.55745888]
[ 2.53088689 1.58248162 1.62293363 1.12522697 1.16617072
1.32301283 1.5008173 1.6395998 1.72814369 1.85697734
2.65120769 6.9287343 17.91910934 48.24872208 43.52801895
38.51861572 22.55297089 19.87830734 16.67383766 12.7084465
11.89365482 16.03906059 19.22805214 14.14501476 12.5786562
13.81528473 12.00575066 13.10462093 22.81254768 28.88109779
32.79132843]
[ 3.7797091 2.32183051 2.29233813 1.66834331 1.75177574
1.68890572 2.03196144 2.15050077 2.43438578 2.60084844
3.72523069 12.57390881 30.29442215 73.32656097 58.49345398
44.97023773 33.804245 49.139328 41.38017273 24.89795303
21.41179466 39.59864807 65.31254578 21.96058083 14.24939728
11.48993015 8.74434566 11.50725174 30.16271973 30.54592705
38.14787674]
[ 4.40338087 3.07408619 3.99825382 3.32060766 2.9125061
2.7582922 3.08484221 3.28195167 3.48950458 3.77397466
7.31523466 21.24904442 47.00681686 94.9484024 69.33935547
65.48565674 61.03204727 81.87646484 72.82511139 39.22859955
32.23112869 36.82618332 55.81175232 24.37439156 17.94472313
18.96814346 16.28027916 19.16891861 45.13747787 39.85911179
42.66761017]
[ 4.52216387 3.10020399 3.66234016 2.7850306 2.28191614
2.03287935 1.99256539 2.09987497 1.95826685 1.97680318
6.75277042 23.10610962 38.8431282 85.06809998 82.27041626
101.52920532 93.89569092 108.20083618 100.13227844 43.0358429
53.45552826 29.9584713 25.25246811 21.91837692 27.90044212
41.26905823 33.79750824 36.61168289 62.39069366 41.76085281
44.05921555]
[ 4.82942963 3.13122892 3.8422606 2.75562358 2.11994267
1.69339085 1.56082952 1.71518373 1.72012007 1.29072917
5.6352911 19.11857986 29.79812813 76.3242569 119.77906036
125.69881439 80.9726181 97.12321472 64.00453949 49.96703339
105.31733704 40.75710678 44.2722702 39.46192932 62.41311264
59.12956619 52.22418594 100.29959869 133.06738281 65.34344482
55.52011108]
[ 7.17176056 3.66834092 3.11800957 1.87753224 1.27451551
1.14877999 0.95984036 1.04592144 1.00403094 1.20485294
3.88942361 18.14830017 31.41493988 70.29806519 90.65190125
95.85595703 82.07322693 58.61307526 76.24801636 143.50421143
137.58480835 49.51870346 55.43074036 65.23453522 95.59564972
84.92819977 65.78530884 81.24728394 93.87770844 64.52074432
74.76600647]
[ 14.03314972 6.21260357 3.73673677 2.26451254 1.95734322
1.3096118 1.42092764 1.55258608 1.68245029 2.02587366
4.49978161 21.95251083 41.01526642 80.60381317 78.41563416
103.12227631 65.76183319 64.52980042 170.03674316 177.62469482
148.77006531 54.40945435 44.58486176 51.43976593 104.40655518
96.17368317 88.84292603 77.43508911 135.15390015 88.73038483
80.5298996 ]
[ 26.81126022 12.15319443 7.40993547 6.78273869 7.82904434
7.36882544 9.32121086 11.86216927 12.55319786 12.60791779
16.7092514 28.30745125 46.7735672 89.24064636 55.34473038
68.52053833 81.73246002 146.45574951 310.66616821 123.08236694
108.39319611 92.07800293 77.54277039 87.84363556 134.43572998
120.39167023 105.34753418 96.70706177 287.59823608 100.7049942
70.00754547]
[ 44.02708435 16.45812416 15.88206577 16.93419647 29.11901665
31.75976181 30.23827171 28.58728981 26.62997437 29.11502075
36.32857895 38.88663101 40.11105347 61.0044136 41.8470993
75.90283966 92.31057739 211.0171814 388.38308716 218.25402832
176.04559326 176.15246582 117.09722137 112.85925293 170.90400696
207.10733032 107.97680664 103.59822083 127.23114777 60.42710876
69.24952698]
[ 55.98625565 28.18400192 33.91566849 36.17356873 44.6913147
77.35273743 62.09910965 33.99442291 45.14460754 76.94776917
59.45671463 45.24910736 70.62404633 55.91226196 52.22939301
82.92276001 98.94645691 219.65240479 344.61755371 184.67443848
147.63018799 177.25750732 116.14400482 146.94735718 193.55786133
245.65202332 117.51345825 99.85142517 96.94999695 60.34388733
58.5375061 ]
[ 64.31043243 57.89761734 71.00749969 79.70358276 72.53900909
167.67288208 86.28264618 33.57116699 62.84986496 161.99879456
73.40656281 48.52155304 85.90597534 57.95286942 50.93392181
86.64831543 75.81438446 143.36520386 259.21511841 150.50064087
101.59277344 134.11192322 116.09380341 249.04421997 226.61079407
157.01924133 103.0191803 110.88439941 126.80984497 90.94130707
69.99067688]
[ 53.73124313 69.74217224 89.73324585 82.81355286 83.5487442
133.49291992 75.14426422 58.05162048 69.83244324 115.94315338
78.48416901 65.28925323 112.88236237 90.07330322 81.82083893
104.22941589 95.11408997 69.32682037 77.03675842 69.30486298
81.65301514 126.46459961 122.67599487 153.34240723 109.49217987
90.23752594 60.05365372 106.95482635 113.12120056 109.67802429
92.37181854]
[ 67.7936554 93.08354187 83.05014038 53.1809082 56.56014252
52.42356873 44.30026245 47.22154236 51.21985626 53.20833969
51.80491638 59.09757996 80.53694916 90.85940552 91.39299011
68.7424469 47.14487076 38.72281647 42.60749817 46.80854034
62.87641525 81.82455444 90.8344574 85.61706543 57.04286957
56.47836685 59.23588943 102.78962708 119.35447693 107.20783997
95.1242981 ]
[ 47.48685074 56.29345322 54.43574524 46.98694611 48.23454285
39.46780396 38.47386169 45.07460785 53.48662186 55.2221756
57.02155304 56.46818924 68.80065155 92.17899323 89.84178162
71.49691772 59.56454086 55.96260071 54.16769409 59.22723389
67.73228455 78.1083374 74.33560944 68.20272827 63.6031723
61.70191956 52.33744812 61.27451324 65.0149231 70.21136475
83.21934509]]]
Current loss value: [[[ 33.2060318 20.08106232 16.17399216 12.16727257 8.56483841
7.62770367 7.33554077 7.80692196 9.49361134 7.62756014
6.00673389 6.87932873 12.56398201 11.63674068 18.35542297
19.38352966 24.62762833 24.09149361 20.39878464 19.10217285
13.69785786 14.9651947 15.17493153 10.78668976 9.50629044
8.43445015 7.72730112 8.06190777 11.87230492 12.63180637
13.64288807]

Please help!

Make sure your loss function is defined correctly:

loss = K.mean(metrics.mse(layer, targ))

3 Likes