Can someone please explain the “binary_loss” in the last couple of cells of lesson 1?
def binary_loss(y, p):
return np.mean((y * np.log(p) + (1y)*np.log(1p)))
 Why is it the “binary” loss? Does that just mean that it is normalized between 0 and 1?

What is the “
y
” here? It is not they
fromlog_preds,y = learn.TTA()
, because it crashes if I use that “y
” 
I see that that when a list is made to call this function, the variable is called “
acts
”, what does this stand for? I realise that it isy
, but I don’t understand. 
How would I get y from the confusion matrix? I got inf when I tried using
probs
 so maybe what isprobs
then…
Also, one really dumb question, but I just want to verify I understand  what does “precompute” actually precompute when set to True?
Thanks!