Why is there training loss?

Dear @jeremy
I did online search, google, quora, stackflow, medium and forum.fast.ai but cannot find the answer.
You are the smartest person I know on deep learning so I hope you can share your insights.

Loss function represents the price paid for inaccuracy of predictions.

Training a neural net,

  1. Why is there a training loss ?
  2. How is the training loss calculated since the data1 = cat 100%, data2 = dog 100% …

Not a trick question, I think I understand training a NN concept as data1 = cat, data2 = dog …
not that data1 = 80% cat, data2 = 60% dog and 20% cat… so why is there a training loss
what is there to lose / difference since the one image is just classified as a cat or dog only,
and how is training loss % derived since tha data is suppose to be a cat or dog only.

Side note:
For readers, this is my second blog on Fast.AI and I welcome your thoughts at Medium.com