Classifying Monetary Notes for Visually Impaired

Hello Everyone,
I am creating this project to classify Nepalese monetary notes using smartphone camera.

In Nepal there are no special markings in its monetary notes, making lives of visually impaired individuals difficult in terms of day to day monetary transactions. Some with years of experience finally learned it, but many still have to ask around for the help.

The project I am working on, will try to solve this problem. As many blind individuals uses smartphone these days using screen reader feature in it. Basically I am creating an app and when the person takes the picture using that app, the picture will be classified using deep learning model behind the scene and the audio will be played to notify the user, predicting value of the note.

For the first prototype I used VGG19 model and only worked on 2 categories, Rs.10 notes and Rs.20 notes. The training and validation accuracy is around 99% but when tested on real data, it was around 90% accurate. I trained them on 2000 images per class, with 350 images per class in each validation set.

I have released the first prototype with its source code and dataset as open source project and I have also released the app which will use it as open source.

I have shared some of the experience while developing this project in article below.

I am looking forward to your valuable advice and suggestions.