Batchnorm in input layer

I was in the process of making a linear model when I noticed that TabularModel uses batchnorm1d even on the input! Just wondering if this was intentional and if so, what is the intuition behind this? As far as I know you only apply batch norm from second layer onwards. The link to the code can be seen here.