How to pipeline input data in keras or tensorflow?

In some situation I cannot put all my training data to main memory. Are there any tutorials about pipeline input data in keras or tensorflow? The data type may be tensor of string/int/float…

You can use fit_generator which does not expect all the data to be in memory. If you are interested in using Bcolz then take a look at lesson 10 , where Jeremy shows how to handle imagenet dataset which is of size 140 gb.

Thanks for your reply.