Universal Sentence Encoders and CNNs for Text

I was wondering if anyone has played around with Google’s Universal Sentence Encoder https://www.aclweb.org/anthology/D18-2029 ? The paper seems to mention that they used it as an input to a CNN (Yoon Kim’s 1D CNNs for text), but I don’t understand how a 512 length 1 dimensional vector (which the embedding outputs) can be put through a 1D Convolution