I am currently an amateur in deep learning and was reading about word2vector on this site https://www.kaggle.com/c/word2vec-nlp-tutorial/details/part-3-more-fun-with-word-ve
Actually the word vector dimension does not reflect the vocabulary size. What Word2Vec is doing is mapping the words to their representation in a vector space and you can make this space of any dimension you want: : Each word is represented by a point in this space and word vector dimension are the coordinates of this word in this space. Also words that tend to appear in the same context appear next to each other in this space.
Hope this helps