Relation between input and ciphertext length in AES
Having recently started using cryptography in my application, I find myself puzzled by the relationship between the input text length and the ciphertext it results in. Before applying crypto, it was easy to determine the database column size. Now, however, the column size varies slightly. Two questions: Am I correct in assuming this is due to the padding of my input, so that it fits the cipher's requirments? Is there a way to accurately predict the maximum length of the ciphertext based on the maximum length of the input? And for bonus points: should I be storing the ciphertext base64-encoded