Does it make sense to use Hungarian notation prefixes in interpreted languages? [closed]
First of all, I have taken a look at the following posts to avoid duplicate question. https://stackoverflow.com/questions/1184717/hungarian-notation Why shouldn't I use "Hungarian Notation"? Are variable prefixes (“Hungarian notation”) really necessary anymore? Do people use the Hungarian Naming Conventions in the real world? Now, all of these posts are related to C#, C++, Java - strongly typed languages. I do understand that there is no need for the prefixes when the type is known before compilation. Nevertheless, my question is: Is it worthwhile to use the prefixes in interpreter based