In python 3, int(50)<\'2\' causes a TypeError, and well it should. In python 2.x, however, int(50)<\'2\' returns True
int(50)<\'2\'
TypeError
True
(And who thought it was a good idea to allow this to begin with???)
I can imagine that the reason might be to allow object from different types to be stored in tree-like structures, which use comparisons internally.