does anyone have a good definition for what a binary protocol is? and what is a text protocol actually? how do these compare to each other in terms of bits sent on the wire?
Both uses different char set, the text one, use a reduced char set, the binary includes all it can, not only "letters" and "numbers", (that's why wikipedia says "human being")
o be more clear, if I have jpg file how would that be sent through a binary protocol and how >through a text one? in terms of bits/bytes sent on the wire of course.
you should read this Base64
any coments are apprecited, I am trying to get to the essence of things here.
I think the essence for narrowing the charset, is narrowing the complexity, and reach portability, compatibility. It's harder to arrange and agree with many to respect a Wide charset, (or a wide whatever). The Latin/Roman alphabet and the Arabic numerals are worldwide known. (There are of course other considerations to reduce the code, but that's a main one)
Let say in binary protocols the "contract" between the parts is about bits, first bit mean this, second that, etc.. or even bytes (but with the freedom of use the charset without thinking in portability) for example in privated closed system or (near hardware standars), however if you design a open system you have to take account how your codes will be represented in a wide set of situations, for example how it will be represented in a machine at other side of world?, so here comes the text protocols where the contract will be as standar as posible. I have designed both and that were the reasons, binary for very custom solutions and text for open or/and portable systems.