80 seems to be the default in many different environments and I\'m looking for a technical or historical reason. It is common knowledge that lines of code shouldn\'t exceed 80 c
As per Wikipedia:
80 chars per line is historically descended from punched cards and later broadly used in monitor text mode
source: http://en.wikipedia.org/wiki/Characters_per_line
Shall I still use 80 CPL?
Many developers argue to use 80 CPL even if you could use more. Quoting from: http://richarddingwall.name/2008/05/31/is-the-80-character-line-limit-still-relevant/
Long lines that span too far across the monitor are hard to read. This is typography 101. The shorter your line lengths, the less your eye has to travel to see it.
If your code is narrow enough, you can fit two files on screen, side by side, at the same time. This can be very useful if you’re comparing files, or watching your application run side-by-side with a debugger in real time.
Plus, if you write code 80 columns wide, you can relax knowing that your code will be readable and maintainable on more-or-less any computer in the world.
Another nice side effect is that snippets of narrow code are much easier to embed into documents or blog posts.
As a Vim user, I keep ColorColumn=80 in my ~/.vimrc. If I remember correctly, Eclipse autoformat CtrlShiftF, breaks lines at 80 chars by default.
