Why is array indexing done with 0 and not with 1 in programming languages like Java ? I am totally new to java any explanation is welcomed.
To summarize his argument:
When working with sub-sequences of natural numbers, the difference between the upper bound and the lower bound should be the length of the sub-sequence. The indices of an array can be thought of as a special kind of such a sub-sequence. The lower bound should be inclusive, the upper bound should be exclusive. In other words, the lower bound should be the first index of the array. Otherwise, we risk having to have a lower bound in the unnatural numbers for some sub-sequences. If we want to maintain conditions (1) and (2), then we effectively have two choices for upper and lower bounds: 1 <= i < N+1 or 0 <= i < N. Clearly, putting N+1 in the range is ugly, so we should prefer indexing starting from 0.
Java uses zero-based indexing because c uses zero-based indexing. C uses zero-based indexing because an array index is nothing more than a memory offset, so the first element of an array is at the memory it's already pointing to, *(array+0)
.
I Have presented answer to this question in the diagram below which i have written in a piece of paper which is self explanatory
Main Steps:
Note: Blocks shown in the image is memory representation
I Have also posted this answer here
It's all legacy from the times when programming languages like C were merely high level assemblers. Programming mavericks spent their wonderful lives doing pointer arithmetic so it became their second nature to count from zero. Now, they are passing this legacy to many modern languages. You can even read statements as "Zero is the most natural number.". Zero is not a natural number. People don't count from zero in real life, mathematicians don't, physicists don't, statisticians don't count from zero... it's only the computer science.
Further, you don't say "I have zero apples" to express the fact that you don't have any apples, otherwise following same logic, you would say "I don't have minus one apples" to express the fact that you have one apple :P
To expand upon @Kevin's answer, I take this quote from an answer on Programmers.SE:
The index in an array is not really an index. It is simply an offset that is the distance from the start of the array. The first element is at the start of the array so there is no distance. Therefore the offset is 0.
Further, if you want to learn more about how different languages do their array indexing, then look at this exhaustive list on Wikipedia.
A quote by Dijkstra, from Why numbering should start at zero (1982):
When dealing with a sequence of length N, the elements of which we wish to distinguish by subscript, the next vexing question is what subscript value to assign to its starting element. Adhering to convention a) yields, when starting with subscript 1, the subscript range 1 ≤ i < N+1; starting with 0, however, gives the nicer range 0 ≤ i < N. So let us let our ordinals start at zero: an element's ordinal (subscript) equals the number of elements preceding it in the sequence. And the moral of the story is that we had better regard —after all those centuries!— zero as a most natural number.
A discussion about this article can be found in Lambda the Ultimate - Why numbering should start at 0.