Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites.

The Question

SuperUser reader DragonLord is curious about why most operating systems and programming languages count from zero. He writes:

Why indeed? As widespread as the practice is, surely there are practical reasons for its implementation.

What historical reasons exist for this, and what practical advantages does counting from zero have over counting from one?

The Answer

SuperUser contributor Matteo offers the following insights:

If you’re looking to delve deeper into the answer, the Dijkstra paper is an informative read.

If an array is stored at a given position in memory (it’s called the address) the position of each element can be computed as

If you consider the first element the first, the computation becomes

Not a huge difference but it adds an unnecessary subtraction for each access.

Edited to add:

The usage of the array index as an offset is not a requirement but just an habit. The offset of the first element could be hidden by the system and taken into consideration when allocating and referencing element. Dijkstra published a paper “Why numbering should start at zero” (pdf) where he explains why starting with 0 is a better choice. Starting at zero allows a better representation of ranges.

Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.