Let a, b and c be non-large positive integers. Does a/b/c always equal a/(b * c) with C# integer arithmetic? For me, in C# it looks like:
int a = 5126, b = 76,
Avoiding the overflow errors noticed by others, they always match.
Let's suppose that a/b=q1
, which means that a=b*q1+r1
, where 0<=r1.
Now suppose that a/b/c=q2
, which means that q1=c*q2+r2
, where 0<=r2
This means that a=b(c*q2+r2)+r1=b*c*q2+br2+r1
.
In order for a/(b*c)=a/b/c=q2
, we need to have 0<=b*r2+r1.
But b*r2+r1, as required, and the two operations match.
This doesn't work if b
or c
are negative, but I don't know how integer division works in that case either.