Suppose I have an IF condition :
if (A || B)
∧
|
|
left
{
// do something
}
Now suppose that A
At runtime if(a||b) will test a first, if a is true it will not waste time testing b therefor the compiler will be 1 execution ahead. Therefore if a is more likely to be true than b this test is also likely to cut 1 line. The total number of lines not executed is tiny on a single line but it’s huge if the statement is nested in a loop of some sort(for,while ,recession or database related queries ). Eg per say we have 1million mins to test data in a database at 1 minute per record (30sec for condition A and 30 sec for condition B). Let A have 80% chances to be true and B have 20% chances to be true. The total time needed if you put A first is 600-000hrs yet it’s 900-000hrs if you put B first.if A is tested first[(0,8*1millions hours)*0,5mins+(0,2*1million hours)*1min]===6000-000hrs : if B is tested first [(0,2*1million hours)*0,5mins+(0,2*1million hours)*1min]===9000-000hrs. However you will notice the difference is less significant if the probability of A becoming true is closer to that of B.