I have a small problem in my code for finding the minimum value from a series of numbers. When I initialize min = 0
, the minimum value turns out as 0. But when I do
When I initialize
min = 0
, the minimum value turns out as0
.
because in that case, if (min > a[i])
is always false. Remeber, a[i]
is > = 100.
But when I don't initialize
min
, the answer is correct!
if you don't initalize the local variable, it's value in indeterminent and contains some garbage value (possibly a large one). So, seemingly, your logic works and you're apparently getting the right answer. However, using values of uninitalized local variable invokes undefined behaviour.
Solution:
Initialize min
with the largest possbile value, INT_MAX
present in header file.