C: avoiding overflows when working with big numbers
I've implemented some sorting algorithms (to sort integers) in C, carefully using uint64_t to store anything which has got to do with the data size (thus also counters and stuff), since the algorithms should be tested also with data sets of several giga of integers. The algorithms should be fine, and there should be no problems about the amount of data allocated: data is stored on files, and we only load little chunks per time, everything works fine even when we choke the in-memory buffers to any size. Tests with datasets up to 4 giga ints (thus 16GB of data) work fine (sorting 4Gint took 2228