A few weeks back I was using std::ifstream to read in some files and it was failing immediately on open because the file was larger than 4GB. At the time I couldnt find a de
From the Standard point of view, there is nothing that prevents this. However, in reality, most 32bit implementations use 32bit for std::size_t. Now, the C++ standard mandates that the standard allocator in the C++ Standard Library uses std::size_t as the size quantity. Thus, you are limited to 2^32 bytes of storage for containers, strings and stuff. The situation could be another for std::off_t, i don't know exactly what is going on there.
You have to use the native API of the OS directly, or some library wrapping it, to be able to do that, without having to trust the Standard Library implementations, which are largely implementation-dependent.