I have following code, where I used HashMap (using two parallel arrays) for storing key-value pairs (key can have multiple values). Now, I have to store and load it for futu
Have the database on disk, and not in memory. Rewrite your operations so that they don't operate on arrays, but instead operate on buffers. Then you can open a sufficiently large file, and have the operations access the portion they need using a mapped buffer. Try whether your application performs better when you implement a cache of the few most recently mapped memory regions, so you won't have to map and unmap common regions too often, but instead can keep them mapped in.
This should give you the best of both worlds, disk and ram:
As you can see, this depends a lot on locality: if some keys are more common than others, things will perform well, whereas nicely distributed keys will cause a new disk operation for each access. So while nice distributions are desirable for most in-memory hash maps, other structures which map often-used keys to similar locations will perform better here. Those will interfere with collision handling, though.
Better to use in-memory database like sqlite, which will give good result.
I doubt this is possible, given the datatypes you have declared. Just multiply the sizes of the primitive types.
Each row requires 4 bytes to store an int and 8 bytes to store a long. 600 million rows * 12 bytes per row = 7200 MB = 7.03 GB. You say you can allocate 5 GB to the JVM. So even if it was all heap and stored only this custom HashMap, it will not fit. Consider shrinking the size of the datatypes involved or storing it somewhere other than RAM.