Handling large dense matrices in python
Basically, what is the best way to go about storing and using dense matrices in python? I have a project that generates similarity metrics between every item in an array. Each item is a custom class, and stores a pointer to the other class and a number representing it's "closeness" to that class. Right now, it works brilliantly up to about ~8000 items, after which it fails with a out-of-memory error. Basically, if you assume that each comparison uses ~30 (seems accurate based on testing) bytes to store the similarity, that means the total required memory is: numItems^2 * itemSize = Memory So