high-speed-computing

Higher speed options for executing very large (20 GB) .sql file in MySQL

余生长醉 提交于 2019-12-10 22:16:54
问题 My firm was delivered a 20+ GB .sql file in reponse to a request for data from the gov't. I don't have many options for getting the data in a different format, so I need options for how to import it in a reasonable amount of time. I'm running it on a high end server (Win 2008 64bit, MySQL 5.1) using Navicat's batch execution tool. It's been running for 14 hours and shows no signs of being near completion. Does anyone know of any higher speed options for such a transaction? Or is this what I

fast, large-width, non-cryptographic string hashing in python

亡梦爱人 提交于 2019-11-28 04:40:16
I have a need for a high-performance string hashing function in python that produces integers with at least 34 bits of output (64 bits would make sense, but 32 is too few). There are several other questions like this one on Stack Overflow, but of those every accepted/upvoted answer I could find fell in to one of a few categories, which don't apply (for the given reason.) Use the built-in hash() function. This function, at least on the machine I'm developing for (with python 2.7, and a 64-bit cpu) produces an integer that fits within 32 bits - not large enough for my purposes. Use hashlib.

fast, large-width, non-cryptographic string hashing in python

流过昼夜 提交于 2019-11-27 00:40:10
问题 I have a need for a high-performance string hashing function in python that produces integers with at least 34 bits of output (64 bits would make sense, but 32 is too few). There are several other questions like this one on Stack Overflow, but of those every accepted/upvoted answer I could find fell in to one of a few categories, which don't apply (for the given reason.) Use the built-in hash() function. This function, at least on the machine I'm developing for (with python 2.7, and a 64-bit