Optimizing Worst Case Time complexity to O(1) for python dicts [closed]
I have to store 500M two digit unicode character in memory (RAM). The data structure I use should have: Worst Case Space Complexity: O(n) Worst Case Time Complexity: O(1) <-- insertion, read, update, deletion I was thinking of choosing dict which is implementation of hash in python but then problem is it assures time complexity of O(1) for the required operations only in average cases than worst case. I heard that if number of entries is known, time complexity of O(1) can be achieved in worst case scenarios. How todo that? In case, thats not possible in python can I access memory addresses and