hashtable

unorder_map<float, short> Why does this work?

三世轮回 提交于 2019-12-11 04:28:12
问题 I am using an unordered_map>float, unsigned short> to implement a hash table in C++. I know that using floats as keys to a hash table is a BAD idea under most circumstances because comparing them is error prone. However, under these circumstances I am reading the floats in from large files and their precision is known and constant. However, I would like to know the details of how unordered_map is hashing my floats in order to estimate collision frequency. I am not overriding the default hash

Expected constructor, destructor, or type conversion before ‘<’ token

懵懂的女人 提交于 2019-12-11 03:28:17
问题 I'm new to C++, and I just can't seem to figure out what's causing these errors. The following is my header file: #ifndef TABLE #define TABLE #include <iostream> #include <cstdlib> #include <vector> typedef struct { double successful , unsuccessful[2] ; } Perform ; using namespace std ; template <class DATA> class Table { private : vector<DATA>* slot; vector<bool>* passBits; vector<bool>* full; int tableSize; public : explicit Table ( unsigned size = 5 ) ; ~Table( ) ; //destructor void empty

Java Hashtable put method slow down my application

試著忘記壹切 提交于 2019-12-11 03:13:05
问题 I need to do : Dictionary cache; cache = new Hashtable(); this.getDocument().putProperty("imageCache", cache); Then I have a method who does : cache.put(url, picture); Where picture is an Image object. I create this way : public Image getSmiley(String smileyName) { BufferedImage img = new BufferedImage(16, 16, BufferedImage.TYPE_INT_ARGB); Graphics g = img.getGraphics(); ImageIcon myicon = new ImageIcon(getClass().getResource("/ola/smileys/" + smileyName + ".png")); myicon.paintIcon(null, g,

Memory issue when Reading HUGE csv file, STORE as Person objects, Write into multiple cleaner/smaller CSV files

有些话、适合烂在心里 提交于 2019-12-11 03:04:53
问题 I have two text files with comma delimited values. One is 150MB and the other is 370MB, so these guys have three million+ rows of data. One document holds information about, let's say soft drink preferences, and the next might have information about, let's say hair colors. Example soft drinks data file, though in the real file the UniqueNames are NOT in order, nor are the dates: "UniqueName","softDrinkBrand","year" "001","diet pepsi","2004" "001","diet coke","2006" "001","diet pepsi","2004"

ocaml hash from mysql

让人想犯罪 __ 提交于 2019-12-11 02:56:11
问题 I have a large database (approx. 150 000 records) that I want to embed into an OCaml source code named compute.ml. I am trying (without success) to transform these tables into hashtables and to embed these hastables into a function compute , so as to have the binary program run quicly without having to do queries to an external sql database. I have 2 questions: Is there a way to export for once a mysql table into an associative array (Hashtbl) that can be accessed by (or even embedded into)

Load factor of hash tables with tombstones

不羁的心 提交于 2019-12-11 02:23:33
问题 So the question came up about whether tombstones should be included when calculating the load factor of a hash table. I thought that, given that the load factor is used to determine when to expand capacity, tombstones should not be included. An obvious example is if you almost fill and then remove every value in a hash table. Here insertions are super easy (no collisions) so I believe the load factor shouldn't include them. But you could look at this and think that with all the tombstones

Android Hashtable Serialization

夙愿已清 提交于 2019-12-11 02:07:59
问题 I am having a weird issue with serialization of a Hashtable. I have made a Server, Client app. Where server(PC/MAC) is serializing a Hashtable and sending it to Client(Android) through UDP. The data is sent/read correctly but I get a bunch of these messages below on LogCat. 04-12 11:19:43.059: DEBUG/dalvikvm(407): GetFieldID: unable to find field Ljava/util/Hashtable;.loadFactor:F Occasionally, I would see these 04-12 11:21:19.150: DEBUG/dalvikvm(407): GC freed 10814 objects / 447184 bytes in

Java Hashtable many accesses problem

∥☆過路亽.° 提交于 2019-12-11 01:54:03
问题 I'm developing tool, which takes Java code and generate code for getting estimations for execution time of basic blocks, loops and methods. After some block is executed, we put it's time to our tool. The program model is stored in next representation static Hashtable<String, Hashtable<Integer, Hashtable<String, pair>>> method2Data = new Hashtable<String, Hashtable<Integer, Hashtable<String, pair>>>(); static Hashtable<String, Vector<String>> class2Method = new Hashtable<String, Vector<String>

Hashtable getting null for existing key

安稳与你 提交于 2019-12-11 00:07:15
问题 I have a very strange error in this code from Vaadin's ContainerHierarchicalWrapper: for (Object object : children.keySet()) { LinkedList<Object> object2 = children.get(object); } The debugger shows this state: How is that even possible? How can object2 be null ? This is my actual code which causes the NPE (class EstablishRelationWindow ): childrenContainer = new BeanItemContainer<>(PlaylistDTO.class); childrenContainerHierarchy = new ContainerHierarchicalWrapper(childrenContainer);

Tuples/ArrayList of pairs

依然范特西╮ 提交于 2019-12-10 23:00:03
问题 I'm essentially trying to create a list of pairs which is proving frustratingly difficult Note before anyone mentions Hashtables that there will be duplicates which I don't care about. For example, if I do $b = @{"dog" = "cat"} I get Name Value ---- ----- dog cat which is good. However, I'm then unable to add the likes of $b += @{"dog" = "horse"} Item has already been added. Key in dictionary: 'dog' Key being added: 'dog' I'm just trying to create a table of data which I can add to using