time-complexity

Why do we use linear probing in Hash tables when there is separate chaining linked with lists?

穿精又带淫゛_ 提交于 2019-12-29 11:35:41
问题 I recently learned about different methods to deal with collisions in hash tables. And saw that the separate chaining with linked lists is always more time efficient, and for space efficiency we allocate a predefined memory for Linear probing which later on we might not use,for separate chaining we utilize memory dynamically, so is separate chaining with linked list not more efficient than Linear probing?if yes why do we then use Linear probing at all? 回答1: I'm surprised that you saw chained

Does List.Insert have any performance penalty?

混江龙づ霸主 提交于 2019-12-29 06:39:26
问题 Given a list: List<object> SomeList = new List<object>(); Does doing: SomeList.Insert(i, val); Vs. SomeList.Add(val); Has any performance penalty? If it does, how it depends on: - i - insertion index - SomeList.Count - The size of the list 回答1: The List class is the generic equivalent of the ArrayList class. It implements the IList generic interface using an array whose size is dynamically increased as required. (source) Meaning that the internal data is stored as an Array, and so it is

Does List.Insert have any performance penalty?

我们两清 提交于 2019-12-29 06:39:08
问题 Given a list: List<object> SomeList = new List<object>(); Does doing: SomeList.Insert(i, val); Vs. SomeList.Add(val); Has any performance penalty? If it does, how it depends on: - i - insertion index - SomeList.Count - The size of the list 回答1: The List class is the generic equivalent of the ArrayList class. It implements the IList generic interface using an array whose size is dynamically increased as required. (source) Meaning that the internal data is stored as an Array, and so it is

String concatenation complexity in C++ and Java [duplicate]

╄→гoц情女王★ 提交于 2019-12-29 03:27:06
问题 This question already has answers here : C++ equivalent of StringBuffer/StringBuilder? (10 answers) Closed 6 years ago . Consider this piece of code: public String joinWords(String[] words) { String sentence = ""; for(String w : words) { sentence = sentence + w; } return sentence; } On each concatenation a new copy of the string is created, so that the overall complexity is O(n^2) . Fortunately in Java we could solve this with a StringBuffer , which has O(1) complexity for each append, then

Why does heap sort have a space complexity of O(1)?

試著忘記壹切 提交于 2019-12-28 09:13:10
问题 I understand that both quick sort and merge sort need O(n) auxiliary space for the temporary sub-arrays that are constructed, and in-place quick sort requires O(log n) auxiliary space for the recursive stack frames. But for heap sort, it seems like it also has a worst case of O(n) auxiliary space to build the temporary heap, even if the nodes are just pointers to the actual elements. I came across this explanation : Only O(1) additional space is required because the heap is built inside the

Finding the most frequent character in a string

北战南征 提交于 2019-12-28 06:46:37
问题 I found this programming problem while looking at a job posting on SO. I thought it was pretty interesting and as a beginner Python programmer I attempted to tackle it. However I feel my solution is quite...messy...can anyone make any suggestions to optimize it or make it cleaner? I know it's pretty trivial, but I had fun writing it. Note: Python 2.6 The problem: Write pseudo-code (or actual code) for a function that takes in a string and returns the letter that appears the most in that

T(n) = c1T(n/a) + c2(T/b) + f(n)

会有一股神秘感。 提交于 2019-12-28 04:37:06
问题 Example T(n)=T(n/3)+T(n/4)+3n is this solvable with iterative master theorem or recursion tree.Can someone solve it analytically to show how it's done ? 回答1: We can expand T(n) with a binomial summation : (after some steps - can be proven by induction For some depth of expansion / recursion k . Where do we terminate? When the parameters to all instances of f(n) reach a certain threshold C . Thus the maximum depth of expansion: We choose the smallest between a, b because the parameter with

Python collections.Counter: most_common complexity

孤街浪徒 提交于 2019-12-28 02:51:04
问题 What is the complexity of the function most_common provided by the collections.Counter object in Python? More specifically, is Counter keeping some kind of sorted list while it's counting, allowing it to perform the most_common operation faster than O(n) when n is the number of (unique) items added to the counter? For you information, I am processing some large amount of text data trying to find the n-th most frequent tokens. I checked the official documentation and the TimeComplexity article

What is the complexity of this simple piece of code?

ⅰ亾dé卋堺 提交于 2019-12-27 22:13:05
问题 I'm pasting this text from an ebook I have. It says the complexity if O(n 2 ) and also gives an explanation for it, but I fail to see how. Question: What is the running time of this code? public String makeSentence(String[] words) { StringBuffer sentence = new StringBuffer(); for (String w : words) sentence.append(w); return sentence.toString(); } The answer the book gave: O(n 2 ), where n is the number of letters in sentence. Here’s why: each time you append a string to sentence, you create

What is the complexity of this simple piece of code?

ε祈祈猫儿з 提交于 2019-12-27 22:08:17
问题 I'm pasting this text from an ebook I have. It says the complexity if O(n 2 ) and also gives an explanation for it, but I fail to see how. Question: What is the running time of this code? public String makeSentence(String[] words) { StringBuffer sentence = new StringBuffer(); for (String w : words) sentence.append(w); return sentence.toString(); } The answer the book gave: O(n 2 ), where n is the number of letters in sentence. Here’s why: each time you append a string to sentence, you create