time-complexity

Time and Space Complexity of list to str conversion in Python

落花浮王杯 提交于 2019-12-23 23:18:29
问题 Trying to find out what is the time complexity of casting to string str([1,2,6,...,3,6]) Pretty sure it's O(1) Not sure how to verify. Edit: about space complexity, That should not be linear to list size, thinking O(1) because string has max size. 回答1: It's linear, because bigger lists need more time and memory to convert. Graph generated using perfplot. Code, for reference: import numpy as np import perfplot perfplot.show( setup=lambda n: np.random.choice(100, n).tolist(), kernels=[ lambda

Best case time complexity for selection sort

假如想象 提交于 2019-12-23 22:27:15
问题 Why is the best case time complexity for selection sort O(n^2) when it is O(n) for insertion sort and bubble sort? Their average times are same. I don't understand why the best case times are different. Would appreciate some help. 回答1: For selection sort you have to search the minimum and put it on the first place in the first iteration. In the second iteration you have to search the minimum in the non-sorted part of the array and put it to the second place and so on... You only know which

Regex crossword solver

回眸只為那壹抹淺笑 提交于 2019-12-23 22:04:10
问题 Following Find string to regular expression programmatically?, we assume that it takes linear time to find a string that matches a regex. My intiution says that we can solve a regex crossword programmatically too, right? If yes, what will be the time complexity of solving a NxM regex crossword? Example: 回答1: It's NP hard, even if you disallow backreferences. There's a simple mapping from the exact set cover problem to this problem. If you have sets S[1], S[2], ..., S[n] (with union S ), and

Running time of minimum spanning tree? ( Prim method )

大憨熊 提交于 2019-12-23 20:13:01
问题 I have written a code that solves MST using Prim method. I read that this kind of implementation(using priority queue) should have O(E + VlogV) = O(VlogV) where E is the number of edges and V number of Edges but when I look at my code it simply doesn't look that way.I would appreciate it if someone could clear this up for me. To me it seems the running time is this: The while loop takes O(E) times(until we go through all the edges) Inside that loop we extract an element from the Q which takes

Is JavaScript switch statement linear or constant time?

这一生的挚爱 提交于 2019-12-23 19:05:08
问题 I have the following JavaScript on my site so that when certain specific searches are performed, the answer is hardcoded to a specific page: function redirect() { var input = document.getElementById('searchBox').value.toLowerCase(); switch (input) { case 'rectangular': window.location.replace('http://www.Example.com/Rectangular/'); break; case 'elephant': window.location.replace('http://www.Example.com/Elephants/'); break; case 'coils': window.location.replace('http://www.Example.com/Parts/')

Time complexity for Java HashMap resizing

痴心易碎 提交于 2019-12-23 17:32:31
问题 I am wondering what would be the time complexity on Java HashMap resizing when the load factor exceeds the threshold ? As far as I understand for HashMap the table size is always power of 2 an even number, so whenever we resize the table we don't necessary need to rehash all the keys (correct me if i am wrong), all we need to do is to allocate additional spaces without and copy over all the entries from the old table (I am not quite sure how does JVM deal with that internally), correct ?

Analyzing time complexity of a function written in C

家住魔仙堡 提交于 2019-12-23 17:16:08
问题 I was implementing Longest Common Subsequence problem in C. I wish to compare the time taken for execution of recursive version of the solution and dynamic programming version. How can I find the time taken for running the LCS function in both versions for various inputs? Also can I use SciPy to plot these values on a graph and infer the time complexity? Thanks in advance, Razor 回答1: For the second part of your question: the short answer is yes, you can. You need to get the two data sets (one

Which Big-O grows faster asymptotically

懵懂的女人 提交于 2019-12-23 14:54:09
问题 I have gotten into an argument/debate recently and I am trying to get a clear verdict of the correct solution. It is well known that n! grows very quickly, but exactly how quickly , enough to "hide" all additional constants that might be added to it? Let's assume I have this silly & simple program (no particular language): for i from 0 to n! do: ; // nothing Given that the input is n , then the complexity of this is obviously O(n!) (or even ϴ(n!) but this isn't relevant here). Now let's

Median of medians algorithm: why divide the array into blocks of size 5

丶灬走出姿态 提交于 2019-12-23 09:39:19
问题 In median-of-medians algorithm, we need to divide the array into chunks of size 5. I am wondering how did the inventors of the algorithms came up with the magic number '5' and not, may be, 7, or 9 or something else? 回答1: I think that if you'll check "Proof of O(n) running time" section of wiki page for medians-of-medians algorithm: The median-calculating recursive call does not exceed worst-case linear behavior because the list of medians is 20% of the size of the list, while the other

What is the time complexity of adding n numbers

Deadly 提交于 2019-12-23 06:18:09
问题 If I have to add arbitrary numbers like the numbers 1,12,14,71,83,21... then what would be the time complexity of this operation? I know that adding two numbers is O(1), but how about a list of n numbers. Assuming I'm using the best possible data structure to store them for this purpose, if at all that can have any impact on the process! Thanks in advance! 回答1: Adding 2 numbers is O(1) since the operation itself is constant time and the input is fixed. Regardless of the input, the operation