asymptotic-complexity

Asymptotic comparison of functions

限于喜欢 提交于 2020-01-06 08:33:52
问题 I want to compare following functions asymptotically and then arrange them in the ascending order. Also requested is a proper explanation lg((√n)!), lg(SquareRoot(n!)), SquareRootlg(n!), (lg(√n))!, (SquareRoot(lg n))!, SquareRoot(lg n)! 回答1: If you wonder about "general solution" and you follow a lot into asymptotic functions comparisons. Here is what I recommend : Use limit definition of BigO notation, once you know: f(n) = O(g(n)) iff limit (n approaches +inf) f(n)/g(n) exists and is not

HashMap vs. ArrayList insertion performance confusion

只谈情不闲聊 提交于 2020-01-04 09:14:11
问题 From my understanding a hashmap insertion is O(1) and for an arraylist the insertion is O(n) since for the hashmap the hashfunction computes the hashcode and index and inserts the entry and an array list does a comparison every time it enters a new element. 回答1: Firstly, an operation of complexity O(1) does not always take lesser time than an operation of complexity O(n). O(1) only means that the operation takes a constant time (which could be any value), regardless of the size of the input.

Tricky Big-O complexity

天大地大妈咪最大 提交于 2020-01-03 10:02:15
问题 public void foo(int n, int m) { int i = m; while (i > 100) { i = i / 3; } for (int k = i ; k >= 0; k--) { for (int j = 1; j < n; j *= 2) { System.out.print(k + "\t" + j); } System.out.println(); } } I figured the complexity would be O(logn). That is as a product of the inner loop, the outer loop -- will never be executed more than 100 times, so it can be omitted. What I'm not sure about is the while clause, should it be incorporated into the Big-O complexity? For very large i values it could

Time complexity of the program using recurrence equation

蓝咒 提交于 2020-01-03 07:01:07
问题 I want to find out the time complexity of the program using recurrence equations. That is .. int f(int x) { if(x<1) return 1; else return f(x-1)+g(x); } int g(int x) { if(x<2) return 1; else return f(x-1)+g(x/2); } I write its recurrence equation and tried to solve it but it keep on getting complex T(n) =T(n-1)+g(n)+c =T(n-2)+g(n-1)+g(n)+c+c =T(n-3)+g(n-2)+g(n-1)+g(n)+c+c+c =T(n-4)+g(n-3)+g(n-2)+g(n-1)+g(n)+c+c+c+c ………………………. …………………….. Kth time ….. =kc+g(n)+g(n-1)+g(n-3)+g(n-4).. .. . … +T(n

Big O of clojure library functions

时间秒杀一切 提交于 2020-01-01 10:01:42
问题 Can anyone point me to a resource that lists the Big-O complexity of basic clojure library functions such as conj, cons, etc.? I know that Big-O would vary depending on the type of the input, but still, is such a resource available? I feel uncomfortable coding something without having a rough idea of how quickly it'll run. 回答1: Here is a table composed by John Jacobsen and taken from this discussion: 回答2: Late to the party here, but I found the link in the comments of the first answer to be

Big O of clojure library functions

两盒软妹~` 提交于 2020-01-01 09:59:33
问题 Can anyone point me to a resource that lists the Big-O complexity of basic clojure library functions such as conj, cons, etc.? I know that Big-O would vary depending on the type of the input, but still, is such a resource available? I feel uncomfortable coding something without having a rough idea of how quickly it'll run. 回答1: Here is a table composed by John Jacobsen and taken from this discussion: 回答2: Late to the party here, but I found the link in the comments of the first answer to be

Algorithms with O(n/log(n)) complexity

点点圈 提交于 2019-12-25 14:29:14
问题 Are there any famous algorithms with this complexity? I was thinking maybe a skip list where levels of the nodes are not determined by the number of tails coin tosses, but instead are use a number generated randomly (with uniform distribution) from the (1,log(n)) period to determine the level of the node. Such a data structure would have a find(x) operation with the complexity of O(n/log(n)) (I think, at least). I was curious whether there was anything else. 回答1: It's common to see algorithms

Difference between solving T(n) = 2T(n/2) + n/log n and T(n) = 4T(n/2) + n/log n using Master Method

蹲街弑〆低调 提交于 2019-12-24 02:23:26
问题 I recently stumbled upon a resource where the 2T(n/2) + n/log n type of recurrences were declared unsolvable by MM. I accepted it as a lemma, until today, when another resource proved to be a contradiction (in some sense). As per the resource (link below): Q7 and Q18 in it are the rec. 1 and 2 respectively in the question whereby, the answer to Q7 says it can't be solved by giving the reason 'Polynomial difference b/w f(n) and n^(log a base b)'. On the contrary, answer 18 solves the second

Difference between solving T(n) = 2T(n/2) + n/log n and T(n) = 4T(n/2) + n/log n using Master Method

倖福魔咒の 提交于 2019-12-24 02:23:24
问题 I recently stumbled upon a resource where the 2T(n/2) + n/log n type of recurrences were declared unsolvable by MM. I accepted it as a lemma, until today, when another resource proved to be a contradiction (in some sense). As per the resource (link below): Q7 and Q18 in it are the rec. 1 and 2 respectively in the question whereby, the answer to Q7 says it can't be solved by giving the reason 'Polynomial difference b/w f(n) and n^(log a base b)'. On the contrary, answer 18 solves the second

Recurrence Relation T(n) = T(n^(1/2)) + T(n-n^(1/2)) + n

半城伤御伤魂 提交于 2019-12-24 00:54:48
问题 My friend and I have found this problem and we cannot figure out how to solve it. Its not trivial and standard substitution method does not really work(or we cannot apply it correctly) This should be quicksort with pivots at rank problem. Here is the recurrence: T(n) = T(n^(1/2)) + T(n-n^(1/2)) + n Any help would be much appreciated. Thanks! 回答1: First take it easy: T(n) = T(n-n^(1/2)) + n, number of iteration is n^(1/2), in each iteration you'll have n-k sqrt(n) time complexity, so total