complexity-theory

Analyzing worst case order-of-growth

你。 提交于 2019-12-24 13:33:12
问题 I'm trying to analyze the worst case order of growth as a function of N for this algorithm: for (int i = N*N; i > 1; i = i/2) for (int j = 0; j < i; j++) { total++; } What I'm trying is to analyze how many times the line total++ will run by looking at the inner and outer loops. The inner loop should run (N^2)/2 times. The outer loop I don't know. Could anyone point me in the right direction? 回答1: The statement total++; shall run following number of times: = N^2 + N^2 / 2 + N^2 / 4 ... N^2 / 2

Does binary search have logarithmic performance of deque C++ data structure?

隐身守侯 提交于 2019-12-24 11:06:38
问题 The standard says that std::binary_search(...) and the two related functions std::lower_bound(...) and std::upper_bound(...) are O(log n) if the data structure has random access. So, given that, I presume that these algorithms have O(log n) performance on std::deque (assuming its contents are kept sorted by the user). However, it seems that the internal representation of std::deque is tricky (it's broken into chunks), so I was wondering: does the requirement of O(log n) search hold for std:

Optimize python file comparison script

久未见 提交于 2019-12-24 08:24:38
问题 I have written a script which works, but I'm guessing isn't the most efficient. What I need to do is the following: Compare two csv files that contain user information. It's essentially a member list where one file is a more updated version of the other. The files contain data such as ID, name, status, etc, etc Write to a third csv file ONLY the records in the new file that either don't exist in the older file, or contain updated information. For each record, there is a unique ID that allows

Why is the time complexity O(n^2) in this code?

蹲街弑〆低调 提交于 2019-12-24 08:18:04
问题 I just didn't get it, why the time complexity is O(n^2) instead of O(n*logn)? The second loop is incrementing 2 each time so isn't it O(logn)? void f3(int n){ int i,j,s=100; int* ar = (int*)malloc(s*sizeof(int)); for(i=0; i<n; i++){ s=0; for(j=0; j<n; j+=2){ s+=j; printf("%d\n", s); } free(ar); } 回答1: By incrementing by two, rather than one, you're doing the following N*N*(1/2) . With big(O) notation, you don't care about the constant, so it's still N*N. This is because big(O) notation

How to determine the complexity of an algorithm function?

妖精的绣舞 提交于 2019-12-24 05:33:36
问题 How do you know if a algorithm function takes linear/constant/logarithmic time for a specific operation? does it depend on the cpu cycles? 回答1: There are three ways you can do it (at least). Look up the algorithm on the net and see what it says about its time complexity. Examine the algorithm yourself to look at things like nested loops and recursion conditions, and how often each loop is run or each recursion is done, based on the input size. An extension of this is a rigorous mathematical

Does Sonar have computational complexity measuring capabilities?

点点圈 提交于 2019-12-24 03:23:41
问题 There is a question asking how to measure computational complexity of Java code similar to what trend-prof does for C/C++. The accepted answer to that question says it is commonly done using Sonar. I know that Sonar has good cyclomatic complexity capabilities built in, but that is not what the question was asking for. Does Sonar in fact have computational complexity measuring capabilities? If so any pointers to details on setting it up would be great. 回答1: No there aren't any similar tools

What is the running time of the translation of infix to postfix using queue and stack?

有些话、适合烂在心里 提交于 2019-12-24 00:57:46
问题 In c++... I know the time complexities for the individual functions of queue and stack, but I don't know what the time complexity for an infixToPostfix function would be, using both queue and stack....I am a beginner programmer of course, and I am very confused. 回答1: Converting from infix to postfix using a stack and a queue is, I assume, Dijkstra's shunting-yard algorithm. One way to measure the complexity is to think about how many pushes and pops are done: Every number is pushed exactly

Trouble with nested for-loop running time

旧巷老猫 提交于 2019-12-24 00:54:41
问题 I have been thinking over this problem for a few days now and am hung up on calculating the number of times the second nested for-loop will run. I believe that I have the correct formula for determining the running time for the other two for-loops, but this third one has me hung up. I have the first loop running n-1 times. The equation to determine the number of times loop #2 runs is; The summation of 1 to n-1. If anyone could help me understand how to find the number of times loop #3 runs it

Time complexity of a recursive function with two calls

China☆狼群 提交于 2019-12-24 00:42:13
问题 Consider this code: def count_7(lst): if len(lst) == 1: if lst[0] == 7: return 1 else: return 0 return count_7(lst[:len(lst)//2]) + count_7(lst[len(lst)//2:]) note: the slicing operations will be considered as O(1). So, my inutation is telling me it's O(n*logn), but I'm struggling proving it scientifically. Be glad for help! 回答1: Ok, mathematically (sort of ;) I get something like this: T(n) = 2T(n/2) + c T(1) = 1 Generalizing the equation: T(n) = 2^k * T(n/2^k) + (2^k - 1) * c T(1) = 1 n/2^k

complexity compression string

ぐ巨炮叔叔 提交于 2019-12-23 18:33:53
问题 I have a large theoretical string (104 characters long) database generation program that returns results measured in petabytes. I don't have that much computing power so I would like to filter the low complexity strings from the database. My grammer is a modified form of the English alphabet with no numerical characters. I read about Kolmogorov Complexity and how it is theoretically impossible to calculate but I just need something basic in C# using compression. Using these two links How to