complexity-theory

What guarantees are there on the run-time complexity (Big-O) of LINQ methods?

早过忘川 提交于 2019-11-26 01:44:38
问题 I\'ve recently started using LINQ quite a bit, and I haven\'t really seen any mention of run-time complexity for any of the LINQ methods. Obviously, there are many factors at play here, so let\'s restrict the discussion to the plain IEnumerable LINQ-to-Objects provider. Further, let\'s assume that any Func passed in as a selector / mutator / etc. is a cheap O(1) operation. It seems obvious that all the single-pass operations ( Select , Where , Count , Take/Skip , Any/All , etc.) will be O(n),

Cost of len() function

白昼怎懂夜的黑 提交于 2019-11-26 00:46:37
问题 What is the cost of len() function for Python built-ins? (list/tuple/string/dictionary) 回答1: It's O(1) (constant time, not depending of actual length of the element - very fast) on every type you've mentioned, plus set and others such as array.array . 回答2: Calling len() on those data types is O(1) in CPython, the most common implementation of the Python language. Here's a link to a table that provides the algorithmic complexity of many different functions in CPython: TimeComplexity Python

Time complexity of nested for-loop

我与影子孤独终老i 提交于 2019-11-25 23:44:42
问题 I need to calculate the time complexity of the following code: for (i = 1; i <= n; i++) { for(j = 1; j <= i; j++) { // Some code } } Is it O(n^2) ? 回答1: Yes, nested loops are one way to quickly get a big O notation. Typically (but not always) one loop nested in another will cause O(n²). Think about it, the inner loop is executed i times, for each value of i . The outer loop is executed n times. thus you see a pattern of execution like this: 1 + 2 + 3 + 4 + ... + n times Therefore, we can

HashMap get/put complexity

心不动则不痛 提交于 2019-11-25 23:28:28
问题 We are used to saying that HashMap get/put operations are O(1). However it depends on the hash implementation. The default object hash is actually the internal address in the JVM heap. Are we sure it is good enough to claim that the get/put are O(1) ? Available memory is another issue. As I understand from the javadocs, the HashMap load factor should be 0.75. What if we do not have enough memory in JVM and the load factor exceeds the limit ? So, it looks like O(1) is not guaranteed. Does it

Constant Amortized Time

时光怂恿深爱的人放手 提交于 2019-11-25 22:58:39
问题 What is meant by \"Constant Amortized Time\" when talking about time complexity of an algorithm? 回答1: Amortised time explained in simple terms: If you do an operation say a million times, you don't really care about the worst-case or the best-case of that operation - what you care about is how much time is taken in total when you repeat the operation a million times. So it doesn't matter if the operation is very slow once in a while, as long as "once in a while" is rare enough for the

Computational complexity of Fibonacci Sequence

自闭症网瘾萝莉.ら 提交于 2019-11-25 22:56:32
问题 I understand Big-O notation, but I don\'t know how to calculate it for many functions. In particular, I\'ve been trying to figure out the computational complexity of the naive version of the Fibonacci sequence: int Fibonacci(int n) { if (n <= 1) return n; else return Fibonacci(n - 1) + Fibonacci(n - 2); } What is the computational complexity of the Fibonacci sequence and how is it calculated? 回答1: You model the time function to calculate Fib(n) as sum of time to calculate Fib(n-1) plus the

How to find time complexity of an algorithm

…衆ロ難τιáo~ 提交于 2019-11-25 22:53:40
问题 The Question How to find time complexity of an algorithm? What have I done before posting a question on SO ? I have gone through this, this and many other links But no where I was able to find a clear and straight forward explanation for how to calculate time complexity. What do I know ? Say for a code as simple as the one below: char h = \'y\'; // This will be executed 1 time int abc = 0; // This will be executed 1 time Say for a loop like the one below: for (int i = 0; i < N; i++) { Console

How can building a heap be O(n) time complexity?

丶灬走出姿态 提交于 2019-11-25 22:47:47
问题 Can someone help explain how can building a heap be O(n) complexity? Inserting an item into a heap is O(log n) , and the insert is repeated n/2 times (the remainder are leaves, and can\'t violate the heap property). So, this means the complexity should be O(n log n) , I would think. In other words, for each item we \"heapify\", it has the potential to have to filter down once for each level for the heap so far (which is log n levels). What am I missing? 回答1: I think there are several

What is a plain English explanation of “Big O” notation?

落花浮王杯 提交于 2019-11-25 21:34:02
问题 I\'d prefer as little formal definition as possible and simple mathematics. 回答1: Quick note, this is almost certainly confusing Big O notation (which is an upper bound) with Theta notation "Θ" (which is a two-side bound). In my experience, this is actually typical of discussions in non-academic settings. Apologies for any confusion caused. Big O complexity can be visualized with this graph: The simplest definition I can give for Big-O notation is this: Big-O notation is a relative