Simplifying the Big-O Complexity of this Exponential Algorithm

自闭症网瘾萝莉.ら 提交于 2020-05-11 06:50:20

问题


I have a counting algorithm for which I am trying to get a general big-o description. It is horribly nested and horribly exponential. Here it is:

 1. For each T_i in T
 2. For k = 1 to max_k
 3. For each of 2^k*(n choose k) items
 4. For each t in T_i
 5. check if the item is in t...etc.

Here is a line-by-line idea of each running time

  1. This is a simple partitioning and I'm going to just give it a constant c1.
  2. max_k is a small number, always less than n, perhaps around 4 or 5. I will use k below.
  3. This loop always runs 2^k*(n choose k) times
  4. By considering line 1 constant, we can generalize this line and know it will never fire more than 2^n times in total in worst case, but generally will run a fraction of 2^n times, so we will call this one (2^n)/c2
  5. This is the simple if-statement operation inside all these loops, so c3.

Multiplying all these together gives:

c1 * k * 2^k * (n choose k) * (2^n)/c2 * c3

Since I want a big-O representation, ignoring constants gives:

k * 2^k * (n choose k) * (2^n)

It is known that (n choose k) is bounded above by (n * e / k)^k, so:

O(k * 2^k * (n * e / k)^k * (2^n))

My question is, what can I ignore here... 2^n is certainly the dominating term since n is always larger than k, and typically much more so. Can this be simplified to O(2^n)? Or O(2^terrible)? Or should I leave in the 2^k, as in O(2^k * 2^n)? (or leave all the terms in?)

My understanding is that if k or max_k can compete or surpass n, then they are vital. But since they are always dominated, can they be discarded like lower order terms of polynomial running times? I suppose all the exponential running time mess is confusing me. Any advice is greatly appreciated.


回答1:


My understanding is that if k or max_k can compete or surpass n, then they are vital

True, but the other way around isn't - meaning - it cannot be ignored when it comes to big O notation, even if it does not compete with n. It can be ignored only if max_k is bounded with a constant (there is a constant c such that k <= c). For example - O(n * logk) algorithms, are not O(n), since the k factor is not bounded and thus there exists a k such that nlogk > c*n for every constant c.

Since the expression you got is a product, all you can ignore are constants, which in your case - is only e getting you O(k*2^k * (n/k)^k * 2^n).

If k is bounded, then you can remove it from the expression as well in big O notation, and you will get O(n^k* 2^n). Note that even in this case, although n^k << 2^n, it still cannot be ignored, because for every constant c there exists some n such that c*2^n < n^k *2^n, so the algorithm is not a O(2^n) one.

Smaller factors can be ignored when it comes to addition. If k < n then O(n + k) = O(n), because there is a constants c,N such that for all n > N: c*n < n + k, but this is of course not true when dealing with product.



来源:https://stackoverflow.com/questions/11314472/simplifying-the-big-o-complexity-of-this-exponential-algorithm

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!