问题
I'm really confused on simplifying this recurrence relation: c(n) = c(n/2) + n^2.
So I first got:
c(n/2) = c(n/4) + n^2 so c(n) = c(n/4) + n^2 + n^2 c(n) = c(n/4) + 2n^2
c(n/4) = c(n/8) + n^2 so c(n) = c(n/8) + 3n^2
I do sort of notice a pattern though:
2 raised to the power of whatever coefficient is in front of "n^2" gives the denominator of what n is over.
I'm not sure if that would help.
I just don't understand how I would simplify this recurrence relation and then find the theta notation of it.
EDIT: Actually I just worked it out again and I got c(n) = c(n/n) + n^2*lgn.
I think that is correct, but I'm not sure. Also, how would I find the theta notation of that? Is it just theta(n^2lgn)?
回答1:
Firstly, make sure to substitute n/2
everywhere n
appears in the original recurrence relation when placing c(n/2)
on the lhs.
i.e.
c(n/2) = c(n/4) + (n/2)^2
Your intuition is correct, in that it is a very important part of the problem. How many times can you divide n
by 2
before we reach 1
?
Let's take 8 for an example
8/2 = 4
4/2 = 2
2/2 = 1
You see it's 3
, which as it turns out is log(8)
In order to prove the theta notation, it might be helpful to check out the master theorem. This is a very useful tool for proving complexity of a recurrence relation.
Using the master theorem case 3, we can see
a = 1
b = 2
logb(a) = 0
c = 2
n^2 = Omega(n^2)
k = 9/10
(n/2)^2 < k*n^2
c(n) = Theta(n^2)
The intuition as to why the answer is Theta(n^2)
is that you have n^2 + (n^2)/4 + (n^2)/16 + ... + (n^2)/2^(2n)
, which won't give us logn
n^2
s, but instead increasingly smaller n^2
s
回答2:
Let's answer a more generic question for recurrences of the form:
r(n) = r(d(n)) + f(n)
. There are some restrictions for the functions, that need further discussion, e.g. if x
is a fix point of d
, then f(x)
should be 0, otherwise there isn't any solution. In your specific case this condition is satisfied.
Rearranging the equation we get that r(n) - r(d(n)) = f(n)
, and we get the intuition that r(n)
and r(d(n))
are both a sum of some terms, but r(n)
has one more term than r(d(n))
, that's why the f(n)
as the difference. On the other hand, r(n)
and r(d(n))
have to have the same 'form', so the number of terms in the previously mentioned sum has to be infinite.
Thus we are looking for a telescopic sum, in which the terms for r(d(n))
cancel out all but one terms for r(n)
:
r(n) = f(n) + a_0(n) + a_1(n) + ...
- r(d(n)) = - a_0(n) - a_1(n) - ...
This latter means that
r(d(n)) = a_0(n) + a_1(n) + ...
And just by substituting d(n)
into the place of n
into the equation for r(n)
, we get:
r(d(n)) = f(d(n)) + a_0(d(n)) + a_1(d(n)) + ...
So by choosing a_0(n) = f(d(n))
, a_1(n) = a_0(d(n)) = f(d(d(n)))
, and so on: a_k(n) = f(d(d(...d(n)...)))
(with k+1
pieces of d
in each other), we get a correct solution.
Thus in general, the solution is of the form r(n) = sum{i=0..infinity}(f(d[i](n)))
, where d[i](n)
denotes the function d(d(...d(n)...))
with i
number of iterations of the d
function.
For your case, d(n)=n/2
and f(n)=n^2
, hence you can get the solution in closed form by using identities for geometric series. The final result is r(n)=4/3*n^2
.
回答3:
Go for advance Master Theorem.
T(n) = aT(n/b)+n^klog^p
where a>0 b>1 k>0 p=real number.
case 1: a>b^k
T(n) = 0(n^logba) b is in base.
case 2 a=b^k
1. p>-1 T(n) than T(n)=0(n^logba log^p+1)
2. p=-1 Than T(n)=0(n^logba logn)
3. p<-1 than T(n)=0(n^logba)
case 3: a<b^k
1.if p>=0 than T(n)=0(n^k log^p n)
2 if p<0 than T(n)=O(n^k)
forgave Constant bcoz constant doesn't change time complexity or constant change processor to processor .(i.e n/2 ==n*1/2 == n)
来源:https://stackoverflow.com/questions/22674053/simplifying-recurrence-relation-cn-cn-2-n2