YA Quick homework question T

Status
Not open for further replies.
Oct 27, 2007
17,009
5
0
This question is so simple I'm actually a little embarrassed to be asking it, but oh well. I'm going through some previous years' exams to study for my upcoming algos exam and this was one of the questions.

The question shows a graph of computing time for two algorithms, for which the complexity functions are
T(n) = 5n^2 + 30n + 25
and
T(n) = 0.01n^3 - 500

The graph makes it look like the n^3 algo has smaller computation time, but obviously with large enough n the asymptotic behavior is worse. Part 3 of the question asks
(iii) Write the asymptotic time complexity of both algorithms in Big-Oh notation. Use the definition of O(n) to prove your answer for any one case.

I'm just not quite sure how to do that last part. Obviously the algos are O(n^2) and O(n^3) respectively, but how do I use the definition of O(n) to prove this?

I'm sure the answer is face-palmingly simple :) TIA.
 

a123456

Senior member
Oct 26, 2006
885
0
0
Haven't done this in forever but I'm guessing they're looking for the formal definition of big oh.

f(n) = O(g(n)) iff there exists constants c and k such that 0 <= f(n) <= cg(n) for all n >= k

First case is easy to prove.

T(n) = 5n^2 + 30n + 25

T(n) <= 6n^2 for all n>= 31

So it works for constants of c=6,k=31. Thus T(n) = O(n^2).
 
Oct 27, 2007
17,009
5
0
Thanks a123456 I think that's what I after. DrP, the coefficients don't matter in computational complexity, all we care about is asymptotic behavior, where coefficients are dominated by the n terms. You might be thinking of exponential complexity, where the base of the exponent is important.
 
Status
Not open for further replies.