Talk:Run-time analysis

Latest comment: 16 years ago by Kdakin in topic Run-time analysis is bad name

Is this page necessary?

edit

Thank you, Groupthink, for writing this material. But I wonder if it should not be part of the Computational complexity page. See my comment on that discussion page. If you are keen, that page could benefit from your efforts. Scottcraig 17:22, 30 June 2007 (UTC)Reply

Merger with analysis of algorithms

edit

I strongly support this merger. I should have caught that that page existed and punched up that page instead of creating this page. Ah well. Ironically, whoever created the other page cited the exact same textbook I used for my reference. :-) At any rate, I'd push for the content of analysis of algorithms to be merged into run-time analysis and not vice-versa. Groupthink 23:38, 8 July 2007 (UTC)Reply

Content-wise tis article is much better, however I think it may be a good idea to threat memory-space analysis in the same article. —Ruud 00:05, 9 July 2007 (UTC)Reply
Thanks. I did make a brief mention of run-space analysis at the end. Groupthink 00:54, 9 July 2007 (UTC)Reply
Agree with merging analysis of algorithms into this article. Iknowyourider (t c) 16:00, 18 July 2007 (UTC)Reply
Agree with merging analysis of algorithms into this article. laufinjackass

I'm going on a University WikiBreak starting on the 21st. If I can't get the merger done by then, then any editor who'd care to take over has my blessing. Groupthink 21:40, 15 August 2007 (UTC)Reply

Should the merger not be in the other direction? Run time analysis being a section in Analysis of Algorithms. Skippydo 08:52, 16 August 2007 (UTC)Reply
I was under the impression that the consensus was AA should be merged with RTA but retitled "Analysis of Algorithms"... but I leave it in the hands of the vox editori Groupthink 15:31, 16 August 2007 (UTC)Reply
I think RTA should be a subsection of AA, since it is one type of AA. Ozgurozkan 20:13, 3 October 2007 (UTC)Reply
I agree - most techniques used for RTA are used even for space analysis. This article is much more complete, while the other is, in practice, a stub, but may be used for the introduction. Btw, the merge is still to perform after an year... --Blaisorblade (talk) 15:13, 16 June 2008 (UTC)Reply

Orders of growth section

edit

Quoting: Note that Big O notation can only be applied to run-times that are monotonically increasing. The following algorithm, expressed as pseudocode, cannot be said to have a Big O classification:

1    if n is prime
2        return n
3    else if n is even
4        while n > 1
5            n = n / 2
6    else
7        while n > 0
8            n = n - 50

This runs in O(n) time, no? Skippydo 02:05, 17 August 2007 (UTC)Reply

No. It runs O(1) for primes, O(lg n) for even integers > 2, and O(n) for non-prime odd integers. Now you could argue that the O(n) branch "dominates", except that Big-O is defined as monotonically increasing; for n > n0, run-time isn't allowed to ever decrease as n increases – it has to either stay constant or increase. Groupthink 02:54, 17 August 2007 (UTC)Reply
Oh crap, you did make me catch something else that needs fixing, however. Your typical programmer would assume that integer division is taking place on line 5 (i.e. the mantissa is being truncated) but for an article like this that needs to be clarified. Groupthink 03:02, 17 August 2007 (UTC)Reply
Big-O is defined as monotonically increasing...run-time isn't allowed to ever decrease as n increases

Who says? Skippydo 05:00, 17 August 2007 (UTC)Reply

Donald Knuth and many others. 8-D Groupthink 05:46, 17 August 2007 (UTC)Reply
Wait, hang on, hold the phone, I may have screwed this one up. I think you might be right, it appears that even though Big-O is typically applied to monotonically increasing functions, it isn't limited to them, which would mean that you're right, the algorithm above runs in O(n) time. I'll move that material to here for now, because even if I'm wrong and you're right, that pseudocode would be a good example for an indefinable Θ-notation. Groupthink 06:05, 17 August 2007 (UTC)Reply
Yeah, I'm pretty sure I confused some concepts here, sorry about that. I think for f(n) = O(g(n)), g(n) must be monotonically increasing, but not so f(n). Same goes for f(n) = Ω(g(n)). Groupthink 06:25, 17 August 2007 (UTC)Reply
Bounding an algorithm's runtime by a some other than a monotonically increasing function would make little sense. However, to my knowledge, it's not restricted by the definition,   if
 
Skippydo 16:14, 17 August 2007 (UTC)Reply
I think I need to concede your points on the theory, but looking at things practically: other than the trivial case of O(1), are there non-trivial algorithms whose run-time f(n) is monotonically decreasing? I've never run across any, but then I haven't earned my doctorate yet... Groupthink 20:17, 17 August 2007 (UTC)Reply
It's difficult to imagine a problem that gets easier when there is more n to deal with. We do, however, speak of the error of a probabilistic algorithm as being O(1/n) or some such. Skippydo 06:15, 18 August 2007 (UTC)Reply

Removed material from orders of growth

edit

Note that Big O notation can only be applied to run-times that are monotonically increasing. The following algorithm, expressed as pseudocode, cannot be said to have a Big O classification:

1    if n is prime
2        return n
3    else if n is even
4        while n > 1
5            n = floor(n / 2)
6    else
7        while n > 0
8            n = n - 50

The above example definitely cannot be expressed using Θ-notation, but I'm unsure if Big-O and Ω are formally restricted to monotonically increasing functions or not. Groupthink 06:12, 17 August 2007 (UTC)Reply

Quick-sort comes to mind. It's already mentioned in this section. Skippydo 16:14, 17 August 2007 (UTC)Reply

Evaluating run-time complexity section

edit

Quoting:The specific amount of time to carry out a given instruction ... can be considered to be an abstract unit of time This very power statement in made but not, in my opinion, utilized. Since we are dealing with abstract units of time the values   can be said to be bounded by 1 unit. Therefore our running time is at most,

 

Thoughts? Skippydo 16:21, 17 August 2007 (UTC)Reply

I think that's a great way to sum up that section or illustrate by alternate example. What you did there is of course the quick way to determine the runtime order. :-) I intentionally took a more roundabout and formal route to fully elucidate that even though the inner loop iterates from 1 to i and not 1 to n, the nested loops still run in O(n2) time. I like your use of summation notation though, and I'll be bold and incorporate that right now. Groupthink 18:18, 17 August 2007 (UTC)Reply

Growth rate of other resources

edit

Note:
The analysis given above for the space consumption is possibly wrong, because the memory space reserved is not growing at a rate of 2n the inputs.

Let us validate the above statement with a simple proof.

Assume that the file size is one and memory reserved is also 1 byte,
for N = 1, space allocated is 1 byte

Suppose the size is increased to 2,
for N = 2, space allocated will be doubled to 2*1 bytes(2), as per definition in line #2
for N = 3, space allocated will be doubled to 2*2 bytes(4), as per definition in line #2
for N = 4, space allocated remains the same, because already we have the enough space

As a summary we can say

S(1) = 1 (Initial condition)
S(2) = 2*1 = 2 (Doubled as per Line #2)
S(3) = 2*2 = 4 (Doubled as per Line #2)
S(4) = 4 (No change)
S(5) = 4*2 = 8 (Doubled as per Line #2)
S(6) = 8 (No change)
. .
S(8) = 8 (No change)
S(9) = 8*2 = 16 (Doubled as per Line #2)
S(10) = 16 (No change)
. .
. .
. .
. .
. .

For input file size of 4, reserved space is only 4 bytes and not 16 bytes(24). So at any time, reserved memory space is less than double the size of an input file size. So we can say the space complexity for the above problem is O(2n). Since the constants doesnt mean in Big-Oh notation, we can say S(N) = O(n).
Linear growth rate.

—Preceding unsigned comment added by 125.17.107.170 (talkcontribs)

Thanks for contributing to the conversation about this article, and for fact-checking. You're absolutely right, and I've corrected accordingly; O(2n) growth would involve doubling with an increase of a constant factor, not a scaled factor. Groupthink 16:26, 10 September 2007 (UTC)Reply


NOTE:

The above analysis is possibly wrong and the growth rate is not exponential, regardless of the size of the file.

There are two contradictions with the above algorithm.

1) Why we need to double the amount of memory reserved for every 1GB increase in file size if we have the enough memory space already allocated for the file.
Eg.,

      Suppose the file size is 1GB and the reserved space is 2 GB.
Now as the above algorithm says, for increase the file of 1GB, we need not double the reserved space to 4GB, because
already we have the reserved space of 2GB and the file size is also 2GB. So the above algorithm cannot hold good.


2) If the growth rate is exponential as explained above,

       for 1GB of file size the reserved space should be 21GB
for 2GB of file size the reserved space should be 22GB
But this is practically impossible and not the case. So we cannot say the algorithm grows at exponential rate.


So we can modify the above alorithm to,

1    While file is still open
2 Let n = file size
3 If n > memory reserved
4 Double the amount of memory reserved


For the above algorithm, assume the file size is 1 byte and reserved memory space is also 1 byte

S(1) = 1
S(2) = 1*2 = 2 (Doubled as per Line #4)
S(3) = 2*2 = 4 (Doubled as per Line #4)
S(4) = 4 (No change)
s(5) = 4*2 = 8 (Doubled as per Line #4)
S(6) = 8 (No change)
.
.
s(9) = 8*2 = 16 (Doubled as per Line #4)
S(10) = 16 (No change)
.
.
.
.
At any point of time, reserved space is less than double the input file size. It is clear that space complexity is less than 2N for input file size N. So space complexity for the above algorithm is O(2N), since the constants doesnt mean anything in Big-Oh notation, we can simplify it to O(N).
Linear Growth Rate.
—Preceding unsigned comment added by 125.17.107.170 (talkcontribs)

OK, now you're just being obnoxious. First of all, comments like these need to go on the discussion page, not the article itself. Second of all, I have no idea what you mean by "contradictions". I can see how you'd have problems with the practicality, but that's why there's a specific footnote indicating that this is a theoretical example and not at all practical. That doesn't make it a "contradiction" however.
As for your growth rate criticisms, they're wrong. As you yourself pointed out, constant factors can be neglected in Big-O notation. O(2n) growth does not mean that run-time (or space) needs to literally be 2n. Take a look at the example again. The pseudocode says: "For every 100 MB increase in file size, double reserved memory." Now I will grant that in real life, your memory usage will probably increase linearly, but again, this is a theoretical example! Plus, for a sufficiently complex application, it's not inconceivable that memory consumption would grow faster than O(n).
OK, so we don't know how much memory has been reserved to manage a file, but let's say we start off at 10 MB per 100 MB of file size.
File Size     Memory Reserved
=============================
100 MB        10 MB (approx. 23 MB)
200 MB        20 MB (approx. 24 MB)
300 MB        40 MB (approx. 25 MB)
...           ...
1000 MB       5120 MB (approx. 212 MB)
...           ...
n MB          1.25 * 2n MB
That's O( 2n ) growth.
If you can come up with a better example, feel free to add it to the article. But it'd be nice if you expressed your concerns here on the talk page first before spamming the article. Groupthink 07:35, 11 September 2007 (UTC)Reply

Run-time analysis is bad name

edit

The article heading "run-time analysis" suggests an analysis performed at run time - as in "Performance analysis". (I have now added a 'do not confuse with' sentence at start of the existing article to help with this).

The wording in the article did not make it sufficiently clear that this is an estimation technique, not related to an algorithm's actual performance. If merging is to take place I suggest it is NOT with this article's current name!

There were, until recently, almost no XREF's between important related topics:-

etc

I have added some XREF's but there may be many more that also relate to performance generally that I have not discovered yet.ken (talk) 06:29, 21 July 2008 (UTC)Reply