BigO for data structures...

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
For large data sets, an efficient algorithm can mean the difference between finishing in seconds or hours.

In sorting, for example, the efficient algorithms are O(n log n) while a bubble sort is O(n^2).

So for indexing a database of 100,000 records a merge sort might do 500,000 comparisons while the bubble sort will do more like 10,000,000,000.

Analysis of O() for different algorithms is essential for completing big jobs in the smallest amount of time.
 

Barnaby W. Füi

Elite Member
Aug 14, 2001
12,343
0
0
It's a measure of how efficient an algorithm is. Best case is constant time, which means that the algorithm will work just as fast on a million items as it will on 5. A linearly scaling algorithm will slow respective to the number of items it's working on, and logarithmic efficiency will have worse efficiency respective to the number of items as the number of items increases. As for the big-oh notation, I dunno. I'm one of those backwoods programmers who's never been to one of them there fancy school things. :p I *believe* O(1) and O(c) are constant time, O(n) is linear, and O(log n), O(n log n), and O(n^2) are all logarithmically scaling. I really should probably read up more on this...

edit: hm, i just realized that ^ is exponential and is quite different from log ;)
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
^ good general explanation except this part:

O(log n), O(n log n), and O(n^2) are all logarithmically scaling. I really should probably read up more on this...

No, assuming c is some constant and n is your number of elements O(c log n) and O(1 log n) are equivalent, but O(n log n) and O(c log n) certainly are not. For example, O(log n) is a good search algorithm like binary search, O(n log n) is a truly awful search since it's even slower than linear search which is O(n) -- O(n log n) means your search is examining every element once, plus some elements more than once.

O(1) < O(log n) < O(n) < O(n log n) < O(n^2) ... < O(exp(n))