Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game. 
Notices 
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto 
Site FAQ 
Sitemap 
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQrelated cookies.

Introduction to Linux  A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free. 


10192003, 09:20 PM

#1

Member
Registered: Feb 2001
Location: Boston, USA
Distribution: Slackware
Posts: 435
Rep:

Big O, Big Omega, and Big Theta
I'm confused between Big O, Big Omega, and Big Theta notations. I understand that Big O gives you an upper bound which gives you the worst case growth rate of the function. And Big Omega gives you a lower bound or best case growth rate of the function, and a function that has an upper and lower bound of the same order is Big Theta. Does this explicitly mean that Big O is worst case; Big Omega is best case, and Big Theta is the average?
That's the way I believed it to be, but then this confused me: Insertion Sort has a worst and average case of Big O(n^2) and Big Theta (n^2) respectively, and a best case of Big Omega (n). Those are two different functions: one is quadratic (n^2) and the other is linear (n), wouldn't you need to find separate lower and upper bounds on them?
Any help would be appreciated. This has been confusing me for a while.



10202003, 02:47 AM

#2

Member
Registered: Aug 2003
Location: Planet Earth
Distribution: Linux Mint Debian Edition (LMDE)
Posts: 215
Rep:

Big Theta is not the average runtime. It only describes the case where the upper and lower bounds of a function are on the same order of magnitude. As far as I know there is no asymptotic notation to describe the average runtime of a function.
You are correct that Big O describes the upper (worst case) bound, and Big Omega describes the lower (best case) bound. Insertion Sort has an upper bound of O(n^2) and a lower bound of Omega(n). Big Theta notation does not apply to Insertion Sort, even though Insertion Sort has an average runtime.


1 members found this post helpful.

10202003, 09:29 AM

#3

Member
Registered: Feb 2001
Location: Boston, USA
Distribution: Slackware
Posts: 435
Original Poster
Rep:

Eric,
Thanks for your reply. My book says that Insertion Sort is Big Theta of n^2 after it says that it has a best case running time of n1 comparisons, but they don't go on to explain why it's Big Theta of n^2. Is this simply because it n(n1)/2 can be bounded from below as well?
Thanks for your help.



08052008, 11:20 AM

#4

LQ Newbie
Registered: Aug 2008
Posts: 1
Rep:

Big Omega in not Best Case
Quote:
Originally Posted by eric.r.turner
Big Theta is not the average runtime. It only describes the case where the upper and lower bounds of a function are on the same order of magnitude. As far as I know there is no asymptotic notation to describe the average runtime of a function.
You are correct that Big O describes the upper (worst case) bound, and Big Omega describes the lower (best case) bound. Insertion Sort has an upper bound of O(n^2) and a lower bound of Omega(n). Big Theta notation does not apply to Insertion Sort, even though Insertion Sort has an average runtime.

I believe Big Omega is not best case scenario. It is only a notation to state the minimum lower bound that can be defined for the running time. It only states that f(n) dominates g(n) for all values of n. If we define minimum time to be the best case scenario, only then we could call it best case.



05252010, 03:21 PM

#5

LQ Newbie
Registered: May 2010
Posts: 2
Rep:

Quote:
Originally Posted by eric.r.turner
Big Theta is not the average runtime. It only describes the case where the upper and lower bounds of a function are on the same order of magnitude. As far as I know there is no asymptotic notation to describe the average runtime of a function.
You are correct that Big O describes the upper (worst case) bound, and Big Omega describes the lower (best case) bound. Insertion Sort has an upper bound of O(n^2) and a lower bound of Omega(n). Big Theta notation does not apply to Insertion Sort, even though Insertion Sort has an average runtime.

Be careful not to confuse best/average/worst case with asymptotic bounds. Big O describes the asymptotic upper bound, but can do so for all three cases. Big Omega likewise can describe an asymptotic lower bound in all three cases. Some of these bounds make less sense than others:
In case of insertion sort: Its bestcase running time is O(n) and Omega(n) and therefore Theta(n). It will neither go faster nor slower than the time it takes to process all n elements. Its averagecase running time is O(n^2): When sifting element A[e] down array A, it will on average be compared with half of the n elements of A[1..e1], and since n/2 is proportional to n by a constant factor, insertion sort has an averagecase running time of O(n^2). A lower bound for its averagecase running time can be defined in a number of complicated ways.


1 members found this post helpful.

05252010, 03:45 PM

#6

LQ 5k Club
Registered: Sep 2009
Posts: 6,443

This thread is 7 years old!



05252010, 03:48 PM

#7

Senior Member
Registered: Jul 2005
Distribution: Slackware
Posts: 2,014
Rep:

Also note that something that is (for example) O(n) is O(n^2) as well, but O(n) is more useful and descriptive.



05262010, 08:18 AM

#8

LQ Newbie
Registered: May 2010
Posts: 2
Rep:

Quote:
Originally Posted by tuxdev
Also note that something that is (for example) O(n) is O(n^2) as well, but O(n) is more useful and descriptive.

Yes, O(n) would be a tighter bound and would better describe the runtime complexity of the algorithm. This means that the algorithm is also o(n^2) (littleo of nsquared). Like littleo is the untight counterpart to BigO, littleomega (ω) is the untight counterpart to BigOmega (Ω).
Quote:
Originally Posted by MTK358
This thread is 7 years old!

The reason I replied was because the thread had a high search engine rating and asks a good question, but the answer lacked some details. Hopefully when people encounter this thread asking the same question, they will learn more. I hope this has not offended anyone.



Thread Tools 
Search this Thread 


Posting Rules

You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off



All times are GMT 5. The time now is 01:05 PM.

LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.

Latest Threads
LQ News

