LinuxQuestions.org
Review your favorite Linux distribution.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 12-30-2009, 04:53 AM   #1
GTrax
Member
 
Registered: Oct 2005
Location: UK
Distribution: Mint
Posts: 258

Rep: Reputation: 37
So - which apps max out your CPU?


Some tasks can thrash a CPU without seeming to do much.

Before the upgrade, the answer for my old kit was "nearly all".
Now, I have a intel i7 CPU and 4G of RAM. On a Gnome desktop (Ubuntu), and running "System Monitor " set to show resources graphs, it displays 8 CPUs (two threads per core). Its easy to see when its working hard.

For example:
Clicking the "Help" tab on a GNU app to bring up the manual, and also, then clicking on any of the subjects in the Contents page. In my case it was "Gnumeric" spreadsheet. One CPU thread will run to 100% for a full 35 seconds, even though the page appears after 10 seconds! For a CPU of this power, simply to display text, that is an extraordinary amount of computing! I can hear the cooling fan rev up to maximum while it does it.

I also sometimes see more memory in use than there is in the entire distro's pile of software. I would expect the hard drive to be able to spin down and stay down.

So what is it that does this? Are some apps just greedy, stuck in useless loops waiting for "something"?
 
Old 12-30-2009, 07:06 AM   #2
MTK358
LQ 5k Club
 
Registered: Sep 2009
Posts: 6,443
Blog Entries: 3

Rep: Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723
My computer also has an Intel Core i7 (and 6GB of RAM), and I never had such problems. Typically memory usage is under 1GB with a few programs running. Maybe it's an Ubuntu thing?

The only time where one of the "8 CPUs" gets to 100% is when a program is doing a very long mathematical computation, compiling large packages, or hangs up.

I found that if a program does not sleep for about a second the CPU will go to 100%. Most programs sleep while waiting for user input.

Last edited by MTK358; 12-30-2009 at 07:07 AM.
 
Old 12-30-2009, 10:06 AM   #3
MrCode
Member
 
Registered: Aug 2009
Location: Oregon, USA
Distribution: Arch
Posts: 864
Blog Entries: 31

Rep: Reputation: 148Reputation: 148
Quote:
So - which apps max out your CPU?
Firefox. Scrolling. That is all.

Actually, pretty much anything that involves smooth unaccelerated graphics animation tends to make the CPU go full throttle. I'm using Ubuntu/GNOME and therefore most (graphical) applications use GTK+...the whole CPU usage thing has always bugged me a bit. Qt apps don't tend to go full throttle as much, but it still seems as though it's more processing than should be necessary. Is it just the cross-platform nature of these toolkits, or what? Or maybe it's that GTK is mostly callback-driven. Not sure about Qt.

Also, things like emulators (e.g. DOSBox) and virtual machines (e.g. VirtualBox VMs) tend to use 100% CPU when doing even somewhat minor tasks, for (what should be) obvious reasons.

Last edited by MrCode; 12-30-2009 at 10:11 AM.
 
Old 12-31-2009, 05:50 AM   #4
GTrax
Member
 
Registered: Oct 2005
Location: UK
Distribution: Mint
Posts: 258

Original Poster
Rep: Reputation: 37
Quote:
Originally Posted by MTK358 View Post
My computer also has an Intel Core i7 (and 6GB of RAM), and I never had such problems. Typically memory usage is under 1GB with a few programs running. Maybe it's an Ubuntu thing?
MTK358 - Yes, I used Ubuntu because, I confess, lazy convenience. Have you maybe tried Ubuntu to compare to Fedora? I suppose I am fishing for "did you think Fedora faster"?

Quote:
Originally Posted by MrCode
I'm using Ubuntu/GNOME and therefore most (graphical) applications use GTK+...the whole CPU usage thing has always bugged me a bit.
Though my need requires intensive mathematical computation, and now I use Virtualbox as well, I have always been able to let that happen, while also using the PC desktop for more mundane stuff. Now, I am starting to consider a "speed" partition, with a distro that does not have so much overhead. :|

I am also thinking that in the past, computers that were relatively feeble compared to what most of us use now, seemed to be able to do much more with what little they had, and with credibility.
 
Old 12-31-2009, 06:03 AM   #5
jens
Senior Member
 
Registered: May 2004
Location: Belgium
Distribution: Debian, Slackware, Fedora
Posts: 1,463

Rep: Reputation: 299Reputation: 299Reputation: 299
Quote:
Originally Posted by GTrax View Post
So what is it that does this? Are some apps just greedy, stuck in useless loops waiting for "something"?
Some apps will indeed max out for maximum speed/performance (mostly 3D games, compilers, video editors ...)
Many of them have an option to stop/change this (if that's what you want, CPU is there to be used).
Others are are just badly written and not yet optimized for fast systems like yours .

Last edited by jens; 12-31-2009 at 06:19 AM.
 
Old 12-31-2009, 10:11 AM   #6
worm5252
Member
 
Registered: Oct 2004
Location: Atlanta
Distribution: CentOS, RHEL, HP-UX, OS X
Posts: 567

Rep: Reputation: 57
I have an AMD Phenom (Quad Core), and 4GB RAM, running Debian AMD64. I keep 1 CPU maxed all the time and a Large chuck of memory used as well because I leave Firefox open with 11 or 12 tabs open. Other than that, I do not have any issues with maxing my CPU or Large memory consumption
 
Old 12-31-2009, 11:20 AM   #7
MTK358
LQ 5k Club
 
Registered: Sep 2009
Posts: 6,443
Blog Entries: 3

Rep: Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723
Quote:
Originally Posted by GTrax View Post
MTK358 - Yes, I used Ubuntu because, I confess, lazy convenience. Have you maybe tried Ubuntu to compare to Fedora? I suppose I am fishing for "did you think Fedora faster"?
I haven't actually ever used Ubuntu. I tried it a few times but it felt too "dumbed-down" for me.

About this GTK vs Qt thing, I know that GTK uses the cairo vector graphics library to draw it's widgets (I sometimes play around with cairo, and I really like it). I don't know how Qt does it.

Also I know Qt uses some weird system called "signals and slots". Everything that is controlled by a widget has to be a subclass of QObject for this to work and it also has to cheat C++ by using some tool called moc to do this. I don't know if it is more or less efficient than GTK's simple callbacks, but GTK's system feels much more elegant to program with, but I tried Qt programming and really didn't like it.
 
Old 12-31-2009, 11:47 AM   #8
MrCode
Member
 
Registered: Aug 2009
Location: Oregon, USA
Distribution: Arch
Posts: 864
Blog Entries: 31

Rep: Reputation: 148Reputation: 148
Quote:
About this GTK vs Qt thing, I know that GTK uses the cairo vector graphics library to draw it's widgets (I sometimes play around with cairo, and I really like it). I don't know how Qt does it.
And I suppose the lack of acceleration in this regard would be the reason for the slowdown, then? Cairo's site doesn't mention anything about it...

It seems like somewhat of a waste to have the option of a H/W accelerated window manager (Compiz Fusion), but not H/W acceleration for the main widget library. Maybe that's just me, though.

I think part of the whole "bloat" problem is the attitude that "the resources are there to be used". Yes, this may be true, but that doesn't mean you should be lazy about it. They're there to be used, but they're to be used wisely (I'm one to talk...I don't do that much graphics programming anymore ).

I suppose something to keep in mind is that your program is sharing memory/CPU time with others!

Last edited by MrCode; 12-31-2009 at 12:01 PM.
 
Old 12-31-2009, 01:22 PM   #9
MBybee
Member
 
Registered: Jan 2009
Location: wherever I can make a living
Distribution: OpenBSD / Debian / Ubuntu / Win7 / OpenVMS
Posts: 440

Rep: Reputation: 57
Quote:
Originally Posted by MrCode View Post
I suppose something to keep in mind is that your program is sharing memory/CPU time with others!
May that be tattooed on every dev's hand.

There was a time (I remember it, but many don't) when you were charged on a quantum basis for computing resources. CPU time, storage, network usage, whatever.

People now have machines that are absurdly powerful, yet something as stupid as Flash will use 90% of my quad-processor box. That's completely unreasonable (and why I tend to leave it disabled).

There is something to be said for GUI bloat - my same laptop running KDE is 96.1% idle when everything but the terminal is closed... but 98.4% idle running Gnome in the same mode. With X down entirely? 99.4%. The kdeinit4 proc tends to sit at 4% CPU no matter what is going on.

To wander back on topic - yes, some apps just suck. Some are polling, some are caching everything into RAM (just in case), and some are just horribly inefficient.

Last edited by MBybee; 12-31-2009 at 01:30 PM. Reason: Typo
 
Old 01-01-2010, 04:16 AM   #10
GTrax
Member
 
Registered: Oct 2005
Location: UK
Distribution: Mint
Posts: 258

Original Poster
Rep: Reputation: 37
Quote:
Originally Posted by MBybee View Post
May that be tattooed on every dev's hand.

There was a time (I remember it, but many don't) when you were charged on a quantum basis for computing resources. CPU time, storage, network usage, whatever.
I don't mind letting a CPU max out when it there is a point to it. I object if the CPU is polling at max for itself - just "in case" On a 1-CPU machine, it competes with other apps, and for some calculations would make the PC unresponsive if the scheduling was poor.

Quote:
Originally Posted by MBybee
To wander back on topic - yes, some apps just suck. Some are polling, some are caching everything into RAM (just in case), and some are just horribly inefficient.
When I was still in education phase, the class was taken to a demo of what happens when intensive computing tasks hog the processor time. The whole room nearly came to a halt. Printers briefly chattered occasionally, and the time between hitting <return> and getting a response stretched to seconds.

When reworked to give the highest priorities to the slowest peripherals, suddenly, the throughput was massive. It seems a fast calculation can be interrupted by a slow thing like a keyboard, the key-push processed, and return to the fast calculation, long ages before the user finger lifts from the key, and the contacts stop bouncing.

No matter how clever the schedulers, they can't do much about greedy apps!
 
Old 01-01-2010, 09:33 AM   #11
MrCode
Member
 
Registered: Aug 2009
Location: Oregon, USA
Distribution: Arch
Posts: 864
Blog Entries: 31

Rep: Reputation: 148Reputation: 148
Quote:
Originally Posted by GTrax View Post
I don't mind letting a CPU max out when it there is a point to it. I object if the CPU is polling at max for itself - just "in case" On a 1-CPU machine, it competes with other apps, and for some calculations would make the PC unresponsive if the scheduling was poor.
Totally agreed. It makes sense for the CPU usage to max out if it's doing something truly intensive, such as re-encoding video, rendering a Mandelbrot Set fractal (although that might could probably be accelerated *a little* by a GPU...right? Or is the precision required too high? ), performing large/numerous file write/copy operations (lots of I/O), etc.

But if it's just doing something stupid like waiting for input (or "polling", as you put it), and it's taking 100% CPU time in the process, then that's just ridiculous, and it will slow things down on a lower-end system.
 
Old 01-01-2010, 12:50 PM   #12
MTK358
LQ 5k Club
 
Registered: Sep 2009
Posts: 6,443
Blog Entries: 3

Rep: Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723
Quote:
Originally Posted by MrCode View Post
the attitude that "the resources are there to be used"
I see that quite often and really don't like it.
 
Old 01-01-2010, 03:44 PM   #13
jens
Senior Member
 
Registered: May 2004
Location: Belgium
Distribution: Debian, Slackware, Fedora
Posts: 1,463

Rep: Reputation: 299Reputation: 299Reputation: 299
Quote:
Originally Posted by MTK358 View Post
I see that quite often and really don't like it.
Why?

Sure, some programs are terrible, but maxing out is normally (if done correctly) a very modern way of sharing resources as well (using only whatever is available without harming performance).

Don't buy ultra powerful cpu's if you don't need them
Never using all your cpu is a waste of money.

Last edited by jens; 01-01-2010 at 03:59 PM.
 
Old 01-01-2010, 04:55 PM   #14
jens
Senior Member
 
Registered: May 2004
Location: Belgium
Distribution: Debian, Slackware, Fedora
Posts: 1,463

Rep: Reputation: 299Reputation: 299Reputation: 299
Quote:
Originally Posted by GTrax View Post
just "in case" On a 1-CPU machine, it competes with other apps
Just like it should.
With single core systems, CPU is often your bottle-neck.
Not using all of it would always be a performance killer.

That "just in case" part (loading in ram) could also save you critical data.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
How do i assign max cpu to applications? RonaldUitAlmere Debian 3 05-23-2006 05:44 AM
max % of CPU for one process uselpa Slackware 2 07-30-2005 08:21 AM
MySQL on SMP. CPU stats max at 99.9% quill18 Linux - Software 2 05-26-2005 10:22 AM
max cpu for Red hat 9 tpeacham Linux - Software 0 02-18-2004 02:42 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 03:36 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration