Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Some tasks can thrash a CPU without seeming to do much.
Before the upgrade, the answer for my old kit was "nearly all".
Now, I have a intel i7 CPU and 4G of RAM. On a Gnome desktop (Ubuntu), and running "System Monitor " set to show resources graphs, it displays 8 CPUs (two threads per core). Its easy to see when its working hard.
For example:
Clicking the "Help" tab on a GNU app to bring up the manual, and also, then clicking on any of the subjects in the Contents page. In my case it was "Gnumeric" spreadsheet. One CPU thread will run to 100% for a full 35 seconds, even though the page appears after 10 seconds! For a CPU of this power, simply to display text, that is an extraordinary amount of computing! I can hear the cooling fan rev up to maximum while it does it.
I also sometimes see more memory in use than there is in the entire distro's pile of software. I would expect the hard drive to be able to spin down and stay down.
So what is it that does this? Are some apps just greedy, stuck in useless loops waiting for "something"?
My computer also has an Intel Core i7 (and 6GB of RAM), and I never had such problems. Typically memory usage is under 1GB with a few programs running. Maybe it's an Ubuntu thing?
The only time where one of the "8 CPUs" gets to 100% is when a program is doing a very long mathematical computation, compiling large packages, or hangs up.
I found that if a program does not sleep for about a second the CPU will go to 100%. Most programs sleep while waiting for user input.
Actually, pretty much anything that involves smooth unaccelerated graphics animation tends to make the CPU go full throttle. I'm using Ubuntu/GNOME and therefore most (graphical) applications use GTK+...the whole CPU usage thing has always bugged me a bit. Qt apps don't tend to go full throttle as much, but it still seems as though it's more processing than should be necessary. Is it just the cross-platform nature of these toolkits, or what? Or maybe it's that GTK is mostly callback-driven. Not sure about Qt.
Also, things like emulators (e.g. DOSBox) and virtual machines (e.g. VirtualBox VMs) tend to use 100% CPU when doing even somewhat minor tasks, for (what should be) obvious reasons.
My computer also has an Intel Core i7 (and 6GB of RAM), and I never had such problems. Typically memory usage is under 1GB with a few programs running. Maybe it's an Ubuntu thing?
MTK358 - Yes, I used Ubuntu because, I confess, lazy convenience. Have you maybe tried Ubuntu to compare to Fedora? I suppose I am fishing for "did you think Fedora faster"?
Quote:
Originally Posted by MrCode
I'm using Ubuntu/GNOME and therefore most (graphical) applications use GTK+...the whole CPU usage thing has always bugged me a bit.
Though my need requires intensive mathematical computation, and now I use Virtualbox as well, I have always been able to let that happen, while also using the PC desktop for more mundane stuff. Now, I am starting to consider a "speed" partition, with a distro that does not have so much overhead. :|
I am also thinking that in the past, computers that were relatively feeble compared to what most of us use now, seemed to be able to do much more with what little they had, and with credibility.
So what is it that does this? Are some apps just greedy, stuck in useless loops waiting for "something"?
Some apps will indeed max out for maximum speed/performance (mostly 3D games, compilers, video editors ...)
Many of them have an option to stop/change this (if that's what you want, CPU is there to be used).
Others are are just badly written and not yet optimized for fast systems like yours .
I have an AMD Phenom (Quad Core), and 4GB RAM, running Debian AMD64. I keep 1 CPU maxed all the time and a Large chuck of memory used as well because I leave Firefox open with 11 or 12 tabs open. Other than that, I do not have any issues with maxing my CPU or Large memory consumption
MTK358 - Yes, I used Ubuntu because, I confess, lazy convenience. Have you maybe tried Ubuntu to compare to Fedora? I suppose I am fishing for "did you think Fedora faster"?
I haven't actually ever used Ubuntu. I tried it a few times but it felt too "dumbed-down" for me.
About this GTK vs Qt thing, I know that GTK uses the cairo vector graphics library to draw it's widgets (I sometimes play around with cairo, and I really like it). I don't know how Qt does it.
Also I know Qt uses some weird system called "signals and slots". Everything that is controlled by a widget has to be a subclass of QObject for this to work and it also has to cheat C++ by using some tool called moc to do this. I don't know if it is more or less efficient than GTK's simple callbacks, but GTK's system feels much more elegant to program with, but I tried Qt programming and really didn't like it.
About this GTK vs Qt thing, I know that GTK uses the cairo vector graphics library to draw it's widgets (I sometimes play around with cairo, and I really like it). I don't know how Qt does it.
And I suppose the lack of acceleration in this regard would be the reason for the slowdown, then? Cairo's site doesn't mention anything about it...
It seems like somewhat of a waste to have the option of a H/W accelerated window manager (Compiz Fusion), but not H/W acceleration for the main widget library. Maybe that's just me, though.
I think part of the whole "bloat" problem is the attitude that "the resources are there to be used". Yes, this may be true, but that doesn't mean you should be lazy about it. They're there to be used, but they're to be used wisely (I'm one to talk...I don't do that much graphics programming anymore ).
I suppose something to keep in mind is that your program is sharing memory/CPU time with others!
I suppose something to keep in mind is that your program is sharing memory/CPU time with others!
May that be tattooed on every dev's hand.
There was a time (I remember it, but many don't) when you were charged on a quantum basis for computing resources. CPU time, storage, network usage, whatever.
People now have machines that are absurdly powerful, yet something as stupid as Flash will use 90% of my quad-processor box. That's completely unreasonable (and why I tend to leave it disabled).
There is something to be said for GUI bloat - my same laptop running KDE is 96.1% idle when everything but the terminal is closed... but 98.4% idle running Gnome in the same mode. With X down entirely? 99.4%. The kdeinit4 proc tends to sit at 4% CPU no matter what is going on.
To wander back on topic - yes, some apps just suck. Some are polling, some are caching everything into RAM (just in case), and some are just horribly inefficient.
Last edited by MBybee; 12-31-2009 at 01:30 PM.
Reason: Typo
There was a time (I remember it, but many don't) when you were charged on a quantum basis for computing resources. CPU time, storage, network usage, whatever.
I don't mind letting a CPU max out when it there is a point to it. I object if the CPU is polling at max for itself - just "in case" On a 1-CPU machine, it competes with other apps, and for some calculations would make the PC unresponsive if the scheduling was poor.
Quote:
Originally Posted by MBybee
To wander back on topic - yes, some apps just suck. Some are polling, some are caching everything into RAM (just in case), and some are just horribly inefficient.
When I was still in education phase, the class was taken to a demo of what happens when intensive computing tasks hog the processor time. The whole room nearly came to a halt. Printers briefly chattered occasionally, and the time between hitting <return> and getting a response stretched to seconds.
When reworked to give the highest priorities to the slowest peripherals, suddenly, the throughput was massive. It seems a fast calculation can be interrupted by a slow thing like a keyboard, the key-push processed, and return to the fast calculation, long ages before the user finger lifts from the key, and the contacts stop bouncing.
No matter how clever the schedulers, they can't do much about greedy apps!
I don't mind letting a CPU max out when it there is a point to it. I object if the CPU is polling at max for itself - just "in case" On a 1-CPU machine, it competes with other apps, and for some calculations would make the PC unresponsive if the scheduling was poor.
Totally agreed. It makes sense for the CPU usage to max out if it's doing something truly intensive, such as re-encoding video, rendering a Mandelbrot Set fractal (although that might could probably be accelerated *a little* by a GPU...right? Or is the precision required too high? ), performing large/numerous file write/copy operations (lots of I/O), etc.
But if it's just doing something stupid like waiting for input (or "polling", as you put it), and it's taking 100% CPU time in the process, then that's just ridiculous, and it will slow things down on a lower-end system.
Sure, some programs are terrible, but maxing out is normally (if done correctly) a very modern way of sharing resources as well (using only whatever is available without harming performance).
Don't buy ultra powerful cpu's if you don't need them
Never using all your cpu is a waste of money.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.