I've seen GUI based apps which are written incorrect end causing 100% usage because while they give up the processor, they have zero timeouts to keep them being responsive or have them emulate real time reporting. The end result is that while this doesn't kill the system or slow it down, it is a loaded situation which can cause stability problems.
When you run the monitor, does it slow the system so that you almost can't use it, or is the system still normal and just the CPU usage statistic is bad?
My read if the system operates near normal is that the monitor application would be the problem due to poor coding.
|