How to compare memory usage across Python scripts?
I have 7 Python scripts averaging 300 lines. They are crawlers that run infinitely.
"top" is said to not represent the numbers truthfully: http://serverfault.com/questions/736...g-memory-usage How can I see which one is the most resource intensive so I can work on optimizing it? |
Many things are said about Linux memory usage - most of them wrong, or hard to interpret correctly.
Go find ps_mem.py and use that. Being python you can read it directly - it has some excellent comments for background. |
Does the program need to terminate to make that work? As I said, it runs infinitely.
|
https://docs.python.org/2/library/profile.html
http://stackoverflow.com/questions/5...-python-script You may try to switch on/off (start/stop) profiler somehow and allow to run your process infinitely. |
No, it's a snapshot. Once a task terminates, all the allocation data is lost - unless you have auditing and/or history monitor in place. The latter is a better long term option - collectl say.
Run ps_mem with a interval (20 secs, 5 mins, 30 mins, whatever) and send the output to a file. Easy to see any increase for a particular task. |
Quote:
|
You can pass in a list of pids, so you don't have a lot of unnecessary entries if you wish. The total line is also just for those tasks.
|
All times are GMT -5. The time now is 12:10 AM. |