I've noticed there's a command line call "factor", which prints the prime factors of a number.
Does anybody have an idea why this is included as a standard Unix/Linux routine? Is it really that important to do prime factorizations on the command line? Most programming languages don't have a factor command, so why does a very popular OS have one?
There must be some interesting history to this I'm missing. The only use I've gotten out of factor is getting a rough comparion of processor speeds on my various boxes by doing
...which takes from 52 seconds to over 2 minutes, depending on the machine
Yours in curiosity,