TB0ne |
01-16-2014 02:30 PM |
Quote:
Originally Posted by bluegospel
(Post 5099308)
Say I had a website that was one page, index.php, widely variable in content by url. Say I had 13,000 generic "hits" since the beginning of October. How many would probably be cyberbots?
|
Your question makes little sense. You have one page...but with "widely variable" content?? Are you re-writing the PHP page constantly? And really, it doesn't even MATTER what's on the page, if you're getting hits. This:
Quote:
Originally Posted by dugan
You don't speculate. You check the user-agents in the web server logs.
|
..would be a VERY obvious place to start for anyone who has a web server and in even vaguely interested in traffic management, and these topics are easily found with a brief Google search. There is NO foolproof way to determine this, since real users can mask their user agents, and bots can easily spoof them. Back to a basic Google search, which would lead you to the Google Analytics page, which can help to break these things down. You're asking for a percentage, and there's no way anyone can answer that.
And didn't you say we should "rest assured" that you were leaving? More than once?
|