Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Hi trying to set up AWstats but i am getting the below error.
Error: SiteDomain parameter not defined in your config/domain file. You must edit it for using this version of AWStats.
Setup ('/etc/awstats/awstats.conf' file, web server or permissions) may be wrong.
Check config file, permissions and AWStats documentation (in 'docs' directory).
There's 3 IP addresses in there, and they are all part of a reserved range, this would indicate to me that any public external traffic is being passed through a NATing firewall. It's possible that your firewall is setting something like the forwarded-for header in the requests but your server isn't configured to log them.
Even with the "real" IP addresses in the log unless you use analytics tracking code (with cookies) in your application you will have to make guesses and assumptions. For example, without some form of unique browser cookie you won't be able to tell if there are multiple users behind a single IP. Take an example of a company that has 100 users behind a firewall, you'll see only the one entry for the company "exit" IP in your logs so you may only count it as 1.
webalizer does a pretty good job of capturing and graphing domain and web page usage by parsing the web server logs.
By default, it will summarize and report statistics for the last 12 months.
[To clarify, it starts with the logs that are available, typically one month's worth, but it will continue to gather statistics and "roll over" at the end of a year...however, I note I have several sites configured to keep 24 months of stats]
But, yes, the OP has challenges in that what's being recorded in the logs are all internal addresses. They need to identify why that's happening.
But, yes, the OP has challenges in that what's being recorded in the logs are all internal addresses. They need to identify why that's happening.
Perhaps there is a proxy in front of Apache2, maybe Varnish or something. If that is the case then maybe look at mod_remoteip, which has replaced mod_rpaf, or something equivalent.
webalizer does a pretty good job of capturing and graphing domain and web page usage by parsing the web server logs.
By default, it will summarize and report statistics for the last 12 months.
[To clarify, it starts with the logs that are available, typically one month's worth, but it will continue to gather statistics and "roll over" at the end of a year...however, I note I have several sites configured to keep 24 months of stats]
But, yes, the OP has challenges in that what's being recorded in the logs are all internal addresses. They need to identify why that's happening.
webalizer does a pretty good job of capturing and graphing domain and web page usage by parsing the web server logs.
By default, it will summarize and report statistics for the last 12 months.
[To clarify, it starts with the logs that are available, typically one month's worth, but it will continue to gather statistics and "roll over" at the end of a year...however, I note I have several sites configured to keep 24 months of stats]
But, yes, the OP has challenges in that what's being recorded in the logs are all internal addresses. They need to identify why that's happening.
yes all are internal, in between there is a reverse proxy whenever the customer tries to connect to the webserver it redirects to internal ip address. itshard to track external ip address
It doesn’t filter anything. It just collates and graphs usage for a domain.
Edit: More correctly, for the traffic recorded in a log file...we just insure that every domain is using a unique log file.
The User Agent string should help you, try using it, the thing's not difficult to use, if you have access to the portal and use User Agent everything should be ok
Why not write your own traffic report, using a simple php script included in all the pages you want to track. Much easier and less overhead than third party analytics, and you have much more control over what information to report on. I did this years ago and quickly learned how to distinguish between robot traffic, my own accesses, one-time visitors, and repeat visitors.
I second writing your own to start with - there's a lot you need to understand as you can see from previous comments, and the best way is to roll your own (initially).
Eventually you may decide to use a pre-written tool, but even the effort of trying to do your own will teach you a lot about how to choose the right tool for you.
As above, you can never be really sure about counting unique visitors, but you can come reasonably close.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.