LinuxQuestions.org
Visit Jeremy's Blog.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 08-27-2016, 05:22 PM   #1
wh33t
Member
 
Registered: Oct 2003
Location: Canada
Posts: 922

Rep: Reputation: 61
Cool Realistically, how many concurrent users do you think this server could handle?


Here is the hardware: http://ca.pcpartpicker.com/list/zkvwzM
OS will be Ubuntu Server 16.04LTS

I'm hoping I can set this up as a webserver and handle around 300 concurrent connections. I'm flexible on the web server, but am familiar with the default LAMP stack that Ubuntu provides but was thinking nginx seems pretty good also. Apparently Apache2 leaks memory?

The internet connection is 15mbit. The website will be a custom scripted forum that makes every attempt to be lean on resources.

I know it's difficult to give an absolute yes or no, but figured it might be worth it to know ahead of time if there is no chance at all it could do this.
 
Old 08-28-2016, 04:25 AM   #2
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by wh33t View Post
The website will be a custom scripted forum that makes every attempt to be lean on resources.
Any reason to not use an actively developed, well-maintained existing product? And did you create it? If so: are you a prolific coder with a keen eye for security as well as performance? If not: 0) what does the developer say wrt performance? And 1) will you have enough support to get at least security problems fixed stat?


Quote:
Originally Posted by wh33t View Post
I know it's difficult to give an absolute yes or no, but figured it might be worth it to know ahead of time if there is no chance at all it could do this.
300 concurrent users "doesn't seem much". But unless you actually test a Real Life setup you won't know and you won't know what to tune. I'd say install VirtualBox in your current workstation (if you don't mind the extra software layer) to run the headless server setup as you designed it. (Or use a cloud VM somewhere?) Ensure the OS provides performance information each component provides enough (debug?) logging. Then use any network stress testing tool from your laptop and pound the heck out of it in a realistic way (meaning use URIs an actual user might use) while keeping various components /state-like URIs and top-like terminal windows open to see if bottlenecks turn up during the test. Analyse data, tweak, test the result, document changes. Rinse. Repeat.
 
1 members found this post helpful.
Old 08-28-2016, 08:39 AM   #3
24x7servermanagement
Member
 
Registered: Jul 2016
Location: India
Distribution: CentOS, Redhat, Ubuntu and Debian
Posts: 57

Rep: Reputation: Disabled
The question is fairly defult one and there is no perfect answer until you find one yourselves with your setup.

However,from my experience with that hardware with niginx you could have 150-200 connections for sure. So do stress Testing as suggested above, i will advise to do that from different ips.
 
1 members found this post helpful.
Old 08-29-2016, 01:08 AM   #4
wh33t
Member
 
Registered: Oct 2003
Location: Canada
Posts: 922

Original Poster
Rep: Reputation: 61
Quote:
Originally Posted by unSpawn View Post
Any reason to not use an actively developed, well-maintained existing product? And did you create it? If so: are you a prolific coder with a keen eye for security as well as performance? If not: 0) what does the developer say wrt performance? And 1) will you have enough support to get at least security problems fixed stat?



300 concurrent users "doesn't seem much". But unless you actually test a Real Life setup you won't know and you won't know what to tune. I'd say install VirtualBox in your current workstation (if you don't mind the extra software layer) to run the headless server setup as you designed it. (Or use a cloud VM somewhere?) Ensure the OS provides performance information each component provides enough (debug?) logging. Then use any network stress testing tool from your laptop and pound the heck out of it in a realistic way (meaning use URIs an actual user might use) while keeping various components /state-like URIs and top-like terminal windows open to see if bottlenecks turn up during the test. Analyse data, tweak, test the result, document changes. Rinse. Repeat.
I am the coder, I've written content management systems in PHP for about 10 years. I dunno if that instills a hope of faith in me but there it is. When I script I always try to think about security and have had issues in the past where hackers got into my sites so I have put some effort and focus into security. But security is a journey right? Not a destination. It's something that you have to continually keep an eye out for by checking logs and do regular audits etc. Feel free to suggest to me any other measures you think I should take. I know that I will do as much server hardening as I can before I even take it live.

As for WRT, not sure what that is. I've considered trying to put a stress test on a server and a connection but how would I do that with out having a bot farm? I considered perhaps writing a script and launching it from a VPS with a 100mbit connection that would open X number of scripts in every few seconds and then watch my HTOP output from a shell. Do you think that's a good idea?
 
Old 08-29-2016, 01:09 AM   #5
wh33t
Member
 
Registered: Oct 2003
Location: Canada
Posts: 922

Original Poster
Rep: Reputation: 61
Quote:
Originally Posted by 24x7servermanagement View Post
The question is fairly defult one and there is no perfect answer until you find one yourselves with your setup.

However,from my experience with that hardware with niginx you could have 150-200 connections for sure. So do stress Testing as suggested above, i will advise to do that from different ips.
Any tips for stress testing? Are you familiar with Apache? Do you think it would be wise to switch to Nginx right away? or deal with that later if the machine starts getting bogged down?
 
Old 08-29-2016, 05:31 AM   #6
TenTenths
Senior Member
 
Registered: Aug 2011
Location: Dublin
Distribution: Centos 5 / 6 / 7
Posts: 3,475

Rep: Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553
Quote:
Originally Posted by wh33t View Post
I'm hoping I can set this up as a webserver and handle around 300 concurrent connections.
Do you really mean concurrent connections or concurrent users?

There's a big difference.

300 "concurrent users" will not necessarily be holding TCP connections open during their sessions (also depends on what you class as a concurrent user).
 
1 members found this post helpful.
Old 08-29-2016, 12:06 PM   #7
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by wh33t View Post
I dunno if that instills a hope of faith in me but there it is.
It does. (Especially if you made mistakes and learned from them but then again the rest of your reply kind of implies that already...)


Quote:
Originally Posted by wh33t View Post
Feel free to suggest to me any other measures you think I should take.
Can anyone download the code for free and audit it?


Quote:
Originally Posted by wh33t View Post
As for WRT, not sure what that is.
Lower case as in "with respect to".


Quote:
Originally Posted by wh33t View Post
I've considered trying to put a stress test on a server and a connection but how would I do that with out having a bot farm? I considered perhaps writing a script and launching it from a VPS with a 100mbit connection that would open X number of scripts in every few seconds and then watch my HTOP output from a shell. Do you think that's a good idea?
Asserting your know your own products inside out you know what you have to optimize database / PHP / caching-wise and what the potential bottlenecks of the product are, right? So maybe a bit of an explanation of what you're looking for could help? Personally I like siege because I can take just any web server access log, awk '{print}' the request field and use that. Of course there's tools and more tools...
 
1 members found this post helpful.
Old 08-29-2016, 01:06 PM   #8
Rinndalir
Member
 
Registered: Sep 2015
Posts: 733

Rep: Reputation: Disabled
300 users hitting php hard can use a lot of resources. The best thing to do will be to do some testing of the box on your lan first to see how it scales.

But the question's very broad, so it's hard to give specific answers with details.
 
1 members found this post helpful.
Old 08-31-2016, 02:56 PM   #9
24x7servermanagement
Member
 
Registered: Jul 2016
Location: India
Distribution: CentOS, Redhat, Ubuntu and Debian
Posts: 57

Rep: Reputation: Disabled
Quote:
Originally Posted by wh33t View Post
Any tips for stress testing? Are you familiar with Apache? Do you think it would be wise to switch to Nginx right away? or deal with that later if the machine starts getting bogged down?
Oh dear, one google search will give you what you want.

I prefer kali linux tools

Look at stress testing tools

Also if you are really looking for security of CMS the don't just try to secure it. Try to break your CMS.
 
1 members found this post helpful.
Old 08-31-2016, 05:29 PM   #10
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by 24x7servermanagement View Post
Oh dear, one google search will give you what you want.
Members ask questions relying on theoretical and practical knowledge of fellow LQ members. So please don't do that: either answer the question if you can or feel free to skip it if you can't.


Quote:
Originally Posted by 24x7servermanagement View Post
I prefer kali linux tools

Look at stress testing tools
And which of those tools have you successfully used before? Which ones would you recommend?
 
1 members found this post helpful.
Old 09-01-2016, 04:19 AM   #11
24x7servermanagement
Member
 
Registered: Jul 2016
Location: India
Distribution: CentOS, Redhat, Ubuntu and Debian
Posts: 57

Rep: Reputation: Disabled
Quote:
Originally Posted by unSpawn View Post
Members ask questions relying on theoretical and practical knowledge of fellow LQ members. So please don't do that: either answer the question if you can or feel free to skip it if you can't.



And which of those tools have you successfully used before? Which ones would you recommend?
Hmm

Used successfully t50, inviteflood, iaxflood, slowhttptest for testing stress, server load, firewall, waf applications and so

Slowhttptest will work fine here.
 
1 members found this post helpful.
Old 09-01-2016, 08:59 AM   #12
sundialsvcs
LQ Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 10,659
Blog Entries: 4

Rep: Reputation: 3940Reputation: 3940Reputation: 3940Reputation: 3940Reputation: 3940Reputation: 3940Reputation: 3940Reputation: 3940Reputation: 3940Reputation: 3940Reputation: 3940
"300 concurrent users" is not particularly much, especially not for HTTP, which is a "stateless" protocol anyway.

It is relatively unlikely that 300 requests would arrive at the same instant, and, even if they did, Apache would process the requests "as fast as it was able," using the pool of worker-processes that it had allotted to the task. This pool is of variable size and has an upper limit.

If you use "ordinary CGI" (which actually works very well on modern hardware ...), the workers are constantly recycling themselves. If you use FastCGI (which I also very much like), you can arrange for the workers to "commit hari-kiri" after a certain number of requests to avoid problems with memory leaks.

Of most concern to you will be precisely what the various requests are doing ... in particular, what shared-resources are they use, and how many milliseconds a request takes to complete under un-obstructed conditions. Then, what sort of obstructions might slow the requests down. (Anyone can sail through the streets of a city at 2 in the morning ... much faster than they can at rush-hour.)
 
1 members found this post helpful.
Old 09-01-2016, 06:42 PM   #13
Rinndalir
Member
 
Registered: Sep 2015
Posts: 733

Rep: Reputation: Disabled
Apache HTTP server used to come with apachebench or was it called abench? Can't remember now and don't know if it is still a part of ApacheHttpd package. Check your distro. It is a blunt instrument but it can be useful for generating some known quantity of traffic. It is not feature full.
 
1 members found this post helpful.
Old 09-01-2016, 07:05 PM   #14
wh33t
Member
 
Registered: Oct 2003
Location: Canada
Posts: 922

Original Poster
Rep: Reputation: 61
I have the server in my possession now. I'm currently in the process of getting it configured. So far it's been challenging. I feel very weak when it comes to general linux system administration and would love any links that can point me to a "every linux admin needs to know these essential facts/techniques" because I just feel lost, but I am absolutely determined to get this working.

And for clarification, I live in a very small city. It only has a population of ~5k max. I do feel it likely that 300 people max may be actively logged in on the system at the same time, and with persistent mysql connections. This would qualify as 300 concurrent connections, correct?

I will look into adjusting it's CGI mode as well, thank you.

And of course once I do get it up and running I will try to break it. I look forward to it Thank you all for the links and suggestions.
 
Old 09-01-2016, 07:22 PM   #15
Rinndalir
Member
 
Registered: Sep 2015
Posts: 733

Rep: Reputation: Disabled
The first thing to know is when to pay someone else to host your hardware or give you diskspace and bandwidth.

Don't mothball your box but if you're really just starting the learning curve to put a box on the internet and keep it secure is steep.

It might be better to pay for hosting, learn from what they do (fail2ban, etc.) then all the while work on your box and keep the goal of administering that your self some day.
 
1 members found this post helpful.
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
When testing VPS server with Siege is 50 concurrent users good enough? deskt0plinux Linux - Server 1 03-06-2012 05:38 AM
Concurrent users elfoozo Linux - Server 2 02-11-2011 01:46 PM
UUCP can't handle 19 concurrent connections? columb Linux - General 2 05-28-2009 01:31 AM
Apache2 optimal concurrent conn. handle LandRover Linux - Server 0 07-27-2007 03:33 AM
How to handle with users & password with a server and plently linux boxes? Xeratul Linux - General 2 01-05-2007 03:02 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 01:57 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration