Does it make sense a good video card in a remote server?
Linux - HardwareThis forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Does it make sense a good video card in a remote server?
Hi all,
I just got a Proliant server and I loaded it with 4 cpu and 12Gb of ram. I use it only remotely and I am interested on using graphic software like FEM pre-post-processing and video generation. I am not sure if it will make sense to buy a video card more powerful as there will never be a monitor plugged into that card.
The question is:
Who does the video processing when I do rendering remotely through ssh -X or vnc? Is the server or the client responsible for that process?
Thanks.
I agree that the client is responsible for showing the image, however, if the image needs processing, like rendering, or all sort of calculations that video cards support, who does them?
I think that, for example, with vnc, when you open a client connection is like pointing a camera to the server. Therefore, all calculations are made in the server side but they are sent to the client as a pure screenshot. So if your connection is slow or your client video card is bad, you will not be able to see good graphics (is like you are pointing the server with a very bad camera). However, the actual server will be processing properly the graphics.
For the most common Linux methods of running an application on one system with the graphics on another, the graphics card on the system running the application is not involved at all.
When discussing this, the terms "client" and "server" can get very confusing because of X Windows terminology.
In X Windows, the display is the server and the application program is the client. This is the reverse of the usual terminology in which the UI is the client and the application program is the server.
Linux almost always uses X Windows for graphics. Two of the most common ways to split the graphics from the application are:
1) Directly use X Windows, so an X Windows "server" runs on the UI system displaying the results of the "client" application running on the application system.
2) Using X Windows through VNC. A program runs on the application system that acts as both an X Windows "server" and a VNC "server". The X Windows "client" application on the application system talks to that X Windows server, while the VNC "client" application on the UI system talks to the VNC server.
In both the above situations, the application side display system is not involved at all.
It is also possible to set up VNC or similar services to run in a mode like remote help, where the application system display is involved and the VNC server copies the graphics from the local display. I think that fits what you expected for remote graphics, but it wouldn't be the normal way to set things up unless you specifically want the graphics visible in both places.
If the remote server machine is not being used to generate a display then a sophisticated display adapter is not required. With one exception, that being if you are planning to use the processing capabilities of a sophisticated display adapter installed in the remote server machine as a GPU for computation purposes (as opposed to graphic display purposes).
johnsfine's terminology explanation is both articulate and correct.
Thanks all,
I found this thread on a nvidia forum.
Apparently ssh sends all GPU processes to the local machines, but vnc by default does all GPU processes in the remote server. Therefore, vnc allows utilizing a good video card in a remote server.
but vnc by default does all GPU processes in the remote server. Therefore, vnc allows utilizing a good video card in a remote server.
I expect the phrase you read there which gave you that idea was
Quote:
"VNC": the VNC server can be run in several modes. The most common one is that the remote machine does a screenshot of the changed screen content and sends it to you. With this setting, it uses all resources on the remote machine. No need to have any OpenGL locally.
I don't know how anyone would know the most common mode in which other people run VNC. No one collects statistics on that.
But the important thing is what mode should you choose for your use of VNC (if you want VNC at all).
VNC allows utilization of the video card in the application server. But that doesn't make it a good idea. Copying the generated graphics and compressing them for transmission may be much more costly than transmitting the original graphics requests and doing all the rendering on the UI side. VNC can run in that mode. Pure X windows without VNC only runs in that mode.
Do the programs you want to run display via Open GL? Or via X Windows? I'm not sure of the full interaction between VNC, X Windows and Open GL. Possibly, there is justification for rendering on the application side to deal with some OpenGL issues. I wouldn't guess that is true, but I'm not certain it isn't.
painting the screen in another machine and sending
something like 32MB of data 25 times a second over the network!
(or trying to)
No really,
Based on my experience, vnc is smart enough to decide how many MB to share based on the connection.
For instance, if i am in my local network and I vnc to the server I get beautiful colors and very good refreshing of the screen. However, If i vnc from the internet to my server, color map is very affected and the refreshing time is reduced.
I understand the Microsoft remote desktop experience, however, vnc improves speed by reducing desktop quality.
Thanks
PD: I really appreciate all the comments here, but what is your opinion, does it makes sense to upgrade a remote server with a video card? (yes/no)
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.