LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Hardware (http://www.linuxquestions.org/questions/linux-hardware-18/)
-   -   Does it make sense a good video card in a remote server? (http://www.linuxquestions.org/questions/linux-hardware-18/does-it-make-sense-a-good-video-card-in-a-remote-server-737311/)

armonmar 07-02-2009 05:09 PM

Does it make sense a good video card in a remote server?
 
Hi all,
I just got a Proliant server and I loaded it with 4 cpu and 12Gb of ram. I use it only remotely and I am interested on using graphic software like FEM pre-post-processing and video generation. I am not sure if it will make sense to buy a video card more powerful as there will never be a monitor plugged into that card.
The question is:
Who does the video processing when I do rendering remotely through ssh -X or vnc? Is the server or the client responsible for that process?
Thanks.

bigearsbilly 07-02-2009 05:19 PM

it's the client.
think about it,
if you have a b/w card on the client, colour pictures from the server
would not work.

armonmar 07-02-2009 06:22 PM

I agree that the client is responsible for showing the image, however, if the image needs processing, like rendering, or all sort of calculations that video cards support, who does them?
I think that, for example, with vnc, when you open a client connection is like pointing a camera to the server. Therefore, all calculations are made in the server side but they are sent to the client as a pure screenshot. So if your connection is slow or your client video card is bad, you will not be able to see good graphics (is like you are pointing the server with a very bad camera). However, the actual server will be processing properly the graphics.

With ssh -X, I'm not so sure that is the case.

Thanks.

johnsfine 07-02-2009 07:10 PM

For the most common Linux methods of running an application on one system with the graphics on another, the graphics card on the system running the application is not involved at all.

When discussing this, the terms "client" and "server" can get very confusing because of X Windows terminology.

In X Windows, the display is the server and the application program is the client. This is the reverse of the usual terminology in which the UI is the client and the application program is the server.

Linux almost always uses X Windows for graphics. Two of the most common ways to split the graphics from the application are:

1) Directly use X Windows, so an X Windows "server" runs on the UI system displaying the results of the "client" application running on the application system.

2) Using X Windows through VNC. A program runs on the application system that acts as both an X Windows "server" and a VNC "server". The X Windows "client" application on the application system talks to that X Windows server, while the VNC "client" application on the UI system talks to the VNC server.

In both the above situations, the application side display system is not involved at all.

It is also possible to set up VNC or similar services to run in a mode like remote help, where the application system display is involved and the VNC server copies the graphics from the local display. I think that fits what you expected for remote graphics, but it wouldn't be the normal way to set things up unless you specifically want the graphics visible in both places.

cgtueno 07-08-2009 09:14 AM

Hi

If the remote server machine is not being used to generate a display then a sophisticated display adapter is not required. With one exception, that being if you are planning to use the processing capabilities of a sophisticated display adapter installed in the remote server machine as a GPU for computation purposes (as opposed to graphic display purposes).

johnsfine's terminology explanation is both articulate and correct.

Regards

Chris

armonmar 07-09-2009 01:08 AM

Thanks all,
I found this thread on a nvidia forum.
Apparently ssh sends all GPU processes to the local machines, but vnc by default does all GPU processes in the remote server. Therefore, vnc allows utilizing a good video card in a remote server.

http://forums.nvidia.com/lofiversion...hp?t35425.html

johnsfine 07-09-2009 09:07 AM

Quote:

Originally Posted by armonmar (Post 3601792)
but vnc by default does all GPU processes in the remote server. Therefore, vnc allows utilizing a good video card in a remote server.

I expect the phrase you read there which gave you that idea was
Quote:

"VNC": the VNC server can be run in several modes. The most common one is that the remote machine does a screenshot of the changed screen content and sends it to you. With this setting, it uses all resources on the remote machine. No need to have any OpenGL locally.
I don't know how anyone would know the most common mode in which other people run VNC. No one collects statistics on that.

But the important thing is what mode should you choose for your use of VNC (if you want VNC at all).

VNC allows utilization of the video card in the application server. But that doesn't make it a good idea. Copying the generated graphics and compressing them for transmission may be much more costly than transmitting the original graphics requests and doing all the rendering on the UI side. VNC can run in that mode. Pure X windows without VNC only runs in that mode.

Do the programs you want to run display via Open GL? Or via X Windows? I'm not sure of the full interaction between VNC, X Windows and Open GL. Possibly, there is justification for rendering on the application side to deal with some OpenGL issues. I wouldn't guess that is true, but I'm not certain it isn't.

bigearsbilly 07-21-2009 05:21 AM

sounds a bit microsoft ;)

painting the screen in another machine and sending
something like 32MB of data 25 times a second over the network!
(or trying to)

armonmar 07-22-2009 12:23 AM

Quote:

Originally Posted by bigearsbilly (Post 3614965)
sounds a bit microsoft ;)

painting the screen in another machine and sending
something like 32MB of data 25 times a second over the network!
(or trying to)

No really,
Based on my experience, vnc is smart enough to decide how many MB to share based on the connection.
For instance, if i am in my local network and I vnc to the server I get beautiful colors and very good refreshing of the screen. However, If i vnc from the internet to my server, color map is very affected and the refreshing time is reduced.
I understand the Microsoft remote desktop experience, however, vnc improves speed by reducing desktop quality.
Thanks

PD: I really appreciate all the comments here, but what is your opinion, does it makes sense to upgrade a remote server with a video card? (yes/no)


All times are GMT -5. The time now is 01:55 PM.