LTSP: Using server's GPU for rendering clients' screens?
I wasn't sure about what sub-forum to put this in; I think I got it right, though...
I was wanting to setup an LTSP server, either with Ubuntu or Debian, and have the GPU (or maybe GPUs) in the server do the rendering of the clients' screens.
Is this possible? If so, what would I need to do so that it works like this?
I have looked around for a while, but haven't been able to come to an answer... I know that if the client has a GPU, it can be used to help with rendering, but I want to know if the server can use its GPU to help with rendering, at least on clients that are limited to integrated graphics.
I know that some people are probably thinking about bandwidth being an issue; I've got that part covered. (Currently have gigabit, and might move to fibre channel.)