LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 12-10-2021, 08:50 AM   #1
Sol33t303
Member
 
Registered: Jul 2017
Distribution: gentoo
Posts: 115

Rep: Reputation: Disabled
How should I go about setting up a container with it's own X server (not connected to my hosts X server), with full GPU acceleration?


So I'm currently in the middle of trying to setup a container that I can use for game streaming, i'd love to setup a Pi in the living room that can remotely stream some games to it from my Nvidia desktop.

Now why a container? Because I want a setup where somebody can get on the Pi in the living room and start streaming while i'm at my PC, without it interfering with what I'm doing at all or even noticing that it's happened. This is also my first time really working with containers so it will also be a bit of a learning project.

Now my problem is I can't seem to get any kind of X server working the way I want. So far I have tried:

- In the guest after installing I tried installing and using xvfb with
Code:
xvfb-run glxgears
, it runs, but pretty suboptimally. It runs a bit shy of 2,000 FPS. Meanwhile however my host runs at over 10,000 FPS, so I belive xvfb is software rendered which isn't good enough.

- Tried using Xdummy https://github.com/Xpra-org/xpra/blo...sage/Xdummy.md as I read elsewhere that it could be hardware accelerated. Tried testing it out on my host with the following config https://pastebin.com/83KdyaeC based on this sample config with some stuff cut out https://github.com/Xpra-org/xpra/blo...xpra/xorg.conf using the following command
Code:
sudo Xorg -noreset +extension GLX +extension RANDR +extension RENDER -logfile ./10.log -config /etc/xpra/xorg-dummy.conf :10
However it seems even on my host with glxgears it's performance is abysmal. 500 FPS, compared to xvfb's 2,000 which was already bad. So I assume running it in a container isn't going to magically make it faster.

-Next I tried simply running a second full xorg server and then connecting the container to it. I'd much rather the container start the X server as that way everything is neatly packed within the container, and I can shut the container down whenever and have everything clean it's self up without having to kill a second X server, but I figured i'd give it a try to see if it's a viable option. Turns out that option isn't a great one either, as at first it performs terribly like the xdummy one does, however it then gets full performance when I switch VTs to it and back. Not exactly the unobtrusive experience where I hardly even realize it's there I want. Also I have a feeling glxgears output isn't giving the full story as despite showing high FPS in it's output after switching out of it, my GPU utilisation is sitting at 0%, unlike when I run glxgears on my ususal x server at :0. Again, I assume trying to hook the :1 xorg server up to the container isn't going to make it any faster so I haven't tried. For this I am using this config https://pastebin.com/pdTvDHb7 and this command
Code:
sudo Xorg -listen unix -listen local -ac -config /etc/X11/xorg-Retroarch.conf :1
So by now i'm pretty well out of ideas, does anybody else have any ideas or suggestions I could look into? Or can anybody spot any errors or misconfigurations in any of my config files?
 
Old 12-11-2021, 11:16 AM   #2
wpeckham
LQ Guru
 
Registered: Apr 2010
Location: Continental USA
Distribution: Debian, Ubuntu, RedHat, DSL, Puppy, CentOS, Knoppix, Mint-DE, Sparky, VSIDO, tinycore, Q4OS, Manjaro
Posts: 5,852

Rep: Reputation: 2803Reputation: 2803Reputation: 2803Reputation: 2803Reputation: 2803Reputation: 2803Reputation: 2803Reputation: 2803Reputation: 2803Reputation: 2803Reputation: 2803
I do not believe that there is any way to get real GPU acceleration without a real physical GPU connection. I am unsure about WHICH kind of containerization you are using. Containers and Virtualization can do a LOT of great things, but it cannot fake the acceleration of real hardware with virtual hardware. Kernel containerization adds a small additional overhead, true virtualization adds MUCH more. Those issues are very real, but most likely not the real bottleneck.

I love RPi, but it is not really a powerhouse gaming engine. If I understand correctly, you are running a display on the RPi that is fed from your container on your workstation where the game is running in a container?
(Correct me if I got that wrong)

When you are talking about remotely accessing gaming, up to 99% of the GPU acceleration is wasted because the display is really going over the remote streaming connection. All of the speed advantage is local, but you cannot see that at the remote display. Access over network is going to be limited to the network speed of the display traffic and updates rather than GPU processing. This is not a valid concern when you are accessing the game directly and not over network.

I think you can get that working, and working so it serves a purpose. I do not believe that you can get the performance you seems to want, there are far too many bottlenecks that cannot be eliminated from the design without major hardware and network changes.

Last edited by wpeckham; 12-11-2021 at 11:31 AM.
 
Old 12-11-2021, 12:31 PM   #3
Sol33t303
Member
 
Registered: Jul 2017
Distribution: gentoo
Posts: 115

Original Poster
Rep: Reputation: Disabled
Quote:
I am unsure about WHICH kind of containerization you are using
I'm using regular containers, not virtualization. In particular, I am managing my containers using LXD, which I probably should have mentioned in my first post, sorry about that.

Quote:
If I understand correctly, you are running a display on the RPi that is fed from your container on your workstation where the game is running in a container?
I haven't actually got the Pi setup yet (still waiting on some controllers and a power supply from amazon, lost the power supply that came with the Pi a long time ago). But yes that is the setup i'm aiming for. I plan on using the software called "sunshine" in the container https://github.com/loki-47-6F-64/sunshine, which hosts and manages the connection to the clients, it uses the nvidia gamestream protocol https://www.nvidia.com/en-us/shield/...tv/gamestream/ which allows for a fast and low latency connection that gaming needs.

Quote:
When you are talking about remotely accessing gaming, up to 99% of the GPU acceleration is wasted because the display is really going over the remote streaming connection.
I'm not sure thats correct, it would depend on your LAN bandwidth. If your bandwidth can handle 1080p video at 60fps, which it should, then you have the bandwidth need to stream games. Nvidia Geforce NOW https://www.nvidia.com/en-us/shield/...rce-now-games/ which uses the exact same protocol to stream games to your PC from the cloud, according to Nvidia at 1080p at 60 FPS should only need a WAN bandwidth of 30 Mbps. I think WiFi should have a bandwidth far exceeding 30 Mbps. Really I'd imagine WiFi SHOULD handle at least like 4k at 60FPS, which is far more then my GPU can output running reasonably demanding games.

I know a lot of people stream their games in a similar fashion also using Steam Remote Play and Google Stadia, for popular examples. Although I believe they both use different protocols.
 
Old 12-12-2021, 10:29 PM   #4
sundialsvcs
LQ Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 10,756
Blog Entries: 4

Rep: Reputation: 3966Reputation: 3966Reputation: 3966Reputation: 3966Reputation: 3966Reputation: 3966Reputation: 3966Reputation: 3966Reputation: 3966Reputation: 3966Reputation: 3966
I doubt that you can accomplish this because containerized processes are in fact running natively on the host. I could of course be wrong.
 
Old 12-12-2021, 11:03 PM   #5
Brains
Senior Member
 
Registered: Apr 2009
Distribution: All OS except Apple
Posts: 1,591

Rep: Reputation: 389Reputation: 389Reputation: 389Reputation: 389
Probably would be easier to set up the RPI to take care of everything and control it from the Desktop.
But that don't help with the Container learning experience

What about setting up the RPI as a graphics device (processor) to get the full GPU acceleration, the rest is up to the container using host processes. Since streaming will likely be done on the same network account, don't matter which computer is downloading it.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
LXer: Being Open and Connected on Your Own Terms with our New Facebook Container Add-On LXer Syndicated Linux News 0 03-31-2018 07:15 PM
Handbrake GPU Acceleration - Inexpensive AMD GPU for Old PC Mr. Macintosh Linux - Software 8 01-03-2018 03:11 PM
Tried to swap GPU in HP workstation. GPU not working good. LexMK Linux - Hardware 1 06-21-2013 06:59 PM
LXer: Firefox 6 Should Sort Out Linux GPU Acceleration LXer Syndicated Linux News 0 05-07-2011 09:00 AM
Script for hosts, numbers of hosts and users connected to squid server arunabh_biswas Programming 5 08-28-2010 04:11 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 02:23 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration