LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (https://www.linuxquestions.org/questions/slackware-14/)
-   -   Slackware 15: AMD or NVIDIA Video Card? (https://www.linuxquestions.org/questions/slackware-14/slackware-15-amd-or-nvidia-video-card-4175685237/)

Daedra 11-14-2020 11:33 PM

Slackware 15: AMD or NVIDIA Video Card?
 
I am patiently waiting for the new Ryzen 5900X to be back in stock so I can finish up my new PC build. However I am going to carry over my NVIDIA GTX 970 into the new system until the new AMD cards come out and NVIDIA restocks their new RTX 3000 series cards. I have always had NVIDIA cards in my systems and used the proprietary drivers with no problems. But I am curious about AMD cards in Slackware, are the drivers opensource or proprietary, do you get the same amount of performance as you do in Windows? Essentially I just want your opinions and your experiences with AMD in Slackware so I can make a informed purchase in the coming months. I have always used NVIDIA because of their Linux support but would consider AMD if their drivers are as solid as NVIDIA's.

Thanks.

Timothy Miller 11-15-2020 12:40 AM

Assuming you have a new enough kernel, the amdgpu drivers are fully open source, and work REALLY well (surprisingly so, in fact). As far as performance, that I'm not aware of for certain, but I BELIEVE it's quoted as being something like 80-85% the performance in Windows. May be better now, it's been a really long time since I've bothered to look. I've never actually tested myself to see first-hand, but then, I don't use Windows, so that goes without saying I guess.

EdGr 11-15-2020 01:20 AM

AMD's open-source drivers work very well.

I have always bought AMD (previously ATI) video cards. My most recent card is a Pitcairn from 2014. It took about a year for Glamor/Mesa to work, but since then AMD has put a lot of effort into the open-source drivers.

The open-source drivers have the significant advantages that they can't get in the way of a kernel upgrade, and support for old video cards won't go away as long as someone is interested in those cards.

I am not a gamer, but I see OpenGL applications rendering as fast as the card seems to be capable of (at 4K).
Ed

Daedra 11-15-2020 01:41 AM

Do you guys ever have any problem with screen tearing with your AMD cards? Nvidia can be somewhat stubborn about this, enabeling vsync and full composite pipeline fixes this most of the time though.

StalocX 11-15-2020 01:54 AM

I'm a long term amd gpu's user (R9 290x -> RX 480 -> RX 5700XT). Mostly use for gaming. From my experience since about 2014 I haven't had any problems with kernel driver amdgpu (which is free) + firmware (which is non-free). Yes, tearing on X under amdgpu is a well known problem, but it's as easy to fix as add
Code:

Option "TearFree" "true"
to your X config file.

I also haven't compared windows and linux gaming performance for years, but with proton and vulkan I'm pretty sure linux performance is equal or even better.

ZhaoLin1457 11-15-2020 02:05 AM

Slackware 15: AMD or NVIDIA Video Card?

I for one, if I can, I will go for a discrete Intel graphics card. :D

I heard that they will invade the World in the first half of 2021.

That will be faster than Slackware 15.0 honest expectation for a release...

chrisretusn 11-15-2020 02:30 AM

Interesting. Off the cuff, I would have said go with NVIDIA. This is based on some very bad experiences with AMD in the past. That was the distant past. I have heard some good things about AMD lately an the comments about reflect that. I might give AMD another shot with my next graphics card.

RadicalDreamer 11-15-2020 02:44 AM

If I were you I'd wait for the AMD 6000 series which is about to come out and checkout the benchmarks on tomshardware! The one thing that concerns me about what I've read about the 5000 series is that the gpu's run hot.

StalocX 11-15-2020 02:53 AM

Quote:

Originally Posted by RadicalDreamer (Post 6185383)
If I were you I'd wait for the AMD 6000 series which is about to come out and checkout the benchmarks on tomshardware! The one thing that concerns me about what I've read about the 5000 series is that the gpu's run hot.

The highest temp I've seen so far is 68C in closed case with soundproofing. GPU was loaded at 100% for over an hour (Gigabyte with 3 coolers).

RadicalDreamer 11-15-2020 03:08 AM

Quote:

Originally Posted by StalocX (Post 6185386)
The highest temp I've seen so far is 68C in closed case with soundproofing. GPU was loaded at 100% for over an hour (Gigabyte with 3 coolers).

KingBeowulf said, "however, the AMD GPU does ramp up 100+ C, and fans on max, on the in game menus and some map areas."
https://steamcommunity.com/groups/sl...7258987752928/

AMD said it was okay. I don't know. That is great that you have low temps with it.

chrisretusn 11-15-2020 03:19 AM

Quote:

Originally Posted by StalocX (Post 6185386)
The highest temp I've seen so far is 68C in closed case with soundproofing. GPU was loaded at 100% for over an hour (Gigabyte with 3 coolers).

Noisy? My NVIDIA is at 52C right now. Where I live heat can definitely be an issue.

StalocX 11-15-2020 03:37 AM

Quote:

Originally Posted by chrisretusn (Post 6185401)
Noisy? My NVIDIA is at 52C right now. Where I live heat can definitely be an issue.

With headphones I don't hear noise at all even when gpu is loaded at 100%.

cwizardone 11-15-2020 09:00 AM

I bought nothing but ATi/AMD for well over twenty years. About ten years ago I was in need of a new card and the local store only sold Nvidia based cards.
Best thing, computer video wise, that ever happened. It has been Nvidia every since.
The Nvidia drivers just work.
Ditto the noveau driver.

tramtrist 11-15-2020 09:09 AM

AMD

Chuck56 11-15-2020 10:07 AM

Let the video holy war begin!

Either will work with Slackware15. Disclaimer: I'm an AMD fan boy.

Bindestreck 11-15-2020 10:13 AM

What I wonder is, will AMD Opensource driver perform the same as AMD proprietary (if this still exists)?

zhl 11-15-2020 10:37 AM

Generally less issues with Nvi, for gamer, cracker..

phalange 11-15-2020 10:41 AM

Quote:

Originally Posted by Daedra (Post 6185354)
your opinions and your experiences

I bought a 5700XT card about a year ago after getting the impression AMD's open source drivers were exceptional. But the card did not run under Slackware or any Linux I tried (I threw many at at). The issue then was (I think) that the drivers had not yet received support for the card, although the card itself had been out about 5 months.

I couldn't run AMD's "pro" divers since they were packaged only for Suse and Red Hat. It wasn't clear if that would have solved the issue anyway, and at that point I was getting frustrated so I returned the card for an NVidia 2070 Super.

Despite being proprietary, the Linux installer from NV has been very reliable (as was the SBo package), and it supported my new card (which came out the same time as the AMD).

So my experience is that you should confirm that support for any new AMD card has actually arrived in the presently shipping open source driver. And my opinion is that if your priority is getting the best performance from the latest GPU hotness, then the fact that AMD's drivers are open source should not be the only criterion.

xor_ebx_ebx 11-15-2020 10:43 AM

Quote:

Originally Posted by RadicalDreamer (Post 6185397)
KingBeowulf said, "however, the AMD GPU does ramp up 100+ C, and fans on max, on the in game menus and some map areas."
https://steamcommunity.com/groups/sl...7258987752928/

AMD said it was okay. I don't know. That is great that you have low temps with it.

The reason for this is probably because some of AMD's (and board partner) cards are using blower fans while some have good coolers. Cards like my Sapphire Pulse usually don't go above 70, but those early blower-style cards will hit 100+ all the time

upnort 11-15-2020 11:38 AM

At work last week I configured a new Debian workstation with an AMD 3200G APU. From Backports I installed the 5.8 kernel and related packages.

Works really nice. No hiccups, no video issues.

That is only one data point but I hope that helps.

garpu 11-15-2020 11:50 AM

I have Nvidia. Your mileage may vary. While AMD works out of the box, so to speak, I had no end of trouble getting specific games working with it. (I've got a 1050ti, and it's pretty quiet and cool, unless I'm gaming. Then it goes up to about 41C) On the old computer, it was very noisy. New one, it's pretty quiet, but I think the new computer and/or kernel supports the power management better.

So should we throw out text editors to complete the holy war? (vim ftw.)

bassmadrigal 11-15-2020 03:48 PM

Quote:

Originally Posted by Bindestreck (Post 6185495)
What I wonder is, will AMD Opensource driver perform the same as AMD proprietary (if this still exists)?

AMD proprietary still exists, but it isn't nearly as easy to install as Nvidia's. They have premade packages for a handful of OSes, and Slackware isn't one of them. With a lot of work, you can install the packages and tweak Slackware enough to work with them, but I lost the desire to do that years ago. The performance is quite similar between the two. Sometimes the proprietary will do better and other times the open source will do better.

@OP, Nvidia has no comparison with closed source drivers. They have supported Linux and Slackware for decades. However, they can be a bit behind to add support for newer kernels, which can leave users stuck on older kernels than desired or switch to the relatively low performing nouveau. It also requires building the kernel module for each kernel version you install (leads to a lot of building when running -current). They will also eventually stop supporting older cards (not an issue if you keep your card within the last few generations). Open source for Nvidia is pretty sad. Older cards perform ok with nouveau, but still not great. Newer cards are rarely supported well.

AMD offers proprietary drivers, but they are a pain to install on Slackware. However, their open source driver is actually pretty incredible. If you have a new enough kernel and mesa, you simply add the card and it just works. AMD has added huge amounts of code to the Linux kernel to better support their devices. Your only problem with AMD is if you install a card that is newer than your kernel or mesa, as you'll need to either try and get the proprietary driver installed or somehow get your mesa and/or kernel upgraded.

kingbeowulf 11-15-2020 07:54 PM

Quote:

Originally Posted by RadicalDreamer (Post 6185397)
KingBeowulf said, "however, the AMD GPU does ramp up 100+ C, and fans on max, on the in game menus and some map areas."
https://steamcommunity.com/groups/sl...7258987752928/

AMD said it was okay. I don't know. That is great that you have low temps with it.

Actually, at least Doom (2016) via proton, and several Feral Interactive ports (Alien Isolation, Deus Ex Human Revolution, Shadow of Mordor), the high temp/high frame rate and temperature I observed was a glitch in how I had freesync set up for amdgpu (samsung 4K freesync 60Hz LCD). Other than that, amdgpu has been great, performs well, but can be a bit tricky on games optimized for Nvidia. I see no need for AMDGPU-PRO unless you need OPENCL (I have a thread on LQ for a work around...).
Code:

ection "Device"
        ### Available Driver options are:-
        ### Values: <i>: integer, <f>: float, <bool>: "True"/"False",
        ### <string>: "String", <freq>: "<f> Hz/kHz/MHz",
        ### <percent>: "<f>%"
        ### [arg]: arg optional
        #Option    "Accel"                      # [<bool>]
        #Option    "SWcursor"                  # [<bool>]
        #Option    "EnablePageFlip"            # [<bool>]
        #Option    "SubPixelOrder"              # [<str>]
        #Option    "ZaphodHeads"                # <str>
        #Option    "AccelMethod"                # <str>
        #Option    "DRI3"                      # [<bool>]
        Option    "DRI"                "3"        # <i>
        #Option    "ShadowPrimary"              # [<bool>]
        Option    "TearFree"          "True"  # [<bool>]
        #Option    "DeleteUnusedDP12Displays"        # [<bool>]
        Option    "VariableRefresh"    "True"  # [<bool>]
        Identifier  "Card0"
        Driver      "amdgpu"
#        BusID      "PCI:5:0:0"
#        Screen      0
EndSection

Now the tricky part. Several sites state that you want to enable VSYNC="on" globally in driver/xorg but 'off' in games to have freesync (VRR) active. However, you shouldn't enable it for the desktop since this may cause some issues - depending on the window manager etc. In the above games, I've found that it's better to set VSYNC="on" or "adaptive" if adaptive is available. This keeps the GPU from trying to push too many frames in some scenes and menus. For Freesync or any VRR method to work, you MUST cap the game fps to the LCD monitor refresh rate - that is why we are now seeing high refresh rate LCD gaming monitors. Older linux game engines, Borderlands 2 for example, and maybe some newer ones, are fine with in-game VSYNC="off". Some games that don't recognize VRR (freesync etc) can freak out with VSYNC="off" when amdgpu has VRR enabled.

Also, overheating and visual artifacts can occur with the wrong Ambient Occlusion (SSAO etc). Set this to "off" or "normal | standard" as HDAO is an Nvidia specific algorithm. Same for Adaptive resolution DLSS.

As to which to choose:
AMDGPU PROS
  • Full open source
  • No reinstall needed on kernel upgrade
  • Better Price/performance than Nvidia
AMDGPU CONS
  • Need to upgrade kernel, mesa, xf86-amdgpu to get newest features

NVIDIA PROS
  • Easy to install/upgrade for new features
  • Many games optimized for Nvidia
NVIDIA CONS
  • Not open source
  • Kernel interface rebuild needed on kernel upgrade
  • Lacks features from MS Windows version.
  • Expensive

After 20 yrs with Nvidia, I switched to AMD (and thus Slackware-current with kernel 5.4.x) and am perfectly happy. YMMV.

Daedra 11-15-2020 11:13 PM

All the replies here pretty much mirror what I have been reading everywhere else. It's pretty much apples and oranges in linux now. Both are well supported. I will wait for the 6000 series to come out and compare them to the RTX 3000 series and buy the best price to performance card I can afford.

enorbet 11-16-2020 01:34 AM

Thanks to kingbeowulf for a clear and concise "shootout". I'd like to note some minor disagreement and clarification on nVidia Cons.

Kernel Interface rebuild really just means re-running the installer. It's EZ and takes all of 3 minutes.

No important features available in Windows are not available in Linux. Performance is actually better on Linux in most cases. Example: Even with such Benchmarks as Shadow of the Tomb Raider which must run with DXVK/Proton, compared to running on Windows 7 Steam, Linux benchmarks are ~14% better in Ultra on both. The subjective feel is far smoother. The older Linux port by Feral of Tomb Raider 2013 is even faster.

While AMD is indeed actively working hard at providing a better bang-for-buck so is Nvidia. The release of the latest 3000 series Ray Tracing cards (AMD is only just beginning to support ray tracing, an important boost that will only improve and increase) are such an increase in bang for buck that 2000 series owners are unloading them and 3000 series cards are being actually "scalped" - bought specifically for resale on sites like eBay at inflated prices because of demand vs/ supply.

To be clear I'm super happy that AMD is working so hard to improve both bang-for-buck and Linux compatibility. It forces Nvidia to up their game but I am very happy with Nvidia and have been for decades since they have always supported even truly obscure alternative operating systems like BeOS and OS/2. It is my understanding that there can be some confusion in applicable drivers for AMD while the only problems I routinely see for Nvidia on Linux are the Optimus nvidia/intel shared systems. If you want to know ahead of time whether an nvidia card will work the answer is clear and readily available on the nvidia site. There is zero gamble.

petejc 11-16-2020 02:50 AM

Quote:

Originally Posted by phalange (Post 6185511)
I bought a 5700XT card about a year ago after getting the impression AMD's open source drivers were exceptional. But the card did not run under Slackware or any Linux I tried (I threw many at at). The issue then was (I think) that the drivers had not yet received support for the card, although the card itself had been out about 5 months.

I found that my RX 590 idd not work on Slackware 14.2 but did on current. The issue was that it did not, on a fresh install, have the firmware installed in /lib/firmware. So you need a new enough kernel plus firmware that supports your AMD card.

phalange 11-16-2020 10:42 AM

Quote:

Originally Posted by petejc (Post 6185789)
I found that my RX 590 idd not work on Slackware 14.2 but did on current. The issue was that it did not, on a fresh install, have the firmware installed in /lib/firmware. So you need a new enough kernel plus firmware that supports your AMD card.

This is a good point. To clarify my post, I run Slackware current.

twy 11-16-2020 11:45 AM

I've used nvidia TNT (1998-2000), GeForce 256 DDR (2000-2004), FX-5200 (2004-2010), and now a GT240 (2010-now). So, these are all old low-end cards. I've used them on Intel-based systems with ECC ram. The systems have always been very stable, with no video problems or crashes (it just never happens). I've always used the nvidia driver directly from the nvidia web site. When upgrading the kernel, I uninstall the nvidia driver just before I reboot, then reinstall it when the system restarts - it is not difficult at all, you just rerun the nvidia installer and tell it not to edit any config files. I have my X config files slightly customized, so I don't want the installer to change them. I have the X compositing disabled, so my X runs in the old way where it does not use the GPU much for regular desktop stuff.

When installing the nvidia driver, I've had a small issue: I have to keep backups of /usr/lib/libEGL.la and /usr/lib64/libEGL.la. After I install the driver, I have to restore these two files, because the installer does something to them. I do not know the cause of this problem, so if anyone knows please tell me.

In /etc/modprobe.d/nvidia.conf I have the lines:
# Enable MSI interrupts
options nvidia NVreg_EnableMSI=1

In NVIDIA X Server Settings application, I have OpenGL settings "Allow Flipping", but I do not check "Sync to VBlank" because that slows things down.

Video playback in mplayer works well with the "-vo vdpau" output mode. This seems to enable some acceleration, and then common video formats run with little CPU usage. libvdpau is in slackware, so this works "out of the box". vdpauinfo is a tool at SBo to tell you what your card supports.

I hope to upgrade my nvidia card in the future. Something like a lower-end GeForce RTX 3050 or 3040 would be fine if these are released in the future! I'm not a big gamer, so a quiet and power-efficient card is preferable. With any luck, the new cards will be perfect for 4K screens at 120Hz refresh and good color etc. Have to wait to see.

pchristy 11-16-2020 12:21 PM

I went off AMD for a while when they stopped support for the chipset in the (then relatively new) laptop that I had. There was a brief hiatus of a couple of years, but then the open source drivers became available that solved my problems.

In the intervening period, I'd switched to NVidia for my desktop, but their drivers always seem to lag some way behind the kernels, and I found myself often having to use older kernels until NVidia caught up.

Caveat: I'm not a gamer! My main priority is video processing! For this, I've found the Intel chipsets generally more than adequate. If I have to install a discrete graphics card, my preference is for AMD, simply because the open source drivers are superb! I've never had much luck with the NVidia nouveau drivers, and, as I said, their proprietary ones always seem to be a bit behind the curve.

No such issues with either AMD or Intel, which just work "out of the box"!

--
Pete

kingbeowulf 11-16-2020 01:44 PM

Quote:

Originally Posted by enorbet (Post 6185765)
Thanks to kingbeowulf for a clear and concise "shootout". I'd like to note some minor disagreement and clarification on nVidia Cons.

Kernel Interface rebuild really just means re-running the installer. It's EZ and takes all of 3 minutes.

This assumes Nvidia updated/supports the new kernel. They tend to lag a bit.

Quote:

No important features available in Windows are not available in Linux. Performance is actually better on Linux in most cases. Example: Even with such Benchmarks as Shadow of the Tomb Raider which must run with DXVK/Proton, compared to running on Windows 7 Steam, Linux benchmarks are ~14% better in Ultra on both. The subjective feel is far smoother. The older Linux port by Feral of Tomb Raider 2013 is even faster.
In terms of performance, at least at 1080p, I've not been disappointed. The RX 5700 XT can push a few games at 4K and higher eye candy, that the GTX1060, and was over $100 less than similar GTX2070/2080 (non-Ti, non-Super). As for features, I stand corrected. When I gave up Nvidia on SBo, there were a few items, such as DLSS, that had not yet hit the linux driver.

Quote:

While AMD is indeed actively working hard at providing a better bang-for-buck so is Nvidia. The release of the latest 3000 series Ray Tracing cards (AMD is only just beginning to support ray tracing, an important boost that will only improve and increase) are such an increase in bang for buck that 2000 series owners are unloading them and 3000 series cards are being actually "scalped" - bought specifically for resale on sites like eBay at inflated prices because of demand vs/ supply.
Very few game titles have ray tracing, and those that do, AFAIK, don't have linux ports. It was probably a good idea on AMDs part to push that to the next GPU generation and allow the kernel and Mesa to catch up. Price wise, unless you are desperate for 4K gaming with ray tracing (FOMO?) none of the top end GTX 2000/3000 cards are worth the price, even without scalping.

All that said, nothing particular wrong with Nvidia on Linux. I switched mostly due to price to get better performance (and drive a 4K Freesync monitor) when Slackware-Current hit kernel 5.4 and Mesa 20.1 to allow good support for AMD Navi GPUs. You can see some early GPU benchmark comparisons on my LQ blog from back in Feb 2020. So far I am pleased. Each GPU has its own "tips and tricks" for optimization. Nvidia has been on the top of the heap for so long that I cringe when I see the Nvidia logo pop up in the game startup. That usually means some of the GPU settings will use specific Nvidia algorithms and not generic OpenGL/Vulkan, and its not always obvious from the labels in the menu. All part of the fun.

RadicalDreamer 11-16-2020 02:09 PM

Beowulf, I'm glad you figured out what was causing the "high temps" and got it working to your liking, but I'm confused about the "high frame rate" problem. Most games are built to run at 60 FPS but some people like FPS games with high frame rates because of input lag and to have an edge against the competition in matches. Can you run Doom 2016 without vsync and with uncapped FPS without issue? There are 240 Hz monitors now, would they have high temperatures? Is limiting the frames which lowered the temperatures or if it has something to do with the freeSync technology. I really don't want to get in AMD and NVIDIA wars but this is an important question that I think needs to be answered.

animeresistance 11-16-2020 02:49 PM

Hi.



I used Nvidia gpus for some time (GT 520, GT 630 and 750Ti), I like the way they handle their propietary drivers, it is true that they were behind in kernel support, but still they are solid (at least for me) and gives a good performance.

But recently (like in June or so) an AMD RT 5500XT fell on my hands (I must admit it, it was a very nice discount). I installed it on the PC and started Slackware 14.2, it only worked with vga drivers, so I updated to Slackware-current and it surprised me, it works without hiccups, games run nice with opensource drivers, specially when the game uses vulkan, but I must say that I am a casual gamer, so for me, the performance is good. The AMD propietary drivers packages are for certain distros, so no Slackware support. Like the others in here, if you want a newer gpu, make sure you have the latest kernel and the latest drivers (specially for AMD gpus).

Cheers.

kingbeowulf 11-16-2020 06:48 PM

Quote:

Originally Posted by RadicalDreamer (Post 6185964)
Beowulf, I'm glad you figured out what was causing the "high temps" and got it working to your liking, but I'm confused about the "high frame rate" problem. Most games are built to run at 60 FPS but some people like FPS games with high frame rates because of input lag and to have an edge against the competition in matches. Can you run Doom 2016 without vsync and with uncapped FPS without issue? There are 240 Hz monitors now, would they have high temperatures? Is limiting the frames which lowered the temperatures or if it has something to do with the freeSync technology. I really don't want to get in AMD and NVIDIA wars but this is an important question that I think needs to be answered.

The various VRR (variable refresh rate) methods, whether Freesync or G-sync, are just a way to get the display and GPU to play nice. In the past, you could only set VSYNC [on | off] to tell the GPU to match (or not) whatever VSYNC the monitor supported. LCDs used to be fixed at either 30Hz or 60Hz, where-as with CRTs you had multisync: the monitor would match whatever the game spit out as FPS, if it could. My 17" Sony Trinitron multisync CRTs where lovely paying Quake 3 Arena at high fps. The 17" Viewsonic LCD not so much - 60 Hz or nothing and Vsync = on was required to remove stuttering and screan tearing. On a pre-VRR LCD, the fps display in the corner means nothing. The high refresh monitors were the first attempt to fix that, but still they only had ONE SPEED.

With VRR, LCD monitors can work like those old CRTs. The game's FPS is no longer locked to the LCD refresh; the LCD can now dynamically adjust based on GPU output. For that 240Hz LCD monitor, that does NOT mean the game will run at 240 fps. Rather, with a VRR enabled monitor, the monitor will adjust (track) its refresh based on the fps produced by the GPU. For example, in game videos/cutscences are usually 30 or 60 fps, even if the GPU can push fps > 100 in game, not to meantion games areas that are "busy" and drop in fps. This removes artifacts and improves input lag.

In my case, when VRR was off, I had issues maintaining stable visuals, since I was trying to push 4K resolution, regardless of Vsync setting. On the LCD, turn on Freesync, enable VRR and Vsync=adaptive in amdgpu, and game VSYNC=ON, all is glorious. There is some glitch (Driver, API, game engine ??) here that if Freesync was on BUT the game turned VSYNC=OFF, something weird happened where it looked like the monitor told the GPU to crank up the fps to 300+ on menus and some, hence the heat! In Doom 2016 it was the in game menu, In the Feral Interactive games, it might be menus, or a certain game scene or map.

I have no way of comparing this to Nvidia. The 24" 1080p LCD monitor and GTX1060 did not support VRR. I needed to set the eye candy to keep fps above 60 and then set VSYNC=ON to prevent tearing.

Anyway, that's all I know. It may even be correct in whole or in part.

enorbet 11-16-2020 09:54 PM

BTW though I have no idea if AMD GPUs have such features and can only suppose they must but I am certain Nvidia drivers have the option to specify "Coolbits" states in xorg.conf and can even be enabled or not by startup scripts per user if you have more than one. I have been setting Default Fan Speed and Preferred Profile (mine is set to 80% fan speed and performance) but you can do whatever you like.

slackerDude 11-16-2020 09:58 PM

Once upon a time, Ati Rage 32 or 64 was *THE* card to have for xOpen86 or whatever it was, back around kernel 0.99pl14 - I had one paired with a NEC MultiSync 5D, IIRC the timelines correctly.

Then 3dfx was a thing (and they actually worked really well as X cards :-).. But then, for quite some time, maybe around 2005? (just guessing) nvidia "just worked" and ATI was bothersome and prone to crashing, so the overwhelming advice, unless you needed 100% open source, was to buy nvidia. Or Matrox if you only needed X and nothing else. IIRC, they had their own X11 driver / module.

Sometime between now and then, I stopped wanting to fiddle with my computer all the time, and haven't changed my buying habits: nvidia only. Between solid X performance, bad Windows drivers for AMD GPUs (I have a windows gaming machine, video cards for X are hand-me-downs), and such, I have found no compelling reason to switch, other than a few $$. A few $$, at this stage of life, isn't worth hours of my time.

My recent Ryzen experience - having to patch kernels, fiddle with BIOS settings, kernel parameters, etc just to get a stable Ryzen 1700 system (thankfully, it FINALLY seems to be!) has made me 100% definite that I will not be buying any AMD graphics cards in the near future. Heck - may switch back to Intel just for rock-solid out-of-the-box CPU setups. As I said - $$ isn't a primary motivator anymore.

YMMV.

enorbet 11-16-2020 10:25 PM

Hiya slackerDude. I remember that NEC monitor! Sweet!

Just FTR coming from OS/2 rather than Windows I always bought Nvidia even though I was very impressed with Matrox 2D performance (fonts were jaw-dropping sharp). I had considered upgrading my Matrox Millenium G-200 card to a G-400 in 1999 but nvidia's driver for OS/2 changed my mind. That same year I started using Linux, at first Mandrake but by late 1999 I switched to Slackware. NVidia always just worked in Linux for me from that time to the present.

One more just FTR, I inherited an old AGP ATi video card sometime around 2004 that just worked with even that old kernel driver (I forget which one) surprisingly well that even rendered most YouTube videos rather well.

I've bought dozens of nvidia cards since 1997 and each time I have bought one I registered them and commented that I chose nvidia exactly and only because of their support for non-Windows systems. I still do that. I think they've earned it but I'm glad to see AMD upping the game level. I'm likely to mention in my next purchase that I'm considering switching for the same reason now that AMD is working more closely with Linux.

RadicalDreamer 11-17-2020 02:22 AM

Quote:

Originally Posted by kingbeowulf (Post 6186026)
The various VRR (variable refresh rate) methods, whether Freesync or G-sync, are just a way to get the display and GPU to play nice. In the past, you could only set VSYNC [on | off] to tell the GPU to match (or not) whatever VSYNC the monitor supported. LCDs used to be fixed at either 30Hz or 60Hz, where-as with CRTs you had multisync: the monitor would match whatever the game spit out as FPS, if it could. My 17" Sony Trinitron multisync CRTs where lovely paying Quake 3 Arena at high fps. The 17" Viewsonic LCD not so much - 60 Hz or nothing and Vsync = on was required to remove stuttering and screan tearing. On a pre-VRR LCD, the fps display in the corner means nothing. The high refresh monitors were the first attempt to fix that, but still they only had ONE SPEED.

With VRR, LCD monitors can work like those old CRTs. The game's FPS is no longer locked to the LCD refresh; the LCD can now dynamically adjust based on GPU output. For that 240Hz LCD monitor, that does NOT mean the game will run at 240 fps. Rather, with a VRR enabled monitor, the monitor will adjust (track) its refresh based on the fps produced by the GPU. For example, in game videos/cutscences are usually 30 or 60 fps, even if the GPU can push fps > 100 in game, not to meantion games areas that are "busy" and drop in fps. This removes artifacts and improves input lag.

In my case, when VRR was off, I had issues maintaining stable visuals, since I was trying to push 4K resolution, regardless of Vsync setting. On the LCD, turn on Freesync, enable VRR and Vsync=adaptive in amdgpu, and game VSYNC=ON, all is glorious. There is some glitch (Driver, API, game engine ??) here that if Freesync was on BUT the game turned VSYNC=OFF, something weird happened where it looked like the monitor told the GPU to crank up the fps to 300+ on menus and some, hence the heat! In Doom 2016 it was the in game menu, In the Feral Interactive games, it might be menus, or a certain game scene or map.

I have no way of comparing this to Nvidia. The 24" 1080p LCD monitor and GTX1060 did not support VRR. I needed to set the eye candy to keep fps above 60 and then set VSYNC=ON to prevent tearing.

Anyway, that's all I know. It may even be correct in whole or in part.

Thanks for the explanation. I have a Freesync monitor but I have a NVIDIA card so I can't use Freesync. 60 FPS for me feels really bad in games like Doom 2016 and Quake Champions (much under 120 FPS feels bad), so I was curious about your experience. It sounds like the new monitor related technology makes a world of difference.

Livestradamus 11-17-2020 02:45 AM

A nod to AMD Ryzen
 
Never a fan boy but I really like where AMD is going. My last purchase and any forthcoming will be AMD.
Currently running -current/testing plasma 5 and it runs splendidly marvelous on a ASUS ZenBook-14-UM431

Daedra 11-18-2020 02:59 PM

Looks like there really is no bad choice this time around.
https://www.phoronix.com/scan.php?pa...00-linux&num=1

Bindestreck 11-18-2020 05:17 PM

Cool, thanks!

gargamel 11-19-2020 07:37 AM

Currently doing some research in preparation of a buying decision, too. While NVIDIA seemed to have the edge for several years in the past, AMD appears to be competitive again recently.
As I tend to use my hardware quite long before replacing it, the long-term perspective is important to me, including driver maintenance. Which is where I tend to trust the AMD open-source drivers a bit more. When NVIDIA abandons a product they might stop any development, including bug and security fixes, for the respective drivers. With AMD OSS drivers there's, at least in theory, a chance that you can use your graphics/video card a bit longer, as even if AMD drops the product, there's a chance that the driver is maintained by the community for little while, at least.

Which is why I am currently leaning towards AMD a little more than towards NVIDIA. It's obviously got nothing to do with the quality of their products. I had an NVIDIA card in the past. As long as they maintained the driver all was good. Once they dropped the product, however, the driver wasn't updated anymore and wouldn't work with newer kernels, rendering the hardware obsolete, unfortunately.


Just my 0.02 EUR.

Daedra 11-19-2020 02:29 PM

I am leaning the other way myself. After watching and reading all the reviews yesterday I think I am going to stick with NVIDIA this time around. Both cards are pretty much neck and neck, but when you start to enable features like ray tracing the 3000 series pummels the new AMD cards.

petejc 11-19-2020 04:36 PM

Quote:

Originally Posted by slackerDude (Post 6186061)
My recent Ryzen experience - having to patch kernels, fiddle with BIOS settings, kernel parameters, etc just to get a stable Ryzen 1700 system (thankfully, it FINALLY seems to be!) has made me 100% definite that I will not be buying any AMD graphics cards in the near future. Heck - may switch back to Intel just for rock-solid out-of-the-box CPU setups. As I said - $$ isn't a primary motivator anymore.

YMMV.

My mileage was definitely different to yours. I waited until the 2700x came out then waited another six months. Youtube videos at the time of the 1700 launch stated that there were teething troubles in general with Ryzen, its motherboards and memory compatibility. I definitely had no patching issues. But then ones timings for a new PC might be driven by multiple factors so waiting a bit might be a luxury you cannot have.

petejc 11-19-2020 04:39 PM

Quote:

Originally Posted by Daedra (Post 6186679)
Looks like there really is no bad choice this time around.
https://www.phoronix.com/scan.php?pa...00-linux&num=1

Worth also looking at the Level1 review on Youtube.
https://www.youtube.com/watch?v=ykiU49gTNakhttp://

NaboHipersonico 10-25-2022 02:09 PM

Hello. Amd for a few years has been the queen in linux, both with free drivers and with proprietary ones.

amd is part of opensuse, that's why it has been so well supported in linux for a few years now.

I have always been from nvidia and currently I have nvidia, amd used to go very badly in linux, but now if I had to buy a new computer, it would be all amd, since I only use linux and amd has great support because free drivers they are included in the linux kernel itself, let's say you don't have to worry about anything, just update your system and fly.

mrapathy 10-25-2022 08:30 PM

Not all AMD graphics is great. Some Intel NUC's with AMD graphics dont have Vulkan support. Intel and AMD are dropping the game in that regard. I have a laptop with AMD graphics and a couple systems with NVIDIA.
Nvidia bleeding edge can be easier than AMD both require reboot anyway.

It's cool to support and further linux with AMD contribution and furthering open source software.
Nvidia is less headache dont have to recompile kernel.

Been on linux since matrox,ati,nvidia and 3dfx were competitors. AMD doesnt spend much on gpu's.
over 20 years experience with X and drivers.

NaboHipersonico 10-26-2022 04:51 AM

Quote:

Originally Posted by mrapathy (Post 6388639)
Not all AMD graphics is great. Some Intel NUC's with AMD graphics dont have Vulkan support. Intel and AMD are dropping the game in that regard. I have a laptop with AMD graphics and a couple systems with NVIDIA.
Nvidia bleeding edge can be easier than AMD both require reboot anyway.

It's cool to support and further linux with AMD contribution and furthering open source software.
Nvidia is less headache dont have to recompile kernel.

Been on linux since matrox,ati,nvidia and 3dfx were competitors. AMD doesnt spend much on gpu's.
over 20 years experience with X and drivers.

Currently yes, years ago no, because the drivers are included in the linux kernel itself. You only have to inform yourself before buying if it is compatible with the Linux distribution you use, but that also with nvidia, you make sure and if amd has free drivers available for your distribution, then go ahead, you're going to hallucinate.

dugan 10-28-2022 10:48 AM

Just want to say that I’ve reported the spam a few posts up (from “RoseSpears”) repeatedly and nothing was done. Not even after the spammer edited the post to change the spam payload in its body into an actual link.

EDIT: It's finally gone. Good.

TurboBlaze 03-26-2023 12:12 PM

I have these warnings
Code:

WARNING: radv is not a conformant Vulkan implementation, testing use only.
WARNING: lavapipe is not a conformant vulkan implementation, testing use only.

with Mesa 21.3.5 and my AMD Radeon RX 6600 but with Mesa 23.0.0 I don't have any warnings.

biker_rat 03-30-2023 10:38 AM

The linux driver support for RX 6600 is still a bit in the development phase , so if you want to get the most out of it use latest kernel & mesa.


All times are GMT -5. The time now is 12:14 AM.