Linux - Hardware This forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux? |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
|
05-07-2022, 02:20 PM
|
#16
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,704
Original Poster
|
Quote:
Originally Posted by Timothy Miller
Biggest advantage of Zen3 over Zen2 is that the IPC is SIGNIFICANTLY better so single threaded performance is a big boost, and the cache efficiencies are massively better with the redesign of the chiplets. This CAN help in certain cases (such as gaming), but doesn't improve performance in ALL circumstances. In mobile, Zen3 (Cezanne vs. Renoir but not Lucienne) also has a fairly large power efficiency improvements, but that's not really relevant for you given you're looking at desktop.
Overall, given you're looking 2nd hand, Zen 2 or 3 would be perfectly adequate. They both (assuming you're getting a CPU, not APU) have 24 lanes of PCIe (4 lanes reserved for chipset so 20 usable by anything directly connected in total), both support PCIe 4.0 depending upon the chipset they're used on (300 or 400 series = PCIe 3.0 only, X570 has more 4.0 lanes made available than B550 while the A520 has none like the older chipsets).
I dare say you're overthinking the graphics card too much. Yes, a 128-bit memory bus does limit a card, but generally the cards that use it are cut down enough that they wouldn't gain THAT much of an advantage of having a wider bus. Ultimately, how much 3d do you do? If you're not doing a lot, and don't plan on doing a lot, something like the 6600XT is still going to be FAR more than you need. If you do need 3d, how much? Do you simply play esports titles? Do you do a lot of AAA gaming? Lots of CAD/CAM design? Video postprocessing? Actual use case is very important for deciding what you need. For instance, for me, I play only a few older games, and exceedingly light "time waster" games. So the Vega 8 IGP on my 5850U is actually perfectly adequate, I don't really need anything more powerful than that, and even less powerful still works fine for me even. As to your point about Nvidia, that I 100% agree on. I'd go with the new Intel Arc despite all the potential pitfalls of first gen hardware & drivers before I'd buy Nvidia myself.
|
Ah, (cpu)Instructions Per Cycle is the sort of detail I can understand, but never find. In fact Zen 2 outperformed zen 3 is one crazy test (5800Us with IGP I think, sweating hard). The reviewer didn't seem to know which cpu was in turbo, and how much  . On paper, the internal layout of zen 3 is a better design but won't necessarily translate into higher numbers in all tests. There seems to be a shortage of hardware-savvy reviewers.
I don't play games at all. I have stroke damage.
I don't think I'm overthinking the graphics thing. First question I asked was "Is 4k going to take over?" It seems not. So the rx6600xt is fine for 1080p, (might be slow @4k, but I don't have to go there). When looking at new boxes it was often a cheap graphics option with more than adequate performance @1080p. The Nvidia gtx1080 by comparison has a 352bit memory bus  but I don't want nvidia. I'm trying to have this box still relevant in 5 years. I know I should be waiting for AM5 socketed mothbhoards & cpus, but you have to draw a line somewhere. And bargains often come here slightly used.
I have an Intel graphics card ATM. I am definitely not getting another - period. Not now, probably never. It will be very tough for Intel to catch up, if they ever do.
|
|
|
05-07-2022, 02:39 PM
|
#17
|
Moderator
Registered: Feb 2003
Location: Arizona, USA
Distribution: Debian, EndeavourOS, OpenSUSE, KDE Neon
Posts: 4,031
|
6600XT can push a 2d 4k display perfectly fine. It can't do 4K gaming very well (at all), but 2d displays it will do just fine. One of my old laptops with a UHD600 display (the SUPER cut down one on Celerons) was able to drive a 4k display without any issues. It just was essentially useless for anything that required 3d.
|
|
|
05-07-2022, 04:33 PM
|
#18
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,704
Original Poster
|
Hmmm... How is it on 1080p 3d (no gaming) - Freecad, google earth, blender? And VM equivalents? There's just so many hidden obstacles to trip over. I knew my way around back in the day of Northbridge/Southbridge, but this is a very different landscape.
Oh, btw, whoever said it: I don't want rs232. Not now, or ever into the future. I have suffered with that protocol. If Something uses rs232, I will not buy it. I would suffer a port existing on some otherwise good m/b, but nothing ever would get plugged in.
|
|
|
05-07-2022, 04:43 PM
|
#19
|
Moderator
Registered: Feb 2003
Location: Arizona, USA
Distribution: Debian, EndeavourOS, OpenSUSE, KDE Neon
Posts: 4,031
|
6600XT was designed around current AAA titles running at max or near max settings for 1080P. So it's exactly what it is geared towards.
|
|
|
05-07-2022, 10:34 PM
|
#20
|
Senior Member
Registered: Aug 2016
Posts: 3,345
|
Quote:
Originally Posted by business_kid
Hmmm... How is it on 1080p 3d (no gaming) - Freecad, google earth, blender? And VM equivalents? There's just so many hidden obstacles to trip over. I knew my way around back in the day of Northbridge/Southbridge, but this is a very different landscape.
|
I think you are overthinking things. Instead of being worried about what if, you should be thinking about what it actually can do. Will it meet your needs today? If you continue the same basic needs in the future as today will it handle the load? If the answer is yes then why worry about changes you don't intend to take advantage of anyway.
4K video -- almost nonexistent on the web.
Gaming -- you said it will not be an issue
Heavy graphics use -- not really
VMs -- You said you may use 1. 16GB ram is more than enough for one VM, and with a processor capable of 6 cores 12 threads you could easily allocate 4 GB ram and 2 cores or even 4 cores (threads) to the VM without hurting anything else you are doing.
If you want actual 3d then you may need a better GPU, but for 99.9% of what you have said you need then it is quite adequate if not still a bit of overkill.
|
|
|
05-08-2022, 01:14 AM
|
#21
|
LQ Addict
Registered: Dec 2013
Posts: 19,872
|
^ Yep.
Quote:
Originally Posted by business_kid
That question came first because it has a lot of implications - graphics, network, storage, etc.
|
And I adressed this.
While most people in the world have access to the internet nowadays, most people also do not have access to a lot of bandwidth or top shelf hardware.
That's why I showed you the YT available formats example. YT provides low and very low resolutions, and I don't see that changing (sinificantly) anytime soon.
I can only presume that that also goes for Netflix & Co.
|
|
|
05-08-2022, 04:10 AM
|
#22
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,704
Original Poster
|
Maybe I am overthinking. I think it's fairer to say I'm thinking.
I tried specifying a box on the default web settings for one of these 'build your own box' websites and it kept throwing errors because the defaults were dud parts. I was using 2 pcie slots but the m/b only had 1 sort of thing. It came home to me that I could nobble any new box before I started by not knowing what I was specifying. I had just been keeping my head down and not reading up at all on hardware. I'm doing the homework now.
|
|
|
05-08-2022, 12:24 PM
|
#23
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,704
Original Poster
|
I'm marking this solved, because thanks to you guys I have more or less an idea of how to go about it.
On the Radeon 6600xt: Some other random facts. - Pcie-4.x is important for this, because the thing only uses 8 pcie lanes!
- Consensus is that it's good for 1080p, but sucks on 4k.
- 3D is poor on this and all AMD cards, because Nvidia is ahead in the internal design.
- It seems a good cheap compromise because the big brother rx6700xt is +€500 extra!
- Power efficiency is a strongpoint - 160W max. Probably smaller wafer fab size.
|
|
|
05-08-2022, 03:44 PM
|
#24
|
LQ Guru
Registered: Aug 2016
Location: SE USA
Distribution: openSUSE & OS/2 24/7; Debian, Knoppix, Mageia, Fedora, others
Posts: 6,565
|
I feel sorry for the pocketbooks of people reading threads like this, as if the anti-IGP proponents haven't used an IGP in over a decade, and only the most expensive and powerful hardware will do no matter what the actual use cases. All new, and some used, motherboards that I've acquired in the past decade, support IGPs. Every newer one is better. For 2k performance they got good enough for me 8 years ago, and have continued to get better since. 4k I've not meaningfully tested, as my eyes aren't good enough for it to be worth spending extra, and my PCs are tools, not toys. The boards acquired in the past 5 years all have 4 graphics outputs and support 2 if not 3 displays at once. My newest is almost 6 months old, LGA1200, 6 cores, 12 threads, UHD730 graphics, NVME 4, plenty fast with 16GB DDR4, at low cost, using little power, 65W TDP, overkill PSU @500W. My primary PC, the one I write this with, is 8 years old, using the IGP on an i3-4150T of 35W TDP, with 32G DDR3, though I have 3 IGP PCs much newer, and clearly faster.
|
|
1 members found this post helpful.
|
05-08-2022, 04:06 PM
|
#25
|
Member
Registered: Jan 2022
Location: Hanover, Germany
Distribution: Slackware
Posts: 321
Rep: 
|
IGPs have some systemically unsolvable disadvantages against dedicated graphics cards.
|
|
|
05-08-2022, 04:25 PM
|
#26
|
LQ Guru
Registered: Aug 2016
Location: SE USA
Distribution: openSUSE & OS/2 24/7; Debian, Knoppix, Mageia, Fedora, others
Posts: 6,565
|
Quote:
Originally Posted by Arnulf
IGPs have some systemically unsolvable disadvantages against dedicated graphics cards.
|
Right, nothing but the best will do no matter how much the overkill for the actual need. OTOH, physical proximity of the IGP to the CPU provides its own advantage over a GPU physically located at great relative distance.
Making decisions nearly always requires tradeoffs. The best decisions require best available information, which here includes the actual needs. Today's internet is designed for mobile phone users. PC IGP users can easily overkill that need. It's only gamers and other non-average users who need better than today's IGPs provide at low initial and continuing cost.
|
|
1 members found this post helpful.
|
05-09-2022, 04:41 AM
|
#27
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,704
Original Poster
|
Last time I bought I got a box that was ok at the time, based on what I needed & didn't need. Now it's sadly lacking. But to an extent I agree with mrmazda - we don't need top spec to do ordinary things. I can surf and watch videos at hdmi(1.0) on a RazPi 4 running @ 1.5Ghz. It can drive 2×4k monitors @ ≅30FPS. But jump forward in the video, and you're seconds catching up & redrawing the screen.
Electronics is never a static landscape. If some new breakthrough allows advances, that will be commonplace, obsoleting what we all have. If we were all buying with Arnulf's budget, this would be simpler.
As for IGPs - after landing two in succession that couldn't reliably keep lips in synch with speech driving fullscreen images, I was determined to stay away from them. I know it's painful on the pocket, but manufacturers leave nothing in the wattage equation for an IGP. My current IGP is 10-15W. My next GPU will probably be a very power efficient ≤160W. How much of that is because it's not integrated? Not that much, I imagine.
You can tell what the manufacturers think. They write drivers for IGPs as they come out, but they don't develop them - what's the point?
|
|
|
05-09-2022, 06:06 AM
|
#28
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,704
Original Poster
|
Hey, in the CPU spec (ryzen 5 5600 & 5600x) it says it has 20pcie-4 lanes (CPU only).
What's that about? Does it imply some limitation?
|
|
|
05-09-2022, 08:43 AM
|
#29
|
LQ Guru
Registered: Aug 2016
Location: SE USA
Distribution: openSUSE & OS/2 24/7; Debian, Knoppix, Mageia, Fedora, others
Posts: 6,565
|
Quote:
Originally Posted by business_kid
I can surf and watch videos at hdmi(1.0) on a RazPi 4 running @ 1.5Ghz. It can drive 2×4k monitors @ ≅30FPS. But jump forward in the video, and you're seconds catching up & redrawing the screen.
|
Don't blame the graphics system when the bottleneck is storage I/O or a copper pipe providing all that data. 4K represents a lot of data. NVME on whatever you buy next can knock that problem down to size that is reasonable for most folks.
|
|
|
05-09-2022, 09:10 AM
|
#30
|
Senior Member
Registered: Sep 2014
Distribution: Slackware
Posts: 1,918
Rep: 
|
Quote:
Originally Posted by business_kid
Freecad, google earth, blender?
|
You'll regret buying a CPU with integrated APU if you really need cad or any sort of 3d rendering.
APU is mostly for users who just watch youtube and do nothing else with the computer.
Workstations should have a dedicated GPU, so APU is a waste of money if you want my advice.
|
|
|
All times are GMT -5. The time now is 09:56 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|