-   General (
-   -   Do flat screen monitors wear out? (

hazel 06-17-2021 06:39 AM

istr our first family TV had a 12" screen. The earliest computer screens I can remember were called visual display units (VDUs) and had a built-in keyboard.

Actually a smaller screen has sharper characters if the one I'm using now is anything to go by.

enorbet 06-17-2021 07:10 AM

Sharper characters are mainly a function of resolution which is in turn a function of GPU and Monitor, but there is a LOT more to it than mere resolution. Way back in my OS/2 days when I was using a decent Viewsonic CRT but an ultra cheap S3 GPU, I finally broke down and bought a Matrox MGA GPU which had, as I recall, 4MB VRAM. Several guys on message boards said I was foolish for wasting my money and displayed the math that showed 1MB was sufficient for 1024x720 resolution. What they apparently didn't grasp is that digital has many differences from analog, and that every tiniest detail requires processing power and RAM. The first time I booted up with the MGA installed when the Desktop leapt onto the screen (it was many times faster) what literally dropped my jaw for a full, stunned minute, wasn't just the speed, but how sharply the fonts were rendered. It was absolutely astonishing.

One needs to consider what things we view, whether 2D or 3D is common, in order to be thrifty but effective in choosing a graphics solution, but most OEM onboard GPUs these days have many more times the processing power of that Matrox MGA and can be set in BIOS/UEFI setup to share whatever amount of RAM one is willing to devote to get proper font rendering and speed. A discrete Graphics Card has the advantage of not only the amount of RAM but WAY faster VRAM. The specs of a Graphics card that I bought for $250 years ago is now matched or exceeded by any off-the-shelf card at 1/10th the price. That is not an off-the-cuff guess but an actual researched observation. In fact, eBay has 1GB new, brand name video cards for as low as $15. It amazes me to even be able to quote such a figure. I think I paid $100 for the 4MB Matrox just 20 years ago! Oh Man, I AM getting old ;)

hazel 06-17-2021 07:28 AM

I don't have a graphics card. I have one of those system-on-a-chip things. Apparently that's the wave of the future.

obobskivich 06-17-2021 10:03 AM


Originally Posted by hazel (Post 6259822)
It finally gave up the ghost. RIP. For weeks, I just didn't turn it off at night; I let the computer switch itself off but left the main switch on to keep the monitor running. And it worked just fine. But yesterday the forecast said there would be violent thunderstorms overnight and I know that can sometimes cause surges in the power supply. I didn't want the computer or its external power unit to run any risk of being damaged, so I switched off at the main for the first time in two months. Now the monitor doesn't work any more.

I'm using a spare 17" screen for the time being. Now that the shops are open again, I can go and buy a new 19" one.

Prior to this post I was going to posit it may be a bad connector on the display itself - I had a similar Dell a few years ago which would act this way if the DVI connection was used, but any of the other inputs (I think it had 3 or 4) would be fine. I eventually sent it to recycle and replaced it with a similar model that I found (and which the DVI port works on - digital video rapidly becoming the only way to connect some machines after all). But it is likely power supply related, as others have suggested (and to the broader question of 'do flat panels go bad' - in my experience yes, but the only ones I've seen fail (yet) are CFL-backlit models - I've yet to have an LED-backlit one give me any issues, but I'm down to probably 2 or 3 surviving CFL models (the oldest of which I think is around 13)).

If you like the non-widescreen monitors, I'd keep an eye out for the newer NEC models (like AS172, which I think has actually been superseded by an even newer offering) as they're 'modern' while still being a 5:4 display (modern as in, they offer HDCP, are LED backlit, don't have massive bezels, don't weigh 40 lbs, etc). Otherwise I agree with enorbet (more or less) on picking something 'nice' with the anticipation of it living beyond a single machine/use. As far as the 'sharpness' goes - pixel pitch (the size of the physical pixels on the display) is a factor, so resolution/size is an important relationship (in other words, for two monitors of the same resolution, the smaller one will be 'finer', and as enorbet points out, whatever is 'driving' it also makes a difference, although I'm not sure how significant this is these days (I mean within the context of more modern GPUs/IGPs)- I know Matrox had a golden reputation for output quality 'back in the day' but nowadays most graphics cards/chips don't even provide analog outputs, but the few digital-to-analog converter dongles I've used seem to offer pretty clean outputs too.

fatmac 06-17-2021 12:26 PM

If you are considering a new monitor, Novatech are a good firm to deal with.

(I've bought several computers/laptops/monitor/drives, etc from them.)

enorbet 06-17-2021 02:15 PM


Originally Posted by hazel (Post 6259845)
I don't have a graphics card. I have one of those system-on-a-chip things. Apparently that's the wave of the future.

It is if you don't use your PC for movies and gaming or CAD work. Since I do all 3 of those any integrated graphics I'm forced to own is used one time just long enough to enter BIOS and turn it off, and same with integrated Audio. There simply is no way to handle as much power as required to do a proper job on a chip for those chores. The thermals alone preclude such a thing at least for the foreseeable future. As Robtert Heinlein noted "TANSTAAFL ! " (There Ain't No Such Thing As A Free Lunch)

All times are GMT -5. The time now is 06:05 PM.