Monitor resolution to low with nvidia geforce in Mint 17
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
install the appropriate nvidia driver!
that is, the proprietary driver, not the open source nouveau driver.
i think mint even has a gui for it (additional drivers).
I could be wrong but I think that support has been dropped for that card even in the legacy proprietary driver.
according to this wikipedia article it's been on the market only in 2002, and that's a looooong time ago as computer development goes.
according to the same article, the 96.43.xx drivers are the last proprietary drivers available, downloadable for 32bit linux from here
- of course you can try with nouveau (open source, sort of reverse-engineered nvidia driver) but either way it's going to be painful.
It works OK on mine when I used the sgfxi script to install the vesa driver since I had problems with the open source nvidia driver (nouveau) and kernel 3.13.
I have a GeForce MX440 running 1600x1200 on an IBM G96.
Tell xorg.conf what modes it can use, which is the default mode to use and it will do that.
Using the 96.43 drivers improved some things like OpenGL.
An xorg.conf monitor section would not hurt either, depends on how much your monitor reports
to queries.
See >> man xorg.conf
If your monitor does not like the default timings it is possible to use a video mode timing tool to explore the video timing and write a modeline for each mode you like.
(And I cannot remember the name of that tool, but that can be another question if you need it).
Last edited by selfprogrammed; 06-24-2014 at 11:08 PM.
How did you setup your xorg.conf to manage your GeForce MX440 running 1600x1200 ?
I've tried many things without any correct result.
As people say it's not possible to install nvidia-96 on LM17, I would be happy to find any workaround. My screen needs 1680 x 1050, but the default driver doesn't offer anything above 1280 x 1024.
Thanks.
[Edit:] Eventually I could make my nvidia GeForce4 MX 440 work well on Linux Mint 17, unsing the "Nouveau" driver and modifying the grub file with GRUB_CMDLINE_LINUX_DEFAULT="quiet splash nouveau.noaccel=1" so there's no more freeze.
My xorg.conf started in 1998 and has been modified with each update.
My first step with new hardware is to run "xvidtune" and determine the correct
tuning for each video mode. This augments the available VESA modes.
Write down the mode you want and inform xorg.conf as in mine.
Sometimes you can choose to use -hsync or -vsync to indicate to the monitor to use an alternative timing (they were the original method of switching modes).
With a slight variation in mode name, you can test several modelines and choose the best.
This is very complicated because there is much stuff that was used for testing and is left around for future reference. Next time I do not want to reinvent everything, and I want to know what I tried before. So I leave stuff around. The xorg screen only uses the device
that you reference BY NAME in screen section.
If VESA does not generate a modeline for your monitor that mode (1600x1200) will not even appear as an option. It think the VESA modes come from some DPMS query. Any modeline you specify will be accepted, unless it violates some sanity checks against the monitor limits, like clock too high, or the mode exceeds the horz. frequency limits. Looking the xorg log gives you all the messy details of VESA modes, and rejected modelines.
But this is the one I currently use.
Notice that almost every driver option is commented out, which allows the driver to choose the default. You have to read the driver docs to decide on what options may be useful.
These have been working good for me.
There is no problem with this card going to 1600x1200 or higher.
This is where I select what I am using now. The stuff I tested is commented out.
Notice that there are four driver devices, but only one is not commented out.
There are lines commented out that were for testing some odd modes, but the ones I settled on are the only ones that are not commented out. (I Removed some of these for clarity in this listing).
Notice that there is a separate subsection for 8 bit, 16 bit, etc.. Only the display depth that you are currently using will have any effect. Changing the wrong one will have no effect. From the amount of stuff in the 24 bit section you might think I am using 24 bit now, but I would not be surprised if it actually was 32 bit depth for daily use.
Notice the strange names used to distinguish my modelines. The numbers in those names is just part of the name.
Notice that I use my custom modeline for 1600x1200 ("G1600x1200p12"), which was my 12th try at it apparently.
VESA will create a set of default modelines too, which you can use. Those are the simpler names (like 1024x768).
Section "Screen"
# Device "nVidia440"
# Device "VESA Framebuffer"
# If your card can handle it, a higher default color depth (like 24 or 32)
# is highly recommended.
# DefaultDepth 32
# "1024x768" is also a conservative usable default resolution. If you
# have a better monitor, feel free to try resolutions such as
# "1152x864", "1280x1024", "1600x1200", and "1800x1400" (or whatever your
# card/monitor can produce)
Identifier "Screen1"
# Device "nVidia440nv"
Device "nVidia440"
# Device "nv_std"
# Device "VESA Framebuffer"
Monitor "G96"
DefaultDepth 24
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.