Linux - HardwareThis forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
manual setting of xorg.config
hey guys, I need some help here, I get a Samsung P2070 LCD monitor and when I plug it on the DVI of my GeFroce4 MX420 (running jackalope), it shrink, it's continiously blinking.... On WinXP I had to set a "custom timing" to CVT-RB to make it work... now on the linux nvidia control panel that option is not there, so I check a lot of forums and I beleive to understand that I have to manually set this on the xorg.conf file, anyone can help me?
these are my monitor espec
General
Model Name SyncMaster P2070, P2070G
LCD Panel
Size
20 inch (49 cm)
Display area
422.8 mm (H) x 249.08 mm (V)
Pixel Pitch
0.2768 mm (H) x 0.2768 mm (V) Synchronization
Horizontal
30 ~ 81 kHz
Vertical
56 ~ 60 Hz
Display Color
16.7 M
Resolution
Optimum resolution
1600x900@60Hz
Maximum resolution
1600x900@60Hz
Input Signal, Terminated
DVI(Digital Visual Interface)- I
0.7 Vp-p ±5 %
Separate H/V sync, Composite, SOG
TTL level (V high ≥ 2.0 V, V low ≤ 0.8 V)
Maximum Pixel Clock
108MHz (Analog,Digital)
Power Supply
AC 100 - 240 V~ (+/- 10 %), 50/60 Hz ± 3 Hz
Signal Cable
29pin DVI-A to D-sub cable, Detachable
24pin DVI-D to DVI-D cable, Detachable(Sold separately)
Dimensions (W x H x D) / Weight (Simple Stand)
500 x 325 x 47 mm ( 19.7 x 12.8 x 1.9 inch) (Without Stand)
500 x 382 x 190 mm (19.7 x 15.0 x 7.5 inch) (With Stand) / 3.3 kg (7.3 Ibs)
Environmental considerations
Operating
Temperature : 50˚F ~ 104˚F (10˚C ~ 40˚C)
Humidity : 10 % ~ 80 %, non-condensing
Storage
Temperature : -4˚F ~ 113˚F (-20˚C ~ 45˚C)
Humidity : 5 % ~ 95 %, non-condensing
Plug and Play Capability
This monitor can be installed on any Plug & Play compatible system. The interaction of the monitor and the computer systems will provide the best operating conditions and monitor settings. In most cases, the monitor installation will proceed automatically, unless the user wishes to select alternate settings.
Dot Acceptable
TFT-LCD panels manufactured by using advanced semiconductor technology with precision of 1ppm (one millionth) above are used for this product. But the pixels of RED, GREEN, BLUE and WHITE color appear to be bright sometimes or some black pixels may be seen. This is not from bad quality and you can use it without any problems.
For example, the number of TFT-LCD sub pixels contained in this product are 4,320,000.
// Display Mode
VESA, 1600 X 900
Horizontal Frequency (kHz)
60.000
Vertical Frequency (Hz)
60.000
Pixel Clock (MHz)
108.000
Sync Polarity (H/V)
+/+
This is my xorg.conf file
######
Section "Monitor"
Identifier "Configured Monitor"
EndSection
I'll have a go, in a lazy sort of way . http://www.mythtv.org/wiki/Modeline_...#VESA_ModePool
0. Dot Clock In mhz. This tells it how fast to go, and, as a consequence, what your refresh rate is. Simple multiplication gets you close.
The other numbers in a modeline mean (First horizontal)
1. Right hand side visible
2. Right hand side invisible
3. Left side invisible
4. Left side visible. This is the highest number, and there your pixels roll over to zero, and you start again.
(Than vertical)
5. Bottom Visible
6. Bottom Invisible
7. Top Invisible
8. Top Visible.
Vesa doesn't _do_ 1600x900, so a vesa driver cannot select that mode. Get a different driver. Stay away from interlaced modes unless you need them - many things don't handle them. The gap between 2. & 3. above is your flyback time; Likewise vertically, 6. & 7. is your vertical flyback. If you get these wrong, the picture will be off centre.
Try something like (at your own risk, because my hardware won't touch that speed)
Your sum to crosscheck is as follows: With the dotclock at 202.5 Mhz, you get 202,500,000 dots written in one second. So a screenful (2160x940) is 2,030,400, and you get 99.73 of them in a second. X rounds numbers a little just for spite to make your sums always be wrong.
thanks, but I already have a modeline I would like ti try, is the one I get from Windows (where the monitor plugged via DVI works)... the problem is that I had modify the xorg.conf several times gettin no success, so now my question is how to try a modeline... I'm on jaunty, nvidia drivers... http://www.linuxquestions.org/questi...jaunty-756505/
Another way, might be to use a program to extract the EDID from the monitor, like read-edid, saving the raw values to a file, and then "feed" that file into your xorg-cong like this:
I had an interesting experiance here since we last wrote. I replaced my 1280x1024 monitor with a 1440x900 one. I measured the screen and put in the new Screen size, removed the old modes, and put in 1440x900 and 1200x750 into xorg.conf as modes, and tried it.
X booted. It said to itself (in Xorg.0.log)
1440x900 - that's clearly ridiculous; delete that for starters!
1200x750 - no such thing; delete that also.
Hmmm, we have no validated modes, better cobble up something.
(II) NVIDIA(0): Virtual screen size determined to be 1440 x 900
(--) NVIDIA(0): DPI set to (87, 87); computed from "UseEdidDpi" X config
(--) NVIDIA(0): option
It then rolled up a 1440x900 mode and went away happy. It has now saved that off, and accepts 1440x900 as a mode :-). In short, I am suggesting the UseEdidDpi in with all the other graphics options and _no_ modeline.
If X is going to read your edid, why bother yourself?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.