LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Programming (https://www.linuxquestions.org/questions/programming-9/)
-   -   OpenGL: glScale(1,1,1) + lighting screws up quad rendering!? (https://www.linuxquestions.org/questions/programming-9/opengl-glscale-1-1-1-lighting-screws-up-quad-rendering-770513/)

ntubski 11-20-2009 04:54 PM

OpenGL: glScale(1,1,1) + lighting screws up quad rendering!?
 
2 Attachment(s)
I'm getting a very strange bug in my OpenGL program:
Code:

#include <GL/gl.h>
#include <GL/glu.h>
#include <GL/glut.h>
#include <stdio.h>
#include <stdlib.h>

void draw(void)
{
    GLenum e;

    glClear(GL_COLOR_BUFFER_BIT);
    glMatrixMode(GL_MODELVIEW);
    glLoadIdentity();
    glTranslatef(0, 0, -5);

    /* no problem without call to scale */
    if (1) glScalef(1, 1, 1);
    glColor3f(1, 1, 1);
    glBegin(GL_QUADS);
    glNormal3f(0, 0, 1);
    glVertex2f(-.5, .5);
    glVertex2f(-.5, -.5);
    glVertex2f(.5, -.5);
    glVertex2f(.5, .5);
    glEnd();

    glFinish();
    e = glGetError();
    if (e != GL_NO_ERROR) {
        fprintf(stderr, "OpenGL: %s\n", gluErrorString(e));
        exit(1);
    }
}

#define W 400
#define H 400
void setup_gl(void)
{
    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();
    glViewport(0, 0, W, H);
    gluPerspective(40.0, (GLfloat)W/(GLfloat)H, 0.1, 1000.0);

    glEnable(GL_LIGHTING);
    glEnable(GL_LIGHT0);

    glShadeModel(GL_FLAT);
}


/* toolkit specific */

static void display(void)
{
    draw();
    glutSwapBuffers();
}

int main(int argc, char **argv)
{
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_DOUBLE | GLUT_DEPTH);
    glutInitWindowSize(W, H);
    glutCreateWindow ("gl window");


    glutDisplayFunc(&display);
    setup_gl();

    glutMainLoop();
    return 0;
}

It should draw a square in the middle of the screen (expected-quad.png), and it does when glScalef(1,1,1) (values other than 1 also cause this, with 1 it's obviously a noop) isn't called, or when lighting is not enabled, but with both of those the square looks rotated and stretched (broken-quad.png).

I don't know where the problem is: my program, opengl implementation, graphics card (lspci gives: Intel Corporation 82815 Chipset Graphics Controller)? I've found the same problem using glx and sdl, so I know it's not the toolkit. Any insights are appreciated.

ntubski 12-18-2009 02:24 PM

So my updated questions are (info below):
  • Does anyone know the proper method of getting software rendering?
  • Can anyone reproduce this problem?


Over on gamedev.net someone suggested that the problem is likely with the graphics card. If that is the case using software rendering should show no problems.

The thing is, I'm not actually sure how to use software rendering, searching the web tends to give guides about how to enable hardware acceleration or articles about ray tracing type stuff. I tried putting
Code:

Section "Module"
        Disable        "dri"
        Disable        "dri2"
EndSection

in xorg.conf, but it doesn't seem to do the trick. I still get weird output (though now it's a triangle instead of a tilted rectangle). I don't understand the output from glxinfo:
Code:

~$ glxinfo |grep render
direct rendering: Yes
OpenGL renderer string: Software Rasterizer



All times are GMT -5. The time now is 04:36 AM.