Author Topic: Setting the screen resolution in OpenGL.  (Read 10635 times)

0 Members and 1 Guest are viewing this topic.

Offline Pixel_Outlaw

  • Pentium
  • *****
  • Posts: 1382
  • Karma: 83
    • View Profile


OK time to learn something new from the current challenge.

I have the following code and wish to set the screen resolution for the comp.

Code: [Select]
Function gl_ini()
GLGraphics 160, 120
glMatrixMode (GL_PROJECTION)
glOrtho(0, 160, 120, 0, - 1000, 1000)
glMatrixMode(GL_MODELVIEW)
glEnable (GL_BLEND)
EndFunction

Now I know that there are probably no native drivers that support 160x120. How can I force OpenGL to upscale the resolution to a full screen 640x480 window without interpolating and anti-aliasing things?
Challenge Trophies Won:

Offline stormbringer

  • Time moves by fast, no second chance
  • Amiga 1200
  • ****
  • Posts: 453
  • Karma: 73
    • View Profile
    • www.retro-remakes.net
Re: Setting the screen resolution in OpenGL.
« Reply #1 on: July 28, 2008 »
you cannot.. unless you draw into a texture and then draw the texture using a quad on the screen. When doing this, set the filtering to XXX_NEAREST for min & max.
We once had a passion
It all seemed so right
So young and so eager
No end in sight
But now we are prisoners
In our own hearts
Nothing seems real
It's all torn apart

Offline Pixel_Outlaw

  • Pentium
  • *****
  • Posts: 1382
  • Karma: 83
    • View Profile
Re: Setting the screen resolution in OpenGL.
« Reply #2 on: July 28, 2008 »
Ahh that sounds slightly painful. Is it going to be fast enough to handle a solid 60 fps do you think?
Challenge Trophies Won:

Offline stormbringer

  • Time moves by fast, no second chance
  • Amiga 1200
  • ****
  • Posts: 453
  • Karma: 73
    • View Profile
    • www.retro-remakes.net
Re: Setting the screen resolution in OpenGL.
« Reply #3 on: July 28, 2008 »
depends on what you do and how you do it. If you use extensions to draw with OpenGL into a texture using a Framebuffer object (FBO, have a look at: http://oss.sgi.com/projects/ogl-sample/registry/EXT/framebuffer_object.txt), then definitely, 60FPS and even higher (depending on the graphics card, here Nvidias beat ATIs)

But for this challenge, and given the low resolution, a CPU rendering might be just as fast. Again it depends on what you do but I'd say yes of course. CPU rendering will avoid the use of FBOs that could cause some problems on some old machines (those that do not support the extension).

Another quick & dirty option is to draw to the screen buffer and copy from there to a texture using the glCopyTexSubImage2D() function. This is the old way of drawing off-screen and render-to-texture technique. However it can be very slow for large copies especially on some laptops and old computers.
We once had a passion
It all seemed so right
So young and so eager
No end in sight
But now we are prisoners
In our own hearts
Nothing seems real
It's all torn apart

Offline Shockwave

  • good/evil
  • Founder Member
  • DBF Aficionado
  • ********
  • Posts: 17394
  • Karma: 498
  • evil/good
    • View Profile
    • My Homepage
Re: Setting the screen resolution in OpenGL.
« Reply #4 on: July 28, 2008 »
I am using software rendering personally.
I agree with Stormbringer here..

Unless you render the display from glquads, coloured according to a screenuffer. That is one way of doing it that will be fine on most configurations.

To make myself clearer, you fill the screen with 160 * 120 rectangles, coloured according to a framebuffer.
Shockwave ^ Codigos
Challenge Trophies Won:

Offline zawran

  • Sponsor
  • Pentium
  • *******
  • Posts: 909
  • Karma: 67
    • View Profile
Re: Setting the screen resolution in OpenGL.
« Reply #5 on: July 28, 2008 »
Unless you are determined to stick with raw openGL code, then you might want to consider another approach. I was actually thinking of finding time to do something for this combo since I am having vacation time the next two weeks. And I have just put together a small piece of code in Bmax which seems to work just fine. Feel free to build of the following idea, if it will suit your idea for the combo.

Code: [Select]
SuperStrict
Graphics 640,480

Local pixmap:TPixmap = CreatePixmap(160,120,PF_RGBA8888) ' pixmap where pixels are manipulated
Local image:TImage = CreateImage(160,120) ' image where pixmap is converted to

Global colorlist:Int[4,3] ' global color list
For Local r:Int = 0 To 3
For Local c:Int = 0 To 2
ReadData colorlist[r,c]
Next
Next
DefData $FF,$FF,$FF
DefData $FF,$00,$00
DefData $00,$FF,$00
DefData $00,$00,$FF

While Not KeyHit(KEY_ESCAPE) ' as long as escape is not hit
SetScale 4.0,4.0 ' set scale to 4x for a 640x480 image
DrawImage(image,0,0) ' draw the image
Flip ' flip the buffer
messwithpixmap(pixmap) ' do something random to the pixmap
image = LoadImage(pixmap,0) ' load the pixmap into the image
Wend
End

Function messwithpixmap(pmap:TPixmap)
Local tmpPtr:Byte Ptr = PixmapPixelPtr(pmap,0,0)
Local pitch:Int = pmap.width
For Local y:Int = 0 To pmap.height-1
For Local x:Int = 0 To pmap.width-1
Local color:Int = Rnd(4)
tmpPtr[x*4+y*4*pitch] = colorlist[color,0]
tmpPtr[x*4+y*4*pitch+1] = colorlist[color,1]
tmpPtr[x*4+y*4*pitch+2] = colorlist[color,2]
tmpPtr[x*4+y*4*pitch+3] = 255
Next
Next
End Function

Its basically utilizing a color array and a pixmap, then directly writing to the pixmap and converting it into an image. Images in Bmax can be influenced directly by scale commands, so to get it 640x480 I only have to set the scale to 4.0 in each direction. Its plenty fast for most applications in 160x120 pixels resolution.

If you want to stick with raw openGL you might want to look into projection matrix, I seem to remember someone mention that on the BlitzResearch forum for having openGL automatically scale everything. But its not something I have used myself, so I do not know how that works.

Offline stormbringer

  • Time moves by fast, no second chance
  • Amiga 1200
  • ****
  • Posts: 453
  • Karma: 73
    • View Profile
    • www.retro-remakes.net
Re: Setting the screen resolution in OpenGL.
« Reply #6 on: July 28, 2008 »
@zawran: scaling comes for "free" with OpenGL, DX etc. However I think what Shockwave wants to have is the "pixelate" effect (aka "mosaic" effect). Basically a lo-res image scaled using a nearest-neighbor method. OpenGL cannot do that through its matrices since it just works on vector data. Scaling would be good of course, but the final drawing will be done at the viewport resolution (happens during the rasterization, that's why you have to give the viewport dimensions...)

Another method for perfoming the pixelization thing would be to draw your stuff with OGL in the screen buffer, read it back into a mem buffer and draw either quads for each pixel in your buffer or as zawran suggested use the glDrawPixels() function. It's all variations on the same technique: draw in an off-screen buffer and for each pixel draw a quad/or draw a scaled version of your off-screen buffer.

I'd strongly suggest that you stick with drawing in a mem buffer and update a texture. Then draw your texture (using a quad) with the XXX_NEAREST sampling filters for magnification/minification (does such thing exist??)... The reason I say this is because it's really the fastest way. Not my words, but those of the Nvidia/ATI engineers. and here is why:

1) if you use glDrawPixels() (or a variation provided by your framework), then there is a chance that your pixels get converted from your buffer into something that is compatible with your drawing context (e.g. the screen) by the driver in software. For such a small resolution (the one requested by the comp) this may be just fine, however for larger buffers this technique can just ruin your FPS.

2) if you use a texture, then you can avoid software conversion by simply creating a texture that has the same properties as your in-mem buffer. If a conversion from your texture data to the screen data has to occur, this will be done in hardware (texel sampling), which of course is much much faster and more efficient for large textures. Also you benefit for "free" of dozens of possibilities like texture coordinates, clipping, etc. Another operation that comes for "free" is blending and shading. Of course for the challenge here, you are only allowed to use 4 colors, however, blending 2 textures with 4 different colors each, will still give you only 4 unique colors on screen. It's something you may want to think about regarding this challenge..
We once had a passion
It all seemed so right
So young and so eager
No end in sight
But now we are prisoners
In our own hearts
Nothing seems real
It's all torn apart

Offline hellfire

  • Sponsor
  • Pentium
  • *******
  • Posts: 1292
  • Karma: 466
    • View Profile
    • my stuff
Re: Setting the screen resolution in OpenGL.
« Reply #7 on: July 29, 2008 »
Quote
you fill the screen with 160 * 120 rectangles, coloured according to a framebuffer
but that's what they invented textures for :)

You're probably going best with the method stormbringer mentioned first:
Set your window to be 640x480, but use a viewport of 160x120 only (make sure your window can't get smaller than that).
Copy the rendered image into a texture (which is preferably 256x128 for compatibility) and draw a quad which fills the whole screen (full viewport) and using appropriate texture-coordinates to use 160x120 only.
Since the copy happens in video-memory (and the area is pretty small) it's pretty fast.
« Last Edit: July 29, 2008 by hellfire »
Challenge Trophies Won:

Offline Pixel_Outlaw

  • Pentium
  • *****
  • Posts: 1382
  • Karma: 83
    • View Profile
Re: Setting the screen resolution in OpenGL.
« Reply #8 on: August 01, 2008 »
Thanks guys. Zawran, the problem with your method is that it will not scale vector based images such as lines and such but it does work nicely for sprites presuming that you adjust motion to be locked to the big pixels.

This is going to be a bit rough yet since I have not dipped into texture writing with Blitzmax and OpenGL.
Challenge Trophies Won:

Offline rain_storm

  • Here comes the Rain
  • DBF Aficionado
  • ******
  • Posts: 3088
  • Karma: 182
  • Rain never hurt nobody
    • View Profile
    • org_100h
Re: Setting the screen resolution in OpenGL.
« Reply #9 on: August 01, 2008 »
Since you must fake 160*120 resolution you cant use vector gfx cos the edges will be drawn to the actual screen resolution which is higher resolution than 160*120. If you must use vectors and also want to keep you resolution at 160*120 then you will have to make your own vector routines. If you look at Shockwaves entry for the compo you will notice that the cubes edges are at the same pixel size as the rest of the scene. you will not get this same effect using openGL primitives.

Challenge Trophies Won:

Offline zawran

  • Sponsor
  • Pentium
  • *******
  • Posts: 909
  • Karma: 67
    • View Profile
Re: Setting the screen resolution in OpenGL.
« Reply #10 on: August 01, 2008 »
The code I posted only shows how to write to the pixmap and then scale it to 4x, you would have to write your own line and triangle fill code ofcause if you want to use vector graphics.  :)

Offline hellfire

  • Sponsor
  • Pentium
  • *******
  • Posts: 1292
  • Karma: 466
    • View Profile
    • my stuff
Re: Setting the screen resolution in OpenGL.
« Reply #11 on: August 01, 2008 »
Since you are probably going for the 4-colour-thing, you might want to read the framebuffer into memory first and apply some colour-reduction-algorithm.

Code: [Select]
// create buffer for reading back the opengl framebuffer (use a size of 2^n so we can directly dump the buffer into a texture)
   uint32 *vbuffer= new uint32[256*128];

// create a texture to hold the framebuffer
   unsigned int mOffscreen;
   glGenTextures(1, &mOffscreen);
   glBindTexture(GL_TEXTURE_2D, mOffscreen);

// disable filtering for this texture
   glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
   glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);

while (running)
{

// set desired resolution
   glViewport(0,0,160,120);

  glClear(...);

// do totally amazing opengl magic here

// read backbuffer (reads more than actually required, so we don't need to convert to texture-size)
  glReadPixels(0,0,256,128, GL_BGRA, GL_UNSIGNED_BYTE, vbuffer);

// do wicked colour-reduction-things on the buffer
// remember that you only need to process the first 160 pixels of each scanline

  // write buffer to texture
  glBindTexture(GL_TEXTURE_2D, mOffscreen);
  glTexImage2D(GL_TEXTURE_2D, 0, GL_BGRA, 256, 128, 0, GL_BGRA, GL_UNSIGNED_BYTE, vbuffer);

// viewport to full screen size
  glViewport(0,0, 640, 480);

// set up a useful projection to make a fullscreen-polygon (glOrtho does well, too)
  glMatrixMode(GL_PROJECTION);
  glLoadIdentity();   
  gluPerspective(90.0f, 1.0f, 0.1f, 100.0f);
  glMatrixMode(GL_MODELVIEW);
  glLoadIdentity();   

  glDisable(GL_DEPTH_TEST); // don't need to clear again.

// draw quad with framebuffer as texture
  glBegin(GL_QUADS);
  float y= 120.0 / 128.0; // texcoord of 160,120 on a 256x128 texture.
  float x= 160.0 / 256.0;
  glColor4f(1,1,1,1);
  glTexCoord2f(x,0); glVertex3f( 1,-1,-1);
  glTexCoord2f(0,0); glVertex3f(-1,-1,-1);
  glTexCoord2f(0,y); glVertex3f(-1, 1,-1);
  glTexCoord2f(x,y); glVertex3f( 1, 1,-1);
  glEnd();

  SwapBuffers();
}

This is far from optimal, but for the required size it doesn't make much sense to think about optimization.
Challenge Trophies Won:

Offline Pixel_Outlaw

  • Pentium
  • *****
  • Posts: 1382
  • Karma: 83
    • View Profile
Re: Setting the screen resolution in OpenGL.
« Reply #12 on: August 04, 2008 »
Big thanks for the code. I'll ofcourse not use your code directly (as it is against the rules) but I will base my code on yours. Thanks for all of your time gents.
Challenge Trophies Won: