SDL+OpenGL Texture not Rendering-- HELP!

Whether you're a newbie or an experienced programmer, any questions, help, or just talk of any language will be welcomed here.

Moderator: Coders of Rage

Post Reply
User avatar
OmenFelix
Chaos Rift Cool Newbie
Chaos Rift Cool Newbie
Posts: 59
Joined: Fri May 04, 2012 1:42 pm
Current Project: Arcadian Descent(C++, SDL, KOS)
Favorite Gaming Platforms: PS2, PC, XBOX, NDS
Programming Language of Choice: C/C++

SDL+OpenGL Texture not Rendering-- HELP!

Post by OmenFelix »

Hey guys, I've been trying to get this damn texture to render, but OpenGL won't render the texture.

Here is the code for lsRender.cpp which has all the functions for rendering:

Code: Select all

#include "lsGlobals.h"
#include "lsRender.h"

ls::Gfx ls::Render::load(std::string address, GLuint *texture, GLenum *textureFormat, GLint *nOfColors){
	address = "media/" + address + ".bmp";
	Gfx gfx = IMG_Load(address.c_str()); //IMG_Load used instead of SDL_LoadBMP because it can load converted images

	if(gfx == NULL){
		ls::Utils::errorReport("Failed to load image: " + address);
		return gfx;
	}

	/*gfx = SDL_DisplayFormat(gfx);
	ls::Render::censor(gfx, ls::Utils::genColorKey(gfx, 255, 0, 255));*/
 
        *nOfColors = gfx->format->BytesPerPixel;
        if (*nOfColors == 4)
        {
                if (gfx->format->Rmask == 0x000000ff)
                        *textureFormat = GL_RGBA;
                else
                        *textureFormat = GL_BGRA;
        } else if (*nOfColors == 3)
        {
                if (gfx->format->Rmask == 0x000000ff)
                        *textureFormat = GL_RGB;
                else
                        *textureFormat = GL_BGR;
        } else {
                printf("warning: the image is not truecolor..  this will probably break\n");
                // this error should not go unhandled
        }

	glGenTextures( 1, texture );
	glBindTexture( GL_TEXTURE_2D, *texture );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
	glTexImage2D( GL_TEXTURE_2D, 0, *nOfColors, gfx->w, gfx->h, 0,
                      *textureFormat, GL_UNSIGNED_BYTE, gfx->pixels );

	return gfx;
}

void ls::Render::render(Gfx gfx, ls::Frame frame, ls::Frame clip, GLuint texture, GLenum textureFormat, GLint  nOfColors) {
	/*ls::Vec2d a = ls::Utils::genVec2d((clip.x - 1) * (1 / (256 / clip.w)), (clip.y- 1) * (1 / (256 / clip.h)));
	ls::Vec2d b = ls::Utils::genVec2d(clip.x * (1 / (256 / clip.w)), (clip.y - 1) * (1 / (256 / clip.h)));
	ls::Vec2d c = ls::Utils::genVec2d(clip.x * (1 / (256 / clip.w)), clip.y * (1 / (256 / clip.h)));
	ls::Vec2d d = ls::Utils::genVec2d((clip.x - 1) * (1 / (256 / clip.w)), clip.y * (1 / (256 / clip.h)));*/

	//SDL_BlitSurface(gfx, &clip,ls::screen, &frame);
	glBindTexture(GL_TEXTURE_2D, texture);

	glTranslatef(frame.x, frame.y, 0);
	glBegin(GL_QUADS);
		glTexCoord2f((GLfloat) clip.x*clip.w, (GLfloat) clip.y*clip.h);
		glVertex2f(0, 0);

		glTexCoord2f((GLfloat) (clip.x+1)*clip.w, (GLfloat) clip.y*clip.h);
		glVertex2f(clip.w, 0);

        glTexCoord2f((GLfloat) (clip.x+1)*clip.w, (GLfloat) (clip.y+1)*clip.h);
        glVertex2f(clip.w, clip.h);

        glTexCoord2f((GLfloat) clip.x*clip.w, (GLfloat) (clip.y+1)*clip.h);
        glVertex2f(0, clip.h);
	glEnd();

	glLoadIdentity();
}

void ls::Render::censor(ls::Gfx gfx, Uint32 colorKey) {
	 SDL_SetColorKey(gfx, SDL_SRCCOLORKEY, colorKey);
}

void ls::Render::free(Gfx gfx){
	SDL_FreeSurface(gfx);
[code]
}

void ls::Render::clear() {
SDL_FillRect(screen,NULL,SDL_MapRGB(screen->format,0,0,0));
glClear(GL_COLOR_BUFFER_BIT);
}

void ls::Render::sync() {
SDL_Flip(ls::screen);
SDL_GL_SwapBuffers();
}
[/code]

Here is the code for lsSprite.cpp which wraps over the lsRender functions in a 'Sprite'-way:

Code: Select all

#include "lsSprite.h"
#include <iostream>

ls::Sprite::Sprite(){
}
	
ls::Sprite::Sprite(std::string address, ls::Frame bodyFrame, ls::Frame clipFrame){
	ls::Sprite::load(address);
	body = bodyFrame;
	clip = clipFrame;
	show=false;
}

ls::Sprite::~Sprite(){
	ls::Render::free(gfx);
}
	
void ls::Sprite::load(std::string address){
	gfx = ls::Render::load(SPRITES_FOLDER + address, &texture, &textureFormat, &nOfColors);
}
	
void ls::Sprite::render() {
	ls::Render::render(gfx, body, clip, texture, textureFormat, nOfColors);
}
Here is Game.cpp which basically invokes the main engine and makes use of the lsSprite class(which makes use of the lsRender class), to draw Crono onto the screen(or should do):

Code: Select all

#include "LunarSanity.h"

ls::Sprite *crono;

void ls::start() {
}

void ls::load() {
	crono = new ls::Sprite("crono", ls::Utils::genFrame(32, 0, 32, 64), ls::Utils::genFrame(1, 1, 32, 64));
}

void ls::gameplay() {
	glColor3f(255, 255, 255);
	crono->render();

	glTranslatef(200.0f, 100.0f, 0.0f);

	glRotatef(135.0f, 0.0f, 0.0f, 1.0f);

	glScalef(2.0f, 2.0f, 0.0f);

	glBegin(GL_QUADS);
		glColor3f(1.0f, 0.0f, 0.0f);
		glVertex3f(0.0f, 0.0f, 0.0f);

		glColor3f(0.0f, 1.0f, 0.0f);
		glVertex3f(32.0f, 0.0f, 0.0f); 

		glColor3f(0.0f, 1.0f, 0.0f);
		glVertex3f(32.0f, 64.0f, 0.0f);

		glColor3f(1.0f, 0.0f, 1.0f);
		glVertex3f(0.0f, 64.0f, 0.0f);
	glEnd();

	glLoadIdentity();
}

void ls::keyboard(short type) {
}

void ls::end() {
}

Can anyone please tell me why it only renders a white quadrilateral instead of Crono?
NOTE: As you can see in the console there are no load errors for finding the image. The problem lies either in formatting the image correctly(the texture crap after loading the image), somewhere in my GLuint* crap(doubt it), or the actually rendering of the image its self.

BTW here's the main function's contents:

Code: Select all

SDL_Init(SDL_INIT_EVERYTHING);
	TTF_Init();
	SDL_WM_SetCaption("Lunar Sanity", NULL);
	SDL_ShowCursor(SDL_DISABLE);
	ls::start();
	SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);

#ifdef _DEBUG_
	ls::screen = SDL_SetVideoMode(ls::screenWidth, ls::screenHeight, ls::screenBpp, SDL_OPENGL|SDL_HWSURFACE|SDL_DOUBLEBUF);
#else
	ls::screen=SDL_SetVideoMode(ls::screenWidth, ls::screenHeight, ls::screenBpp, SDL_OPENGL|SDL_FULLSCREEN|SDL_HWSURFACE|SDL_DOUBLEBUF);
#endif

	glEnable(GL_TEXTURE_2D);
	glEnable(GL_BLEND);
	glClearColor(0, 0, 0, 0);
	glMatrixMode( GL_PROJECTION ); 
	glLoadIdentity();
	glOrtho(0, ls::screenWidth, ls::screenHeight, 0, -1, 1);
	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();

	ls::load();

#ifndef _DS_
	splash = new ls::Sprite(ls::UI_FOLDER + "logo", 0, 32, 512, 512);
	splash->render();
	ls::sync();
	ls::sleep(2000);
#endif

	while(!ls::isEnd){
		ls::Utils::sleep(100);
		ls::Render::sync();
		ls::Render::clear();
		ls::gameplay();
		ls::Render::sync();

		if(CONTROL_HIT(&ls::event)){
			ls::keyboard(ls::event.type);
		}
	}

	std::cout.flush();
	TTF_Quit();
	SDL_Quit();
	ls::Utils::close();
Thanks in advance. :cheers:
Attachments
shit.jpg
shit.jpg (74.24 KiB) Viewed 2150 times
Why not check out my game-development forum at: http://f1rel0ck.netai.net
Or my 2D RPG at: https://www.facebook.com/pages/Arcadian ... 6873806531
Image
Image
Image
User avatar
bbguimaraes
Chaos Rift Junior
Chaos Rift Junior
Posts: 294
Joined: Wed Apr 11, 2012 4:34 pm
Programming Language of Choice: c++
Location: Brazil
Contact:

Re: SDL+OpenGL Texture not Rendering-- HELP!

Post by bbguimaraes »

Well, there's no way I'm looking through all that code, but some things that made my textures not render in the past (in order of annoyance/time needed to find):
  • Wrong file path.
  • Not enabling GL_TEXTURE_2D (or using GL_TEXTURE instead). Sometimes I would disable it in some part of the code and forget to enable it again.
  • Loading them before initializing the OGL context or in another context.
Also, see if there are no OGL errors (glGetError(), I think) and try loading other images (like a simple black rectangle bmp).
User avatar
OmenFelix
Chaos Rift Cool Newbie
Chaos Rift Cool Newbie
Posts: 59
Joined: Fri May 04, 2012 1:42 pm
Current Project: Arcadian Descent(C++, SDL, KOS)
Favorite Gaming Platforms: PS2, PC, XBOX, NDS
Programming Language of Choice: C/C++

Re: SDL+OpenGL Texture not Rendering-- HELP!

Post by OmenFelix »

bbguimaraes wrote:Well, there's no way I'm looking through all that code, but some things that made my textures not render in the past (in order of annoyance/time needed to find):
  • Wrong file path.
  • Not enabling GL_TEXTURE_2D (or using GL_TEXTURE instead). Sometimes I would disable it in some part of the code and forget to enable it again.
  • Loading them before initializing the OGL context or in another context.
Also, see if there are no OGL errors (glGetError(), I think) and try loading other images (like a simple black rectangle bmp).
Well it's using the right file path.
I have enabled GL_TEXTURE_2D.
And I get no errors from OGL.

When using this new code as the ls::Render::render(...):

Code: Select all

        glBindTexture(GL_TEXTURE_2D, texture);
	glScalef(2.0f, 2.0f, 2.0f);
	glTranslatef(frame.x, frame.y, 0);
	
	glBegin(GL_QUADS);
		glTexCoord2f((GLfloat) clip.x*clip.w, (GLfloat) clip.y*clip.h);
		glVertex2f(0, 0);

		glTexCoord2f((GLfloat) (clip.x+1)*clip.w, (GLfloat) clip.y*clip.h);
		glVertex2f(clip.w, 0);

        glTexCoord2f((GLfloat) (clip.x+1)*clip.w, (GLfloat) (clip.y+1)*clip.h);
        glVertex2f(clip.w, clip.h);

        glTexCoord2f((GLfloat) clip.x*clip.w, (GLfloat) (clip.y+1)*clip.h);
        glVertex2f(0, clip.h);
	glEnd();

        glLoadIdentity();
Whilst trying to draw bawx.bmp, I get a tiled version of the bawx image to a scale of 2(won't render unless it's the scale of 2 for some crazy reason). WTF is wrong, seriously??
Attachments
shit2.jpg
shit2.jpg (51.3 KiB) Viewed 2135 times
bawx.jpg
bawx.jpg (2.24 KiB) Viewed 2135 times
Why not check out my game-development forum at: http://f1rel0ck.netai.net
Or my 2D RPG at: https://www.facebook.com/pages/Arcadian ... 6873806531
Image
Image
Image
User avatar
bbguimaraes
Chaos Rift Junior
Chaos Rift Junior
Posts: 294
Joined: Wed Apr 11, 2012 4:34 pm
Programming Language of Choice: c++
Location: Brazil
Contact:

Re: SDL+OpenGL Texture not Rendering-- HELP!

Post by bbguimaraes »

Are you aware that glTexCoord2f takes the uv values as a parameter, which are inside the range [0.0f,1.0f], not as world coordinates? You seem to be using the later, which causes the texture to be rendered (width*height) times, in this tiled fashion. For simple textures (i.e., the ones that span the whole tile, having the same dimensions), you should use:

Code: Select all

// Bottom-left.
glTexCoord2f(0.0f, 0.0f);
glVertex2f(0.0f, 0.0f);

// Bottom-right.
glTexCoord2f(1.0f, 0.0f);
glVertex2f(width, 0.0f);

// Upper-right.
glTexCoord2f(1.0f, 1.0f);
glVertex2f(width, height);

// Upper-left.
glTexCoord2f(0.0f, 1.0f);
glVertex2f(0.0f, height);
Or something like that, you get the idea.
User avatar
OmenFelix
Chaos Rift Cool Newbie
Chaos Rift Cool Newbie
Posts: 59
Joined: Fri May 04, 2012 1:42 pm
Current Project: Arcadian Descent(C++, SDL, KOS)
Favorite Gaming Platforms: PS2, PC, XBOX, NDS
Programming Language of Choice: C/C++

Re: SDL+OpenGL Texture not Rendering-- HELP!

Post by OmenFelix »

bbguimaraes wrote:Are you aware that glTexCoord2f takes the uv values as a parameter, which are inside the range [0.0f,1.0f], not as world coordinates? You seem to be using the later, which causes the texture to be rendered (width*height) times, in this tiled fashion. For simple textures (i.e., the ones that span the whole tile, having the same dimensions), you should use:

Code: Select all

// Bottom-left.
glTexCoord2f(0.0f, 0.0f);
glVertex2f(0.0f, 0.0f);

// Bottom-right.
glTexCoord2f(1.0f, 0.0f);
glVertex2f(width, 0.0f);

// Upper-right.
glTexCoord2f(1.0f, 1.0f);
glVertex2f(width, height);

// Upper-left.
glTexCoord2f(0.0f, 1.0f);
glVertex2f(0.0f, height);
Or something like that, you get the idea.
I can counteract that with this code usually(but just renders a magenta quadrilateral):

Code: Select all

        ls::Vec2d a = ls::Utils::genVec2d((clip.x - 1) * (1 / (256 / clip.w)), (clip.y- 1) * (1 / (256 / clip.h)));
	ls::Vec2d b = ls::Utils::genVec2d(clip.x * (1 / (256 / clip.w)), (clip.y - 1) * (1 / (256 / clip.h)));
	ls::Vec2d c = ls::Utils::genVec2d(clip.x * (1 / (256 / clip.w)), clip.y * (1 / (256 / clip.h)));
	ls::Vec2d d = ls::Utils::genVec2d((clip.x - 1) * (1 / (256 / clip.w)), clip.y * (1 / (256 / clip.h)));

	glBindTexture(GL_TEXTURE_2D, texture);
	glTranslatef(frame.x, frame.y, 0);
	
	glBegin(GL_QUADS);
		glTexCoord2f(a.x, a.y);
		glVertex2f(0, 0);

		glTexCoord2f(b.x, b.y);
		glVertex2f(clip.w, 0);

        glTexCoord2f(c.x, c.y);
        glVertex2f(clip.w, clip.h);

        glTexCoord2f(d.x, d.y);
        glVertex2f(0, clip.h);
	glEnd();

       glLoadIdentity();
Why not check out my game-development forum at: http://f1rel0ck.netai.net
Or my 2D RPG at: https://www.facebook.com/pages/Arcadian ... 6873806531
Image
Image
Image
User avatar
bbguimaraes
Chaos Rift Junior
Chaos Rift Junior
Posts: 294
Joined: Wed Apr 11, 2012 4:34 pm
Programming Language of Choice: c++
Location: Brazil
Contact:

Re: SDL+OpenGL Texture not Rendering-- HELP!

Post by bbguimaraes »

I'm sorry if I'm misunderstanding you, but texture coordinates are specified like the image below (vertexes in black, texture in red). Even if the tile is at position (14,15) and has dimensions (92,65), the texture coordinates will still be (0,0) (1,0) (1,1) (0,1). They specify coordinates "relative to your object" rather than "relative to the world". I really can't understand what you're trying to accomplish with all those calculations for the texture coordinates.

If I'm totally wrong and this doesn't help you at all, I suggest you read this chapter from the Red Book. This is where I learned texture rendering: http://www.glprogramming.com/red/chapter09.html. I'm sure all the information you need is there.
Attachments
texture.png
texture.png (4.28 KiB) Viewed 2124 times
User avatar
OmenFelix
Chaos Rift Cool Newbie
Chaos Rift Cool Newbie
Posts: 59
Joined: Fri May 04, 2012 1:42 pm
Current Project: Arcadian Descent(C++, SDL, KOS)
Favorite Gaming Platforms: PS2, PC, XBOX, NDS
Programming Language of Choice: C/C++

Re: SDL+OpenGL Texture not Rendering-- HELP!

Post by OmenFelix »

bbguimaraes wrote:I'm sorry if I'm misunderstanding you, but texture coordinates are specified like the image below (vertexes in black, texture in red). Even if the tile is at position (14,15) and has dimensions (92,65), the texture coordinates will still be (0,0) (1,0) (1,1) (0,1). They specify coordinates "relative to your object" rather than "relative to the world". I really can't understand what you're trying to accomplish with all those calculations for the texture coordinates.

If I'm totally wrong and this doesn't help you at all, I suggest you read this chapter from the Red Book. This is where I learned texture rendering: http://www.glprogramming.com/red/chapter09.html. I'm sure all the information you need is there.
I've decided I'll go about rendering a different way. Thanks anyway though. :P
Why not check out my game-development forum at: http://f1rel0ck.netai.net
Or my 2D RPG at: https://www.facebook.com/pages/Arcadian ... 6873806531
Image
Image
Image
Post Reply