Writing to SDL_Surface Pixels
Moderator: Coders of Rage
-
- Chaos Rift Regular
- Posts: 173
- Joined: Thu Feb 11, 2010 9:46 pm
Writing to SDL_Surface Pixels
I'm trying to write a function to vertically flip an SDL_Surface but I get access violations everytime I try to write to the pixel data. Obviously I'm not doing something right, can anyone help?
Edit: Code removed for now.
Edit: Code removed for now.
Last edited by X Abstract X on Wed Jul 14, 2010 1:02 am, edited 1 time in total.
- Ginto8
- ES Beta Backer
- Posts: 1064
- Joined: Tue Jan 06, 2009 4:12 pm
- Programming Language of Choice: C/C++, Java
Re: Writing to SDL_Surface Pixels
I didn't feel like looking at the code, but you really should take a look at lazyfoo's tutorial on the subject: http://lazyfoo.net/SDL_tutorials/lesson31/index.php
Quit procrastinating and make something awesome.
Ducky wrote:Give a man some wood, he'll be warm for the night. Put him on fire and he'll be warm for the rest of his life.
-
- Chaos Rift Regular
- Posts: 173
- Joined: Thu Feb 11, 2010 9:46 pm
Re: Writing to SDL_Surface Pixels
I wrote mine after reading that article, looked the article over several times and I still can't figure out my problem.
Re: Writing to SDL_Surface Pixels
Well nothing pops out from looking at it, but the only thing I could think the problem could be, is that integers aren't the same size on your system. So instead of doing "unsigned int" use "Uint32" Because if your unsigned ints happen to be larger than the Uint32, you'll definitely have an out of bounds error.
But also you may want to start debugging a bit... Set some breakpoints, and find the exact point you get the error from.
But also you may want to start debugging a bit... Set some breakpoints, and find the exact point you get the error from.
-
- Chaos Rift Regular
- Posts: 173
- Joined: Thu Feb 11, 2010 9:46 pm
Re: Writing to SDL_Surface Pixels
I figured something out... the function works on my 32 bit RGBA surface but not my 24 bit RGB surface. What can I do though besides converting the surface to 32 bit?
Re: Writing to SDL_Surface Pixels
X Abstract X wrote:I figured something out... the function works on my 32 bit RGBA surface but not my 24 bit RGB surface. What can I do though besides converting the surface to 32 bit?
well your unsigned int is 32 bits, so you need something like a Uint24... but I don't know if SDL has that, or how you'd go about making one...
-
- Chaos Rift Regular
- Posts: 173
- Joined: Thu Feb 11, 2010 9:46 pm
Re: Writing to SDL_Surface Pixels
I must be able to do this by separating the components into unsigned bytes. I'll see what I can do.XianForce wrote:X Abstract X wrote:I figured something out... the function works on my 32 bit RGBA surface but not my 24 bit RGB surface. What can I do though besides converting the surface to 32 bit?
well your unsigned int is 32 bits, so you need something like a Uint24... but I don't know if SDL has that, or how you'd go about making one...
P.S. Fuck, I hate working with all these wrapper data types.
- Ginto8
- ES Beta Backer
- Posts: 1064
- Joined: Tue Jan 06, 2009 4:12 pm
- Programming Language of Choice: C/C++, Java
Re: Writing to SDL_Surface Pixels
okey dokey... you can either blit it to a preformatted surface, or convert it. Unless you're doing padding for say opengl, which I don't think you are, look at SDL_ConvertSurface (http://sdl.beuc.net/sdl.wiki/SDL_ConvertSurface)X Abstract X wrote:I figured something out... the function works on my 32 bit RGBA surface but not my 24 bit RGB surface. What can I do though besides converting the surface to 32 bit?
Quit procrastinating and make something awesome.
Ducky wrote:Give a man some wood, he'll be warm for the night. Put him on fire and he'll be warm for the rest of his life.
-
- Chaos Rift Regular
- Posts: 173
- Joined: Thu Feb 11, 2010 9:46 pm
Re: Writing to SDL_Surface Pixels
I stated that I want to try to avoid converting the surface because I think the performance may be better if I just swap the relevant bytes, I'm going to try and compare both methods. On another note, I'm wondering why I get this access violation when ALL SDL_Surfaces, regardless of their depth, store the pixel data in a 32 bit unsigned integer.Ginto8 wrote:okey dokey... you can either blit it to a preformatted surface, or convert it. Unless you're doing padding for say opengl, which I don't think you are, look at SDL_ConvertSurface (http://sdl.beuc.net/sdl.wiki/SDL_ConvertSurface)X Abstract X wrote:I figured something out... the function works on my 32 bit RGBA surface but not my 24 bit RGB surface. What can I do though besides converting the surface to 32 bit?
- Ginto8
- ES Beta Backer
- Posts: 1064
- Joined: Tue Jan 06, 2009 4:12 pm
- Programming Language of Choice: C/C++, Java
Re: Writing to SDL_Surface Pixels
Converting the surface in one way or another is really quite optimal, seeing as otherwise it will have to automatically converted by SDL every time you blit it. Look at lazyfoo's tutorial where it has "optimized" image loading. That is specifically designed so that the image conversion only happens once, and after that simply doesn't happen.X Abstract X wrote:I stated that I want to try to avoid converting the surface because I think the performance may be better if I just swap the relevant bytes, I'm going to try and compare both methods. On another note, I'm wondering why I get this access violation when ALL SDL_Surfaces, regardless of their depth, store the pixel data in a 32 bit unsigned integer.Ginto8 wrote:okey dokey... you can either blit it to a preformatted surface, or convert it. Unless you're doing padding for say opengl, which I don't think you are, look at SDL_ConvertSurface (http://sdl.beuc.net/sdl.wiki/SDL_ConvertSurface)X Abstract X wrote:I figured something out... the function works on my 32 bit RGBA surface but not my 24 bit RGB surface. What can I do though besides converting the surface to 32 bit?
Quit procrastinating and make something awesome.
Ducky wrote:Give a man some wood, he'll be warm for the night. Put him on fire and he'll be warm for the rest of his life.
-
- Chaos Rift Regular
- Posts: 173
- Joined: Thu Feb 11, 2010 9:46 pm
Re: Writing to SDL_Surface Pixels
The surface is being used to build a 24bit OpenGL texture.Ginto8 wrote:Converting the surface in one way or another is really quite optimal, seeing as otherwise it will have to automatically converted by SDL every time you blit it. Look at lazyfoo's tutorial where it has "optimized" image loading. That is specifically designed so that the image conversion only happens once, and after that simply doesn't happen.X Abstract X wrote:I stated that I want to try to avoid converting the surface because I think the performance may be better if I just swap the relevant bytes, I'm going to try and compare both methods. On another note, I'm wondering why I get this access violation when ALL SDL_Surfaces, regardless of their depth, store the pixel data in a 32 bit unsigned integer.Ginto8 wrote:okey dokey... you can either blit it to a preformatted surface, or convert it. Unless you're doing padding for say opengl, which I don't think you are, look at SDL_ConvertSurface (http://sdl.beuc.net/sdl.wiki/SDL_ConvertSurface)X Abstract X wrote:I figured something out... the function works on my 32 bit RGBA surface but not my 24 bit RGB surface. What can I do though besides converting the surface to 32 bit?
- Ginto8
- ES Beta Backer
- Posts: 1064
- Joined: Tue Jan 06, 2009 4:12 pm
- Programming Language of Choice: C/C++, Java
Re: Writing to SDL_Surface Pixels
Are you sure you don't want a 32 bit rgba? for OpenGL it's always better to have transparency.X Abstract X wrote:The surface is being used to build a 24bit OpenGL texture.Ginto8 wrote:Converting the surface in one way or another is really quite optimal, seeing as otherwise it will have to automatically converted by SDL every time you blit it. Look at lazyfoo's tutorial where it has "optimized" image loading. That is specifically designed so that the image conversion only happens once, and after that simply doesn't happen.X Abstract X wrote:I stated that I want to try to avoid converting the surface because I think the performance may be better if I just swap the relevant bytes, I'm going to try and compare both methods. On another note, I'm wondering why I get this access violation when ALL SDL_Surfaces, regardless of their depth, store the pixel data in a 32 bit unsigned integer.Ginto8 wrote:okey dokey... you can either blit it to a preformatted surface, or convert it. Unless you're doing padding for say opengl, which I don't think you are, look at SDL_ConvertSurface (http://sdl.beuc.net/sdl.wiki/SDL_ConvertSurface)X Abstract X wrote:I figured something out... the function works on my 32 bit RGBA surface but not my 24 bit RGB surface. What can I do though besides converting the surface to 32 bit?
Quit procrastinating and make something awesome.
Ducky wrote:Give a man some wood, he'll be warm for the night. Put him on fire and he'll be warm for the rest of his life.
-
- Chaos Rift Regular
- Posts: 173
- Joined: Thu Feb 11, 2010 9:46 pm
Re: Writing to SDL_Surface Pixels
I'm not trying to be a jerk but, can you explain why it's better? Is it faster because it's a power of 2 and the graphics card bus width is usually (always?) a power of 2? If I have a hell of a lot of textures it sounds like a waste of memory to me. I think memory is probably more precious than processing power for my application.Ginto8 wrote:Are you sure you don't want a 32 bit rgba? for OpenGL it's always better to have transparency.X Abstract X wrote:The surface is being used to build a 24bit OpenGL texture.Ginto8 wrote:Converting the surface in one way or another is really quite optimal, seeing as otherwise it will have to automatically converted by SDL every time you blit it. Look at lazyfoo's tutorial where it has "optimized" image loading. That is specifically designed so that the image conversion only happens once, and after that simply doesn't happen.X Abstract X wrote:I stated that I want to try to avoid converting the surface because I think the performance may be better if I just swap the relevant bytes, I'm going to try and compare both methods. On another note, I'm wondering why I get this access violation when ALL SDL_Surfaces, regardless of their depth, store the pixel data in a 32 bit unsigned integer.Ginto8 wrote:okey dokey... you can either blit it to a preformatted surface, or convert it. Unless you're doing padding for say opengl, which I don't think you are, look at SDL_ConvertSurface (http://sdl.beuc.net/sdl.wiki/SDL_ConvertSurface)X Abstract X wrote:I figured something out... the function works on my 32 bit RGBA surface but not my 24 bit RGB surface. What can I do though besides converting the surface to 32 bit?
- Ginto8
- ES Beta Backer
- Posts: 1064
- Joined: Tue Jan 06, 2009 4:12 pm
- Programming Language of Choice: C/C++, Java
Re: Writing to SDL_Surface Pixels
I don't know about the speed of 24bit vs 32 bit, but I DO know that the way SDL stores pixels, it's gonna be an array of 32 bit ints anyway, and will just end up ignoring 8 bytes. So it will most likely end up messing up on the transfer to opengl anyway. Second, a ton of textures? shouldn't you consolidate that via tile/sprite sheets, thus lowering the amount of textures necessary? Third, the amount of extra memory by adding another byte to every pixel is so insanely small in the overall scheme of things that you REALLY shouldn't be worried about it. As Donald Knuth puts it, "Premature optimization is the root of all evil," so you shouldn't optimize until you actually have a speed/memory issue.X Abstract X wrote:I'm not trying to be a jerk but, can you explain why it's better? Is it faster because it's a power of 2 and the graphics card bus width is usually (always?) a power of 2? If I have a hell of a lot of textures it sounds like a waste of memory to me. I think memory is probably more precious than processing power for my application.
Quit procrastinating and make something awesome.
Ducky wrote:Give a man some wood, he'll be warm for the night. Put him on fire and he'll be warm for the rest of his life.
-
- Chaos Rift Regular
- Posts: 173
- Joined: Thu Feb 11, 2010 9:46 pm
Re: Writing to SDL_Surface Pixels
I've been using 24 bit textures for months without problem; one of the parameters for glTexImage2D() lets you specify the format that your pixel data is in. I've never heard of anyone putting their textures into a sheet. These aren't 32x32 tiles that you would use in a 2D game, they are 256x256, 512x512 or even bigger textures. A 512x512 texture consists of 262 144 pixels. At 4 bytes per pixel, that is an entire megabyte, cutting that down to 3/4 of a megabyte definitely seems worth it.Ginto8 wrote:I don't know about the speed of 24bit vs 32 bit, but I DO know that the way SDL stores pixels, it's gonna be an array of 32 bit ints anyway, and will just end up ignoring 8 bytes. So it will most likely end up messing up on the transfer to opengl anyway. Second, a ton of textures? shouldn't you consolidate that via tile/sprite sheets, thus lowering the amount of textures necessary? Third, the amount of extra memory by adding another byte to every pixel is so insanely small in the overall scheme of things that you REALLY shouldn't be worried about it. As Donald Knuth puts it, "Premature optimization is the root of all evil," so you shouldn't optimize until you actually have a speed/memory issue.X Abstract X wrote:I'm not trying to be a jerk but, can you explain why it's better? Is it faster because it's a power of 2 and the graphics card bus width is usually (always?) a power of 2? If I have a hell of a lot of textures it sounds like a waste of memory to me. I think memory is probably more precious than processing power for my application.