Okay, so i'm at a bit of a crossroads and wanted a more educated opinion.
I've gotten pretty good with OpenGL over the past few years. Both fixed function, and Modern GL.
I'm also pretty good with SFML, and end up using it for compatibility.
My problem is that I prefer GL. I like to write everything my way, and think it ends up cleaner that way.
But, Fixed function is too out-dated, and I seem to be having trouble getting Modern GL to run on alot of devices.
For example, when I start a new project I often use my Girlfriend's laptop to test. Now it's only about 2 years old, but it isn't a very nice laptop and with full driver updates only supports GL 2.1 which effectively breaks anything I try to write in modern GL because the shaders aren't supported. Is this something I should just get over? or is there a middle-man version of GL I can use that both uses modern shaders and works on low-end opengl drivers.
I know for a fact SFML uses OpenGL as it's library and even supports shaders but I'm not sure what version they use and SFML apps run fine on my girlfriends Laptop. How does it do this?
Overall I just feel like I am missing something.
To use GL or not to use GL ( and which GL to use )
Moderator: Coders of Rage
-
- Chaos Rift Regular
- Posts: 101
- Joined: Thu Dec 09, 2010 2:13 am
-
- Chaos Rift Regular
- Posts: 173
- Joined: Thu Feb 11, 2010 9:46 pm
Re: To use GL or not to use GL ( and which GL to use )
You pretty much have 2 options.
Option 1: Choose an older version of OpenGL and make sure you don't use any functionality that is newer than that version. This is a good choice for most indies because most of us aren't trying to do anything cutting edge, graphics wise. GL 2.1 is a pretty popular version to target, it will let you write shaders using GLSL 1.20 and it is also pretty old/widely supported. You have to think about your target audience though and do some research into what versions of OpenGL are supported on what age of hardware, then make your decision on what version you want to target.
Option 2: Make use of different OpenGL versions based on the best possible version available on each user's system. This is what modern commercial games tend to do (I'm guessing SFML does this too), the benefit of doing this is obvious, you get to support a wide variety of users and make use of the newest functionality on each system. Obviously though this is going to make it way more time consuming and complicated to write your renderer, basically it's going to be like writing a whole bunch of renderers all in different versions of OpenGL and GLSL.
Option 1: Choose an older version of OpenGL and make sure you don't use any functionality that is newer than that version. This is a good choice for most indies because most of us aren't trying to do anything cutting edge, graphics wise. GL 2.1 is a pretty popular version to target, it will let you write shaders using GLSL 1.20 and it is also pretty old/widely supported. You have to think about your target audience though and do some research into what versions of OpenGL are supported on what age of hardware, then make your decision on what version you want to target.
Option 2: Make use of different OpenGL versions based on the best possible version available on each user's system. This is what modern commercial games tend to do (I'm guessing SFML does this too), the benefit of doing this is obvious, you get to support a wide variety of users and make use of the newest functionality on each system. Obviously though this is going to make it way more time consuming and complicated to write your renderer, basically it's going to be like writing a whole bunch of renderers all in different versions of OpenGL and GLSL.
- dandymcgee
- ES Beta Backer
- Posts: 4709
- Joined: Tue Apr 29, 2008 3:24 pm
- Current Project: https://github.com/dbechrd/RicoTech
- Favorite Gaming Platforms: NES, Sega Genesis, PS2, PC
- Programming Language of Choice: C
- Location: San Francisco
- Contact:
Re: To use GL or not to use GL ( and which GL to use )
Option 3: "System Requirements"
Falco Girgis wrote:It is imperative that I can broadcast my narcissistic commit strings to the Twitter! Tweet Tweet, bitches!
- superLED
- Chaos Rift Junior
- Posts: 303
- Joined: Sun Nov 21, 2010 10:56 am
- Current Project: Engine
- Favorite Gaming Platforms: N64
- Programming Language of Choice: C++, PHP
- Location: Norway
Re: To use GL or not to use GL ( and which GL to use )
I would go with Option 3, if you want to make the best game you can make. Then you don't need to focus on this problem and rather focus on the game itself.
-
- Respected Programmer
- Posts: 387
- Joined: Fri Dec 19, 2008 3:33 pm
- Location: Dallas
- Contact:
Re: To use GL or not to use GL ( and which GL to use )
Welcome to the nebulous nature of the modern OpenGL API...
It's really quite crap. I absolutely abhor the API. I've written at great length about it and as predicted, it's not going to change anytime soon (ever). The best bet is to shoot as high as you can go which will be limited by your development platform. You can either say "You simply can't run this unless you have XXX hardware", or you can provide several pipelines that fork based on hardware caps. So in Quadrion we'd have a pipeline we'd shoot for, but if your device doesn't support SM4, then we'd fall back to some SM3 pathway. You can conditionally compile shaders on there so that you CAN fork it one way or another. It's an absolute ton of extra work.
My best advice is to focus on writing a very good effect framework. FX/CGFX were both pretty good solutions (CGFX the less good of the two) for attacking this problem. Currently neither are supported anymore because everyone has done what I just said....write a very good effect framework for yourself. If your back-end is OpenGL based, then the code will be GLSL, but the effect will encapsulate every piece of render state that is to be set for every pass, for every effect. In this sense, the effect deals w/ the pathways to take. It's less C++/C code to write and you can much more quickly deal w/ render bugs.
That said, if you're writing on a "shitty" laptop, it's likely you have an Intel integrated graphics chip going on there. Either the 4XXX or the 2XXX. Both of them are absolute crap. They shit on specification and their performance is 3 generations behind advertised at best. I made it a point to discard Intel graphics as a target platform completely. You might be well advised to do the same...
It's really quite crap. I absolutely abhor the API. I've written at great length about it and as predicted, it's not going to change anytime soon (ever). The best bet is to shoot as high as you can go which will be limited by your development platform. You can either say "You simply can't run this unless you have XXX hardware", or you can provide several pipelines that fork based on hardware caps. So in Quadrion we'd have a pipeline we'd shoot for, but if your device doesn't support SM4, then we'd fall back to some SM3 pathway. You can conditionally compile shaders on there so that you CAN fork it one way or another. It's an absolute ton of extra work.
My best advice is to focus on writing a very good effect framework. FX/CGFX were both pretty good solutions (CGFX the less good of the two) for attacking this problem. Currently neither are supported anymore because everyone has done what I just said....write a very good effect framework for yourself. If your back-end is OpenGL based, then the code will be GLSL, but the effect will encapsulate every piece of render state that is to be set for every pass, for every effect. In this sense, the effect deals w/ the pathways to take. It's less C++/C code to write and you can much more quickly deal w/ render bugs.
That said, if you're writing on a "shitty" laptop, it's likely you have an Intel integrated graphics chip going on there. Either the 4XXX or the 2XXX. Both of them are absolute crap. They shit on specification and their performance is 3 generations behind advertised at best. I made it a point to discard Intel graphics as a target platform completely. You might be well advised to do the same...