I recently got through upgrading our engine to support ES2 and GLSL shaders. It took about a week to get the game just looking the same as it did before, but rendering with shaders instead. I'm sharing some info that might be worthwhile to anyone else trying to update their iPhone renderers. This is not a how-to on using GLSL to achieve different effects, you can find plenty of that elsewhere.
ES2 is not an incremental improvement to ES1, it is a total paradigm shift in how pixels get rendered. You can't just take an ES1 renderer and add a couple shaders here and there like you can in DirectX. In ES2, you write vertex and pixel/fragment shaders in GLSL, and then pass values to the shaders at runtime.
The vertex shader reads in any values from VBOs or vertex arrays, and outputs any values that are useful for the pixel shader. Any values created in the vertex shader are interpolated along the triangle edges and raster lines before being passed to the pixel shader. The pixel/fragment shader has only one job, to output a color value.
I've found this site to be a nice reference for GLSL functions, starting at section 8.1: http://www.khronos.org/files/opengl-quick-reference-card.pdf
3GS/iPhone4/iPad or bust:
ES2 is not supported on the 3G, first gen iPhone, or first gen iPod. It's possible to support both ES1 and ES2 in the same codebase, but you will need two entirely different render paths. There's no mixing and matching allowed, so within a run you are either entirely ES1 or entirely ES2.
EAGLContext* eaglContext = 0;
eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
if (eaglContext && [EAGLContext setCurrentContext:eaglContext])
{
// initialize a renderer that uses ES2 imports
}
else
{
eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];
if (!eaglContext || ![EAGLContext setCurrentContext:eaglContext]) {
// total failure!
}
// initialize a renderer that uses ES1 imports
}
Loading and assigning shaders:
The shader compiler deals in character buffers. You will need to either create a GLSL stream in code, or more sanely load up a file containing a shader and pass the contents to the compiler.
int loadShader(GLenum type, const char* glslSourceBuf)
{
int ret = glCreateShader(type);
if (ret == 0) return ret;
glShaderSource(ret, 1, (const GLchar**)&glslSourceBuf, NULL);
glCompileShader(ret);
int success;
glGetShaderiv(ret, GL_COMPILE_STATUS, &success);
if (success == 0)
{
char errorMsg[2048];
glGetShaderInfoLog(ret, sizeof(errorMsg), NULL,
errorMsg);
outputDebugString("%s error: %s\n", fileName, errorMsg);
glDeleteShader(ret);
ret = 0;
}
return ret;
}
int loadShaderProgram(const char* vertSource, const char* pixelSource)
{
// load in the two individual shaders
int vertShader = loadShader(GL_VERTEX_SHADER, vertSource);
int pixelShader = loadShader(GL_FRAGMENT_SHADER, pixelSource);
// create a "program" which is a vertex/pixel shader pair.
int ret = glCreateProgram();
if (ret == 0) return ret;
glAttachShader(ret, vertShader);
glAttachShader(ret, pixelShader);
// assign vertex attributes to positions inside
// glVertexAttribPointer calls
glBindAttribLocation(ret, AP_POS, "position");
glBindAttribLocation(ret, AP_NORMAL, "normal");
glBindAttribLocation(ret, AP_DIFFUSE, "diffuse");
glBindAttribLocation(ret, AP_SPECULAR, "specular");
glBindAttribLocation(ret, AP_UV1, "uv1");
glLinkProgram(ret);
int linked;
glGetProgramiv(ret, GL_LINK_STATUS, &linked);
if (linked == 0)
{
glDeleteProgram(ret);
outputDebugString("Failed to link shader program.");
return 0;
}
return ret;
}
void drawSomething(void)
{
// tell opengl which shaders to use for rendering
glUseProgram(mShaderProgram);
// set any values on the shader that you want to use.
// set up the vertex buffer using glVertexAttribPointer
// calls and the same positions used during the linking.
// then draw like usual.
glDrawElements(GL_TRIANGLES, numTris, GL_UNSIGNED_SHORT, 0);
}
No matrix stack:
All transformations are done in the shader, so anything using glMatrixMode is automatically out. glFrustumf and glOrthof are also gone, so you will need to write replacements. You can find examples of these two functions in the Android codebase at http://www.google.com/codesearch/p?hl=en#uX1GffpyOZk/opengl/libagl/matrix.cpp&q=glfrustumf%20lang:c++&sa=N&cd=1&ct=rc&l=7.
For the transforms used by shaders, I have callbacks to grab values like ModelToView and ViewToProj from a structure that I calculate once per render pass.
In C++:
unsigned int transformShaderHandle = glGetUniformLocation(shaderId, "ModelToScreen");
glUniformMatrix4fv(transformShaderHandle, 1, GL_FALSE, (GLfloat*)mModelViewProj );
In the vertex shader:
uniform mat4 ModelToScreen;
attribute vec4 position;
void main()
{
gl_Position = ModelToScreen * position;
}
More textures!
You only get two texture channels to use under ES1. ES2 gives you 8. Setting up a texture in ES2 is similar to ES1, but you don't get the various glTexEnvi functions to define how multiple texture channels blend together. You do that part in GLSL instead.
In C++:
unsigned int textureShaderHandle = glGetUniformLocation(shaderId, "Texture0");
// tell the shader that Texture0 will be on texture channel 0
glUniform1i(textureShaderHandle, 0);
// then set up the texture like you would in ES1
glActiveTexture(GL_TEXTURE0);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, mTextureId);
In the pixel shader:
uniform lowp sampler2D Texture0;
void main()
{
gl_FragColor = texture2D(Texture0, v_uv1);
}
No such thing as glEnableClientState(GL_NORMAL_ARRAY)
GL_NORMAL_ARRAY, GL_COLOR_ARRAY, etc have all gone away. Instead you use the unified glVertexAttribPointer interface to push vertex buffer info to the shaders. This is a pretty simple change.
glEnableVertexAttribArray(AP_NORMAL);
glVertexAttribPointer(AP_NORMAL, 3, GL_FLOAT, false, vertDef.getVertSize(), (GLvoid*)(vertDef.getNormalOffset()*4));