Jump to content

GLSL Programming/Blender/Textured Spheres

From Wikibooks, open books for an open world
The Earth seen from Apollo 17. The shape of the Earth is close to a quite smooth sphere.

This tutorial introduces texture mapping.

It's the first in a series of tutorials about texturing in GLSL shaders in Blender. In this tutorial, we start with a single texture map on a sphere. More specifically, we map an image of the Earth's surface onto a sphere. Based on this, further tutorials cover topics such as lighting of textured surfaces, transparent textures, multitexturing, gloss mapping, etc.

A triangle mesh approximating a sphere.
An image of the Earth's surface. The horizontal coordinate represents the longitude, the vertical coordinate the latitude.

Texture Mapping

[edit | edit source]

The basic idea of “texture mapping” (or “texturing”) is to map an image (i.e. a “texture” or a “texture map”) onto a triangle mesh; in other words, to put a flat image onto the surface of a three-dimensional shape.

To this end, “texture coordinates” are defined, which simply specify the position in the texture (i.e. image). The horizontal coordinate is officially called S and the vertical coordinate T. However, it is very common to refer to them as x and y. In animation and modeling tools, texture coordinates are usually called U and V.

In order to map the texture image to a mesh, every vertex of the mesh is given a pair of texture coordinates. (This process (and the result) is sometimes called “UV mapping” since each vertex is mapped to a point in the UV-space.) Thus, every vertex is mapped to a point in the texture image. The texture coordinates of the vertices can then be interpolated for each point of any triangle between three vertices and thus every point of all triangles of the mesh can have a pair of (interpolated) texture coordinates. These texture coordinates map each point of the mesh to a specific position in the texture map and therefore to the color at this position. Thus, rendering a texture-mapped mesh consists of two steps for all visible points: interpolation of texture coordinates and a look-up of the color of the texture image at the position specified by the interpolated texture coordinates.

In OpenGL, any valid floating-point number is a valid texture coordinate. However, when the GPU is asked to look up a pixel (or “texel”) of a texture image (e.g. with the “texture2D” instruction described below), it will internally map the texture coordinates to the range between 0 and 1 in a way depending on the “wrap mode”. For example, wrap mode “repeat” basically uses the fractional part of the texture coordinates to determine texture coordinates in the range between 0 and 1. On the other hand, wrap mode “clamp” clamps the texture coordinates to this range. These internal texture coordinates in the range between 0 and 1 are then used to determine the position in the texture image: specifies the lower, left corner of the texture image; the lower, right corner; the upper, left corner; etc. The OpenGL's wrap mode corresponds to Blender's settings under Properties > Texture tab > Image Mapping. Unfortunately, Blender doesn't appear to set the OpenGL wrap mode but it is always set to “repeat”.

Texturing a Sphere in Blender

[edit | edit source]

To map the image of the Earth's surface to the left onto a sphere in Blender, you first have to download this image to your computer: click the image to the left until you get to a larger version and save it (usually with a right-click) to your computer (remember where you saved it). Then switch to Blender and add a sphere (in an Info window choose Add > Mesh > UV Sphere), select it in the 3D View (by right-clicking), activate smooth shading (in the Tool Shelf of the 3D View, press t if it is not active), make sure that Display > Shading: GLSL is set in the Properties of the 3D View (press n if they aren't displayed), and switch the Viewport Shading of the 3D View to Textured (the second icon to the right of the main menu in the 3D View). Now (with the sphere still being selected) add a material (in a Properties window > Material tab > New). Then add a new texture (in the Properties window > Textures tab > New) and select Image or Movie for the Type and click Image > Open. Select your file in the file browser and click on Open Image (or double-click it in the file browser). The image should now appear in the preview section of the Textures tab and Blender should put it onto the sphere in the 3D View.

Now you should make sure that the Coordinates in the Properties window > Textures tab > Mapping are set to Generated. This means that our texture coordinates will be set to the coordinates in object space. Specifying or generating texture coordinates (i.e. UVs) in any modeling tool is a whole different topic which is well beyond the scope of this tutorial.

With these settings, Blender will also send texture coordinates to the vertex shader. (Actually, we could also use the object coordinates in gl_Vertex because they are the same in this case.) Thus, we can write a vertex shader that receives the texture coordinates and hands them through to the fragment shader. The fragment shader then does some computation on the four-dimensional texture coordinates to compute the longitude and latitude (scale to the range from 0 to 1), which will be used as texture coordinates here. Usually this step would be unnecessary since the texture coordinates should already correctly specify where to look up the texture image. (In fact, any such processing of texture coordinates in the fragment shader should be avoided for performance reasons; here I'm only using this trick to avoid setting up appropriate UV texture coordinates.) The Python script to set up the shader could be:

import bge
 
cont = bge.logic.getCurrentController()
 
VertexShader = """
   varying vec4 texCoords; // texture coordinates at this vertex
  
   void main()
   {
      texCoords = gl_MultiTexCoord0; // in this case equal to gl_Vertex
 
      gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
   }
"""
 
FragmentShader = """
   varying vec4 texCoords; 
      // interpolated texture coordinates for this fragment
   uniform sampler2D textureUnit; 
      // a small integer identifying a texture image
 
   void main()
   {
      vec2 longitudeLatitude = vec2(
        (atan(texCoords.y, texCoords.x) / 3.1415926 + 1.0) * 0.5, 
         1.0 - acos(texCoords.z) / 3.1415926);
         // processing of the texture coordinates; 
         // this is unnecessary if correct texture coordinates 
         // are specified within Blender
   
      gl_FragColor = texture2D(textureUnit, longitudeLatitude);
         // look up the color of the texture image specified 
         // by the uniform "textureUnit" at the position 
         // specified by "longitudeLatitude.x" and 
         // "longitudeLatitude.y" and return it in "gl_FragColor"        
   }
"""
 
mesh = cont.owner.meshes[0]
for mat in mesh.materials:
    shader = mat.getShader()
    if shader != None:
        if not shader.isValid():
            shader.setSource(VertexShader, FragmentShader, 1)
            shader.setSampler('textureUnit', 0)

Note the last line

shader.setSampler('textureUnit', 0)

in the Python script: it sets the uniform variable textureUnity to 0. This specifies that the texture should be used which is first in the list in the Properties window > Textures tab. A value of 1 would select the second in the list, etc. In fact, for each sampler2D variable that you use in a fragment shader, you have to set its value with a call to setSampler in the Python script as shown above. Actually, a sampler2D uniform specifies the texture unit of the GPU. (A texture unit is a part of the hardware that is responsible for the lookup and interpolation of colors in texture images.) The number of texure units of GPUs is available in the built-in constant gl_MaxTextureUnits, which is usually 4 or 8. Thus, the number of different texture images available in a fragment shader is limited to this number.

If everything went right, the texture image should now appear correctly mapped onto the sphere when you start the game engine by pressing p. (Otherwise Blender maps it differently onto the sphere.) Congratulations!

How It Works

[edit | edit source]

Since many techniques use texture mapping, it pays off very well to understand what is happening here. Therefore, let's review the shader code:

The vertices of Blender's sphere object come with attribute data in gl_MultiTexCoord0 for each vertex, which specifies texture coordinates that are in our particular example the same values as in the attribute gl_Vertex, which specifies a position in object space.

The vertex shader then writes the texture coordinates of each vertex to the varying variable texCoords. For each fragment of a triangle (i.e. each covered pixel), the values of this varying at the three triangle vertices are interpolated (see the description in “Rasterization”) and the interpolated texture coordinates are given to the fragment shader. In this particular example, the fragment shader computes new texture coordinates in longitudeLatitude. Usually, this wouldn't be necessary because correct texture coordinates should be specified within Blender using UV mapping. The fragment shader then uses the texture coordinates to look up a color in the texture image specified by the uniform textureUnit at the interpolated position in texture space and returns this color in gl_FragColor, which is then written to the framebuffer and displayed on the screen.

It is crucial that you gain a good idea of these steps in order to understand the more complicated texture mapping techniques presented in other tutorials.

Summary

[edit | edit source]

You have reached the end of one of the most important tutorials. We have looked at:

  • How to set up a Blender object for texturing.
  • How to import a texture image.
  • How a vertex shader and a fragment shader work together to map a texture image onto a mesh.

Further Reading

[edit | edit source]

If you want to know more

  • about the data flow in and out of vertex shaders and fragment shaders (i.e. vertex attributes, varyings, etc.), you should read the description of the “OpenGL ES 2.0 Pipeline”.
  • about the interpolation of varying variables for the fragment shader, you should read the discussion of the “Rasterization”.


< GLSL Programming/Blender

Unless stated otherwise, all example source code on this page is granted to the public domain.