/cournia.com/dev/null - anisotropic gaze contingent display home
anisotropic gaze contingent display March 8, 2004

Here's my second try at a Gaze Contingent Display:

Like my previous approach, I'm relying on shaders to achieve the effect I want. However, unlike my previous method, I'm now supplying the shaders with a texture (lovingly called a degradation texture) which contains degradation information for the GCD.

The degradation image is a RGBA image. The most important channel of the texture is the alpha channel, which contains level of detail information. Here's an example of the alpha channel from the degradation texture used in the image above:

ARB_fragment_program makes using the level of detail map easy. The whiter the sample from the alpha channel of the degradation map, the higher the level of detail. Once we know the LOD required for a fragment, we map the LOD value from [0 - 1] to [MAX_LOD_LEVEL - 0], where MAX_LOD_LEVEL is the level of the coarsest LOD level. For example, MAX_LOD_LEVEL for a 1024x1024 texture would be 10. We use the mapped LOD value to sample the viewing texture with the TXB (texture bias) instruction. The entire process can be described in about 5 lines of a Cg shader!

The cool thing about the TXB instruction is that the current mipmapping mode can drastically change the resultant image. Above, we used GL_LINEAR_MIPMAP_LINEAR. Below are examples of using GL_NEAREST_MIPMAP_LINEAR, GL_LINEAR_MIPMAP_NEAREST, and GL_NEAREST_MIPMAP_NEAREST respectively.

     

The red, green, and blue channels of the degradation texture contain channel modulation information which can be used to simulate color blindness, glaucoma, and other various vision related diseases. Of course, since most of the logic for the GCD is in shaders, we really aren't restricted to a meaning for a given channel in the degradation map.

Below is an example of slightly more complicated GCD. Here, parts of the image not in the region of interest are degraded and converted to grey scale:

If you're worried about speed, don't. It take less than 2ms to render a frame on a GeForce FX 5900. Of course, if you have a card that doesn't support ARB_fragment_program, then you probably should worry about speed (but you probably already do alot of that anyway).

  index