Add shading or lighting to a simple volume renderer. You can choose a fixed transfer function and light, and need not count preprocessing time (if any), but should be able to view the volume from multiple viewpoints.

Some options include local gradient-based diffuse and specular, attenuation from deep shadow maps, single or multiple scattering participating media, and fixed or moving lights. For the basic project, you need only have diffuse and specular from a single light source based on the volume gradient. Anything beyond this will net extra credit points, proportional to the difficulty. I'm not going to cap the possible extra credit, as some of the options would probably take me longer than two weeks. None the less, I encourage you to start simple, then expand if you have time. If you are planning **any** of the more complex options, start now!

There are many data sets you can use, but I suggest the buckyball electron density dataset as a good starting point. It is small, and since it has byte data values, you don't need to worry about byte ordering problems or uncertainty about the range of valid values. There's a copy of the raw data at www.umbc.edu/~olano/491/bucky.raw. It is 32x32x32, one byte per voxel: just a raw binary file with no header or anything. The original data set came from AVS, downloaded from the University of Erlangen Volume Library (where you can find many other volumes as well).

You can use the platform and language of your choosing. For your reference, here is a copy of the simple volume rendering shader I gave out in class (though your volume renderer need not be shader-based, and indeed doesn't need to run on a GPU at all if you prefer a CPU implementation):

// simple volume rendering ray caster // maximum ray steps to allow const int MAXSTEPS=500; // textures and shader parameters uniform sampler2D volumeLUT; // table mapping value to color & opacity uniform sampler3D volumeTexture; // volume texture data uniform vec3 volumeSize; // geometric size (volume is +/-volumeSize/2) uniform vec3 voxelSize; // dimensions in voxels in each direction uniform float raystep; // minimum ray step to take (in voxels) uniform float opticalScale; // scale between geometric size and optical depth // info passed from vertex shader varying vec4 Eo; // eye location in object space varying vec4 Vo; // position in object space void main() { // Ray origin and direction vec3 pVol = Vo.xyz/Vo.w; // origin on proxy surface vec3 pVox = .5 + pVol/volumeSize; // ray origin in voxel space vec3 d = normalize(pVol*Eo.w - Eo.xyz); // unit-length ray direction [1] // single ray step float stepVol = raystep * volumeSize.x/voxelSize.x; // step in volume space float stepOptical = stepVol * opticalScale; // step in optical scale vec3 dVox = d * stepVol / volumeSize; // step vector in voxels // find ray exit point from box [2] vec3 box = (sign(d)*volumeSize*0.5 - pVol)/d; float exitVox = min(box.x,min(box.y,box.z)) * voxelSize.x / volumeSize.x; // loop through volume, accumulating integral contribution for each step vec4 result = vec4(0.); float s=0.; for(int i=0; i<MAXSTEPS && s<=exitVox; ++i, s+=raystep, pVox+=dVox) { vec4 vol = texture3D(volumeTexture, pVox); vec4 color = texture2D(volumeLUT, vol.xx) * stepOptical; // [3] result = result + (1.-result.a)*color; } gl_FragColor = result; }

On April 21st, I'd like to spend just a couple of minutes for each of you to show your results, and tell the class about the methods you used. It seems like switching computers or USB sticks isn't too much of an impediment after all, so you can bring what you'd like to show to class or check it into your assignment directory.

Turn in all necessary source files, along with an informal one to two page write-up in a file named README in your assignment directory. While the write-up may be informal, I will count off for spelling and grammar. Please proofread before you turn it in. The write-up shoud describe what help you got, if any, what you did, how you did it, how well you think it worked, and what further work you might do. Include what hardware and software you used (it is not necessary to use the gl.umbc systems, though you must submit there), and what files you'd like to use at the show and tell.

Turn files into the new *assn4* directory in your class repository. This new directory is in the repository, but not on your local computer. To get a copy, you'll need to *pull* and*update* (analogous to commit and push when submitting changes). With command-line hg, you'd run these commands in your local copy:

hg pull hg update

Once again, I strongly recommend committing your changes often as you work on the assignment. It is your choice whether to push each commit or wait until the end.

A few spots in that shader probably deserve a little more explanation. Here are a few extra notes

[2] The box code is simultaneously computing how many steps until the ray exits on each axis. Consider just x. `d`

points in the ray direction. `sign(d.x)`

is 1, -1, or 0, depending on whether `d`

is pointing in the +x or -x direction, or doesn't have any change in x at all. So

`sign(d.x)*volumeSize.x*0.5`

is therefore the x plane where d will exit the cube (or 0, if it's parallel to that plane).

`sign(d.x)*volumeSize.x*0.5 - pVol.x`

is the distance along x from the place where the ray enters the cube to where it exits the x face. Finally,

`(sign(d.x)*volumeSize.x*0.5 - pVol.x)/d.x`

is the number of steps until it exits. `box`

is this computation for each axis. The smallest of these is the first face that the ray exits, and the number of steps to get there..

[3] Multiplying by the step size is a common approximation that is not too bad an approximation for mostly transparent stuff, but near opaque it gets pretty wrong. Instead of "`*opticalScale`

", the correct scaling should be

`float alpha = max(color.a, 1e-3);`

`color.a = 1. - pow(1.-color.a, opticalScale);`

`color.rgb *= color.a / alpha;`