Saturday, May 19, 2007

Forward Progress

As you can see from this image below, I've made some good progress on the ray tracer.



Figure 1. Shiny red balls on checkered plane
are the quintessential ray-traced image.

There are four new features highlighted above. The first is specular highlights. The second is reflections. The checker board pattern is the third, and the plane it is applied to is the fourth. The plane is not a proper plane; it's normal is always along the Y axis (except as noted below), making it good for creating a ground object to place objects on and around, but it isn't useful for much else. You'll also notice some aliasing around the checkerboard pattern. I believe this is caused by rounding errors, but I'll need to investigate to determine for sure. This image also uses ambient light, which was not present in the first few images that contained diffuse lighting. In addition, shadows appear to be functioning normally.

This next image was actually created before the above image, but highlights a few other features that were not used in the above image. This image is very high-res (2000x2000), so be sure to view the full size image.



Figure 2. It's a good thing solid black compresses well.


This image introduces two features not seen above. The blue-green and red sphere on the upper left and the blue and white sphere on the upper right use the concept of noise to create a cloud-like texture and a marble-like texture, respectively (although, admittedly in this picture, the texture I consider marble looks more like clouds due to the particular color scheme chosen).

Noise, for those who aren't familiar, is n-dimensional, interpolated semi-random data. What that means in English is that noise is used to create things like marble, that have a pattern that varies widely overall, but is not random as close inspection reveals that there is a continuousness to it. For more information, google "Perlin noise".

If you look closely at the blue marble texture, you'll see that the reflection coefficient varies with the color. It was also rendered with a low specular exponent, so the highlight is much larger than on the other objects.

The second feature introduced is bump-mapping, which is used on the red sphere. Bump-mapping is related to textures, but tweaks the way the ray tracer works internally to give the illusion of a non-smooth surface. This works by making small changes to the surface normal of the object, without the need to mathematically model the actual shape of the surface. In this example, sine and cosine are utilized to give a lumpy feel to the object. Notice around the edges of the sphere, however, that the silhouette of the sphere is still completely round. For small to medium alterations of the surface normal, the illusion stands pretty well.

Next steps? New features, such as refraction or oversampling, aren't planned to show up for a while. Priority is to clean up the code and get the performance up, add more primitives, and solve most of the little issues. My dirty little secret is that these images carefully hide known issues with the current code. Well, o.k., that's not my only secret...

Finally, I'll leave you with an image that contains all the new features, followed by a little more source code. Again, don't forget to view the full size image.


Figure 3. Oh yeah.



# All code on blog is (c) 2007 Brandon Inman. Please contact
# me directly for licensing information. I am intending to
# make this OSS but I'm not there yet :)

class MarbleTexture
def initialize(args = {})
@material = Material.new
@material.specular = args[:specular] || 0.0
@material.refraction = args[:refraction] || 0.0
@stretch = args[:stretch] || 1000.0
end


def get_at(point)
i = Noise.noise(point[0] / @stretch , point[1] *2, point[2]*2)
@material.color = [i,i,1]
@material.reflection = i ** 2
return @material
end
end


Monday, May 14, 2007

Quick Update

Still haven't fixed the perspective problem. I figure I'll go one of three ways from here-

1) I'll ignore it for now, get some other nice stuff working and fix it later.
2) It will suddenly start working properly and I'll have no idea why.

Either way, the perspective stuff will eventually be fixed.

Oh, and I've got light sources and shadows now. Not a shadow to be seen on some other ray tracing blogs.



Too bad my spheres all look like eggs, which is compounded by where the light source is placed. You'll also notice some odd artifacting of the shadows, which might be a result of the perspective problem, or might be actual bugs.

I do have an orthogonal camera implemented. I'll update with images as I have time.

Sunday, May 6, 2007

First post!

One night, being an ub3r g33k, I decided that I, too, would write a ray-tracer. Being much more l33t than everyone else, I began by rubbing two sticks together, in hopes of creating enough heat to start a fire, with which I would smelt enough silicone to form the computer with which I would create the ult1mat3 r@ytr@c3r. I quickly abandoned this approach for two reasons-
  1. It was taking to long.
  2. Computer chips are made from silicon, not silicone.
So, instead, I grabbed my copy of "Advanced Graphics Programming in Turbo Pascal" (Roger T. Stevens and Christopher D. Watkins). This epic work on graphics programming includes many great gems that were invaluable during the creation of the ray tracer, such as

Because of the large amount of information in the ray traced (image) file... it is probably not possible to fit a high resolution (image) file onto a standard 360K floppy disk.

My first two choices were 1) what language to implement the ray tracer in and 2) what file format to output to. Given the performance requirements needed for ray-tracing, and need for an output format that would eventually store large images, I choose Ruby as the language and ASCII-mode ppm as the output format.

Next, I typed in all the vector and matrix math from the book, carefully converting to Ruby syntax. By the end, I expected this to be the largest effort of the entire project, although this did not end up being the case.

After all of the math was entered, I then added the sphere intersection logic. After verifying that this functioned correctly, I added the main rendering loop, which is almost entirely original code. I adjusted this loop several times, but could not yet get the perspective to come out right.

x /= @viewport_size
y /= @viewport_size
#film_point = vec(x, y,0)
film_point = vec(x, y,-@viewport_z)
#film_point = vec(x, y,@viewport_z)
#film_point = vec(0, 0, 0)
#film_point = vec(0, 0, -@viewport_z)

# My original way, causes distortion
ray_dir = vec(-x, -y, @viewport_z)
# Books way, causes way worse distortion

#eye = vec_lin_comb(x, @u_vec, y,@v_vec)

#ray_dir = vec_add(@view_vec, eye)



ray_dir_normal = vec_normalize( ray_dir )

hit = cast_ray(film_point, ray_dir_normal)

image_buffer[ix][iy] = hit[0].get_color if(hit[0] != nil)


This is used to render this image.



Notice several things about the picture above:
  1. The spheres are all different colors, instead of all being purple-ish.
  2. I've already posted code, yeah I'm talking to you, brailsmt.
  3. There are no rendered light sources in the image.
Note that there aren't any non-rendered light sources either. This is because I spent most of my time trying to fix the perspective problem mentioned above. Oh well. I'll stop by brailsmt's (http://brailsmt.blogspot.com) cube in the morning and pester him until he admits how he fixed it in his code.