Friday, July 1, 2011

Progressive photon mapping with metropolis sampling

Hi again,

To answer my last entry, I've added metropolis sampler to my photon tracing routine.
Well it was a natural thing to do.

I am aware of that paper by Toshiya et al. - Robust Adaptive Photon Tracing using Photon Path Visibility.
But since I am quite familiar with Kelemen's mlt, and I didn't want to switch to that sppm approach (I keep it as a traditional photon mapper with shrinking radius), I implemented it quite differently. Path quality is equal to how many photons do contribute to photon map (and photon map's bouding box is based on the visible hitpoints). So it just favors longer paths. Perhaps I could extend it to how much flux the path carries.
Also like in standard Kelemen's mlt, there are 2 mutation strategies, small and large mutation, which the former needs some further testing like mutation approach and size.

So enough talking, pics:

Path tracing:

Photon mapping - random sampling:

and finally, Photon mapping - metropolis sampling:


All images were computed in 2 minutes, also the metropolis code is still fresh and messy.

Monday, June 27, 2011

photon mapping vs path tracing outdoor test


So yea photon mapping doesn't handle biger scenes very well.

Glass is farily good (good caustics makes up for crap direct ilumination, I trace 5m photons every pass here, perhaps if I lowered that number I would get better direct lighting and antialiasing, but worse indirect lighting).

Photon mapping:


Path tracing:


Lambert material, photon mapping (without direct lighting computed separately):


Photon mapping with direct lighting:


Path tracing:


As expected path tracing performs better, maybe some fine tunning (photons per pass and initial radius) would help, but I don't expect miracles.

Sunday, June 26, 2011

Download Red Dot demo

Hi there,

I would like to provide runnable version of the current incarnation of my raytracer.
I know it's nothing special, but maybe someone would like to run it, just for fun, I know I would.


Unpack the zip archive and there you have it - double click on reddot.jar or type:
java -jar reddot.jar -Xms1g -Xmx1g

(if you care about the performance it is acutally a lot, lot better if you specify those options from command line, because if not, jvm will get too little memory for photons, and GC will trigger like crazy)

It reads obj files (with its material representation) straight from blender exporter.
But since camera is fixed I didn't provide a gui for importing your own scenes yet.

Image is updated every each 5 seconds.

Moreover, rendering fashion depends on 3 parameters:
  • initial radius = (world_max - world_min).length / 50
  • photons_per_pass = 100000
  • alpha = 0.5
Initial radius just like in regular photon mapping is a tradeoff between smoothness and details. If the initial radius was smaller, the caustic would be sharper from the start, but image would be noiser overall.
Instead of basing it on the world bounding box, I perhaps should base it on bounding box of the photon map. But in this case it's exactly the same.

Photons per pass means how many photons it shoots between ray tracing passes. It doesn't store photons for the first hit, because direct ligting is computed like in regular path tracer.
This is also kind of a tradeoff. In scenes where there is no direct ligting (emitter is not directly visible by the diffuse surfaces) all direct lighting computations (visibility checks mostly) are a waste of time. But in the other case, when surfaces are lit directly, those direct lighting computations helps quite a lot.

And an alpha parameter controls how fast the initial radius shrinks.

Here I attach my 45m render (mobile i7):

Friday, June 24, 2011

Photon mapping continued.

Quick 5 min render.

It clearly reveals an issue with smooth normals.

Wednesday, June 22, 2011

Photon mapping ownz!

Hi there, it's been a while.

I've implemented progressive photon mapping, first exactly as it is in original paper (constructing hitmap once and shooting photons in the main loop), it was all nice and fast but quite troublesome, especially if it comes to antialiasing and glossy materials.
Then i've implemented the more basic approach to PPM as in this paper.

But I have some isues as for now:
  • I am not sure about the adjoint bsdf for shading normals. What's obvious is smooth normals didnt work out of the box for the lambertian. What I did is using original normal, and scaling contribution by |in*nor_s| / |in*nor_g|, kinda works but I am not sure. What with other materials? For specular and dielectric it seems to work with just smoothed normal in place of the geometric one.
  • Since I'm still using Java I have problems with Garbage Collector, I build uniform grid for 100000 - 200000 photons every pass on all 8 threads separately and that really kills the GC, even if I try to reuse the grid. Any suggestions?
  • Shading on hard corners, for example 90 degree corners between the walls. Right now I use simple hack to check for the angle between the photon and surface normal, but that makes them black, the blackness does shrink over time but doesn't seem to ever disappear. So I wonder if it wouldn't be a better choice to leave it alone converging to the right sollution in +inf time and 1/inf radius (?)
Other than that it's blazing fast. This image took around one hour to render:


But was already pretty good after minutes.