Showing posts with label gpu. Show all posts
Showing posts with label gpu. Show all posts

Monday, June 27, 2011

Accelerating Path Tracing by Eye Path Reprojection

Just stumbled upon this upcoming paper about interactive GPU path tracing from Niklas Henrich. The abstract sounds very promising:
"Recently, path tracing has gained interest for real-time global illumination since it allows to simulate an unbiased result of the rendering equation. However, path tracing is still too slow for real-time applications and shows noise when displayed at interactive frame rates. The radiance of a pixel is computed by tracing a path, starting from the eye and connecting each point on the path with the light source. While conventional path tracing uses the information of a path for a single pixel only, we demonstrate how to distribute the intermediate results along the path to other pixels in the image. We show that this reprojection of an eye path can be implemented efficiently on graphics hardware with only a small overhead. This results in an overall improvement of the whole image since the number of paths per pixel increases. The method is especially useful for many indirections, which is often circumvented to save computation time. Furthermore, instead of improving the quality of the rendering, our method is able to increase the rendering speed by reusing the reprojected paths instead of tracing new paths while maintaining the same quality."

Hopefully, the results from "Accelerating path tracing by re-using paths" (a paper from 2002 by Bekaert et al. who reported a 9x speed-up for CPU path tracing) can be replicated on the GPU.

Wednesday, April 6, 2011

Scene from Kajiya's paper 'The rendering equation' can now be path traced in real-time on the GPU! UPDATE: exe available

This is just incredible, another milestone in the history of rendering! Jacco Bikker, the main developer behind the real-time path tracer Brigade, and Jeroen van Schijndel, an IGAD research assistant, have made a new simple but superb path tracer (similar to tokaspt) which can render the classic path tracing scene from the 1986 paper "The rendering equation" by Jim Kajiya in real-time on just one GTX 470! There is very little noise overall, there's just some in the glass spheres and in the shadowed caustics.




Some stats:

- 512x512 rendering resolution
- 8 bounces
- 64 samples per pixel
- 12 frames/second on 1 GTX470

The amazing path tracing speed is partly due to the fact that there are no triangle meshes but only geometric primitives (spheres and boxes) in this scene, which are computationally much cheaper to intersect than triangles. The scene in the video is not 100 % identical to the original one (the structure consisting of the revolved parabola with the oblate spheroid, 'mushroom' for short ;-), is missing), but they're working on it (UPDATE 2: the mushroom is finished, see link at the end of this post). This is the original scene from the 1986 paper:


The above is an off-screen photograph (published in the paper). This is the direct feed image:


It's really mind-boggling when you realize that Kajiya needed 1221 minutes (20.35 hours) to render this image on a supercomputer from 1986 (an IBM 3081 mainframe) and 25 years later it can be computed at the same resolution in 36 milliseconds on a GTX580! A speed up of 2 million times!! Sounds like a great 25th anniversary :D !


(gotta love that '80ies font :-)

I would love to see some animation in this scene, for example an animated light casting moving shadows or a collapse of the pile of green spheres, which would greatly accentuate the "real-timeness" of the path tracing.

UPDATE: Executable and source code for this demo are now available at the links in this thread on the ompf forum. It's awesome, I'm getting a frametime of 1900ms in the 64 spp version on my poor 8600M GT, which is 49x slower than a GTX580 in this demo (kajiya-perf, default view at 64 spp/frame needs 1759 ms/frame on my 8600M GT and only 36 ms/frame on a GTX580)! Time to upgrade :)

UPDATE 2: Like Jacco Bikker has promised in the comments, the mushroom-like structure is now done! Visit http://ompf.org/forum/viewtopic.php?f=6&t=3174 for a screenshot of the updated scene and for more info on this project.

Thursday, April 15, 2010

Real-time pathtracing demo shows future of game graphics

Yessss!!! I've been anticipating this for a long time: real-time raytraced high-quality dynamic global illumination which is practical for games. Until now, the image quality of every real-time raytracing demo that I've seen in the context of a game was deeply disappointing:

- Quake 3 raytraced (http://www.youtube.com/watch?v=bpNZt3yDXno),
- Quake 4 raytraced (http://www.youtube.com/watch?v=Y5GteH4q47s),
- Quake Wars raytraced (http://www.youtube.com/watch?v=mtHDSG2wNho) (there's a pattern in there somewhere),
- Outbound (http://igad.nhtv.nl/~bikker/projects.htm),
- Let there be light (http://www.youtube.com/watch?v=33yrCV25A14,
- the last Larrabee demo showing an extremely dull Quake Wars scene (a raytraced floating boat in a mountainous landscape, with some flying vehicles roaring over, Intel just showed a completely motionless scene, too afraid of revealing the low framerate when navigating)http://www.youtube.com/watch?v=b5TGA-IE85o,
- the Nvidia demo of the Bugatti at Siggraph 2008 (http://www.youtube.com/watch?v=BAZQlQ86IB4)

All of these demo's lack one major feature: realtime dynamic global illumination. They just show Whitted raytracing, which makes the lighting look flat and dull and which quality-wise cannot seriously compete with rasterization (which uses many tricks to fake GI such as baked GI, SSAO, SSGI, instant radiosity, precomputed radiance transfer and sperical harmonics, Crytek's light propagation volumes, ...).

The above videos would make you believe that real-time high quality dynamic GI is still out for an undetermined amount of time. But as the following video shows, that time is much closer than you would think: http://www.youtube.com/watch?v=dKZIzcioKYQ

The technology demonstrated in the video is developed by Jacco Bikker (Phantom on ompf.org, who also developed the Arauna game engine which uses realtime raytracing) and shows a glimpse of the future of graphics: real-time dynamic global illumination through pathtracing (probably bidirectional), computed on a hybrid architecture (CPU and GPU) achieving ~40 Mrays/sec on a Core i7 + GTX260. There's a dynamic floating object and each frame accumulates 8 samples/pixel before being displayed. There's caustics from the reflective ring, cube and cylinder as well as motion blur. The beauty of path tracing is that it inherently provides photorealistic graphics: there's no extra coding effort required to have soft shadows, reflections, refractions and indirect lighting, it all works automagically (it also handles caustics, but not very efficiently though). The photorealism is already there, now it's just a matter of speeding it up through code optimization, new algorithms (stochastic progressive photon mapping, Metropolis Light Transport, ...) and of course better hardware (CPU and GPU).

The video is imo a proof of concept of the feasibility of realtime pathtraced games: despite the low resolution, low framerate, low geometric complexity and the noise there is an undeniable beauty about the unified global lighting for static and dynamic objects. I like it very, very much. I think a Myst or Outbound-like game would be ideally suited to this technology: it's slow paced and you often hold still for inspecting the scene looking for clues (so it's very tolerant to low framerates) and it contains only a few dynamic objects. I can't wait to see the kind of games built with this technology. Photorealistic game graphics with dynamic high-quality global illumination for everything are just a major step closer to becoming reality.

UPDATE: I've found a good mathematical explanation for the motion blur you're seeing in the video, that was achieved by averaging the samples of 4 frames (http://www.reddit.com/r/programming/comments/brsut/realtime_pathtracing_is_here/):
it is because there is too much variance in lighting in this scene for the numbers of samples the frames take to integrate the rendering equation (8; typically 'nice' results starts at 100+ samples/per pixel). Therefore you get noise which (if they implemented their pathtracer correctly) is unbiased. Which means in turn that the amount of noise is proportional to the inverse of the square of number of samples. By averaging over 4 frames, they half the noise as long as the camera is not moving.


UPDATE2: Jacco Bikker uploaded a new, even more amazing video to youtube showing a rotating light globally illuminating the scene with path tracing in real-time at 14-18 fps (frame time: 55-70 ms)!
http://www.youtube.com/watch?v=Jm6hz2-gxZ0&playnext_from=TL&videos=ZkGZWOIKQV8

The frame averaging trick must have been used here too, because 6 samples per pixel cannot possibly give such good quality.