Thursday, January 26, 2012

Brigade 2 Blowout!

Jacco Bikker, one of the Brigade developers, posted some updates about the Brigade 2 path tracer:

- there is a new downloadable demo showing off the Blinn shader (Fermi only). I haven't been able to test it yet, but the screenshot looks sweet!

- a brand new video of the Reflect game using Brigade 2 running on 2 GTX 470 GPUs, which demonstrates some dramatic lighting effects and very low noise levels (for a real-time path traced interior scene)

All can be seen and downloaded from here: http://igad.nhtv.nl/~bikker/

There should also be a playable Reflect demo up by tomorrow!

GigaVoxels thesis online

Cyril Crassin has just posted his entire PhD thesis on GigaVoxels, the sparse voxel octree raycasting technology which supports efficient depth-of-field, soft shadows, animated voxel objects and indirect lighting at

http://blog.icare3d.org/2012/01/phd-thesis-gigavoxels.html

There are over 200 pages of voxel raytracing goodies :)


Tuesday, January 24, 2012

Real-time path traced Sponza fly-through on 3 GPUs!

I've sent the executable of the teapot in Sponza scene to a friend, szczyglo74 on Youtube, who has a much more powerful rig than my own (a PC with 3 GPUs: 1 GTX 590 (dual gpu card) + 1 GTX 460)  and who made some very cool real-time videos of the sponza scene. Many thanks szczyglo!

The maximum path depth in each of these videos is 4 (= 3 bounces max):

32 spp per frame (480x360, 1/4th render resolution, 8 fps): 

http://www.youtube.com/watch?v=fVAl-oKAL9I  :awesome video, shows real-time convergence in most parts of the scene

32 spp per frame (640x480, 1/4th render resolution, 4.7 fps): 


8 spp per frame (480x360, 1/4th render resolution, ~21fps): 


4 spp per frame (640x480, 1/4th render resolution, ~18fps):



The above videos clearly show the importance of the number of samples per pixel per frame in indirectly lit areas: despite the low max path depth of 4, it is still possible to discern some details in the corridors in the first two videos (32 spp per frame) during navigation, while the last two videos (8 and 4 spp per frame) are obviously too dark in these regions, but clear up very fast with a stationary camera. Note that these tests were made with a kernel that is not optimized for indirect lighting (no multiple importance sampling is used here). 

I'm quite happy with the sense of photorealism in these videos, especially when you consider that this is just brute force path tracing (no caching, filtering or interpolation yet nor anything like image reconstruction, adaptive sampling, importance sampling, bidirectional pt, eye path reprojection, ... which are all interesting approaches). A textured version of Sponza will probably further increase the realism, which is something I will try in a next test.  

Monday, January 23, 2012

Utah Teapot in Sponza continued

I've managed to get the "teapot in Sponza" scene to render slightly faster by treating all objects in the scene as static.

Youtube videos (rendered on a GeForce GTS 450):



Sunday, January 22, 2012

Optix 2.5 released

A few days ago, Nvidia has released OptiX SDK 2.5 RC1 which stands out compared to prior releases due to a number of major improvements:

- out-of-core GPU ray tracing: scenes can now exceed the amount of available GPU RAM (up to 3 times)

- HLBVH2 support (Garanzha and Pantaleoni): replaces the previous LBVH builder and is able to build the BVH acceleration structure on the GPU at a fraction of the time it would take a CPU, which allows for completely dynamic scenes by rebuilding the acceleration structure each frame in real-time (e.g. the HLBVH2 paper reports building times of 10.5 ms on a GTX480 for a model consisting of 1.76M fully dynamic polygons). HLBVH2 traversal speed is said to be comparable to a CPU built BVH

This will greatly benefit real-time GPU path traced games and animations as it not only reduces the BVH build times by several orders of magnitude compared to CPU builders, but also eliminates costly per-frame CPU-to-GPU transfers of the updated BVH (when built on the cpu)

- the SDK path tracing sample is enhanced with multiple importance sampling

Friday, January 20, 2012

Real-time browser based path tracer

While searching the net for ways to improve convergence in dynamic, path traced environments, I stumbled upon this neat web-based path tracer created by Edward Porten:


It uses progressive spatial caching and runs pretty fast despite being CPU-based.


Also check out some other browser-based demos from the same author at http://www.openprocessing.org/portal/?userID=6535

The Sphere flake tracer uses a cool looking frameless rendering technique, roughly comparable to frame averaging.

"Hair Ball" and  "Terrain Ray Marching"  are awesome as well.

Thursday, January 19, 2012

Brigade 2 Teapot in Sponza test

Testing indirect lighting with motion blur in Brigade 2 with the teapot and the Sponza atrium:

Youtube videos:

http://www.youtube.com/watch?v=QCPCAZlS5QI (8 spp per frame)
Some pics:

This is a particularly nasty scene, since all the light is coming from the skydome through the narrow roof  (no portals or importance sampling are used, max path depth is 4). Nevertheless, it still converges rather quickly on a GTS 450 using the motion blur trick. 

Wednesday, January 18, 2012

Brigade 2 motion blur test continued

I made some side-by-side comparisons from the Utah teapot scene (http://www.youtube.com/watch?v=KxELvSK3Gl0) to show the difference that the motion blur technique makes in areas that are partially lit with indirect lighting. Both sides in the comparison screens use only 4 samples per pixel. The low samplerate is required to achieve playable framerates. Motion blur (frame averaging) is disabled in the left image, while the right image averages the pixels of the current frame with those of the previous 7 frames (equivalent to 32 samples per pixel). The difference in obscured regions is enormous. This is the quality that can be achieved in real-time (+10 fps) using one current high end GPU (GTX 570 or better): 


Edges and shadows become much more clearly defined:


Video (low framerate, rendered at 640x480 on a GTS 450):


It's amazing what such a simple trick can do to the quality of the image, without losing any detail. My next test will involve a 3rd person car camera like the one in  http://www.youtube.com/watch?v=SOic3eE8wrs which should work really well in combination with the motion blur (no sudden sideways movement).

Tuesday, January 17, 2012

New paper about noise reduction for real-time/interactive path tracing

"Practical noise reduction for progressive stochastic ray tracing with perceptual control"

Check it out here: http://www.karsten-schwenk.de/papers/papers_noisered.html

The "supplemental.zip" folder contains some very nice comparison videos.

Brigade 2 motion blurred Utah teapot


Small update on my previous post about motion blur in Brigade 2. I've made a small Bullet physics demo featuring a holy symbol of computer graphics, the Utah teapot:


The video was rendered on a low end GTS 450. The scene is inspired by and meant to be a real-time remake of an animation rendered with GPU path tracing using SmallLuxGPU (last scene in the video). 

Sunday, January 15, 2012

Brigade 2 motion blur test

Today I've added cheap camera motion blur to the Brigade 2 path tracer using the OpenGL accumulation buffer. The technique blends the pixels of one or more previous frames with the current frame and works very well for real-time path traced dynamic scenes, provided the framerate is sufficiently high (10+ fps).  The difference in image quality is huge: path tracing noise is drastically reduced and since all calculations pertaining to the accumulation buffer are hardware accelerated on the GPU, there is zero impact on the rendering performance. Depending on the accumulation value, you essentially get two to ten times the amount of samples per pixel for free at the expense of slight blurring caused by the frame averaging (the blurring is actually not so bad because it adds a nice cinematic effect). 

Below is a comparison image of a stress test of an indoor scene without and with motion blur applied (rendered on 8600M GT). The images were rendered with multiple importance sampling using only 1 sample per pixel to better show the differences (converging is off):


Comparison of the scene with the rotating ogre from a previous demo (see http://raytracey.blogspot.com/2011/12/videos-and-executable-demo-of-brigade-2.html) rendered at 1 sample per pixel (on a 8600M GT), with (right) and without motion blur:


The following image has nothing to do with frame averaging but it is an image from an earlier test of the MIS indoor scene with a highly detailed Alyx character (40k triangles) with a Blinn shader applied. It rendered very fast:

I'll upload some videos of the above tests soon. 

Monday, January 9, 2012

A new real-time sphere path tracer (CUDA and OpenCL)

Real-time path tracing of spheres on the GPU is hot it seems :-)


It looks very impressive and pretty. The path tracer owes its speed to the fact that it is completely running on the GPU and to a custom precomputed hashing algorithm. More info at http://bertolami.com/projectView.php?content=research_content&project=real-time-path-tracer There are also downloadable executables further down the page.

Tokaspt (by tbp), Sfera (by LuxRender's Dade), the WebGL path tracer (by Evan Wallace), ... the list of real-time GPU path tracers featuring dynamic scenes with spheres keeps growing. 

Brigade 2 GI test

Just a small test with a Cornell box scene to test diffuse color bleeding with Brigade 2, rendered on my faithful 8600M GT (the female character is a high poly version of Alyx from Half-Life 2 (character model from here), containing almost 40k triangles and can be moved around the scene in real-time, the spheres are made of triangles):

Some more tests with an area light that show indirect lighting and subtle brownish/greenish color bleeding on the left side of the character:

These scenes use Brigade's multiple importance sampling kernel which drastically reduces the noise in interior scenes compared to the kernel used in previous test scenes (which converged fast because those were open outdoor scenes lit by an HDR skydome).  Brigade's MIS algorithm allows this kind of interior scenes to converge extremely fast as can be seen in this video made by Jacco Bikker. It uses two GTX 470s to render at real-time framerates with 80 spp per frame!

Tuesday, January 3, 2012

Brigade 2 website launched with source code + new videos and exe's

Great news today: the website for the Brigade 2 path tracer has been launched and includes the source code so anyone can make their own real-time path traced game. The path tracing kernels are still closed source and precompiled in a library.

The website can be found here: http://brigade.roenyroeny.com/

I have also developed two small demos showing real-time path traced interior scenes with animation. The first one is a simple study room like scene, in which the ceiling and back walls are removed to let more light enter the scene (there is a problem with the normals from the chair):
 

The second one is a more complex bedroom scene (180k triangles). The scene is a free 3ds max model from http://www.3dmodelfree.com, which was tweaked a little bit (I took out the ceiling, two walls and the curtains). The scene came without textures, so I had to set every material manually in the mtl file. Some screens:
Diffuse(.1,.1,.1) and spec(.9) for bedframe and closet:

Two videos showing real-time material tweaking and animation with physics:


The executable demo can be downloaded at  
(all CUDA architectures supported)

Expect many more demos made with the Brigade 2 path tracer in the coming weeks and months. Brigade is also going to be ported to OpenCL soon. 2012 is going to be a breakthrough year for real-time GPGPU path tracing.

Monday, January 2, 2012

Pepeland inspired Brigade 2 scene

Inspired by a Pepeland animation (one of the first truly photorealistic animations, rendered with Arnold in 1999), I've created a scene with the Brigade 2 path tracer in an attempt to achieve photorealism in real-time. Some pics (rendered with 8600M GT):

The scene contains 110k triangles (the chair and robot are free models found on the net, the ogre is the same model from the previous demo). It renders in real-time at low resolution on a high end GPU and is pretty amazing to see in action.

Videos and executable will follow soon.

Bungie talks about ray tracing and voxels

Gamasutra has an interview up with Bungie's senior graphics engineer. The whole interview can be found at

http://www.gamasutra.com/view/news/39332/The_Big_Graphics_Problems_Bungie_Wants_To_Solve.php

Some interesting fragments:
I've certainly seen how even today people are having a lot of trouble rendering shadows without a lot of blockiness or dithering. 
HC: That's kind of the problem with computer game graphics these days. A lot of things people consider solved problems are actually quite far from being solved, and shadows are one of them. After all these years we don't have a very satisfactory shadow solution. They're improving; every generation of games they're improving, but they're nowhere near the perfect solution that people thought we already have. 
What do you think might be the answer? Your potential megatexture solution, or something else? 
HC: We are still far from seeing perfect shadows. Shadows are a byproduct of lighting. All frequency shadows (shadows that are hard and soft in all the right places) are a byproduct of global illumination, and these things are notoriously hard in real time.< 
There's just not enough machine power, even in today's generation or the next generation, to be able to capture that kind of fidelity. There are also inherent limitations to the current techniques, such as shadow maps, for example. When the light is near the glancing angle of a shadow receiver, then it is impossible to do the correct thing.
With the current state of the art shadow techniques we can manage the resolution much better, and we can do high quality filtering, but we still have long ways to go to get where we need to be, even if we just talk about hard shadows from direct illumination.
I think megatextures could help, but still fundamentally there are things you cannot solve with our current shadow meshes. And until the performance supports real-time ray tracing and global illumination, we're going to continue seeing hack after hack for rendering shadows.
About the potential of using voxels for faster and more efficient global illumination:
HC: Voxels are very very interesting to us. For example, when we take advantage of voxelization, we basically voxelize our level and then we build these portalizations and clustering of our spaces based on the voxelization. And so voxelization, what it does is hide all the small geometry details. And in the regular data structures, it's very easy to reason out the space when it's voxelized versus dealing with individual polygons.
But besides this ability, there's also the very interesting possibility for us to use voxelization or a voxelized scene to do lighting and global illumination. We have some thoughts in that area that we might research in the future, but in general I think it's a very good direction for us to think about; to use voxelization to hide all the details of the scene geometry and sort of decouple the complexity of the scene from the complexity of lighting and visibility. In that way everything becomes easier in voxelization.

Friday, December 30, 2011

Videos and executable demo of Brigade 2 GUI and animation test


Some videos of the GUI that I developed for Brigade 2, showing real-time changing of materials with simultaneous animation and physics simulation. The GUI is still a work in progress, but being able to tweak any material on the fly is so much more easy to get the look right.

480p (320x240 render res, 4 spp, max depth 4)

480p (640x480 render res, 4 spp, max depth 4)

The Ogre (model from here) is just simply rotating for now. The mesh consists of 51k triangles of which the BVH is dynamically updated every frame (as are the BVHs of the car and the stack of blocks which are physics driven). 

The executable demo is available at 

Further experiments will include: 
- skeletal animation
- first person camera and gun
- cam following the vehicle
- architecture 

Thursday, December 29, 2011

GUI test in Brigade 2

I've been working on a IMGUI (immediate mode GUI) for Brigade 2 based on SDL, which is working pretty well and is quite responsive when the framerate is more than 5 fps. There are sliders for specularity, transparency and RGB values of the diffuse color of the material. Every material in the scene can be picked and updates in real-time + you can navigate the scene, control the vehicle and let a physics simulation run, all simultaneously while chaning materials. I will upload an executable demo and some videos of the GUI in action tomorrow.

Screenshot of the GUI while tweaking a glossy Ogre (almost 50k triangles, model from Javor Kalojanov's real-time raytraced animation demo):


szczyglo74 (CG artist/architect) also provided me with a modern half-open architectural structure (Altana model from Januszowka) which is great to test indirect lighting and which I will turn into a demo soon (screen rendered on 8600M GT): 

Paper + video of "Real-time bidirectional path tracing via rasterization"

Available here: http://www.square-enix.com/jp/info/library/  (thanks to selfshadow)

It looks pretty damn good and the technique seems to be a very likely candidate for implementation in next-gen games because it fully exploits the rasterization hardware. One of the limitations (as mentioned in the paper) is that it doesn't work well for highly glossy and perfectly specular surfaces (sharp reflections) for which ray tracing is proposed as an alternative.

Monday, December 26, 2011

Real-time bidirectional path tracing via rasterization

This is the title of an upcoming I3D 2012 paper by Yusuke Tokuyoshi and Shinji Ogaki, see

http://graphics.ics.uci.edu/I3D2012/papers.php

The title reminds me of "High-quality global illumination rendering using rasterization" by Toshiya Hachisuka from 2005, which described a technique to obtain photorealistic images on a typical 2005 GPU (like the Radeon 9700) in mere seconds, extremely impressive for that time. Shinji Ogaki is also a co-author on the Progressive Photon Mapping paper by Hachisuka and Jensen, so this new paper is definitely going to be interesting.

If the paper lives up to the title, this could be quite interesting. Both researchers work at Square Enix and there seems to be a connection with the recently unveiled photorealistic Luminous engine which uses high quality offline baked lightmaps (see this page for more details). A paper about the rasterization-based lightmap baking in Luminous can be found here and the real-time bidirectional PT technique probably works very similarly (i.e. ray bundles computed with rasterization by parallel visibilty tests):

Quote from the "Fast global illumination baking via ray bundles" paper (describing the tech behind the Luminous engine):
7 high-quality light maps are rendered in 181 seconds with NVIDIA GeForce GTX 580. The resolution of ray-bundle is 2048x2048 pixels, and 10000 directions are sampled. The performance of our renderer is over 200 M rays per second on a commodity GPU.
Assuming everything scales linearly, this means that it would take about 16 milliseconds (60 fps) on a GTX 580 to compute a GI lightmap with ray bundles of 512x512 pixels and 100 ray bundle directions (= 100 directional samples) which should still yield great quality real-time global illumination. This tech could potentially be used for making real-time photorealistic games on current GPUs. It doesn't work however for objects with highly glossy and perfectly specular materials.