Friday, January 15, 2010

Arion Render, a hybrid GPU/CPU unbiased renderer

Amazing and extremely fast GPU/CPU raytracing from the maker of Fryrender:

Videos and info here http://www.randomcontrol.com/arion

The nice thing is that it's also a production renderer, just like Octane Render. There's no seperate render core for preview and final rendering, they are one and the same!

The race for realtime "production" rendering is definitely on...

Tuesday, January 12, 2010

Octane Render: a realtime GPU based unbiased renderer

The flood of realtime GPU path tracers and renderers keeps continuing and Octane Render is the latest newcomer. It's made by Terrence Vergauwen, a Belgian and compatriot of mine :-), who was the former lead of LuxRender. It's incredibly fast (currently CUDA only): www.refractivesoftware.com
Be sure to check out the video. It might not be as impressive as the iray video, but that was runnning on 15 Tesla GPU's, while the Octane video was recorded with 1 GTX 260!

The ever growing list of GPU renderers:

1 V-Ray RT GPU
2 iray (mental images)
3 LuxRender OpenCL (smallLuxGPU, David Bucciarelli)
4 Octane Render (Refractive Software)
5 BrazilRT (Caustic Graphics)
6 FryRender, Maxwell ????

Thursday, December 10, 2009

SVO path tracer WIP

Inspired by the recent flood of real-time path tracers (both commercial and non-commercial:

V-Ray RT,
mental images iray,
Nvidia Optix,
Caustic's Brazil RT,
David Bucciarelli's SmallptGPU OpenCL path tracer (http://davibu.interfree.it/opencl/smallptgpu/smallptGPU.html),
albertoven's Raydiant (albertoven.com),
mxadd CUDA path tracer (http://mxadd.org/CenterFrame.php?wherejump=projects),
javor's CUDA path tracer (http://javor.tech.officelive.com/tmp.aspx))

and the promise of infinite geometry by using voxel rendering, I decided to take the best of both worlds and write an SVO path tracer as a hobby project ;-). My goal is to make a very simple demo showing a Cornell box with a SVO voxelized Stanford Dragon in it, interactively path traced (brute force calculated unbiased indirect lighting) with CUDA (or OptiX, if it's easy enough to implement SVO raycasting). The whole scene must fit in video memory, no streaming (yet), because the path tracing algorithm would badly choke.

I also want to do a comparison between a polygon version and an SVO version of the aforementioned scene, hoping to demonstrate that path tracing voxels is much faster than path tracing polygons, due to the much simpler ray-voxel intersection calculation and automagic multiresolution benefits (see Cyril Crassin's "Beyond triangles: gigavoxels effects in videogames").

I'm still in the research phase, reading articles on (volume) path tracing and SVO (Cyril Crassin's Gigavoxels paper is extremely informative and I think you could just extend the raycasting to a path tracing approach). More to come soon.

Monday, November 30, 2009

Thursday, November 26, 2009

Soon...


I hope so goddammit ;-). Anything on the new Ruby, Fusion Render Cloud or that mysterious anime game would be more than fine.

In the meantime, there's an excellent article about cloud gaming in the November issue of Game Developer, written by Jake Cannell:

http://gamedeveloper.texterity.com/gamedeveloper/200911?pg=11#pg11
(you can read the last three pages without subscription)

Thursday, October 22, 2009

Nvidia's realtime ray tracing in the cloud: RealityServer and iray



mental images' RealityServer has now the option to render on a GPU cluster with CUDA based iray.

A very impressive interactive ray tracing demo, first shown publically at Nvidia's GPU Tech Conference: http://www.megaupload.com/?d=RCYDVG6S (H.264, 180 MB)

A video of iray integrated in Google SketchUp: http://www.youtube.com/watch?v=i6LBhws8l-A

RealityServer is sold in different configurations consisting of 8 up to 100+ Tesla GPU's. The biggest cluster configurations are targeted towards rendering of online virtual worlds and online entertainment, and can serve multiple concurrent users per GPU.




more info:

http://www.hpcwire.com/features/NVIDIA-Pitches-GPU-Computing-in-the-Cloud-65217572.html

http://brightsideofnews.com/news/2009/10/20/nvidia-launches-gpu-based-realityserver-30.aspx?pageid=0


Nvidia also showed a demo of an interactively raytraced Bugatti with global illumination running on OptiX at GTC:


video: http://www.megaupload.com/?d=YAOSYWK1 (H.264, 82 MB)

Wednesday, October 14, 2009

New CryENGINE3 trailer



in HD: http://www.youtube.com/watch?v=7tPfM1QnPlo&hd=1

Again the trailer explicitely mentions "Integrated Voxel Objects". I don't think it refers to the voxel sculpting tool for making things like caves in Sandbox Editor 2. I believe it is a reference to the SVO tech that I discussed in the previous post, because it is shown in the trailer among other features that are specific to CryENGINE3: procedural content, multi-core support, streaming infinite worlds, (advanced AI), deferred lighting, realtime global illumination, cryengine3 Live create (whatever that is).

Tuesday, September 29, 2009

Sparse Voxel Octree in CryEngine 3

Slides from the Crytek presentation at CEDEC 2009 reveal that the SVO technology is going to be part of CryEngine3. The difference with id Software’s (Jon Olick) SVO tech seems to be that Crytek rasterizes the SVO, while id uses raycasting.
One thing that made me smile: "GPU rasterized version of SVOs & its realtime content creation only available in CryENGINE 3" is listed as a con. Nice joke Crytek!

Other slides list the pro’s and cons of point based rendering, raytracing and rasterization. http://www.4gamer.net/games/092/G009273/20090903001/



If Crytek continues innovating at this pace, there’s a good chance that CryEngine3 will become the Unreal Engine 3 of the next generation and will attract many licensees.

Tuesday, September 22, 2009

OTOY tech will power movie/game in 2012

OTOY and Big Lazy Robot are working on a game/movie that will apparently be using OTOY's voxel raytracing technology. It will be rendered on the server side and deliver perfectly photorealistic graphics (picture below). The game and movie should be released in 2012:



http://www.brightsideofnews.com/news/2009/9/21/worlds-first-ray-traced-pc-game-and-movie-to-arrive-in-2012.aspx

Friday, September 11, 2009

Tuesday, August 18, 2009

The Future of Game Graphics according to Crytek

Cevat Yerli has given a keynote at GDC Europe about Crytek's next engine and its rendering techniques. Incrysis has pictures of the keynote slides: http://www.incrysis.com/index.php?option=com_content&task=view&id=818&Itemid=1

Two interesting slides:





According to Cevat Yerli, their next generation engine will use a mix of ray tracing, rasterization, point based rendering and SVO: (Google translation from http://www.golem.de/0908/69105.html)
Yerli has also talked about the technique a year ago, other graphics programmers like John Carmack or Jon Olick are also researching it. According to Yerli Sparse Voxel Octrees will form the base for the next version of its Cryengine - but will be ready only in a few years .
From http://www.gamasutra.com/php-bin/news_index.php?story=24865
He then focused on the actual technical innovations that he feels will make a difference in graphics. For example, tech like point-based rendering is potentially faster than triangle-based rendering at certain higher qualities, and works well with levels of detail.

On the other hand point-based rendering might define a certain super-high polygon look for game, Yerli said. However: "There's a lot of games today in the Top 10 which don't need that", he conceded, and content creation tools are almost exclusively based around triangles right now.

He also noted ray-tracing as a possible rendering method to move towards, and particularly recommended rasterization and sparse voxel octrees for rendering. Such principles will form "the core" of future technology for Crytek's next engine, Yerli said, and the goal is to "render the entire world" with the voxel data structure.
Concluding, Yerli suggested that, after 2013, there are opportunities with new APIs and hardware platforms to "mix and match" between multiple rendering models, with "a Renaissance of graphics programming", and visual fidelity on a par with movies such as Shrek and Ice Age rendered in real time.

Friday, August 14, 2009

Next gen GPU from AMD unveiled on Sep 10



Now let's hope the "you won't believe your eyes" refers to a new Ruby demo from OTOY!

Carmack keynote at QuakeCon 09

A complete video of this year's keynote can be found at http://www.quakeunity.com/file=2919

Carmack considers cloud computing an interesting way forward for different game types. He thinks that a reduction of the latency to 50ms is achievable. However, he believes that the current internet infrastructure still needs a lot of work before fast-paced "twitchy" shooters like Quake are possible.

Two live blogging reports:

Oh boy, someone's asking about Cloud Computing.
#
Carmack says it's "wonderful."
#
Talking about how parallel processing is where it's at now.
#
But there are physical limits.
#
Especially in terms of how much power computers/consoles are drawing.
#
Cloud Computing prevents server-side cheating.
#
Still some serious disadvantages when it comes to fast-twitch games like Quake, though.
#
With 50 millisecond lag, though, anything's possible.
#
Some games even have that much lag internally.
#
Carmack thinks Cloud Computing could be a significant force in a few years -- perhaps even a decade.
http://www.vg247.com/2009/08/13/quakecon-press-conference-liveblog-today-at-930pm-bst/


Question about cloud computing and onlive
*
Carmack says cloud computing is wonderful
*
But carmack says about cloud computing for gaming that you start winding up coming up against power limits
*
So says common computing resources may be helpful. Because the power bricks on something like a 360 is showing a looming power problem
*
Latency is the killer, carmack says
*
Says the sims would work with a cloud setup
*
But thinks twitch gaming will be the last kind of game that could work
*
Says that cloud computing would limit cheating
*
Thinks you could get games down to 50 millisecond lags as they're streamed via cloud computing
*
Wouldn't be shocked if in ten years cloud computing is a significant paradigm for some kinds of games
http://kotaku.com/5336589/the-john-carmack-keynote-liveblogging-quakecon


On onlive type services..latency the issue, but a lot more classes of games than people think, could be feasible, example being "The Sims", twitch games like Quake would be the hardest..upside is client side cheating vanishes. Key will be optimizing networks stacks for reasonable latency. Definitely thinks its not a crazy idea and has very interesting potential.

http://forum.beyond3d.com/showthread.php?t=54817

Sunday, August 9, 2009

The race for real-time ray tracing (Siggraph 2009)

Last Siggraph has shown that a lot of the big companies in 3D are heavily involved in realtime and interactive raytracing research.

Nvidia: OptiX, mental images (RealityServer, iray)

Intel: LRB

AMD: nothing AMD-specific announced yet

Caustic Graphics: CausticRT, BrazilRT, integration in 3DStudioMax Design 2010, LightWork Design, Robert McNeel & Associates, Realtime Technology AG (RTT AG), Right Hemisphere and Splutterfish

Then there was also this extremely impressive demonstration of V-Ray RT for GPU's, which caught many by surprise:

video: https://www.youtube.com/watch?v=DJLCpS107jg



and http://www.spot3d.com/vrayrt/gpu20090725.mov

V-Ray RT rendering on a GTX 285 using CUDA (will be ported to OpenCL): rendering a Cornell box with 5 bounces of physically correct global illumination at 40 fps, and a 800k polygon Collosseum with 5 GI bounces at around 4 fps (with progressive rendering). Since it will be ported to OpenCL, GPU's from AMD and Intel will be able to run it as well.

The Chaos Group and mental images presentations indicate that rendering is going to move from the CPU to the GPU very soon and will become increasingly realtime.

Caustic Graphics would like to see their cards end up in a next-gen console as they target gaming as well. During their presentation at the Autodesk booth, they also mentioned cloud gaming as an option: put a lot of Caustic accelerators in a server and create a fully raytraced game rendered server side. Chaos Group could do the same in fact: they could use their V-Ray RT GPU tech in a cloud rendering environment and make a photorealistic game doing realtime raytracing on a bunch of GPUs (V-Ray RT GPU supports multi-GPU tech and distributed rendering), with fully accurate and dynamic GI rendered on the fly. And if they don't do it, I'm sure mental images will with iray and Realityserver.

There were also some interesting presentations about the future of realtime graphics and alternative rendering pipelines, which all suggested a bigger focus on ray tracing and REYES and indicated that pure rasterization will become less important in the not so distant future.

On the game development front, Tim Sweeney (and of course Carmack with the SVO stuff) is exploring ray tracing/ray casting for his next generation engines: http://news.cnet.com/8301-13512_3-10306215-23.html
and
http://graphics.cs.williams.edu/archive/SweeneyHPG2009/TimHPG2009.pdf

According to Sweeney, next generation rendering pipelines could include a mix of ray tracing, REYES, voxel raycasting and other volume rendering techniques, all implemented in a GPGPU language: REYES for characters and other dynamic objects, voxels for the static environment and foliage, raytracing for reflection and refraction and maybe some kind of global illumination.

Friday, August 7, 2009

John Carmack talking about cloud computing...

... and getting super excited! Check it out for yourself:

http://www.youtube.com/watch?v=N_Ge-35Ld70

This 10-minute video shows the last part of a three-part video interview. Carmack talks a little bit about the sparse voxel octree raycasting stuff that he's excited to do research into. Then he wanders off to cloud computing and his excitement goes visibly through the roof.

From http://www.eurogamer.net/articles/digitalfoundry-carmack-next-gen-blog-entry :

"The big question is, are we going to be able to do a ray-casting primitive for a lot of things?" he ponders. "Certainly we'll still be doing a lot of conventional stuff like animated characters and things like that very likely will be drawn not incredibly differently from how they're drawn now. Hopefully we'll be able to use some form of sparse voxel octree representation cast stuff for some of the things in the world that are gonna be rigid-bodied... maybe we'll have deformations on things like that. But that's a research project I'm excited to get back to in the relatively near future. We can prototype that stuff now on current hardware and if we're thinking that... this type of thing will be ten times faster on the hardware that ends up shipping, we'll be able to learn a lot from that."

However, while he predicts that the leaps in cutting edge console technology are set to continue (certainly there is no hint from him that Microsoft or Sony will follow a Wii-style strategy of simply adding minor or incremental upgrades to their existing hardware), we are swiftly reaching the point where platform holders will be unable to win their battles against the laws of physics.

"We talk about these absurd things like how many teraflops of processing and memory that are going into our game machines," Carmack says, speculating off-hand that the next gen consoles will have at least 2GB of internal RAM. "It's great and there's going to be at least another generation like that, although interestingly we are coasting towards some fundamental physical limits on things. We've already hit the megahertz wall and eventually there's going to be a power density wall from which you won't get more processing out there..."

That being the case, he speculates that the game-makers could move into different directions to provide new game experiences and at that point, the almost mythical cloud computing concept could make an impact.

"There'll be questions of whether we shift to a cloud computing infrastructure... lots of interesting questions about whether you have the computing power in your living room versus somewhere else," he says, noting that while latency is a fundamental issue, the sheer scope of storage available online opens up intriguing possibilities. "Certainly the easier aspect of that is 'net as storage' where it's all digital distribution and you could wind up doing an idTech 5-like thing... and blow it up to World of Warcraft size so you need a hundred petabytes of storage in your central game system. We can do that now! It's not an absurd thing to talk about. Games are already in the tens of millions of dollars in terms of budget size and that's probably going to continue to climb there. The idea of putting millions of dollars into higher-sized storage... it's not unreasonable to at least consider."

Sunday, August 2, 2009

OTOY at Siggraph 2009 on Aug 3

SIGGRAPH 2009 Panel Explores Eye-Definition Computing and the Future of Digital Actors:

http://finance.yahoo.com/news/SIGGRAPH-2009-Panel-Explores-bw-367156942.html?x=0&.v=1



Interview with David Perry of Gaikai on some of the technical details (scaling, latency) of his cloud gaming service:

http://www.gamesindustry.biz/articles/david-perry-part-two

I've mentioned the blog EnterTheSingularity in a previous post, and the author Jake Cannell keeps writing very interesting and elaborate blogposts on voxels and cloud gaming such as:

http://enterthesingularity.blogspot.com/2009/07/voxel-tracing.html

http://enterthesingularity.blogspot.com/2009/07/next-generation-of-gaming.html

Friday, July 10, 2009

GigaVoxels at Siggraph

Cyril Crassin will be giving a talk at Siggraph 2009 on the subject of voxel raycasting and he has published a new short paper:

Beyond Triangles: Gigavoxels Effects in Video Games

Some screens of raycasted voxel scenes with the GigaVoxel technique here

I also found an interesting blog (via Timothy Farrar's blog) called EnterTheSingularity, which features a very interesting and in-depth post about the "perceived" latency of cloud gaming services:

http://enterthesingularity.blogspot.com/2009_04_01_archive.html

The author (Jake Cannell, programmer on Mercenaries 2) has been thinking through the idea of cloud games quite well.

Thursday, July 2, 2009

Gaikai, another cloud gaming service

http://www.gamespot.com/news/6212860.html

http://games.venturebeat.com/2009/07/01/gaikai-demos-how-to-serve-games-over-the-internet/

Perry said the games can run on less than a megabit a second of Internet bandwidth. Most DSL connections offer around three megabits a second, coming downstream into the home. In the demo, Perry said, the server was about 800 miles away, which results in a round trip (ping rate) of 21 milliseconds. That’s a split second and not really noticeable. For a server much closer, the ping rate is about 10 milliseconds. While both OnLive and Otoy used custom servers with graphics chips, Perry said his service can run across low-end custom servers.

Perry contends the ping rate is fast enough to play most games. He showed racing games such as Mario 64 and other games such as World of Warcraft and Eve Online, all playing without downloaded software. He also showed how he could use Adobe Photoshop across the server.

Wednesday, July 1, 2009

Naughty Dog's Uncharted 2 does cloud computing

Naughty Dog gave a live chat yesterday in which they mentioned cloud computing as an engine feature.

Excerpt from the live chat:

2:37
[Comment From Pound_lb003]
What crazy technical terms can you throw at us, that the ND Engine 2.0 boasts?
2:38
[Comment From wsowen02]
How varied are the environments this time?
2:40
[Comment From Luke]
You told us the snow is going to be the best snow we ever saw. We got some footage of that snow in trailers, but can you tell us more, of how this snow is going to work? By the way, you're an awesome company Naughty Dog.
2:40
Evan Wells: Screen Space Ambient Occlusion, deferred rendering, cloud computing, paralax mapping, high dynamic range tonemapping, per object motion blur, cascade shadows, sub surface scattering simulation...


Source: http://blog.us.playstation.com/2009/06/30/uncharted-2-live-chat-with-naughty-dog



Seeing the term “cloud computing” among all these other high-end features, shows that it has reached a similar drool-inducing status. I wonder what exactly is being computed in the cloud and if it is similar to what OTOY is doing. Probably wishful thinking, but maybe there is a link with PS Cloud, as Naughty Dog has also contributed a great deal to the Playstation EDGE tools (together with Jon Olick). Naughty Dog is the best game developer on consoles in my opinion (especially in the fields of graphics, animation and cinematic storytelling) so it wouldn't suprise me if they are the first console developer to jump on the cloud computing train.

update: Interesting info about PS Cloud:

http://playstationlifestyle.net/2009/06/29/sony-drumming-up-a-digital-storm-with-playstationcloudcom/

Ever since the trademark filing of PS Cloud, speculation has been buzzing. The trademark covers “entertainment services, namely, providing an on-line video game that users may access through the internet”

We have found more evidence of PlayStation Cloud being accessible “through the internet”…

While doing some research, I performed a who is lookup of the domain playstationcloud.com. The results of my query showed that SCEA San Diego Studios is the owner of the domain.

SCEA San Diego recently mentioned while speaking to Gamasutra that they would be “exclusively” working on PSN game development. This goes against all speculation so far pegging PlayStation Cloud as a social networking service. Of course Sony San Diego could very well be developing a service or application, for the PSN with PlayStation Cloud.

Whatever PS Cloud is, it seems as though it’s going to be coming from Sony’s San Diego Studio.